Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

vlaiv

Members
  • Posts

    13,106
  • Joined

  • Last visited

  • Days Won

    12

Everything posted by vlaiv

  1. Same as plate solving - which uses stars and distance between them, this method is simpler - you record a planet and measure its diameter in pixels, then turn on Stellarium and adjust the time to that of your recording and note this number: Divide the two to get arc seconds per pixel value for your scope and camera.
  2. Have you tried doing sigma clip stacking instead of regular? I don't know APP at all, but I do know that you can remove hot pixels by using some sort of sigma clip and that you usually need to enable that feature / select it in software
  3. Don't forget that sensor is not right at the beginning of the camera - but a little bit inside - depending on model, it can be something like 6.5mm inside. Otherwise, yes, you seems to have it covered.
  4. As long as you've calculated in optical path of filter wheel and you are happy with magnification that you are getting - it should not matter.
  5. That should be easy then ... @FLO What is supposed to be done with SW Scope that has no collimation screws (by design) yet goes out of collimation?
  6. Ubuntu should be pretty stable distribution. I use it all the time on servers and it just works. 64bit version is available for 64 bit desktop processors. Astroberry and other software for RPI is 32bit because architecture is 32bit - ARM processors. I believe that most RPI astro distributions in fact use Ubuntu as base OS.
  7. Did you look at above image and especially stats for it? Green seems to be captured at half the rate of other two colors - maybe such approach would be the best. You get proper color information but if green is not as important for the target - maybe only spend half of usual time on it and use the rest for R and B.
  8. What do you consider to be the best results? The most accurate color reproduction or "the most pleasing" result. Mind you, later can be quite influenced by images we see online - that might be processed to people's liking rather than to true color. OSC cameras are inherently more capable of color reproduction than mono+RGB. At least RGB as most vendors make them. There are few exceptions that don't have straight edges that are better for color reproduction - but no astro camera reproduces color "out of the box". Color needs to be processed carefully if you want to get true color of object.
  9. Adjusting secondary won't cut it in this case - you need to adjust primary mirror as it is parabolic. If the scope does not have adjustable primary mirror - send it back. Fact that they did not include collimation screws in primary mirror cell points to their belief that scope should stay in collimation for end user. Since your scope isn't - it's not working as expected and this is not your fault - reasonable grounds for returning the scope I would say.
  10. Because you can't really do that if you want to be accurate. There are cases where you can get exact information by skipping one of the channels. Like doing LRGB and skipping any of channels and going for RGB or LGB or LRG or LRB - provided that your RGB filters precisely add up to L filter and do not overlap. With above approach - you are effectively making up once color, and in this case it was green. No reason for green to be equal to average of R and B, but sometimes for some targets it works good enough. My idea was simply - why not use R and G to do the same and make up blue. It might not work as well on this target or maybe other formula will give better results for this target. Point is - in each case data is being made up to speed up things and as long as you can get away with that - it is fine. Btw, there is very nice image captured by fellow astronomer from my country, have a look here: https://aristarh.rs/astrofotografija/mars-september-21-2020/ Captured at the same time, although @CraigT82 travels in time judging by the fact that he has this image taken in next hour or so This was taken with full RGB approach and we can see that synthetic G sort of works - but maybe clouds could be a bit thicker and richer in color with regular G.
  11. Above image was from linear stack. There are some residual issues, either from flats or something else - see that darker line at the top, even after crop. After binning the data, it was rather smooth and ok to work with, I did not do much noise reduction at all. Here is what I did: 1. Loaded posted file, cropped away artifacts and separated channels 2. Loaded those separate channels to ImageJ - there I binned data x3 and removed background and background gradient 3. I found B-V 0.42 star and did photometric measurement on it - for color balance 4. Created artificial luminance after color balance (L = 0.2126*R + 0.7152*G + 0.0722*B) 5. I stretched luminance in gimp - levels, curves and a bit of noise reduction 6. I loaded RGB data in Gimp and did color compose to get back color image 7. I pasted stretched luminance over RGB image (which was not stretched) and set layer mode to Luminance 8. I then did temperature correction to remove yellow cast imparted by atmosphere 9. Boosted saturation 10. Desaturated yellow as core was really yellowish/orange 11. Saved as 8bit data ...
  12. Oh, I had a blast processing this data ... This is not how I would normally process the data, but I wanted to recreate as much as possible that "popular" look that most people go for - boosted saturation, too much blue and dark red and so on ...
  13. So your green is then (R+B)/2 if I'm not mistaken? G = (R+B)/2 => B = 2*G-R There you go, put two times G layer set to addition and once R layer set to subtraction and you have synthetic B layer from G and R. (I thought that synthetic G is a bit more involved that this).
  14. Same way you create synthetic G, but in other direction? How do you create synthetic G now? There must be some math involved and I bet that we can "reverse" it to create B from true R and G.
  15. I don't use APP so I can't tell. I think that @ollypenrice uses it quite a bit. Olly could you shed some light on this? Maybe it is Gamma setting or something like that?
  16. Why don't you try R and G and make synt B? B is rather susceptible to seeing because of short wavelengths. Maybe G will be less affected and you get somewhat sharper image (less of a dent in the planet )?
  17. This data looks like it had curves applied to it - which means it is not linear. This is what linear data for M31 should look like (roughly): core is much brighter than anything else and there is only a hint of other structures in the image. Here is for example single luminance sub from my M31 a few years back: Histogram of such image is much narrower: Something has been done to your image that is non linear transform and once you do non linear transform, you loose ability to easily fix color balance or remove gradients (algorithms for that assume data linearity).
  18. Usually that would mean linear data without any manipulation, but let's work with what we have: There you go, yellow cast removed and colors closer to what you would find in a galaxy of that type. This should really be done while the data is still linear - color calibration / balancing and gradient removal, but it can be done.
  19. I just usually attach it here in thread, but for very large files, I've seen people use drop box or google drive or possibly we transfer or similar. In "My profile" section, there is "My attachments" part - you can see how much space you've used up here on SGL. There is 50GB limit I think - which is plenty I suppose.
  20. Processing of astro photos is a skill to be learned. Maybe watch some of videos explaining the process. You can also post your image here (32bit fits/tiff is the best option) for other more experienced members to show you what image really contains (there is usually more to data than people starting out with processing are able to show).
  21. I think there is more to this data. Here is quick extract of green channel as luminance and a bit of stretching: Once gradient is removed and color calibrated and applied to this luminance, I think there could be rather nice image. Your original rendition is too blue and stars are without color - as if whole image is monochrome with a blue cast over it. Data is not like that - there is quite a bit of red and yellow in there as well.
  22. Ok, that calculator is misleading. First of all - 3"/px is definitively not gross under sampling, it is just wider field imaging, you don't have to worry one bit about being under sampled. This is actually very good sampling rate to be when you have 80mm scope. It is much better to see what sort of FOV you'll be getting on particular targets and to choose targets that suit said FOV. For example - M31 will barely fit into FOV with this combination unless you take care to orient it properly: Pleiades will be framed nicely ... So will Orion's nebula: Heart nebula as well (here not rendered very nicely, but you can check it in Stellarium) In any case, this is very nice combination for large number of objects and it will give you many imaging opportunities and will be nice skill honing platform - don't rush to change anything yet until you had a bit of fun with this combo.
  23. Could be that 0.0 degrees in Stellarium is relative to your plate solved rotation. This way you can orient FOV in Stellarium to your liking and it will tell you how much to rotate your camera in relative degrees - not absolute orientation? Try rotating your camera about 120 degrees, doing another plate solve and then sync to see if FOV in stellarium changes orientation and if you are closer to what you want.
  24. Not sure how your plate solving works, but here is general "workflow" to getting to exact location: 1. Just tell your mount to go to selected coordinates 2. Take image 3. Plate solve 4. Sync 5. If you are not at those exact coordinates - go to step 1 else done And yes, it will take couple of adjustments of camera rotation to get to wanted angle - plate solve should give you that information too - and just adjust until you are close enough.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.