Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

Annehouw

Members
  • Posts

    97
  • Joined

  • Last visited

Everything posted by Annehouw

  1. @Z3roCool You have received some good input here, but it might be wise to direct your efforts at the root of your problem. "I cannot see a way of getting natural RGB data with my current setup" is that correct? I could not find a description of your image processing pipeline, so a bit blind in in helping you in that respect: As has been mentioned earlier, correcting for Light Pollution (and gradients) and getting a "correct" colour balance is something that a lot of post processing software is very good at. I see no principal problem with your camera, filters or your lens. There are a lot of people producing fine images with a set-up like that, also from a light polluted environment. Doing this systematically: 1) Look at your image processing pipeline first as a way of tackling the RGB color and light pollution problem. AstropixelProcessor, Pixinsight or Startools are popular choices. 2) The effectiveness of OSC vs LRGB is a nice conversation topic while emptying a few bottles of whisky, but either will work well. I do not think this is at the heart of your problem. You already have the mono + RGB filter set, so..... 3) A light pollution filter just maybe gets you a bit less light pollution in your shots, but gradients and residual LP will still be present and so it is back to item (1) and if you master item (1), there is no need for a LP filter. Maybe you could share your workflow and the natural RGB colour problem in a bit more details, with images?
  2. Hello Ian, As nobody replied, I will give you my (limited) experience with an iEQ30Pro. Maybe, the following link may be of help? Post#9 in this tread: https://www.cloudynights.com/topic/612459-ioptron-cem60-how-to-set-the-zero-point/ 1) As for the other questions: After powering down and tear down, the stored alignment certainly is not valid anymore as the orientation of the mount in all three axis will be different. 2) Correct. I am primarily an imager. I pole align, point the cw shaft downwards and set the zero position to that position. (I have learned that on my Ieq30Pro, it will disable tracking if I do not set a zero position). To get accurate pointing for visual use you need to do the alignment steps. Again, as I am an imager, I just point the scope somewhere, do a platesolve and after that go to my target (I use goto++ in APT, which does that in a recursive way: goto, platesolve, measure the distance from target, goto, platesolve etc. How crucial an absolute correct zero position is if you want to do the manual alignment process, I do not know. 3) See the link above. Anne
  3. Thanks for the nice write-up! Looks like inpainting works very good. I have a copy of Affinity Photo. Time to give this a try.
  4. Well done, Olly! What "very easy" star reduction technique did you use? I have tried virtually all of them (Content aware fill in PS, Straton, Starnet++, PI StarMask, PI MaskGen script) and all of them work to some extent, but none can handle my bright stars with diffraction spikes, so I need to use quite some opto-neuro feedback processing (also called painting ;-)). Most of the algorithms presume that stars are round and I read somewhere that Starnet++ has been trained on round star images only. In the back of my head, I am thinking of experimenting to convolve a starmask with an single star image of a nice spiky star to get a spiky starmask. OTOH, when you put stars back in, it does not have to be pixel perfect.
  5. Congratulations with this result! I have tried lucky DSO imaging and failed miserably. Here's my story, maybe there is something in it for you. After seeing the magnificent results of a French guy under the name exaxe on Astrobin, I decided to try this myself. A bit of research told me that to beat the seeing you need to image somewhere between 10 and 100Hz. With those short exposure times you need a telescope that can collect a lot of light and a camera with low read noise. Read noise is getting lower every year with modern CMOS cameras. I decided upon a ZWO 290M (uncooled) camera. To get more light in early experiments I decided to image at f 3/3. This gives an imaging scale of 0.6 arcsecond per pixel. And that is way better than my typical seeing of 2-2.5 arcseconds. Enough to get this starting. The plan of attack was as follows: - Select a bright subject - Take bursts of movies in SER (raw) format, with 10 fps (or higher, signal levels allowing) - Generate darks with the same approach (they are essentially bias frames at these short exposures) - Process these movies ( I created about 150 Gbyte in a single session!) into an end result, using about 30-50% of the data (only the sharpest subs) - Take conventional color images of the same object with my OSC camera at a scale of 1 arcsecond per picture (smeared by the seeing, but that is OK as this is just the color data). - Combine in a LRGB approach - Done 😀 I found SIRIL to be an excellent piece of software to handle this https://www.siril.org/ The result? A rather sharp image, but with horrible, horrible raining noise. Unusable. To try and improve this, I would have to start dithering while making the movies. Now this is doable, but all in all it would take so much time to perfect the technique that I decided to use the not so many clear skies around here to image with my well known set-up. Maybe I will try again sometime in the future. It looks a lot like planetary lucky imaging, but with new challenges. For further inspiration, here is one of the few practical write-ups I know of http://astrophoto17.eklablog.com/acquisition-du-ciel-profond-en-mode-rapide-c28772848
  6. I use APP for Pre Processing, but it is the same thing. Here is what works for me: (The algorithm can only work on what you give it. It needs your guidance to work on pieces of sky background that should be evenly lit, but aren't because of light pollution) - Stretch the image to the extreme (in lightness and saturation), so you see what is going on. Where is the sky background? Are there faint traces of nebulosity, faint spiral arms, maybe IFN? - Place points small enough so you have an even patch, large enough to have sufficient statistics in the places where you think there should only be an even sky background. Less is more. This is the most important thing imho. - Do the calculation and look at the correction model. Is it smooth? It should be. - When you have integrated images over multiple nights, the gradient in your image can be quite complex instead of a gradial transition and as a result, the correction model will need to be a bit less smooth. - Confronted with this, place additional points along either side of the transition in light pollution to help the algorithm in modelling the complexity. - Look at the model again. Then look at your image. Totally flat is not what you will get. Remember that the background in you final rendition will be much darker than this. It is not a black art, but it is somewhat of a craft and as in any craft you have to practice it a lot to get a feeling for it.
  7. Here is something that you can try: If you are using the native ZWO driver in PHD, change to the ASCOM driver. I had the same observation as you and changing the driver changed typical snr from 20 or so to well above 100. What I noticed in the guide frame is that there were fewer stars to choose from, but the star shapes were much smoother.
  8. The image is nice! I also see the gradient. You could go for another round of ABE or DBE, followed by SNCR and a new color calibration (PCC would work well, I think).
  9. What Olly says, plus you might try the Screen, Mask, Invert procedure. See here http://www.astronomersdoitinthedark.com/dslr_llrgb_tutorial.php If you go to the video clip listing, it is the 9th item. The rest of the tutorial is very good as well.
  10. Hi Mick, I normally speak Dutch, so doing it twice is not a big leap... I do not have this flattener, but I do have the 6A III. Same procedure. As per the WO website they recommend a spacing of 66.2 mm from the flat end of the flattener to your sensor. (See WO website and then choose "more information"). You do this by using extension tubes between the flattener and your camera and making it precise by using the rotation extension in the flattener itself. The numbers are approximate; you will have to star test it to get to the last fractions of a mm. Make short enough exposures of a star field so that tracking does not influence the result. If the star shapes in the edges are comets pointing to the center, you have not enough distance. If the star shapes are little arcs rotating around the center, you are too far out. Another caveat is that if you use a filter between the flattener and the sensor, this also changes the required distance (most often less than a mm though). Test with that filter in place... Hope this makes any sense to you?
  11. I did not see it with refractors as well. But, fast optics present a whole range of challenges and these kind of reflections probably are part of that game. Good to see that this one is now behind you!
  12. Could well be light pollution. I have something similar in all of my images from home (although the bottom right corner is a bit extreme, maybe). DBE, ABE or something similar will probably take care of it.
  13. @Allinthehead: Here is a recent example I did some work in post to minimize them, but unfortunately I cannot get rid of them entirely. Of course this reflection is on all my stars, making them bigger than need be.
  14. Interesting. I have the same sort of "suction cups" and use a QHY168C (same sensor as the ZWO 071). I have attributed these artifacts to my corrector lenses, but it might be in the camera after all. I do agree that a fast system is super addictive 😉
  15. I am using my (now 20 years old) Hypergraph RC. It is a 32cm F3.1/F9 system. I normally use in int the F3.1 configuration. See my avatar for a picture of it (in winter dress). Think of it as a sort of Celestron SC with a hyperstar lens (the camera is at the front). It is no longer made as the designer now only makes much bigger scopes for professional use. The speed of the system helps imaging from Bortle 5. Still, it needs a lot of imaging time to overcome light pollution. I normally stick to targets that are close to zenith to maximize my chances. I am fairly certain that you can image these streamers with a smaller and/or slower instrument provided you have dark skies. In the case of M63, there might be something else at play. You don't need a lot of exposure time to make a pretty picture of the galaxy. The brightness and the dust lanes, which to my eyes seem to hover in the halo above the plane of the galactic disc, are easily recorded. In astrophotography time is always scarce, so you have to decide when enough is enough. Most people probably do not know the extent of the halo and the streamers and hence stop imaging after a few hours. To be honest, I did not know of the existence of the streamers. I wanted some of the halo extending from the galaxy disc, but found out that there is more and that's when I started looking for more information.
  16. ...and here is a version processed specifically for the streams. Too noisy to be a nice picture, but it shows the phenomenon (discovered in 1979....). A few hours of processing produces a less grainy appearance 😉
  17. What a nice image to look at! Nice composition and great imaging and processing.
  18. M63 has a remarkable halo and shows a few interesting tidal streams in really deep images. All of this is thought to be the result of mergers with companion dwarf galaxies (as our own Milky Way has done and still is doing). A fun fact is that the halo to total mass ratio is about 12 percent, which is the largest ratio in any galaxy known to date* Unfortunately my Bortle 5 skies do not co-operate in showing the faintest of the faintest of stuff. The halo is OK, but there is only a hint of the brightest of the tidal streams (left side, going in an arc over the top and into the right side). I spent about 22 hours in imaging this subject. Really bringing out the tidal streams would have taken double that or so. In view of the sparse amount of imaging time here (I have not taken a single exposure between november and march!), I think this should be final. *Monthly Notices of the Royal Astronomical Society, Volume 454, Issue 4, 21 December 2015, Pages 3613–3621 (if you do an internet search with this string, you will find an interesting article on M63, its halo and tidal streams)
  19. Hi Apprentice First and foremost: are you sure the mirror needs cleaning? It needs to be very dirty to notice anything at all. So, if it just is a little bit dusty, leave it alone. If you insist on doing this, here is a procedure I follow. You do not need special cleaning liquid, just some dishwater soap, lukewarm water, distilled water, cotton balls and a blower (not from a can, just the one you squize) and a bucket. - Pour a few centimeters of lukewarm water in the bucket and add a bit (only a bit) of dishwater soap liquid. Make sure it is well mixed. - Gently take the mirror out of the scope and be careful not to touch the surface with your fingers - Put the mirror in the bucket. - Let it soak for half an hour or so. - Now, take the mirror and set it on its side, while holding it. Soak a cotton ball in the water of the bucket and gently sweep it over the surface of the mirror without putting pressure on it. A movement parallel to the surface of the mirror. Radial from the center of the mirror to the outside. Change cotton balls a few times. - Once done, get the mirror out of the bucket on lay it on it's backside on a towel. Throw away the cleaning water and rinse the bucket. - Set the mirror at an angle in the bucket again (supported by your hand) and with the other hand, pour distilled water over the mirror (can be directly out of the bottle). Make sure all the soap is washed away. - Get the mirror out of the bucket again, put it on another towel, but this time standing at an angle supported by one hand. - Use the blower to blow away any remaining water droplets. - You can now put it on the towel to dry the backside of the mirror and then re-insert it into scope again, being very careful not to touch the surface of the mirror. - Admire yourself in the mirror It looks more complicated on paper than it is in practice. If you google on "telescope mirror cleaning" you will find several other procedures that will work as well. But again, if it ain't broke, don't fix it
  20. Here is my version. It was color calibrated using spectral reference stars. Saturation is a very personal thing though.....
  21. In APP: Do you use DDP in the preview filter to generate the file for export to photoshop? - When you choose "no stretch" is the blue star burned out already? If so, your subs are too long. In order to preserve star color, subs usually need to be on the short side. - If the answer to the above is "not burned out", experiment with the DDP settings until you see a nice image but not too bright as to retain the star color. - Save the image. - Bring it into PS for the last mile, but be careful with further stretching!
  22. I have had a few over the years: AstroArt, MaximDL, Nebulosity. But I ended up with APT. It is cheap and has a very rich function set. It can handle DSLRs as well as CCD or CMOS camera's. Fast platesolving, automatic meridian flips, repeatable pointing in plans. Etc. And not to forget: voice progress-notifications 😉 But the most important thing for me is that the developer (Ivo) is very open and responsive. I noticed that the unguided dithering feature did not do as I was expecting. I sent him an email with my finding and my guess as where the algorithm was wrong. No "not invented here". Within a few days I had a special beta to test out (which solved the problem) and after a few weeks it was in the official version als well.
  23. Adding to this: If you look at your single subs, is that star saturated (unstreched)? If it is, there is no way to get (a lot of) color back. If it isn't, then it is just a matter of processing. Stars are hard 😉
  24. Hi Adam, It would help to have some details. Scope, exposure time and your processing steps. It looks to me that you have stretched the stars too much, leading to loss of color saturation. What software are you using?
  25. Thanks all! As for La Palma. The fun fact is that it is a regular touristic destination. So cheap flights and often cheap accomodation (I am speaking of the recent past..for now you cannot go there). Add a car and you are all set. There are many dark places to image (or observe). The top of the volcano (Roque de los Muchachos) is very dark and above the inversion layer. It might not be the best spot though. It often is very windy (there is a reason the pros have their equipment in big domes up there).
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.