Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

Bibabutzemann

Members
  • Posts

    110
  • Joined

  • Last visited

Everything posted by Bibabutzemann

  1. @pipnina could you solve your issue with the blotchy background? I started a new thread about this issue. To have some comparison, it would be awesome if you could stack your flats, but additive to enhance the problem. If you send me that image i could compare it to mine and see if its the same problem. I found out that the shape of the that complex gradient changes when i move the focuser in or out. Here is the gif which shows what happens when i move the focuser from all in to out. It contains 9 frames. Each fram is a additive stack of 20 flats and vignetting removed:
  2. i get the same problem without the L2 filter. My camera is full spectrum modded, so thats why im using the L2 filter. I tested without the L2 filter and still have the same gradient. It would have surprised me if the L2 filter would cause the issue, since its a clip in filter and therefore always rotates with the camera. Im sure the gradient is independent of the camera, because like i said, when i rotate the cam by 90 degree, the gradient appears rotated by 90 degree on the image. There is only one way i could cange the form of the gradient: When i change the focus. But i dont know what to make out of it. EDIT: I made a gif which shows how the gradient changes when i move my focuser from all in to all out.. It contains 9 frames and each frame is a additive stack of 20 flats. i only removed the vignetting.
  3. I didnt mean that you should use NINA for capturing. Just the handy framing tool. You can use it on your normal PC without connection to your gear. Of course its always wise to view your data afrer few hours to see if its worth it. But before that i would still check framing, especially proper angle, so you dont need to start all over again. @alacant i would be interested to see comparison images with same gear on different targets. With such dim objects i doubt that the uhc would perfom very well, but im happy to be proven wrong. @carastro there are some good results in astrobin done with the l extreme and a colour camera.
  4. This is an O3 target. The l extreme should give a better contrast and because of the narrower band less light pollution gradient to deal with.
  5. If you want to go for such a long project with a dim object, i would use something like the N.I.N.A framing tool, to see which focal length and angle fits better. Since you are using the L extreme, i dont see the benefit in additionally using an UHC filter. But for prettier star colours you could add some RGB data.
  6. Hi, im using the Eos 1200Da with the Astronomik L2 Filter. I only work with raw files, in this case @ISO1600. But as i mentioned, the pattern rotates, when i rotate the camera. So i doubt that another camera would solve this issue.
  7. Hi, I have a problem with my 130 PDS. After doing background extraction on my calibrated and stacked image, im still left with a complex background gradient: The problem doesnt come from "complex" light pollution. Here is comparison between a flat frame stack (additive, to visualize the problem) and around 70 lights frames without registration and no flatframes (for better comparison). I applied background extraction to both image to remove vignetting so that im only left with the complex gradient. On both images its almost identiacal. The light blue circle in the middle come from the MPCCIII. As you can see on the first image, the circle gets calibrated out just fine. But the flats cant fully diminish the blotchy background, even thougt its similar. So i did some tests to find the source. The problem doesnt come from the camera or CC, because when i rotate my camera 90 degree, the pattern also gets rotated by 90 degree on the image. So next try was to look into the focuser with my flat panel attached. I saw some bright reflexions on the corner of the 2nd mirror and on the outside of the focuser tube. So i flocked both areas and i also covered the back of the primary mirror to avoid light leak. Again, i took some flats and stacked them additive to see if the problem got smaller. Here is the result: No differene lol. Im slowly running out of ideas now. Maybe someone could give me an hint. Thanks for reading.
  8. Wow, the perfomance of this lens, even in edges of a 6d, is amazing. Which f ratio was used?
  9. depens on your budget and your scope. For example: If you have a scope that can correct full frame size sensor and a budget of 800 Dollar, i would buy a used EOS 6D over a cooled ASI 533. I agree with the others, i would always save money for a peltier cooled, before buying a fan cooled.
  10. Your flats are working nicely. Both of your image show a typical light pollution gradient coming from south direction. I noticed that your single frame has only little light pollution compared to my single frames. I dont know how much you stretched to make it so obvious. I also get this weird uneven background, when i stretch my final image too much after doing Background Extraction in Siril, but it doesnt look so strong line in your image. But maybe you just stretch differently.
  11. You can simply calculate as following: Inner diameter of the dewshield=outer diameter of scope=160mm. Mirror diameter is 130mm. So the distance between mirror and dew shield is 15mm. This distance determines, how long of a tube+dew shield is allowed (given the sensor size). The light path in the image is the most outer light that isnt obstructed by the dewshield: We have tan alpha = (15mm/(540mm+x)) , alpha = FOV/2 -> x=15mm/tan(FOV/2) - 540mm For APS-C Camera the FOV is 2.36° , so the maximum dewshield lenght is x=189mm Note that its not the total dew shield length, only measured from the end of the tube. So as long as you are under 19cm, you are right to assume that your Dew shield doesnt affect the light path. Doing flats with dew shield can also cause additional problem, when its a flexible dew shield. In which way did those mods improve your images or how did you conclude that it helps? If the cloth you are using for flats it too thin, then internal reflections could effect your flats. Daylight sky is already too bright for a normal shirt in my experience. Maybe you could give us a unstretched single calibrated frame. Then its easier to see what is causing your gradients. I would use a flocked dew shield as first measurement to block out stray light . Its easy, riskfree and protects the 2nd mirror from dew.
  12. Here my attempt on M101. The stars are bloated. I dont know if this is due to bad seeing. But i didnt want to apply star shrinking, because it would most likely also shrink small background galaxies. So i decided to just let it be. 130 PDS, EOS 1200Da cooled, EQ3 Pro 16hours @ISO1600, Bortle 5
  13. thanks again Vlaiv! This cleared lots of things up for me. Now it also makes sense to me why Lee gets better results with the 2.5x Barlow, he theoreitcal optimum would be 3,75x5=18,75 with the ASI224MC, so without Barlow it is undersampled at F/10 and with Barlow 2.5x its only a bit oversampled at F/25 I agree that theoretical limits shouldnt discourage anyone to try out setups, that are not considered to be in the optimum. Like you said, there are many variables we could overlook 😀 But i think its certainly helpful to know the limits, so when there are unexpected results one doesnt immediately jumps into wrong conclusions. For example a newer camera with more pixels may lead to better results, even though the old camera was already oversampling. That doesnt mean, heavy oversampling is suddenly a good thing, physics wont change, but the new camera has simply a more advanced sensor (higher QE, less noise etc). Also i think it adds another layer to this awesome hobby to learn about the science behind the instruments we are using, likewise understanding the science behind objects we are imaging 😀
  14. Thank you for your answers 😀 The argument with 0.05" details, that can be seen, was regarding a 254mm scope. The airy disc size would be 0.48". The airy disc of an 106cm scope is 0.115". Since we are not (as) seeing limited with short exposures, i can see why it would makes sense to sample at 0.07" with that scope, which isnt even half the size of the airy disc. So this would agree with what Vlaiv explained about Nyquist sampling theorem . And with that scope a 0.05" detail is therefore less "blown up". (i dont consider lucky imaging processing here, only single ultra short exposures). Good point, i have to think more about that Your Jupiter image looks amazing btw! I really appreciate you taking the time and effort explaining this stuff in a simplified way. So lets say the seeing blur is 2" and the airy disc size for a 5" and 8" scope are 0,96" and 0.61". This means they the resulting FWHM would be 2,22" for the 5 inch scope and 2,09" for the 8 inch scope. (I assume a guiding error sufficient low to be neglectable) Seems like just a tiny difference. But could it be true, that for the 8" scope, the curve could fall more steep before it hits the cut off frequency? So that there is already more energy in those high frequencies right before the cut off, resultung in already more contrast without sharpening. Like that:
  15. Wow this topic is really fascinating, but im also confused now. I dont want to "heat up" the discussion though, im just curious. There is science behind it, this is nothing philosophical. I think its clear to say, that are so many variables involved, that we cant deduce from this comparison, that the Apo image would benefit from smaller pixels. I think it would be better to try out two Cams with different pixels during the same night and same scope. @vlaiv showed with a downsampled Hubble image, that with a resolution of 1"/pixel, much more details can be achieved. But is it that simple? What would happen, if we downsample the hubble image to 2"/pixel, would it still look sharper than Lees images? Also according to Lees source, Damien Peach, a scope can display details far below Dawes limit. But that doesnt mean, that those details are displayed proportionaly correct in the image, right? After all, even the image itself certainly doesnt have such small pixels to show 0,05" fluctuations. But certain details have such a big contrast, that it the detail is still visible, but in a blown up way? But anyway, this isnt AS seeing limited, because of the shorter exposures in planetary imaging. So now regarding long exposures i have another question: Does "seeing limited" automatically mean, that you wont get a sharper image, when you increase aperture? Lets say say seeing is 2". Does that mean, every scope that has an airy disc <2" can display exactly 2" resolution? I would think no, because both effects (airy disc and seeing) will combine to total resolution of >2". And bigger apperture means, that you will come closer to the 2" seeing limitation. Does that make sense? As you can see, i have not much clue, so maybe someone can recommend a scientific paper or book, where i can read more about this stuff? Anyway, @Magnum i think both your images look excellent and the processing was done quite tastefully (Pixinsight?)
  16. Just to clarify something: First, my initial statement "That attempt looks more natural" was in comparison to his other attempts where the background looked spotty and stars had big halos. None of those attributes are representing real fluctuations in brightness. I didnt say its the most natural one can achieve. We could discuss endless, what is the most natural. And like i said, its not always my prime goal to go "natural" and i dont think its automatically the best. Like you said, cameras collect data differently than human eyes. Its Photography, not observal astronomy. But Pitch Black Sky wanted to achieve more natural picuture, so there is that. I think Vlaivs statement, that he wouldnt call it astrophotography anymore was more related to your statements, such as you do to your image "whatever you feel like". Topaz Denoise is something where we some people draw the line. I mean there has to be a line somewhere, even if its not a sharp line between art and photography. And again, we could now discuss endless about semantics, but when someone alters his data so much and adds artificial data, he shouldnt feel surprised if others dont consider that work in the same realm (call it astrophotography or whatever) as others, who only want to show real data
  17. Well i never said it is 100%. But it certainly comes closer to, than brushing away certain details. And instead of the eyetracker thing, you could just see it that way. If you would look at the background with the telescope, you would see all those small galaxies. In the image you should see it as well if you look at that part 😉
  18. Well since we havent exactly defined what natural means, we could discuss about semantics to no end. So im happy to abandon that word and simply say what i mean: With natural i mean, that my picture shows what a gigantic telescope outside of earth would show to the human eye. I do sharpening, noise reduction and background extraction to reduce the optical effects caused by atmoshperic turbulence, light pollution, heat noise etc. because those things dont come from outside of earth. Sometimes my pictures are less "natural" (in the sense i said above), when i want to focus more on bringing out certain details. Thats where Narrowband for example come into play. But the data is still there, its still light gathered from outside of the earth. I dont want to add artificial data in post processing. Thats where i draw the line. I also have huge joy, when i find quasars and galaxy groups in the background of my main object. For me its not just art. There is a fascinating story behind all those small dots you see in the background. And colours are important for me because they tell, if a object is maybe red shifter, or contains huge amount of young stars or old stars. If i understand the physics behind it, it makes the image ever more beautiful.
  19. Looks much more natural to me than your other attempts. I took this image into Darktable, sharpened with Highpass and Contras Equalizer and saturated the colors: Regarding Topaz: i do not use that software, because as i understand it, it doesnt do "simple" math functions on your image, to enhance what IS already there. It makes educational guesses to clean the image up, based on millions of images the neural network got fed with. Even if it looks natural, it is not. But thats preference. Some people are more into the artistical aspect of the hobby. You will find out for yourself, where you want to draw a line in post processing. For me its much more satisfying if i show only the data i collected with my gear, even if it might be a less clean look. @powerlord Well you could also just use Hubble images as luminance layer and go for it 😆
  20. I have no idea how startools works but those big black halos around the stars make the image look unnatural. In PS/Darktable this happens when i sharpen too much or when the sharpen radius is too big. Also i would really avoid clipping the background. You will loose so much of the precious data you collected. All the small galaxies in the background get lost. This also adds to an unnatural look. Third thing that stands out to me, that there are barely any colours left. But i guess this is preference. Again, i dont know Star Tools and which steps causes these issues. I only tried Star Tools in a trial period and realized i got faster the results i wanted with other software. Just for fun i would try processing with Siril+ Darktable, which are both free and easy to use.
  21. https://www.lightpollutionmap.info/#zoom=1.19&lat=14.7492&lon=-96.9789&layers=B0FFFFFFFTFFFFFFFFFFF This website will give you a rough estimate in your location.
  22. I always thought the focuser is really stable. I would have thought flex in the tube would be a bigger concern then the focuser. How did you measure the weight induced tilt?
  23. Every Scope gives you vignetting. There is no way around flats. As long es the vignetting is centered,, everything is fine.
  24. nice data, may i ask under which Bortle Sky this was taken? 🙂
  25. Nice one, i totally agree on the processing part. I try to process my M101 right now, and its really not easy. The colour balance of M101 is still a mystery to me M13 Cluster 106x60s@ISO1600 130PDS + EOS 1200DAC + EXOS 2 PMC8
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.