Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

vlaiv

Members
  • Posts

    13,106
  • Joined

  • Last visited

  • Days Won

    12

Posts posted by vlaiv

  1. 46 minutes ago, Adam J said:

    It matters for saturation on targets like m42 core and means you don't need multiple exposure lengths saturation is saturation irrespective of stacking. I did say 12 (ok 11.88) stops above so am confused by that comment. Also I did not mention e/ADU at any point just read noise at a given gain. You are right about the full well being 5k its a log chart so not the easiest to read at a glance. 

    Just wanted to point out that dynamic range is not really something people should concern themselves with.

    In fact, neither is read noise provided that it is in reasonable range. Of course, it is better to have camera with lower read noise, but that is much more important with planetary imaging for example where exposure lengths are very short.

    With DSO imaging there will be small difference in camera with 1.3e read noise, and one with 2e read noise as total impact can be controlled with sub duration. Let's say that you want to image for total of 4h. If you use 2 minute exposures on 1.3e noise camera, what should be exposure length on 2e noise camera to have equal result (everything else being equal). That is rather easy to calculate. With 2 minute exposures you will have total of 120 exposures, so total read noise will be sqrt(120)*1.3 = ~14.24. If you want same total read noise from 2e noise camera, then you need to shoot (14.24 / 2)^2 = 50.7 exposures or let's round that up to 51 exposures - so your exposure length needs to be 240 / 51 = 4.7 minutes (this calculation is same as (2/1.3)^2 = or ratio of read noises squared).

    Due to nature of objects that we shoot and levels of light involved, we should not base our sub duration on saturation point of sensor, and yes for most targets in fact one might want to take a few short filler exposures - to avoid star cores clipping - that occurs almost always and not only on bright targets like M42 - don't see anything wrong with that (not shortcoming of the sensor but rather adoption of certain workflow).

    Hence, dynamic range is not really significant in DSO imaging. It is important in daytime photography - where we deal with single shots, and more dynamic range means more options in post processing (exposure correction, displaying things hidden in shadows, etc ...).

    In any case, I agree with you that this is interesting sensor, but I would not view it as a competitor to ASI183 - rather as a complement. It allows people to choose between sampling rates / resolutions for their particular setup without too much of a price difference for the same FOV.

     

  2. 3 minutes ago, Adam J said:

    So its got very low dark current and still over 10 stops of dynamic range at 30dB gain, which in turn will give you about 1.2e read noise. Though I would most likely run it at gain 200 and 1.3e read noise for a incredible 12 stops of dynamic range and ~8k full well.  

    That is very very good. 

    Adam 

    Not sure if your calcs are right. At 200 gain, e/ADU is about 0.25, right (maybe closer to 0.3). With 14bit ADC, you can get only about 5000e worth of signal, so not really 8K full well. Dynamic range is in fact about 12 (~5000/1.3 = 3780 log that with base two = 11.88) but not sure if that is meaningful in AP - since you get much larger dynamic range by stacking subs (and hence you can target your dynamic range).

  3. 2 minutes ago, Louis D said:

    So this aberration doesn't scale linearly with magnification like coma in a Newtonian?

    I'm not sure, it is interplay of scope curvature and "expected" curvature of flattener - it usually manifests itself in the corners of the image and can't be seen when you move towards the center - this is on sensors with diagonal of about 20-30mm - so affected area is usually 10+ mm from optical axis.

  4. 3 minutes ago, Louis D said:

    To test your theory, check my image below of my 12mm Nagler T4 which focuses 19mm from the shoulder in two inch mode.

    That is not going to help much - it's a 12mm eyepiece - means that field stop is about 17.1mm - that is only about 8.6mm from axis - about a third of a distance from field center compared to edge of 46mm field. Even if you introduce large distance error for flattener - that part of the field will not be significantly distorted.

  5. 8 minutes ago, Louis D said:

    They were taken through my f/6 AstroTech 72mm ED with a TSFLAT2 field flattener ahead of the 2" GSO 99% dielectric diagonal spaced at 15mm with an SCT to M48 thread adapter (original diagonal nosepiece removed) with about 35 feet of separation indoors between the target and the scope's objective.  As I said above, it's clear that certain camera/eyepiece combinations will yield sharp images to the diagonal limits of the camera (a Galaxy S7 phone camera).  Check both edges for sharpness because I sometimes ended up with one or the other being slightly sharper due to camera tipping or centering issues.  There's very little residual field curvature or edge aberrations from the objective itself.  It's also unobstructed, so that's one less variable when it comes to evaluating eyepieces.  The images I captured pretty closely resemble what I saw through each eyepiece with my eyes while wearing eyeglasses to correct for my strong astigmatism.  Each eyepiece was focused for infinity corrected vision to hopefully show the field stop sharpness at its best since focusing for near or far sightedness can move the image plane ahead of or behind the field stop location causing it to look fuzzy.

    I don't think that distance will be much of an issue. It can only contribute to bit of spherical aberration, but with most of these eyepieces and their focal lengths it is too low impact to show on images. Field stop will be in focus - close focusing just moves focal plane further out but does not change focal plane of the eyepiece (which is ideally at field stop to make it sharp).

    On the other hand - use of field flattener that has exact operating distance could be an issue for edge performance of some of the eyepieces. You did get about good spacing for it - it should be located at about 128mm away from focal plane, and GSO diagonal adds 110mm of optical path + 15mm of optical path of SCT/M48 adapter that makes about 125mm total - let's call it close enough. This would be generally ok if all eyepieces had their focal point right at the shoulder - but eyepieces vary by quite a bit and some may need up to 10mm of adjustment either way - that would make field flattener introduce significant astigmatism - which would show as EP astigmatism.

  6. 5 minutes ago, Louis D said:

    Nope, not the camera, it's the eyepiece.  Notice how sharp the 40mm Meade SWA is right almost to the edge with nearly the same apparent field of view.  You can also see that the 30mm ES-82 is fairly sharp nearly to the edge.  I tilted the camera to get the edge in sharpest focus possible to show the field stop sharpness in that little extra bit of image to the right of the main image since my camera only goes to about 75 degrees on the diagonal.  I bought a second super wide field camera to capture the entire field in one go in the "full width" images.  However, it is a lower resolution camera and I had to up-sample the images to match the image scale of the 75 degree camera, so the utility of those images isn't all that great.

    There just is no free lunch when it comes to wide field and good image correction across that field.  If you want both at the same time, it means getting a big, heavy eyepiece.  Notice how nicely corrected the Baader Scopos Extreme is.  However, it's weight slots it between my 12mm and 17mm ES-92 eyepieces, which is to say very heavy.  Your best bet for well corrected and lighter weight is the 30mm APM UFF.  However, it is fairly expensive, though still cheaper than TV Panoptics.

    I was under impression that you were using DSLR and a lens. In case you have one - we can easily calculate focal length of lens that you would need to capture all eyepieces up to field stop. Let's take 100 degrees to be maximum AFOV currently available. If you use APS-C sized chip (most likely in consumer type DSLR) - that is about 28mm across. You would need something like 12mm lens to get it covered.

    Btw - what scope did you use it with?

    And what was the distance to rullers?

  7. I did not find anything wrong with GSO 32mm in F/6 8" dob, but then again did not look for it either. It served me fine for a long time, I still have it although I don't use it anymore because I replaced it with 28mm 68 degree ES.

    Later gives me better view because of higher magnification (makes sky darker in LP) and of course nicer 68 degree FOV vs ~50 of GSO.

    I also though about getting largest possible FOV for another scope (F/10 1000mm one) so I examined all available options. My initial idea was to use x0.67 reducer with F/10 scope and 28mm EP to get maximum TFOV - but that proved rather difficult as distance needed is 85mm - small enough for 2" diagonals, larger then 1.25" (and reducer needs 2" to be able to show that much sky).

    In any case - getting largest possible TFOV will require rather large field stop - around 46mm. Most eyepieces are reported soft around the edges - but that is no wonder because most scopes are not well corrected on such a large circle - most perform good up to 30mm or so - That makes me wonder if it is at all feasible to get large FOV on long focal length or if it is better to simply use another scope that has short focal length in the first place.

  8. 4 minutes ago, Deisler said:

    Thank you for your reply, Sir.

    Could you please elaborate a bit on 'instead of Barlow' -

    I assume you meant Barlow may not be that important if I can have 5/10(stock)/15/25(stock)/32 EPs? 

    I think that is what was meant above. UHC filter is very nice observing tool - it helps you when observing certain types of nebulae and I think it's worth having.

    Barlow is also nice tool, but having had couple of them - in the end I decided that I like short focal length eyepieces more than barlow + EP combination. If you do end up getting barlow or telecentric amplifier lens - then you probably won't need anything below 10mm in EP collection unless you plan to use that particular eyepiece without barlow. Short focal length EP + barlow will give too much magnification. I'm guilty of using it like that - just as a "let's see what can be done" gimmick but never for regular observing.

    If you are looking for cheaper comfortable eyepieces in short focal lengths, then do have a look at these:

    https://www.firstlightoptics.com/skywatcher-eyepieces/skywatcher-uwa-planetary-eyepieces.html

    I had one of those and while I was not particularly impressed with optical quality - it was indeed better than stock eyepieces and served me well until I tried better / more expensive eyepieces. Later in discussion with other members I came to conclusion that I might have had rather poor sample. It was 7mm one.

    For short focal length EP without barlow I recommend that you stay above 5mm for the time being.

  9. Nice image. I would personally move white point a bit closer to the nebulosity - it will be a bit brighter with a bit more dynamic and stars will be sharper and pop out more.

    Here is example of what I'm talking to be clear (quick touch-up on the image you posted and cropped, hope you don't mind):

    image.png.846a1c49d5b81d78c667b927085666cf.png

  10. If you can't quite afford premium of TV eyepieces, there is quite good and quite cheaper alternative that will get you almost there.

    Look at Explore Scientific ranges. Not all of their lines are equally good, but 68 and 82 degrees are very good EPs.

    I personally have following lineup to be used with my 8" F/6 dob: 28/68 (two inch EP) for low power / wide field observing, 16mm 68 for main DSO EP, 11mm 82 degrees for higher power DSO and some planetary and recently acquired 5.5mm 62 degrees (still not seen first light) - for high power planetary.

    As John mentioned above, they also have x2 and x3 telecentric lens that is the same thing as TV PMs (in terms of optical design) - and reportedly quite good as well.

    • Like 2
  11. 3 hours ago, Deisler said:

    But when I used 10mm EP, I found the quality is quite poor and the lack of details is disappointing, although I don't really know what benchmark looks like to be honest. 

    Ok, this is somewhat telling.

    Although stock 10mm EP is not pinnacle of optical perfection it should not be that poor especially if you don't have comparison point.

    Don't know how much observing you've done previously, but please make sure that you:

    - know how to recognize poor seeing. 10mm EP in 8" F/6 is not going to give you too much magnification under most circumstances, but there are some nights that offer worse seeing than most nights and in those cases x120 power can be too much. Maybe it happened that you had two such poor seeing observing sessions.

    - Make sure your scope is well collimated. If your collimation is off - there is no "quality" EP in the world that will make image look good.

    I'm saying above just as a means to improve your observing and not to deter you from getting new eyepieces / other kit. Understanding seeing limitations and having properly collimated scope will help with both old and new EPs.

     

    • Like 2
  12. 1 minute ago, symmetal said:

    Well that told me didn't it. :D Thanks vlaiv.  So LP puts no restriction on how much faint detail you can see. It just means you have to allocate much more imaging time to be able to reduce the significantly more noise contributed by the light pollution in order to see that detail.  And deal with the gradients of course. I stand corrected. :smile:

    So hypothetically, if you had a camera with sufficient well depth that it didn't saturate, you could do DSO imaging in the daytime if there were no clouds, (and you weren't pointing at the Sun). 

    Alan

    Hypothetically, you don't even need a camera with very deep wells to do that. As long as single exposure does not saturate sensor - in principle you could image during daytime.

    Of course, it would take probably decades to do something comparable to single one minute exposure at night :D

    • Like 2
  13. 10 minutes ago, symmetal said:

    Are you saying that you can get an image of say a mag 21 object with a skyglow of mag 18 if you image for long enough. For every 16 photons from the skyglow you'll get 1 extra photon from your target once the noise has been reduced to an insignificant level by enough imaging time.

    Absolutely.

    Here is an example:

    M51 surface brightness profile:

    image.png.afb4c29b7c312afd98d1f9ef2b66a857.png

    As you see, it very quickly falls of below mag 20 or so as you go away from the center. In fact, I'm almost certain that tail of that galaxy is somewhere around mag26-28.

    Here is 2h exposure of M51 from mag18.5 light pollution:

    image.png.eeb1cea054feb56a676eb98b27972b99.png

    It is down to processing. There is no "drowning" of the signal - signal just adds up. If you have 16e/px/exposure from LP and 1e/px/exposure from target - you will end up having average background signal of 16e where there is no target and 17e average signal on target. You subtract 16e and then you are left with 0e for background and 1e for target (black level) - then you stretch your image and you end up with 0e background (black) and 50% luminance for example (depends on how much you stretch).

    Key here is setting black point. If you have uneven LP - and it will certainly be, depends on your FOV how much gradient there will be (small FOV - small gradient, larger FOV - larger gradient because more sky is captured and difference in LP will be larger at edges) - you might need to remove that gradient as well.

    You are right that each exposure does not have same gradient direction, however there are algorithms to deal with this - in process of sub normalization you can extract planar component of the sub (interpolate signal with a flat plane - assumption here is that FOV is small enough so that LP changes linearly) and equalize those as well. That way you will end up with final linear gradient that is the same across the subs. When you stack it will just average out to again - linear gradient - which you can remove because there is mathematical dependence on that background level (fairly easy one to model and fit).

    In fact, let me see if I can find linear stack of above image in my archive and I'll present you with measurements of background signal and faintest parts - just to see what sort of signal we are talking about.

    I've managed to find only scaled (0-1 range) linear image and not electron count one, but that will do as we can see how much brighter background is from tail.

    Measurement on median pixel value for background is:

    0.018510705

    Median pixel value just below NGC 5195 (companion dwarf galaxy) is 0.019229632

    Difference of the two is:  ~ 0.00072, or about x25.75 less signal (around 3.88% of LP signal is signal of that faint region). That is 3.5 mag less than LP level.

     

    • Like 3
  14. 25 minutes ago, fwm891 said:

    Feeling a [removed word] but how do I link to someone's post so they know I've responded to their question?

    I've tried putting the '@' immediately before a name but it never highlights...

    Cheers

    If you want to mention someone yes, put '@' and start typing their screen name and wait a bit - it will show a list of members (auto complete) - select member from the list. That way you will always "complete" mention.

    image.png.56f40586394607ab2aa570ff037849b9.png

    Just typing someones screen name after '@' will not always function (at least in my browser it happened couple of times that it did not work, but above method always works.

    You can also quote just a part of their post - which is handy if post is long, has many images that you don't want to repeat, or you want to address certain part of their post. Just select part that you want and wait maybe a second or so - "quote selection" popup will appear - again click on it:

    image.png.e38f572f494a16d1df2c93261c880eec.png

    As a third option, you can actually insert a link to a thread or a post - just copy address of relevant thread/post from address bar and paste it in text - forum engine will create nice looking excerpt from that post. Mind you - that is just to referring to something and people in that thread will not get notification of any kind.

     

     

    • Thanks 1
  15. 1 hour ago, symmetal said:

    That's my take on it but am willing to be proved wrong. 

    Not trying to prove you wrong, just want to expand on my answer above. I did hint that 300/7 is only approximation and that for Ha this factor is even greater.

    Graph that you showed is direct Sun/Moon light - not something that will end up on your image if you are doing Ha imaging in the night of the full moon. At least it should not - no body images with part of the moon in their image.

    What constitutes LP from moonlight is not actual light of the moon (above spectrum on your graph) - but rather part of that spectrum scattered in the atmosphere. Same thing that gives us day and blue skies when the Sun is up. I must put emphasis on "blue skies" part from above sentence in case it gets missed :D

    Thing with our atmosphere is that it scatters shorter wavelengths much more than long wavelengths. Hence sky is blue - blue light being of shorter wavelength is scattered more than green and red part of the spectrum (unless you have desert storm and larger dust particles in the air - then it turns brown because larger particles scatter red light better).

    In any case - that is the reason for Ha being shot more often than OIII. SII is usually fainter so it suffers more from small amount of LP that does get thru the filter. Filter "width" also plays a part, and yes Ha tends to be narrower than the rest.

    But you can image under LP and under moon light - no swamping happens - signals just add. What does happen is that you get poorer SNR per total imaging time - sometimes considerably so. For example - not counting in the moonlight, just plain LP - I calculated that difference of about 2 magnitudes for me means x6 less imaging time for same SNR. I'm currently at mag 18.5 skies and plan to move to mag 20.8 skies. One hour there will give same SNR as 6 hours over here.

    Same is true for moon light - you can image faint targets under full moon - but nobody does it because they would spend 5-6 hours on the target and get something like half an hour worth of data. Combining subs with different SNR is not something that is readily available in stacking software, and gradients are pain to remove - so most people opt not to do it for such a small contribution.

    • Like 3
  16. 1 minute ago, andrew s said:

    The book goes into the details. What is happening is that they are using Feynman's path integral approach to give a picture of what is happening. As with classical variation methods all paths are considered and in this case all are traversed at the speed of light c.

    I don't  really like it as an image as people tend to believe it is real rather than a mathematical tool.

    Regards Andrew 

    When I think about it - I just imagine a spherical wave bouncing all over the place - it then interferes with itself. This of course means two things:

    Each path must be straight (no bent paths or curves) and if we assume that "ripple" is wave packet - meaning it has "length" of certain number of wavelengths - it is also "limited" in time - we can't sum paths that won't converge at point of interest in given time.

  17. How about time?

    That is one of aspects that confuses me - do we add all trajectories that can add up with speed of light being constant for duration of photon "wave-packet"? What about those before / after? Do they add to their own probability sum? (We have different sums depending on total travel time?).

    Do we add only straight paths? Some of the graphical explanation that I've seen involves "curved" trajectories for photons. Some on the other hand when saying "we need to sum up all trajectories" mention nebulous things like - even photons going to Andromeda galaxy and back :D

     

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.