Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

vlaiv

Members
  • Posts

    13,016
  • Joined

  • Last visited

  • Days Won

    11

Everything posted by vlaiv

  1. If you can't quite afford premium of TV eyepieces, there is quite good and quite cheaper alternative that will get you almost there. Look at Explore Scientific ranges. Not all of their lines are equally good, but 68 and 82 degrees are very good EPs. I personally have following lineup to be used with my 8" F/6 dob: 28/68 (two inch EP) for low power / wide field observing, 16mm 68 for main DSO EP, 11mm 82 degrees for higher power DSO and some planetary and recently acquired 5.5mm 62 degrees (still not seen first light) - for high power planetary. As John mentioned above, they also have x2 and x3 telecentric lens that is the same thing as TV PMs (in terms of optical design) - and reportedly quite good as well.
  2. Ok, this is somewhat telling. Although stock 10mm EP is not pinnacle of optical perfection it should not be that poor especially if you don't have comparison point. Don't know how much observing you've done previously, but please make sure that you: - know how to recognize poor seeing. 10mm EP in 8" F/6 is not going to give you too much magnification under most circumstances, but there are some nights that offer worse seeing than most nights and in those cases x120 power can be too much. Maybe it happened that you had two such poor seeing observing sessions. - Make sure your scope is well collimated. If your collimation is off - there is no "quality" EP in the world that will make image look good. I'm saying above just as a means to improve your observing and not to deter you from getting new eyepieces / other kit. Understanding seeing limitations and having properly collimated scope will help with both old and new EPs.
  3. Hypothetically, you don't even need a camera with very deep wells to do that. As long as single exposure does not saturate sensor - in principle you could image during daytime. Of course, it would take probably decades to do something comparable to single one minute exposure at night
  4. Absolutely. Here is an example: M51 surface brightness profile: As you see, it very quickly falls of below mag 20 or so as you go away from the center. In fact, I'm almost certain that tail of that galaxy is somewhere around mag26-28. Here is 2h exposure of M51 from mag18.5 light pollution: It is down to processing. There is no "drowning" of the signal - signal just adds up. If you have 16e/px/exposure from LP and 1e/px/exposure from target - you will end up having average background signal of 16e where there is no target and 17e average signal on target. You subtract 16e and then you are left with 0e for background and 1e for target (black level) - then you stretch your image and you end up with 0e background (black) and 50% luminance for example (depends on how much you stretch). Key here is setting black point. If you have uneven LP - and it will certainly be, depends on your FOV how much gradient there will be (small FOV - small gradient, larger FOV - larger gradient because more sky is captured and difference in LP will be larger at edges) - you might need to remove that gradient as well. You are right that each exposure does not have same gradient direction, however there are algorithms to deal with this - in process of sub normalization you can extract planar component of the sub (interpolate signal with a flat plane - assumption here is that FOV is small enough so that LP changes linearly) and equalize those as well. That way you will end up with final linear gradient that is the same across the subs. When you stack it will just average out to again - linear gradient - which you can remove because there is mathematical dependence on that background level (fairly easy one to model and fit). In fact, let me see if I can find linear stack of above image in my archive and I'll present you with measurements of background signal and faintest parts - just to see what sort of signal we are talking about. I've managed to find only scaled (0-1 range) linear image and not electron count one, but that will do as we can see how much brighter background is from tail. Measurement on median pixel value for background is: 0.018510705 Median pixel value just below NGC 5195 (companion dwarf galaxy) is 0.019229632 Difference of the two is: ~ 0.00072, or about x25.75 less signal (around 3.88% of LP signal is signal of that faint region). That is 3.5 mag less than LP level.
  5. This sounds like you still don't have info on that? It is under profile / account settings. Choose "Signature" on left hand menu.
  6. If you want to mention someone yes, put '@' and start typing their screen name and wait a bit - it will show a list of members (auto complete) - select member from the list. That way you will always "complete" mention. Just typing someones screen name after '@' will not always function (at least in my browser it happened couple of times that it did not work, but above method always works. You can also quote just a part of their post - which is handy if post is long, has many images that you don't want to repeat, or you want to address certain part of their post. Just select part that you want and wait maybe a second or so - "quote selection" popup will appear - again click on it: As a third option, you can actually insert a link to a thread or a post - just copy address of relevant thread/post from address bar and paste it in text - forum engine will create nice looking excerpt from that post. Mind you - that is just to referring to something and people in that thread will not get notification of any kind.
  7. Not trying to prove you wrong, just want to expand on my answer above. I did hint that 300/7 is only approximation and that for Ha this factor is even greater. Graph that you showed is direct Sun/Moon light - not something that will end up on your image if you are doing Ha imaging in the night of the full moon. At least it should not - no body images with part of the moon in their image. What constitutes LP from moonlight is not actual light of the moon (above spectrum on your graph) - but rather part of that spectrum scattered in the atmosphere. Same thing that gives us day and blue skies when the Sun is up. I must put emphasis on "blue skies" part from above sentence in case it gets missed Thing with our atmosphere is that it scatters shorter wavelengths much more than long wavelengths. Hence sky is blue - blue light being of shorter wavelength is scattered more than green and red part of the spectrum (unless you have desert storm and larger dust particles in the air - then it turns brown because larger particles scatter red light better). In any case - that is the reason for Ha being shot more often than OIII. SII is usually fainter so it suffers more from small amount of LP that does get thru the filter. Filter "width" also plays a part, and yes Ha tends to be narrower than the rest. But you can image under LP and under moon light - no swamping happens - signals just add. What does happen is that you get poorer SNR per total imaging time - sometimes considerably so. For example - not counting in the moonlight, just plain LP - I calculated that difference of about 2 magnitudes for me means x6 less imaging time for same SNR. I'm currently at mag 18.5 skies and plan to move to mag 20.8 skies. One hour there will give same SNR as 6 hours over here. Same is true for moon light - you can image faint targets under full moon - but nobody does it because they would spend 5-6 hours on the target and get something like half an hour worth of data. Combining subs with different SNR is not something that is readily available in stacking software, and gradients are pain to remove - so most people opt not to do it for such a small contribution.
  8. When I think about it - I just imagine a spherical wave bouncing all over the place - it then interferes with itself. This of course means two things: Each path must be straight (no bent paths or curves) and if we assume that "ripple" is wave packet - meaning it has "length" of certain number of wavelengths - it is also "limited" in time - we can't sum paths that won't converge at point of interest in given time.
  9. How about time? That is one of aspects that confuses me - do we add all trajectories that can add up with speed of light being constant for duration of photon "wave-packet"? What about those before / after? Do they add to their own probability sum? (We have different sums depending on total travel time?). Do we add only straight paths? Some of the graphical explanation that I've seen involves "curved" trajectories for photons. Some on the other hand when saying "we need to sum up all trajectories" mention nebulous things like - even photons going to Andromeda galaxy and back
  10. Is this for reflective slit? I can't find any reference on the net for such slides (have no idea what 4/92 stands for). Could you expand on it please?
  11. I have - in 5" scope! Well - it was not color as we normally see it - it was more "impression" of the color, or rather feeling that leaves you wondering if you saw color at all or your mind is playing tricks on you by relating what you see to the images of object you saw before and knowing in principle what color it should be (mind you - it did not happen to me on other objects regardless of the fact that I also saw and remembered their color images). Part of the trick when trying to see color is not to get dark adapted. Once you get dark adapted - even light levels that should produce color response get contrast loss and you start not to see color at the level you should. This is also important when observing planets - you will loose both color fidelity and perception of detail if you fully dark adapt. In fact, I think it is related to this: https://en.wikipedia.org/wiki/Purkinje_effect
  12. That is quite correct. Issue with our vision is that it is not linear - that is why we have magnitudes for stars (logarithmic dependence) and also why there is Gamma on our displays. When we see something "twice" as bright - that does not mean it is twice the intensity of the light. This hurts contrast even more then simple signal addition that sensor experiences (LP + target photons give signal level). With sensors that LP level is just black point on histogram in case there is no gradient. In gradient case it's a bit more complicated but in principle the same - level of signal that will be subtracted / removed. Biggest issues is associated noise. Since that signal also comes in form of photons and those hit randomly (Poisson process) - there is noise that one can't remove because it's random and it is equal to square root of level of signal. More LP signal there is - more associated noise there will be. What filters do is rather simple - remove unwanted signal and keep wanted signal. Removing of unwanted signal brings back "black point" (and also makes gradients less of a hassle) - but also minimizes noise from that signal (square root of 0 is 0 - no signal no associated noise).
  13. Never thought of that, but indeed - depends on where you look from - southern or northern hemisphere
  14. It works because all other imaging "works" when the full moon is out Let me explain. Issue with full moon is light pollution. For visual it reduces the contrast making target almost invisible (and in some cases invisible) to the naked eye. Not so for photography. Sensor will gather both photons from the target and photons from the LP. Issue that we are having in imaging from LP can be summarized by: - annoying gradients (LP is not equal in all parts of the image and produces gradient) - lower SNR. This is important part. We don't need LP signal and we can remove it from the image by levels (and leave only target), or more fanciful ways that deal with gradients as well. What remains is shot noise associated with that signal - you can't remove that. Additional noise lower our SNR because it combines with other noise sources (read noise, dark current noise and shot noise from target). Whole issue with LP can be overcome by using longer exposure - you can get very good images in heavy LP provided that you image for enough time. Back to full moon and narrow band. If you are using narrow band filters that have band pass of about 7nm then you are in fact doing the following: - passing all the light from the target at particular wavelength (Ha for example) - cutting down all other type of light. This includes broadband LP signal. If we observe 400-700nm range that is used for imaging - it is 300nm wide. With narrow band filter - we remove everything except 7nm - or in another words we reduce LP by 300/7 = ~ x43. In reality we reduce LP even more because LP is strongest in other wavelengths (even Moon light). x43 is 4 magnitudes of difference, and you are making you sky be 4 magnitudes less bright. So if full moon puts you in Bortle 9 skies (18 mag skies) - using narro band filter returns you to Bortle 1 (mag 22 skies).
  15. If by CA filter you mean filter that will lessen chromatic aberration, then you can use any of the following: - simple wratten #8 filter. This works very well, but casts a yellow tint on the view - Baader Contrast booster filter - this works well on F/10 achromat and the Moon (I was really surprised) - probably other planets as well. Not as good as I expected on bright stars - For complete removal of any CA - Baader continuum filter. This is rather "specialty" filter usually used for white light solar observing. It impacts strong green cast on the image. I found that it transformed my F/5 achromat in spectacularly sharp moon scope (although like I said - green cast). It is probably very useful for double star observing as well (it helps with seeing besides CA). - Look also at Baader fringe killer and Baader semi apo filters (used neither and opted for Baader contrast booster instead on my F/10 achro).
  16. Why do you worry about luminance star cores? As for parts of target that gets blown out - use a few shorter exposures. Stack them in separate stack aligned with main stack. Multiply short stack with appropriate value after stacking (ratio of exposure lengths) and "copy/paste" blown out region of the target. Star cores will certainly reach saturation in luminance after any sort of stretch even if they are not blown in linear data. There is simply no way around it. Dynamic range is just simply too great to be able to see both faint stuff and star cores without blowing out later. Where people go wrong is color composition. If you compose your image prior to stretch - you will end up with stars that are white in center. Best way to do color composition and retain star color is to: 1. Make sure your RGB subs are not saturated in star cores - use same technique for each channel as described above - take a few short exposures in each channel and mix in with long exposures after stacking. There is simple way to do that via pixel math - "replace long stack pixel value with scaled short stack if over some threshold value - like 90% of max ADU" 2. Stretch your luminance data without color composing until you are satisfied - you can do even things like sharpening / denoising and such at this stage to remove luminance noise. 3. Out of linear, wiped color channels create three images that contain RGB ratio rather than actual color values. First do color calibration. Next stack R, G and B channels via max stacking into temporary image and then divide R, G and B channels with this temporary image. You must be careful not to have negative or zero values in your RGB images - so scale them to 0-1 (or just above 0 to 1) prior to making max stack. After you get ratio images - apply gamma correction (simple gamma 2.2 is enough) on each of them. 4. Produce R, G and B channels of final image by multiplying stretched luminance with each R, G and B ratio images. Voila, star color in cores will be accurate and not blown out.
  17. Not even going to pretend that I understood what you just said - quick google search on "curved spacetime well defined time coordinate" yielded this result: https://arxiv.org/pdf/1401.2026.pdf Paper is titled: "Quantum fields in curved spacetime" Might be of interest to someone
  18. What seems to be the problem with incorporating curved space time with quantum field theory? Too small contribution of gravity to make significant impact within limits of calculations?
  19. I wonder why it it not "known" thing then? Many people use CCD sensors that are based on "consumer" chips. In fact dark scaling depends on stable bias and many people do it. In any case - it's worth a try to take subs with a bit of time in between.
  20. This should be rather simple but a bit time consuming activity. You'll need something like hour or two of your time under stars. Procedure would be the same as preparing PEC curve - disable any PEC if you have it, fire up PHD2 and do calibration but disable guide output in PHD2. Make sure guide logging is enabled in PHD2 and run "guide" session for about hour or so (HEQ5 has worm period of 638s if I remember correctly - and you want at least couple of cycles to get good PE curve). Once you are done, you can load PHD2 guide log in software called Pecprep - part of EQMOD project. It is used for analysis of PE data and generation of PEC curves. Simple screen shot of PE curve in PEC prep is all you need. In fact - maybe just PHD2 log of this session is all that potential buyers will need - they can use either PECPrep or some other software to analyze it.
  21. So no one is having a clue what this might be? Could someone do the same measurement with their bias subs on CCD camera? I'm inclined to believe that this might not be fluke and that this stuff can happen to people. Something like this happening on occasion could well be solution to flat calibration mystery that was never solved but happens sometimes to @ollypenrice.
  22. No, you would not feel gravity instantly. Imagine you are in a hollow center of a planet and there is a shaft leading to its surface. Inside that central cavity you would not feel any gravitational pull. Once you start climbing up the shaft you would very gradually start feeling gravitational pull. In fact gravity pull that you would experience at certain height inside the shaft would be the same as if there were no mass "above" you - only one "below" you (only mass that is less than your height distance from center of the planet would exert gravitational pull on you - everything that is at larger distance would act as another hollow sphere - thus producing no gravity pull). This leads us to very interesting conclusion - gravity is the strongest on the surface of the planet, once you start going higher away from planet and going down the shaft - it starts reducing. I think this is related to how brain works - or rather how "understanding / knowing" works. In order to know something we need to establish some sort of equivalence - mapping of the phenomena to something that ultimately boils down to our experience. For example - it is much easier to accept that two electrons repulse one another because most people can relate that there is virtual photon exchange - momentum transfer. At some point this boils down to "hit one billiard ball with another and it will bounce off". Most people don't have any issues with Newtonian gravity because they relate it to "fields" and them acting as some sort of elastic cord between bodies that pulls them together. Once you start talking about space time curvature - all analogy goes down the drain. Imagine following: when most people hear about space time curvature - they expect space time to actually be "curved" so space craft orbiting a planet is following this "curve". This means that all points along that trajectory are somehow "curved straight line" or something. But at exact same time and at exact same spatial coordinate - shoot a beam of light, or put object moving at higher velocity. It will no longer follow same path - so there is no actual "curve in space time" (in the sense we started thinking about it) - confusion sets, there is nothing to relate to and we "loose knowledge" - or rather we concluded that we don't know / understand it.
  23. That one is pure beauty, but with price to match . If it's well behaved sensor (in terms of amp glow / darks / FPN and calibration in general) - it's going to be real hit (for anyone who can justify the cost). My only concern is data size. Single calibrated frame is going to be around 234MB or so.
  24. That price is excluding VAT. I'm outside EU so exported goods are sold without EU VAT. Price is actually about 32-33% higher for me than that (10% customs fee and then 20% VAT on all of it including postage).
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.