Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

vlaiv

Members
  • Posts

    13,106
  • Joined

  • Last visited

  • Days Won

    12

Everything posted by vlaiv

  1. I doubt that you are under sampling with 1.94"/px. Look at this: RASA is not quite what we would call diffraction limited system. If it were, then those dots in first row would be something like half of the smallest dot in this diagram (that blue for example at 500nm / 10mm - or about 1 to 1.5 little squares in this grid). In another words, most of these dots above are about 5-10 times larger than diffraction limited 8" aperture. For diffraction limited aperture of this size - optimal sampling rate (planetary) is about 0.26"/px This means that equivalent sampling rate, without influence of atmosphere and tracking would be 1.3"/px to 2.6"/px (depending on wavelength and spot size above). Add seeing and tracking and you can see how it easily shoots above 2"/px. But that is ok, RASA 8 is not meant to be diffraction limited imaging scope - it is meant to be fast wide field imaging scope and in wide fields resolution is sacrificed so larger spots don't really matter.
  2. Large mirror * large sensor = large speed
  3. F/ratio works if (physical) pixel size is kept the same, not just focal length. If you want ratio to define speed (something per something) - best thing to use would be aperture per pixel size in arc seconds (reciprocal of resolution in "/px) - or maybe easier way to remember would be just multiply aperture and resolution. Now we can compare different combinations of gear like 200mm at 1"/px to 80mm at 2"/px. 200mm at 1"/px is just 200mm per 1px per arc second so 200 (or 200 x 1 = 200) 80mm at 2"/px is 80mm per 0.5 px per arc second is 160mm per 1px per arc second so 160 (or 80 x 2 = 160). First system is 200 and second is 160 - first is faster by 1.25 in SNR (200/160) for given time (in light dominant regime) or 1.25^2 = 1.5625 in signal strength / photons per pixel. We can do similar thing with respect to FOV like @andrew s pointed out above, rather than per pixel and that is useful if we bin / resize image after, this pixel approach is better if we want to compare systems without binning / resizing.
  4. I don't think that there was any color adjustment done in stacking process. It's probably my choice of reference star or something that I over looked. Although, it is kind of strange to have such high blue values for star that should have more red than blue.
  5. Most UV/IR filters for astronomical cameras should be more or less uniform, like for example 1.25" L filters: They should not significantly reduce green and red part of the spectrum ...
  6. http://www.astrosurf.com/brego-sky/A_glimpse_to_the_night_sky/Articles_files/filters.pdf This is equation: For 3nm filters, we need to have maximum shift of 1,5nm at 656.3nm. 654.8/656.3 = ~0.99771446 Now we square that and subtract from 1 we get ~0.0045658566. Refraction index of air is 1.000293 and that of glass is around 1.5, so we have 0.445 * sin^2 angle = 0.0045658566 From this, we have sin angle = ~0.101356 = 0.1 Angle is 5.74 degrees. This gives approx ratio of ~4.975 or F/5. Interesting exercise. I expected longer F/ratio needed for such narrow filter, but it turns out that 3nm filters are good for F/5 and higher. They can certainly be used on even faster systems with slight loss of aperture due to filter acting as aperture stop. I also calculated for on axis beam - edge of field beam will have additional angle added that depends on focal length and size of sensor (but is relatively small in comparison). I started this post believing that there would be significant difference between 3nm and 7nm on fast optics, but it turns out for regular F/ratios that we use in imaging, 3nm is just fine.
  7. Actually if everything is right, above color is how it should look in outer space not from earth. I do have doubts if above measurement is accurate. For example I used 5570K (according to Gaia DR2) star as a reference. It should have following RGB values: 1.135763 0.972170 0.875711 So, linear R should be something like 30% stronger than B. Data from your camera has much stronger B than R: 68281.670806 84433.395310 96729.032408 Linear B is in this case 41% stronger than R. This is something that usually does not happen with OSC cameras. OSC cameras usually have stronger R than B. Even IMX571 QE graphs hint that R should be more sensitive: R has both higher peak and broader curve. So does G. In above reference 5570K star, G is stronger than B. In QE graph, G is clearly stronger than B, yet in measured star values - B is stronger than G? I don't see how this could happen unless something is wrong with either reference star or my measurement or the data.
  8. Well, it's not gray Supposedly - it's above shades if my measurements and math are correct. Reference star used was: TYC 4590-1433-1 at 20:54:22.679+78:07:05.94 Gaia DR2 quotes this star to be 5570K of effective temperature. Corresponding sRGB linear triplet is: 1.135763 0.972170 0.875711 Photometric measurement of star gives: 68281.670806 84433.395310 96729.032408 and measured nebulosity linear RGB values are: 18.910516739 14.435421944 6.550098896 Resulting nebulosity RGB values are: 3.14548 1.6621 0.593 However, I'm rather skeptical about these results for two reasons: 1. Data is not calibrated (at least flat field is not applied - this makes background removal very difficult and probably not precise) 2. Data is 16bit. There is 6 hours of data in 4 minute exposures. This is 90 subs stacked and that is 6.5 bits improvement over camera bit count which is around 14 at gain 100 - so it's 20 bits. 4 Bits of precision lost due to 16 bit format.
  9. I'll certainly give it a go - not sure how accurate this will be.
  10. Well - we have data set to work with, right? One just needs to measure reference star in the image (or few) and measure light intensity in each channel in nebular - on linear wiped data. This will tell us what sort of color we can expect on that nebula. Here is handy website for star temperature to color conversion: http://www.brucelindbloom.com/index.html?ColorCalculator.html One just needs to enter star color and these settings: to get RGB linear ratio (make sure you select D65 for sRGB and gamma 1.0 to leave it linear).
  11. I agree that best place to start would be measurement. There is however reason to believe it could be gray(ish). Look at these: 6000K is white. This is power / energy curve - we need to convert it to photon curve, but it is clear that "red" side of spectrum is going to dominate (and it dominates for 4000K and less anyway - even in this graph). Now we take such light that is dominant in red - has higher photon count there and do this with it: We scatter more blue and less red. We have more red and less blue to start with and we scatter more blue and less red - my reasoning is that it will sort of balance out and it will leave us with about same amount of blue and red in scattered light - sort of more * less = less * more situation. Equal amounts of blue and red will (and about the same amount of green - as it is in "between" of these two extremes) - will create rather balanced RGB composition - a gray color - or at least grayish.
  12. But are those really "brown" nebulae - or simply gray nebulae for the reasons I described above - that turn gray once they pass our atmosphere?
  13. Nebula will make stars behind it look redder - in same way that our atmosphere makes our Sun look yellow/orange/red (depending on how much atmosphere it hits). Color of nebula on the other hand will be same as our atmosphere - blue - or rather light from stars (not only those directly behind it - but in every direction) will be scattered of it and blue components will be scattered more than red. We have G2V star illuminating our atmosphere - it is white and star turns yellow while sky turns blue. I linked above how frequent are stars of each color - about 96-97% of all stars has yellow / orange / red tone to them - which means that their light is no white. Once you take such light and you scatter it - it will turn "white" - same process that makes our sky blue as it turn white light into blue. This process of scatter moves light in following diagram "to the right" So if original light is white - which is middle of this diagram - our atmosphere will scatter such light and color of it will move to the right - thus we end up with blue sky. If original light is orange and nebula scatters it it will also move right - it can end up being pale yellow, white or very light blue. This is what I would expect from nebula that scatters mostly orange light of surrounding stars - to be on average "gray" (or white - same thing, different intensity).
  14. I'm actually at 45 degrees North - but thanks for pointing that out - will take it into consideration.
  15. I don't see why it would scatter blue light in direction of stars but not in our direction. This is a possibility.
  16. Deck will be surrounded by walls on three sides (at least meter and a bit high) and there will be sort of window section towards the main volume. This will effectively make main volume be like a large "warm room". Something like this: This is so I can monitor scope from below. This is actually a very good idea - I was just thinking about it . Maybe it could even be "sliding" type door - there is enough room after the stairs to incorporate it into floor itself.
  17. When you say 2 metre inside dimensions and later say 2.5 metre squared what do you mean exactly? I have object that is 4.7m x 2.7 m inside dimensions (5x3 ground allowance - so that minus minimal 15cm walls on each side) and I plan to make a "gallery" style observing deck on less than half this size - this is my current idea. Now, I have an option to go for 2m x 2m + 0.7m for stairs to get to deck or perhaps to go with 1.8 x 1.8 and to have both more room for stairs - 0.9m in this case and also more room in main "volume" of the object - which should double as garden shed and observatory. In first case, I'll have 2.7 x 2.7 + some small room under the gallery - it will be situated at 1.4m height, so there is enough height to put a desk under that will serve as imaging station and also there will be a bit of storage room as well. In second case - I'll have proper width stairs (well, better than 0.7 in any case) and also bigger main "room" - at 2.9 x 2.7 but there will be a bit less space on the deck as it will only be 1.8 x 1.8 m I'm also wondering if I could get a bit more height below the gallery - if I could get it then I think it would be even possible to sit below it as well as having table and monitor under there. This is floor plan in my permit: Side plan: I have quite a bit of "room" to do things differently - like I don't have to have roof "slab" - I can go with just regular roof, and walls don't need to be 25cm thick - I can do 15cm ones (and I plan to do it to increase inside space. Here is what I envisioned to do: Now, the two obvious problems are - size of gallery / deck above and height of deck. Half of the roof will slide over other half, so I'll get open deck like that. While I won't be building the whole thing yet, foundations will be done in about two weeks time (if everything goes as planned - not something that happened often this year) and I need to know where to put pillar anchor (that will be about 1 cubic meter of concrete and reinforcement and 200mm concrete reinforced pillar will probably be ok on that height).
  18. Don't try to push it too hard if it won't let you: This is: Registax RGB align Registax wavelet sharpen (just a bit) Gimp scale down + Brightness + Contrast adjustments.
  19. Good point - just what I would expect from a "color corrected" workflow - while data is still in linear stage, measure star RGB ratios and compare them with expected linear RGB ratios. I did that on several occasions and I even did color calibration on "ground bound" color reference. In both cases I get significant color shift towards red part of spectrum due to atmosphere. For example here is ground reference on Jupiter (which turned out very yellow):
  20. Atmospheric scattering will. And my point is - what do we want to show in the astro image - object as viewed from the earth in given imaging conditions or object as is? If we want to show object as is - without filter imposed by our atmosphere - we need to do inverse of what atmospheric scattering does to color - we need to boost blue part of the spectrum - do color conversion to our image. Once we do that - we get actual color of nebulosity - which I believe is much more gray than depicted in above image. Why do we have white balence on our cameras? Because we want to see "actual color of the object" (in this particular case color of object under most common lighting conditions - daytime lighting). If you look at this image: and you look at the left side - you would not say, oh, I'm happy with this image - look how nice this person's orange skin tone is! You know that this is not what skin color is and you would correct the image with white balance. In above case, we might not know what the actual nebulosity color is, but we do know filters that are at play and if we want to find out and depict "natural" color of nebulosity - we need to color correct for what we know is happening - atmospheric scattering and possibly interstellar reddening.
  21. It happens both in nebula and our atmosphere I would say - with a difference. Scattering in nebula is what is producing glow of nebula - so one could expect shorter wavelengths to be dominant there as they are more scattered. Most stars around nebula will be on longer part of spectrum since bluish stars are very rare and majority of stars in universe have reddish color. 97% of stars is in fact on yellowish / orange side. Nebula scatters mostly blue part of that so it sort of balances out (or maybe not - I'm just guessing here) - producing gray color. Bright reflection nebulae that have bluish color - have it because they have rather luminous but rare bluish star illuminating them. Then this gray color hits our atmosphere and another round of scattering occurs. This time blue light is scattered away - leaving yellow/orange cast on the color in question (if it was gray to start with). You know - this thing happens:
  22. The way I see it, there are at least three "correct" renditions - depending what you want to show. 1. Image from surface of the earth. I think this one is probably the least interesting as it does not leave much room for comparison. Take image of object at elevation of 100m with target being at azimuth of about 45 degrees and compare it with someone else's image taken at 2500mm above sea level with target being in zenith - there will be difference in colors. Probably significant difference. In general we want to eliminate atmosphere from our images, I would say, although it is correct rendition in sense that object really has that color observed from surface of the earth (if color correction has been applied and all that ...) 2. What the object would look like image from a space craft orbiting earth or sun or somewhere in our stellar neightborhood Here you need to compensate for earth's atmosphere and remove its influence. 3. What the object would look like if imaged from relative proximity - meaning its stellar neighborhood Here you need to compensate for earth's atmosphere as well as interstellar reddening if there is any in direction of the object I think that all of the above is theoretical rather than being "mandatory" or even recommended for AP, since most people don't know how to properly process color from their camera to start with, let alone do above corrections. Camera sensor does not see color the way we see it, and first thing to be done is color transform matrix - usually split into white balancing and color correction matrix (I learned that recently). Next thing is to apply proper sRGB gamma transforms to colors if we want to encode images for display on computers etc ... Most people don't bother with above two steps and adjust curves "by hand" until they are happy with what image looks like - and after most will say that "there is no proper color in AP anyway" - for some reason.
  23. I guess this is down to atmospheric scattering. Here is a little experiment: First our two color targets as I believe look outside of our atmosphere - above one is dust in your image, and other one is reference - GV2 star 6500K - our sun. We don't see our sun as being white in regular day light - but rather yellowish - this is due to Rayleigh scattering in our atmosphere (sky takes blue color for itself and leaves sun color to be yellowish). We can do this by adjusting image temperature in Gimp for example: Nice, we already have brownish color for our dark nebula (although it is gray in nature). Let's do something astrophotographers like to do in their processing - let's boost saturation: In the end, if I compare the two - part of LDN1228 from the image you posted last and this color that I made by simple process of atmospheric scattering and boosting saturation: Pretty good match, right?
  24. yes, that is a good question. Did you do color calibration on stars?
  25. Ok, so this is literally the first image in google search for LDN1228: And this is second: and this is yours: In both above images we have grey area that I marked out being darker. Whole nebulosity has this beige or brownish tint to it - and I suspect that is color shift due to atmospheric scattering. Yours has red in dark areas - I also marked it out.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.