Jump to content

vlaiv

Members
  • Posts

    13,265
  • Joined

  • Last visited

  • Days Won

    12

Everything posted by vlaiv

  1. That is quite interesting. I think that there is not much difference between regular and median if data is normalized. What sort of background normalization did you select in DSS?
  2. I can't be sure but I think that red rectangle represents common stackable surface - meaning that is the part of the image that stays in frame. How much the frame move during the course of recording?
  3. I think that pixel area relation (not sure what it is, but by the sound of the name of it) is pretty much the same as linear interpolation as long as there is no significant rotation, and even with rotation, it won't be as good as other interpolation methods. Choice of interpolation method is very important as it determines pixel-to-pixel correlation which is balance between sharpness and denoising due to blur. I personally prefer sharpness as denoising due to that slight blur is only in high frequency domain so low frequency domain noise remains unaltered. In the end - I think that this experiment shows something most people don't really think about - resolution of OSC camera is indeed half of what pixel size suggests and interpolation debayering is the same as taking mono image and enlarging it to 200% in software - unnecessary and void of detail. This is why I always prefer split / super pixel approach.
  4. If you don't mind the doing another go - try using Lanczos interpolation for registration - just to see what sort of difference you'll get in star FWHM. I suspect that pixel area relation functions pretty much as linear interpolation and will introduce significant blur and widen FWHM - more so when data is closer to proper sampling (with over sampling you already have blurring on pixel level so additional blurring does not make much of a difference, but if data is sharp - it will be noticed).
  5. Worse star sizes in what respect FWHM or visual size of stars after stretching? When you split bayer matrix and then stack resulting frames - selection of alignment interpolation method is very important. Using say bilinear interpolation for sub alignment will have less impact on over sampled image, so high quality interpolation needs to be used for split data. In any case - compare star size with robust FWHM estimation method. Some FWHM algorithms have issue with actual FWHM measurement because they rely on background noise estimation. I think that AstroImageJ has good FWHM measurement tool - maybe use that one to compare star sizes.
  6. RCD is another interpolation technique. With OSC sensors - every other pixel in X and Y is red, similarly every other pixel is blue and you can look at green being green1 and green2 (although they are the same color) - and in that regard - they are also spaced by another pixel. Interpolation debayering "makes up" pixel values - however "educated" guess that is - it is still calculated value and can't replace genuine detail. For example - in first row second pixel is green. In order to add red and blue color to it - two adjacent R pixels can be averaged and since this is close to edge - only one blue pixel (one below) can be used to replicate the color. In any case - how ever you slice it - interpolation debayering is like zooming in to 200% using software - image gets bigger and no detail is resolved at that scale. There are other options to consider: 1. Using super pixel mode or as @ONIKKINEN mentioned - splitting bayer matrix into color components - which produces results as if one is using "mono + filter" camera with half the resolution in X and Y direction. 2. Using bayer drizzle - proper implementation of bayer drizzle will work - but at expense of SNR. Data needs to be dithered properly and it will almost reconstruct full resolution that would mono version of sensor have.
  7. Yes, but the process of them becoming visible will last couple hundred of thousand of years. Little will change in a month - it will become roughly 1 / 12 * 100,000 brighter in the image in a course of month (if we assume uniform luminosity - and 100,000 years "appearing" period. It won't just pop into view. If it's bright enough - you won't see a change, if it's not - odds are that noise will be larger than change in brightness of the galaxy over the course of a month as more of it comes into view. Yes - about 94% of galaxies that are now visible - we can't reach even if we set now off towards them with a speed of light (they are already in process of "fast" receding). Almost all will eventually vanish from the view, but it will take couple of billion of years, so we are good for next Messier marathon (downside is that we can't blame expansion of universe for those galaxies that we fail to see ).
  8. I'm a computer programmer and for me, GIT is source code version control system - so I don't really get what you mean by that.
  9. My first reaction was that this is not possible - but it turns out that it is - although not quite the way you said. Problem is with expanding universe and there are bunch of galaxies that we will never see as light from them will never reach us because expansion of universe is greater and it will always be ahead of distance covered by light (and increase lead over time). However, it turns out that our present "reach" is about 18Bly while universe is about 13.8By old and galaxies did not need much time to form - around 1By. That leaves about 3.2Bly thick shell with first galaxies that can still reach us in the future. So there are galaxies to be seen - but will we be able to see them appear in our lifetime or maybe in future centuries (for future astronomers to compare images)? - well no. If you think about it - our nearest star is 4 Ly away - so it takes 4 years for light to reach us. Distance to our neighboring galaxy is 2.5Mly, and Milky way is what 100,000Ly across give or take - so depending on angle - it could take up to 100,000 years of "appearing" for a galaxy the size of milky way - not quite "popping into existence" or one moment it's not there and the next it is It would also take millions of years between two consecutive appearances (remember - space expanded and the distance between galaxies that was then - is now much bigger).
  10. Simplest approach is to calibrate your subs and then bin them - or even bin after stacking while data is still linear. Last option is quickest and first reduces storage space requirements for stacking. In theory, you can bin all of your subs prior calibration - that means lights, flats, darks, flat darks - what ever you use, and that is equivalent to "firmware" binning - or letting your CMOS camera firmware bin data for you (not the same as CCD binning on camera - but the same as software binning you would do), but I'm against that sort of thing because you loose some of control. For example - you loose ability to bin in more sophisticated ways than regular sum / average bin. For example - you could have hot pixel map and remove hot pixels from binning process - so most 2x2 (or 3x3 - it does not matter) groups of pixels would average all pixels, but some will simply exclude hot (or dead) pixels and average 3 out of 4. As far as mathematics goes - order of operations makes no difference if you use regular calibration and binning - so you can calibrate binned data or bin calibrated data.
  11. Sorry, my fault, I wasn't clear - this is what I meant: - skip interpolation debayering and not - skip interpolation debayering Or in plain English - don't debayer using interpolation method as that artificially raises sampling rate. Use some debayering method that does not do this - like super pixel or preferably channel split debayering.
  12. Which one? Sigma clip stacking is sure to get rid of them provided that data is dithered - and yours appears to be. put Kappa between 2 and 3 (2 noiser result but removes more outliers, 3 smoother result but removes less outliers) number of iterations - 3 is enough
  13. Much like anything else these days - mobile phones suffer mega pixel craze as well. We can see that by ever increasing PPIs that reach 400 and more. But the thing is - even when holding phone really close - like 10" or so - human eye can't resolve individual pixels - far from it. Normal / very good eyesight can resolve down to 1 minute of arc - or degree / 60 - so you don't really need more than 60 pixel per degree Here is table of some phone / tablet displays (together with number of pixels per degree and usual viewing distance). As a comparison - usual computer screen of say 24" diagonal with 1920 x 1080 resolution has ~92 PPI pixel density (2203px at diagonal of 24") and is viewed what, 20-25" away (equivalent to mobile phone with 200 PPI viewed at 10" distance) and we don't really have any issues when viewing in those conditions - we never see individual pixels of the image.
  14. Depends on what you are after. If you want to do wider field imaging, or use small scope up to say 80-100mm, nothing wrong with aiming at 2"/px. In fact, with such small telescope (80mm for example) - you'll be hard pressed to go higher than 2"/px. Want to go at 1.5"/px - then get 6"-8" telescope. I'd personally aim at say 1000-1100mm scope with OSC camera and skip interpolation debayering. 8" F/5 newtonian is good combination for 1.5"/px - just make sure you get good coma corrector that does not introduce spherical aberration as that will lower optical resolution of the telescope and make it hard to actually achieve 1.5"/px Want to attempt imaging close to 1.2"/px? First - make sure your mount is up to it. Get one that guides below 0.5" RMS - preferably in 0.2-0.3 RMS range. Also, make sure you get 1.5" FWHM seeing nights during the year. Get 12" RC or 11" EdgeHD and skip interpolation debayering (super pixel mode) and bin x2 resulting data. Want to attempt 1"/px? - Don't.
  15. Ha! there is a thought. Do you think JWST will suffer from light pollution issues? I mean - multi day exposure with such aperture is bound to pick up some of the cosmic microwave background. We do call it microwave, but it is actually black body type radiation (very cold indeed) - which means some of it must be in IR part of spectrum as well.
  16. It is certainly possible and does not differ much from single person with multiple setups combining data. Some of iconic images are in fact collaboration - you can find those online, for example - this one: You can see that data from 5 different observatories was used to produce final result
  17. Interesting idea. I guess main issue will be getting something that is of optical quality - flat enough so it does not introduce too many aberrations when put in optical train.
  18. That just works with gold plated telescopes - not with regular ones
  19. I guess it is not as simple as adding another layer (bayer matrix) and that mono differs in manufacturing process from color significantly so it does not pay out to do mono version of sensor that is aimed at consumer market (DSLR) rather than industry. Most of mono sensors in astro cameras are for industrial / security and alike applications - where volume production pays off.
  20. It depends on what sort of magnification you are after. I'm going to assume 7mm exit pupil limit, but you should check it, some people have wider pupils than 7mm and some have narrower, so that is individual. For x7 magnification - there is this little scope: https://www.teleskop-express.de/shop/product_info.php/info/p13750_TS-Optics-50-mm-f-4-ED-travel-refractor--spotting-scope-and-guiding-scope-with-Crayford-focuser---perfect-optics.html If you can live with x9 magnification - there is this one: https://www.teleskop-express.de/shop/product_info.php/info/p14747_TS-Optics-62-mm-f-8-4-4-Element-Flatfield-Refractor-for-Observation-and-Photography.html or perhaps something like 60mm TS doublet: https://www.teleskop-express.de/shop/product_info.php/info/p10095_TS-Optics-PhotoLine-60-mm-f-6-FPL53-Apo---2--R-P-Focuser---RED-Line.html With 70mm you have quite a bit of choice - 70/500 as cheap option, but you'll have to change focuser for 2" version so you can use 2" eyepieces to get wide field of view. Then there are number of 70-72 ED doublet variants - however, you'll be limited to x10 magnification with those.
  21. Yep, that AliExpress listing did not bother to change the image although they changed the letters in text:
  22. As far as I know mono version of that camera never existed. IMX410 is color sensor only (I might be wrong - but as far as I can see - it is color only) and no other manufacturer has mono version of that camera. Mono camera is not in discontinued section of ZWO site and thus was probably never in their lineup, and my guess is that AliExpress listing is either wrong, scam or simply marketing trick to get google search results (page created for something people are likely to search - but item was never in stock and always discontinued - however, you can look at other items that "we have in stock" sort of thing).
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.