Jump to content

vlaiv

Members
  • Posts

    13,265
  • Joined

  • Last visited

  • Days Won

    12

Everything posted by vlaiv

  1. I'm not entirely sure it will solve the problem. With x0.63 reducer and without binning - you will still be working at 0.52"/px - which is about x3 over sampled. You can process this image and you'll be able to work with x0.63 if you included binning in your workflow. You don't have to bin at capture time - you can bin later.
  2. I was just amused by strange wording of image being pixelated and all of that ... You are right - sub is low SNR in part due to resolution it was taken at. Binning will help with that of course (your calculation on equivalent exposure is right, but not really important here - because when we bin this image it will have equivalent signal as 180s taken with binned pixels or large pixels).
  3. How can image be "pixelated" because "pixels are starved for light"?
  4. @recceranger If you want, you can upload your subs to some file transfer service (google drive, we transfer, dropbox) and people will have a look and stack those for you.
  5. In astrophotography, single sub will often look under exposed compared to daytime photography, but single sub really should not be your guide (nor histogram for that matter). Once you stack your data, you will be able to process it and in that processing, one of the steps is to set "proper exposure" using levels / curves. Light from DSOs is very faint and you can't really properly expose them unless you do multi hour exposures (and that is what stacking really is - combining shorter exposure into one long exposure).
  6. Those are hot pixels and are removed by using sigma rejection stacking technique.
  7. There is simply no way that spider be in focus with main optic elements. It is probably due to something else in optical train. Do you use any sort of focal reducer? Why do you have such severe vignetting? When you adjusted collimation of your telescope, did you take care not to change primary / secondary distance? How close is camera to back end of the telescope? That RC scope has something like 160mm of back focus distance. Changing mirror spacing will change this as well. Using focal reducer will change this distance and make it smaller. When you position sensor closer to focuser - you can start "bypassing" installed baffles of the telescope and you can start having direct line of sight between the sky and the sensor - see this as example: This is taken from collimation video made by TS ( https://www.youtube.com/watch?v=aLwTkyJZM1Y) If you put camera very close - there will be gap in the baffle system even on optical axis (it will be even larger if you move eye to the side - away from optical axis). However - this alone will not cause spider to be visible on sensor as it is not focused light (it is like taking sensor without lens and expecting to see something on it). What can happen is that you have some sort of lens element - like focal reducer that strangely focuses light from spiders onto sensor. - try flats first - see if you have spacing between primary and secondary right (if you fiddled too much with collimation - focus should be around 160mm away from 2" end of focuser) - if you have focal reducer - try changing distance between it and sensor to more sensible distance. Looking at vignetting - if you are using one - it is probably at high reduction factor
  8. Top is IR/UV cut Bottom one is simple AR (anti-reflex) coated window - or "plain glass" type. You don't really need filter for lunar imaging, but if you want one to optimize for seeing conditions - in your case (and otherwise) I would recommend Baader solar continuum filter. It is sort of narrowband filter although it is not really centered on any particular wavelength that is prominent in narrowband imaging. Second one that I would recommend is Ha filter (not too narrow - if you can find one that is about 10-15nm FWHM that would be ideal). Why not narrow? There is no particular wavelength of light you are trying to catch here. These filters serve different purpose in lunar imaging - they serve to reduce seeing effects. This image contains all the clues: Seeing is the same thing as prism above - it is bending of the light and there is important feature of this bending that we can see in the image above - red is bent less than green which is in turn bent less than blue (that is why we have dispersion as each wavelength is bent slightly differently). Because of this - there are two important thing that we achieve with NB filter - first, we can select red part of spectrum as it gets bent less so seeing affects longer wavelengths less than short ones. Second is - wavelengths that are close together are bent about the same - so there is no "smear" due to different amount of bending of light. Why not very narrow? Because effect on close wavelengths is small and we still need enough signal to get good SNR in short exposure. We should not block too much light - that way we will get very noisy image in exposures that are of interest (less than 5ms). Why green over red? I recommended solar continuum over Ha - although green is bent more. Because it is a trade off. Longer the wavelength - less detail will be accessible with same aperture. You'll get sharper detail at ~540nm than at 656nm without influence of atmosphere. Atmosphere will bend green light a bit more - but we are doing lucky imaging and trying to beat the seeing. Both filters are narrow enough to provide benefits of reduced wavelength "smear". There are two more benefits - one of which is applicable to your case and other might not be (but is still valid). You are using OSC sensor. It has RGGB bayer matrix - which means that half of pixels in your sensor are sensitive to green light while only 1/4 of them is sensitive to Ha. Also - QE in green is higher then in Ha. This means that you'll have higher SNR per recording with Solar continuum filter versus Ha filter. Second benefit is related to refractors. Refractors are usually optimized so that best color is green around 510nm. This is where optics is the sharpest and has the least spherical aberration (spherochromatism). If you are using refractor for lunar imaging - this will give you edge in sharpness of recording (and subsequently less need for wavelets). Just remember - optimum F/ratio is not the same for NB filters and in particular not the same for Solar continuum and Ha filters. ASI2600 has 3.76µm pixel size and therefore optimum F/ratio for solar continuum is F/13.9 while for Ha it is F/11.4
  9. I would say that main issue is size of stars. If that is your complete FOV - at 3000x3000, odds are that stars are simply too large and DSS does not see them as stars. You are imaging at 0.31"/px (2350mm of FL + 3.75µm pixel size) - which is probably about x5 over sampled. Try using super pixel mode for debayering first and then look into binning if that does not help.
  10. Two different things really. Aperture - it dictates possible level of detail that can be observed. Due to nature of light and diffraction effects - aperture limits level of detail that can be resolved. Larger aperture is able to resolve finer detail. Focal ratio acts in couple of ways to improve high power views: - slower scopes often have shallower curves that are easier to make. Any optical aberrations will be therefore smaller on slower scopes for same level of manufacturing quality. Other aberrations also get smaller with slower scopes - like in reflecting scopes - spherical primary will impact less spherical aberration if scope is slow, or in parabolic - there is less coma if telescope is slow - much more easy to get high powers with regular eyepieces. 6mm plossl or ortho will be very uncomfortable for viewing, but 18mm versions will be quite comfortable. They will yield same magnification on F/5 and F/15 systems respectively. - slower scopes make simple eyepieces perform better
  11. It does reduce noise - just not read noise. There are several noise sources and one of them is thermal or dark current noise. Cooling helps bring down that noise, but more importantly - keeps sensor on set temperature so that dark current can be properly measured (dark calibration). Dark current varies with temperature in non linear way (doubles about every 6°C) and it is very hard to remove that dark current signal when you have varying sensor temperature.
  12. No - read noise has absolutely nothing to do with temperature. It is the same for cooled and non cooled version of the camera. Read noise depends on gain used.
  13. https://www.teleskop-express.de/shop/product_info.php/info/p13758_TS-Optics-APO-Refractor-106-700-mm---FDC100-Triplet-Lens-from-Japan.html with matching FF/FR: https://www.teleskop-express.de/shop/product_info.php/info/p13759_TS-Optics-0-75x-REFRACTOR-Reducer-Corrector-for-TS-106-mm-f-6-6-Apo.html Just an alternative - I don't have any info on how good that scope is - but it is 100mm+ and close to your target FL (525mm).
  14. Sure, take some time to cool off. Things like that are quite stressful and hobbies are supposed to be enjoyable. Before you look into getting another secondary - you can try masking trick on secondary as well. It will block funny light scatter, but if it is close to sensor it won't be "invisible" like on primary - it will present itself like vignetting of sorts on that side. Luckily, flats should take care of that.
  15. Oh, I see. You were thinking of still using that? I would advise against it - even if your sensor is not illuminated by that portion - part where mirror is chipped will still get some light and will scatter that light all over the place. You'll get bunch of diffraction effects / spikes and lighter background. Best thing to do is simply to replace that secondary flat - it should not be expensive.
  16. It does not really work like that. Every point on sensor is illuminated by whole primary mirror - so that mark will impact whole image. It might not show at all on your images, or it could cause mild diffraction effects. If that happens - not all is lost, you can simply "patch" it with small black circular patch. Small circular masks don't impact image at all (to very small degree - like very tiny secondary obstruction) and people often use such masks to mask off problematic part of the mirror - either mechanical damage or place on the mirror where surface figure is not good (turned down edge or zone or similar). I wanted to give you an example in form of picture - but it looks like google search is down at the moment (at least for me).
  17. @Stuart1971 I'm not sure that I can. I don't use APP so have no idea what those might be. Single SNR value that relates to whole image just does not make sense. If we have uniform signal - like single pixel or extent of nebulosity that is all the more or less the same signal (like faint tidal tail or arm of the galaxy) then it makes sense to talk about signal to noise ratio - but image contains wide variety of signal levels - faint stuff, no signal areas, high signal areas in cores of galaxies or in stars, etc ... so single SNR figure applied to whole image is nonsense. I'm sure that app author created that SNR figure to be meaningful in some way - problem is, we don't really know what he meant and what does it stand for (might be something like - ratio of average signal in the image versus background noise or something like that) Star shape is number that is probably describing how round star is, although again - I can't really understand what the actual number means. It would make sense for single number to be value from 0-1 so 1 is perfect circle and 0 is effectively a line. Anything in between would be some sort of ellipse (in that case it would be nice to include angle of elongation as well). Here some numbers are higher than 1 and some are smaller. Have no idea what they mean. Quality is again - some measure of quality of sub that you can use to decide if you want to keep sub in the stack or discard it. Again - we don't really know how it's calculated.
  18. By the looks of it - it is the same scope with mechanics improved (3.5" focuser and fancier tube rings and matching dovetail plate + dew shield in color and focuser adapter). Original triplet with FPL-51 is rather good and I don't really see much room for improvement with FPL-53 glass. FPL-53 is marketing magnet and if they used it - I'm sure it would be all over the page and there is no single mention of it.
  19. When we are at it, does anyone have sensible idea of how to provide 3/8" connection on bottom and top side of 3d printed part that is load bearing? Would simple 3/8" insert be enough to attach tracker to tripod on bottom and carry ball head and camera on top?
  20. Not sure if compactness will be feasible. 3d printed parts are not that tough and require quite a bit of bulk to provide sufficient stiffness. They will however be lighter than metal counterparts. As far as miniature platform for mobile phone tracking (even DSLR) - there is this: https://www.teleskop-express.de/shop/product_info.php/info/p6540_TS-Optics-NanoTracker---compact-camera-tracking-mount-for-astrophotography.html Now that is compact I really doubt that anything 3d printed will be that small.
  21. Yes, sun also has spectral features. (source: https://www.esa.int/ESA_Multimedia/Images/2017/12/Solar_spectrum)
  22. Ah I see. No. Because with stars all the light is produced by mostly hydrogen but it is different from emission of nebulae like Ha and Hb. Emission lines in hydrogen come from energized electrons that jump energy levels in atomic hydrogen in nebula. With stars - process that generates light is much different - it is due to thermal motion of gas in outer layers and also due to fusion happening in the core (photons created that way are vastly more energetic than visible light but by the time they reach surface - they scatter so many times and give of so much energy to atoms in the sun that they blend in to thermal spectrum). Funny thing about stars is that we don't have emission lines - but rather the opposite - we have absorption lines. If we observe spectrum of stars - we will see dips rather than peaks in Ha and Hb - simply because photons from thermal process that are on those frequencies are more likely to bump into hydrogen atoms and be absorbed. See this post as example Process that forms color of the stars is the same one that happens here on the earth when you start warming up piece of iron. Depending on temperature - it will glow with different color. Star of a given surface temperature has the same color as piece of iron on that temperature. We never really see blue color in iron because it needs to be much higher temperature (above 7000K). With nebula it is fair to assume that large parts of it will be energized the same and ratio of Ha/Hb does not change much, but with stars - we don't actually record Ha/Hb light - but rather continuum around that wavelength. We exploit the fact that NB filter is not infinitely narrow and has some FWHM - like 3nm or 7nm - and record "around" Ha/Hb. Ratio of spectrum continuum around Ha and Hb will depend on color temperature of the star and that is the information we can use to deduce actual color.
  23. Thing with lenses is that they are often optimized for terrestrial use. Macro lenses are optimized for very near focus while most lenses are optimized for "medium" distances. There are good portrait lenses (optimized for relatively close but not as close as macro) and so on... None of them are optimized for infinite focus as scopes are. With scopes, when you image close targets - you introduce a lot of spherical aberration. Similarly, it is hard to make lens with large focus range and good spherical correction. Take for example Samyang 135mm F/2 lens. Often used in astrophotography and regarded as very good and sharp lens: This is MTF at F/8 and 30 lines/mm. 30 lines / mm equates to pixel size of 1000µm / 30 = 33µm. At F/8 and 135mm of focal length, equivalent telescope would require 2µm pixel size for critical sampling. At F/2 - this lens is supposed to have same (or similar) performance: At that F/ratio - diffraction limited telescope of same aperture and focal length would need 0.5µm pixel size for critical sampling - that is x60 smaller pixel size!
  24. There is no such thing as quality 200mm F/2 telescope. Telescope will optically be always better than lens. Telescopes are diffraction limited (not always) and lenses are not. When you start making telescopes very fast - you start lowering their optical quality. Take for example RASA at F/2 - it is no longer diffraction limited telescope - in fact it has at least twice lower resolution than aperture would suggest. Speed is not reflected in F/number alone. Sometimes F/6 telescope can be faster than F/4 telescope and even F/2 telescope. Anyway - it is complex stuff and for the question you asked - telescope wins. Problem is - even if you just take two parameters you listed - optical quality and speed - you don't get the full picture. I would argue that my 8" F/8 telescope wins over any lens in two categories you mentioned - but only over very tiny field of view. As soon as you start wanting bigger FOV - then my telescope might not necessarily win compared to some lens.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.