Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

vlaiv

Members
  • Posts

    13,106
  • Joined

  • Last visited

  • Days Won

    12

Everything posted by vlaiv

  1. I was under the impression that Photon line are cheapest of the cheapest - metal tube ones, yet very decent performers (according to your image). https://www.teleskop-express.de/shop/product_info.php/info/p10227_TS-PHOTON-6--F4-Advanced-Newtonian-Telescope-with-Metal-Tube.html Although, there appears to be carbon tube version as well of that line. I like the way it looks with that black red combination. In any case, seems to be rather good match for ASI1600
  2. I see that you have TS 6" F/4 newtonian - is that their Photon line? What coma corrector are you using and how do you find the scope?
  3. I do it every single time It's not about pixel peeping, it's about appreciating the scale of things. I also like to sometimes spot very faint background galaxy or see something else that I would miss by looking at "larger picture".
  4. I'd say you need shaft coupler: Nema 17 bracket And visit to your local hardware store that sells hinges, L brackets and all those accessories for doors, windows and such (sorry, don't know exact term in English): in any case, I think that you'll find something suitable to mount nema bracket to mount there - or at least something that you can easily modify yourself.
  5. I don't know, but I do know it 16th today and I'm eagerly awaiting results
  6. Have no idea. Best detail on the image capture process that I could find in a quick search is this: It sort of tells me that many data sets were used to combine to create this image. It might as well be some sort of Ha filter blend. In reality, Ha can't be properly displayed on any display device and it certainly can't be encoded in sRGB format as it is pure spectral color. Here is chromaticity diagram displaying all colors of sRGB color space compared to all color space that most humans are capable of seeing. Outer rim of this diagram represents pure spectral colors - rainbow. Ha for example is too saturated red compared to any red shown on this diagram inside sRGB color space (redder than any red displayed on computer screen). It is also a bit darker - meaning for same light intensity, you would get something like this: Left is the reddest of what display can show and right is (lacking a bit of saturation in this image that Ha normally has) same intensity Ha red. This is due to how our vision cells work: Note that sRGB can't accurately represent any of the colors of rainbow / spectral colors and every spectrum you see on your computer monitor will be wrong. In image above, author really tried to simulate spectral colors as best as they could - which shows in Ha part - by darkening of same red color trying to mimic what our eyes would see (note sensitivity of L type cells). For that reason - we can say that any image containing Ha is false color image as far as our ability to display images, but it also gives us guide on how best to incorporate pure Ha regions - not as bright red but rather dim deep red if we want to try to best represent visual appearance of object. This however does not hold for "color mix" situations. When one has Ha region filled with stars that are far away to be resolved (like in another galaxy) or perhaps has OIII as well present - it is much harder to tell exact color. Luckily, we actually can give some idea of how it should look like - or rather where on chromaticity diagram it should be located. If one has two light sources of different color - their combined color will simply lie on line connecting those two colors in above graph. Stars + Ha will have this color: I outlined region of color that such combination can have - btw that series of numbers is what is called Planckian locus and it represents color of black bodies of different temperature (source of above diagram of colors of stars - each spectral class will have certain temperature and each temperature has certain color - like when you warm up steel - it will first be deep red, then have yellow glow and it will turn white and bluish in the end). Similar thing we get when we connect OIII line on outer, spectral locus - at 500nm to Ha point at 656 - line joining them gives you all possible colors of Ha/OIII nebulae (actual color will depend on relative intensities of each spectral color). Fun stuff, isn't it?
  7. Here is HST image for comparison: and "stellar bar". I think that HST images in general match star colors very well - which leads me to believe they have been properly calibrated for display.
  8. What scope is that? It's really hard to find scope that is corrected for 45mm isn't it?
  9. I don't think that any software will let you calibrate with wrong image pixel count. If you used darks in your calibration workflow, check the following: 1. You used darks of matching settings (gain, offset, exposure length) 2. You did not turn on dark optimization by accident (or on purpose)
  10. That dark is not for above camera. You seem to have mixed your calibration files between cameras. This is dark from ASI071 or similar? Do you happen to have that camera as well? Image size is 4944 x 3284, pixel size reported by SGP in fits header is 4.78µm and camera name is "ASI Camera(1)" It also shows some glow along bottom of the frame and two places to the left: but it is far less pronounced than that of IMX183
  11. Yes indeed - you'll see this but only when you stretch your darks. Can you upload single dark fits file (you can do it as attachment in response) so we can check for amp glow?
  12. That is amp glow. Nothing to worry about - proper dark calibration deals with it. Have a look here for more details:
  13. It's rather simple - we know what is right because we measure things. As long as you stick to measurements and standards - you'll get objective thing. As soon as you start doing things the other way - it becomes subjective. There is well a defined color standard / absolute color space - CieXYZ. There is well defined relative color space that we all use on internet - sRGB. There is well known transformation between the two. Only thing that is missing is to know transformation between your particular camera and these color spaces (you can actually go directly from camera to linear sRGB bypassing XYZ if you want since all transforms are linear). When you choose certain white balance preset in DSLR - you apply one of these transforms that converts what has been captured into actual color. These have been preloaded at factory. For astronomy purposes, we don't really need to do that much work as DSLR makers do since they need to account for different lighting conditions (cloudy, sunny, inside, tungsten, halogen ...). In astronomy, colors don't depend on ambient light - they are light and we capture it. We just need proper transform from our linear data into linear sRGB and we need to apply proper gamma function for sRGB color space.
  14. I always wondered what is exact definition of for example UBVRI system? If one does search on google for UBVRI response curves - they can hardly find two graphs that look the same - and these are just filter response curves and not combined instrument + filter response curves. How does one account for instrument + filter response to obtain same results as someone else using different instrument and filter set? For example: vs vs vs
  15. In principle - you should go for longer total exposure on fainter channels. Problem with that is - you don't really know what is fainter channel until you start imaging. If you really want the best possible image, then here are a few pointers for that: Ha is usually the strongest, followed by OIII and SII being often the faintest component. This is of course target dependent but for most of the targets it holds. Don't be afraid to go out and capture additional detail. Most of the best images are captured over several nights. You can always start by allocating equal time slots for each channel and then once you do preliminary stack, check which channels lack in quality and go out the next night and get more data on those. One more thing - if you use Ha as luminance, then it pays to have more time on it - Luminance needs to be smooth and we have much higher sensitivity for noise in luminance data than in color. With individual exposures - in principle it is the same thing - you should "tune" exposure length for each channel. This will depend on two things - strength of signal and level and type of light pollution you have as well as width of you narrowband filters. Again - you can't tell until you start shooting. Overall differences will be rather small and it really is not worth having different exposure lengths for each channel because you'll need different set of darks (if your camera can't use dark scaling, or even with dark scaling resulting SNR will be slightly different).
  16. Because there is well established theory of color matching - colorimetry. There are infinite number of spectra that produce same color in human vision - this is because human vision is trichromatic. It has three types of receptors and our brain "synthesizes" what we perceive as color based on level of stimuli from each one of those. Stimuli in each one of those depends on spectrum but in a way that includes "summation". If I asked you to name 3 numbers that add up to 12 - you'd be correct to respond with both 6+5+1 and 4+4+4. In the same way - two or three different spectra can produce the same color. This has nothing to do with idea of you seeing "different sort of red apple" than I. It has to do with - if I show you two apples that I find looking the same - you'll find those two apples also looking the same. Hence the name - color matching. In order to best understand what color matching is as opposed to just two people seeing the same image of something is as follows: Imagine a scene that is imaged with two different cameras - say Canon and Nikon (or Sony, or insert your favorite brand here _____ ). After that, image from each camera is displayed on its own computer screen - one from Dell and another from LG (or again, maybe BenQ or insert your favorite monitor brand here ___). Anyone who has seen the scene and sees image on computer screens, if all is well and cameras and computer monitors have been properly color calibrated, will agree that: scene had same colors as both first and second computer monitor. This is definition of color matching - you are able to replicate color regardless of equipment used to record it or device used to display it and any observer agrees that color matches. Things get a bit more complicated when we apply color to object as color that we see / record will depend on light source used to illuminate it, but in principle it is again down to "sum" of spectrum reaching our eye in the end. For this reason, there is well defined color of astronomical object. In fact, because astronomical objects are illuminated with only one type of color - either one they emit themselves or in case of reflection nebulae or planets or similar - with nearby light source and we can't have different illumination conditions - things are even simpler than daytime photography where you need to be careful about type of lighting in order to recreate proper color of object (what is known as white balance). Further, we know what color each type of star needs to be - physics tells us what sort of spectra we can expect from each star and colorimetry tells us how to convert that spectrum into color. There you go - actual colors depending on spectral type, and if you want to be more precise, there are tables for stellar colors: http://www.vendian.org/mncharity/dir3/starcolor/details.html
  17. I did above with my Az Gti and it fixed a bit of backlash that I had.
  18. In about 1h maybe? I tend to stay up late and usually sleep in in the mornings. Luckily, my job (partially responsible for that sort of sleep cycle) allows me to start late as I work from home. Btw, I'm 1h ahead of GMT so it's well past midnight here ...
  19. I think that perhaps easiest thing to do is just take regular flats. Then it is fairly easy to get percentage of illumination - by simply measuring ADU values in center (like average of small circular selection) and ADU values in the corner. Divide the two and you'll get illumination percentage. In principle, SNR loss is square root of reciprocal of illumination percentage at best. For example, if one has 50% illumination at the edge - this means that center receives x2 signal compared to the edge (reciprocal of 0.5 - which is 50%) and SNR of pure signal is square root of signal, so in this case, SNR at edge is sqrt(2) = ~ x1.41 worse at best. It is never that good because we have dark current noise and read noise that don't depend on illumination and stay the same.
  20. I think that point in above comment is as follows: You pay "premium" price of £69 since you are hoping to get wide field eyepiece. It is still entry level price, I'm aware of that, but let's look at what you get if you opt for 42mm version. Instead of getting advertised 65° of AFOV, you get closer to 58° AFOV. If you happen to have faster scope, you also get usable only, let's say 80% of the field. That is ~46° of usable field. If I told you that you have 42mm with 46° AFOV for £69 or perhaps 40mm with 42° but for less than half that price at £30, which one would sound like a better deal? Alternative would be to consider following offer: 42mm with 58° and poor outer field performance in F/5-F/6 scopes for £69 or 35mm with 68° (true field effectively the same) and very good performance in F/5-F/6 for £89. In the end, everyone can decide for themselves if asking price is something that suits them - as long as they are properly informed of performance of the eyepiece. Interestingly enough - £69 is exact amount local dealer here is asking for those eyepieces, branded as GSO and if I were in the market for one of them - it would make more sense to get them locally thus skipping import fees and having lower postage. On a separate note, 6" F/12 CC for low power moon - you want 30-32mm plossl - again around £30-£40, but spare on no expense on good diagonal though.
  21. I think that is very sensible. We often speak of good / bad performance, but really, to some people these are just nuances. I remember when I first noticed outer field astigmatism in 32mm plossl in F/5 scope. I used that eyepiece for very long time and never noticed it until I actually actively searched for it. Problem is that once you see such thing - it is very hard to "unsee" it . Best course of action is to try for oneself.
  22. It would be really interesting to measure vignetting and illuminated field up to half of illumination (50% of peak intensity). I guess that would be interesting to anyone trying to image. Light reduction of more than 50% while correctable with flats - will simply lower SNR too much.
  23. This is very very strange. There are 3 possible causes that I can think of and I'm not sure it is anyone of those (I'll explain why): 1. No anti blooming on CCD 2. OAG prism diffraction 3. Some sort of wire over the telescope - like phone or power line that goes over your imaging location. For first one - it's obvious - this happens on CCD sensors that don't have anti blooming. Your camera has it according to specs. Also, when this sort of electron spill happens - it looks different. It is also in vertical direction because of CCD structure - columns of pixels are used to read out pixels so there is "channel" that connects each vertical column of pixels. Second one looks differently - it happens only on bright stars on one side of the image - where OAG prism sits and is usually perpendicular to closest edge (because we tend to orient OAG prism to be parallel to closest edge) - but in principle it is perpendicular to OAG prism. Effect is very small if it happens at all and solution is simple - moving OAG prism a bit more away from sensor. There is additional telltale "signal" that it might happen - flats that show OAG prism shadow. Third is straight forward - there is power line over your imaging site and it acts as secondary spider stalk - creates diffraction spike. Problem with that is that you have diffraction spikes in your image - and you know what they look like - cable diffraction spike will look like that - straight beam of light that grows fainter as one moves away from bright star. It will not show this: This is what happens when you have: moving bright star in frame and you shake telescope. Only other explanation that I can offer, and I'm certain it can't be true as it would imply something very silly, but it might lead you on a right track to figure out what is happening would be: You have a flip mirror that you flip during exposure for some strange reason and that causes shakes in the telescope. Action of flipping the mirror will move reflection of bright star across the image - in a straight line. There will be some shake and that will result in above wiggle line. There will be some "bounce" action that will create line on other side of the star. That is really my best guess - otherwise, I'm not sure I can come up with sensible answer.
  24. Indeed - this looks like sweet spot for versatile scope. Read somewhere (not really credible source - description on retailer website) that it has color correction comparable to 90/900. That might actually be true if more exotic but not overly expensive glass was used (for example Bresser 102/460 has some sort of ED glass in it, it still shows quite a bit of false color, similar to Skywatcher ST102 - but at 4" difference between F/4.6 and f/5 should be obvious). There are couple of scopes that have this focal length - 102/660. There is Celestron model and Bresser model. I wonder how this one compares to those two.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.