Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

Flats and things that can mess them up.


ollypenrice

Recommended Posts

  • Replies 42
  • Created
  • Last Reply
Perhaps the final answer is to construct pseudo flats from exposures of the actual sky taken around the time and place of the light frames, and at the full light-frame exposure time. Then we could just subtract the pseudo-flat and be sure it was a pretty accurate match. Bit time-consuming though!

This works fantastically well at flattening the background in the light, but the difficulty is getting enough signal-to-noise in the flat to avoid adding extra noise to the light.

NigelM

Link to comment
Share on other sites

This works fantastically well at flattening the background in the light, but the difficulty is getting enough signal-to-noise in the flat to avoid adding extra noise to the light.

NigelM

Yes, I suppose it's analagous to subtracting darks. You'd really need to capture and stack many sky-background exposures for the pseudo-flat, so not a very practical proposition for long-exposure deep-sky images at least.

Adrian

Link to comment
Share on other sites

A funny old game is flat fielding. If you think about it, it really should be so easy :eek:

Richard Crisp does create some controversy from time to time but at least he explains the theory and maths. Take a look here -

http://www.narrowban...elding_page.htm

Have a quick read of that page and then download his PDFs.

Dave.

And no, I'll not be taking questions :rolleyes:

Also. From personal experience I can tell you that the light source is very important. Laptop screens don't always work too well no matter what you're told. I even have the inverse flats to prove it. Somewhere.

Link to comment
Share on other sites

I'll have a read, but his web site design skills are definitely 'controversial', i.e. migrane inducing grey text overlaid on a background of more grey text is not exactly encouraging :(

Link to comment
Share on other sites

I really don't understand why people do this sort of thing with their websites :( When I see one of these I tend not to read it! Sometime highlighting the text can help as it changes the colours.

As for flats I have a Gerd Neumann Aurora panel and no problems whatsoever :) A new white T shirt worked quite well too but being able to grab flats without moving anything after imaging is a great benefit of the GN panel. So quick and east :)

Link to comment
Share on other sites

What are you doing snooping around SBIG website Olly :tongue: ?

My Geoptic panel creates green flats, but just as I always have a massive colour cast from the sky itself, any colour balance routine seem to work well fighting both these odd colours.

The science may well be perfectly correct in the article, but for the average amateur, I'm convinced reflections are to blame to far greater extent than slight spectral deviations.

I for one don't think that a light panel will all that accurately replicate how light reaches the sensor during imaging in terms of what angles photons may finally arrive at. I'm out on thin ice here perhaps, but I think a flat panel emitts a whole lot more light at rather odd angles than the typical night sky. Inside a tube these stray rays might well build up unwanted patterns on the CCD after numerous bounces, perhaps made worse the faster the system.

Rob, you have used a truss design I believe. Do you go about in the same way taking flats with a truss OTA, compared to a closed tube? Is the truss more forgiving in any way?

/Jesper

Link to comment
Share on other sites

My Geoptic panel creates green flats, but just as I always have a massive colour cast from the sky itself, any colour balance routine seem to work well fighting both these odd colours.

The science may well be perfectly correct in the article, but for the average amateur, I'm convinced reflections are to blame to far greater extent than slight spectral deviations.

I for one don't think that a light panel will all that accurately replicate how light reaches the sensor during imaging in terms of what angles photons may finally arrive at. I'm out on thin ice here perhaps, but I think a flat panel emitts a whole lot more light at rather odd angles than the typical night sky. Inside a tube these stray rays might well build up unwanted patterns on the CCD after numerous bounces, perhaps made worse the faster the system.

I agree that a colour cast in flats is not that big a deal; it definitely shouldn't be an issue in LRGB or narrowband imaging as you can adjust the exposure for each filter to ensure you have sufficient signal in the flat you create for that filter. You have to balance the colours yourself anyway when you combine.

It is more of an issue for OSC, since you have to get enough exposure in all three channels for a single exposure length. My lightbox gives out way more blue than anything else. I think my red channel is a bit near the readout noise level (but haven't measured it), and may also explain why I am getting a bit of residual vignetting since the blue channel is nearing the right-side of the histogram whilst the red is stuck near the left edge and the consensus seems to be that all three peaks need to be at least a third of the way in.

(It is expected to be unbalanced since I have used tri-colour LEDs and the output of the blue element is far more than green, and the red is pretty low. I have tried adding additional red LEDs but it hasn't helped much, so I am thinking of re-making and perhaps using poteniometers to adjust the colour balance of each channel manually. )

Again colour balancing the OSC resulting output is pretty easy in PixInsight now I have figured out how to do it easily (using the auto-STF function with the channels unlocked produces a good colour balance, and this can be made permanent by dragging it in to the histogram process). Though the images still tend to be a bit too green if left to automated processes due to the IDAS LP filter I think, but easy to tweak thereafter.

I don't think you're right about light panels not replicating sky sources. You absolutely do want photons from your flat light source to traverse every single path through your scope/camera systems; both the useful direct paths from source via lens/mirror to focus on the camera and via all the possible reflected paths. A single photon from the sky could traverse any of those paths and the flat is effectively a map of how much light makes it through all of those paths.

Let's say you took a closed-tube newtonian and you had a flat-field light source with some kind of 'barn doors' (i.e. those things you see on TV/movie lighting to direct the light). You could arrange things such that the light source only illuminated the actual mirror at the bottom of the tube, using lots of small barn doors to prevent any of it hitting the sides of the tube on the way in. Now take your flat frames. This would not be any use in flattening your light frame, since light from the sky (stars and background) does hit the sides of the tube on the way in and some of it gets reflected on to the mirror. None of that reflected light would be corrected in your resulting image and you'd end up with some kind of ring or donut.

That's the whole point the original article that started this thread is making. Basically cameras are senstive to IR to some extent, your telescope's paint/anodising/flocking/baffles will reflect some of the IR even though you can't see it, and if your flat light source doesn't emit IR then your flat won't correct for it. So either you need to make flats using a light source that replicates the sky (using the sky, using a wide enough spectrum light source) or you need to filter out the stuff you can't correct for, e.g. by using a narrowband filter or an IR blocking filter, or you need to use an anti-reflection surface that doesn't reflect IR or other wavelengths that you can't reproduce in your flats.

Link to comment
Share on other sites

Ian, you're probably spot on.

I merely struggle with the thought experiment that is light from flat panels vs the sky. What I wonder is if the distribution of photons from the whole range of directions really is bang on the money. In the real world photons from a portion of the sky can only reach the CCD within a very narrow angular window. The sky is off course big so photons will be coming from all directions to form the whole source within light grasp. The flat panel is on the other hand just a multiple of omnidirectional light sources in immediate proximity, where light from any corner of the panel can reach any corner of the inside of the OTA - unlike the sky.

When imaging, a nearby star out of view can cast very nasty reflections on an image - out of reach for the flats to cure. I ponder if the flat panel simply is an - albeit proven functional - array of such troublesome light sources, and this poses a bit of a thought problem to me...

My case is not even weak at best, so pardon my waffling...

/Jesper

Link to comment
Share on other sites

I merely struggle with the thought experiment that is light from flat panels vs the sky. What I wonder is if the distribution of photons from the whole range of directions really is bang on the money. In the real world photons from a portion of the sky can only reach the CCD within a very narrow angular window. The sky is off course big so photons will be coming from all directions to form the whole source within light grasp. The flat panel is on the other hand just a multiple of omnidirectional light sources in immediate proximity, where light from any corner of the panel can reach any corner of the inside of the OTA - unlike the sky.

Thinking about it this way might help. All you need to consider is that light travels in straight lines. A light 'ray' entering the scope from an effectively infinite distance (the sky) will travel through the scope on exactly the same path as a light ray that originates from a flat-field source that is just in front of the scope or even right on top of it. There is no line that you can draw from the sky in to the scope that you cannot draw from the surface of a flat field source just in front of it. The converse is also true, there is no path that the light from a flat field light source can take into the scope that cannot also be taken from the sky.

Clearly there is only a small angle through which rays can directly travel through the optics to focus on the camera chip. Regardless of the fact that one source of light is a set of objects at infinite distance and the other is right in front of the scope, both sets of rays follow the same paths. (You can't form a focussed image of the flat light source because it is too close to the lens or mirror, but we don't care about that, just that an equal number of photons follow all of the paths through the optics to the camera. Actually the fact that the source is out of focus helps, since it will deal with any small scratches, dust or small-scale variations in illumination from the source).

But there are many other rays that could end up on the chip and they all involve reflection off one or more surfaces in the scope/camera combination. Equally there are rays that will never end up on the chip at all since there is no path of reflections that will get them there before they are absorbed by the scope surface or reflected back out of it again. Again the flat-field source is a perfect analogue in that it produces rays that trace all of those paths, both the ones that reflect to the chip and the ones that never will.

The only wrinkle I can think of here is that you do not want the surface of your flat field source to produce specular (shiny) reflections itself, since that would produce reflections that wouldn't otherwise exist when imaging the sky. Using a surface that produces diffuse reflections is the solution to that (I have not seen a flat field EL panel, but I think they are made by sandwiching the EL foil behind a diffuser, and my lightbox certainly has a diffusing surface on both sides of the diffuser.)

The only case where it could be different is if you put the flat field source inside the scope tube, since now you have effectively blocked some of the paths that the light from the sky would take. In this I include any dew shield on the front of the scope. This answers an earlier question about whether you should do the flats with or without the dewshield; the answer is that you should do them however you took the lights so that the flat contains all possible reflection paths that the light does.

When imaging, a nearby star out of view can cast very nasty reflections on an image - out of reach for the flats to cure. I ponder if the flat panel simply is an - albeit proven functional - array of such troublesome light sources, and this poses a bit of a thought problem to me...

My case is not even weak at best, so pardon my waffling...

No it is helpful to think these things through from all angles (pun intended).

You are absolutely right that flats will not remove non-uniform/directional illumination. That could be from stars or other objects outside the FOV, or indeed stars and objects in the FOV with stray reflected light that makes it to the focal plane, or the moon off to one side illuminating part of the dewshield or tube wall, or the neighbour's security light doing the same, or a general glow from light pollution in the sky.

Conversely, if your flats didn't include light that got to the chip via reflected paths, then flat-fielding would add another layer of non-uniformity to the non-uniform illumination :) So instead of (say) a moon gradient getting brighter from the left of your image towards the right, you'd end up with a moon gradient that did so but had a hole or a bright spot in it due to under or over-correction by the flat.

You might say "who cares" but that applies all the way from gross and obvious sources of non-uniform illumination to really small and subtle ones. The whole calibration process is about peeling away the unwanted layers in your image in a consistent manner until you are left with the best representation of what your camera actually recorded. You don't need to be adding more unwanted variation to the image when your objective is to remove it.

Flats ensure that if there was no directional illumination (whatever the source) then you would end up with a perfectly flat, even sky background in the image . In the real world you will still end up with gradients or reflection artefacts if you have too much stray light bouncing around the scope whatever the source. (Of course flats also deal with vignetting due to the fact that the optical system does not perfectly convey the light to all parts of the FOV, dust shadows and variations in the response of each sensor element in the camera).

Put it another way, if you make a flat, and then make another flat and use it to flatten the first flat, you should end up with an image where every pixel has the same value. (Doesn't happen in the real world, due to the different noise sources, but if you had a perfect flat it would). You still have to process out the gradients some other way, and more specific reflection artefacts are prettty hard to get rid of unless you clone them out in PS, which is why everyone spends so much time on baffles, flocking, anti-reflection coatings on lenses, and also why the moon is such a bane to broadband imagers.

Again, back to the original point, if your flat doesn't represent the way light travels through your scope/camera, then you will introduce unwanted artefacts.

Link to comment
Share on other sites

Thanks Ian for your time to write down these answers.

One day the penny might drop at my end. Going back to my out of sight star that causes an unwanted reflection though, how come the flat panel doesn't introduce these unwanted reflections given that it sends so much light at all angles. Is it simply so that it actually does, but does it so uniformly, that there only remains the slight variations caused by the actual optics?

From my experience I have come to the conclusion that not all sets of flats are the same. Some exposure/brightness settings just don't work for me, and I generally go 'low and slow' - meaning brightness of the panel and exposure time - to get a good flat. If I go 'high and fast' it generally derails entirely. Is this then glare from the panel material playing a part as you suggest - I can certainly see how this can happen, or is it a tiny portion extra of IR that's allowed to have an influence? Or is it simply entering domains that the CCD isn't designed to handle. (It's perhaps like imaging the moon with my OSC - it simply can't be done regardless of exposure time).

I agree that the dew shield should be in place. I find them useful for more that dew! And indeed OS states that the dew shield of their Veloce series of fast astrographs form a part of the optical system - despite being a mere barrel.

/Jesper

Link to comment
Share on other sites

Some good and interesting stuff here, great thread.

Jesper, with my truss scope, I have a camping mat dewshield round it all the time so it's more like a closed tube as far as stray light is concerned. I have too many light sources near my obs so as well as keeping dew under control, I have to have some sort of light shroud.

The point about slower, dimmer flats is similar to something I've found too. I tend to shoot quite low ADU flats (8-15k) and find that these work much better than ones with a higher ADU.

In practice, I'm more concerned about my flats removing dust artifacts etc than gradients, as there are several other ways to remove these, although flats are obviously the best. Dust bunnies are much more of an issue though, especially with a heavily stretched image.

Cheers

Rob

Link to comment
Share on other sites

Going back to my out of sight star that causes an unwanted reflection though, how come the flat panel doesn't introduce these unwanted reflections given that it sends so much light at all angles. Is it simply so that it actually does, but does it so uniformly, that there only remains the slight variations caused by the actual optics?

Yes the flat panel will produce those reflections, which is what you want. No that doesn't just leave the optical/etc. variations (in the flat) it also leaves the variations cause by the reflections. You need to be dividing apples by apples when you flatten, and if you managed to leave out (some component of) the reflections (e.g. by having a light source that lacks IR as per the original article), you end up with dividing your apples by doughnuts, quite literally in the examples we have seen.

From my experience I have come to the conclusion that not all sets of flats are the same. Some exposure/brightness settings just don't work for me, and I generally go 'low and slow' - meaning brightness of the panel and exposure time - to get a good flat. If I go 'high and fast' it generally derails entirely. Is this then glare from the panel material playing a part as you suggest - I can certainly see how this can happen, or is it a tiny portion extra of IR that's allowed to have an influence? Or is it simply entering domains that the CCD isn't designed to handle. (It's perhaps like imaging the moon with my OSC - it simply can't be done regardless of exposure time).

Yes most discussions of flats report that not all of them work and you need to get the correct brightness level in the flat for them to work well. I'm not sure why, but I dug out my HAIP last night and I'm going to read that first and then look a bit more at the maths as I suspect slightly different algorithms may be used by different packages; really it should be a simple enough formula and if properly implemented I can't see why the actual brightness of the flats is so critical (must be things to do with normalisation, dynamic range of image vs. flat, etc.)

Noise is a significant issue but that is easier to understand. You just need to treat flats like darks and lights; take plenty and stack them. Opinions vary about the need for things like flat darks; the standard texts say to use them but PixInsight just rescales the normal darks by trying to match noise levels in the dark and the light or flat. Using a DSLR it doesn't work at all for me as it never finds a correlation, probably due to the funky processing on camera making dark current non-linear and screwing up their approach.

Link to comment
Share on other sites

..... I can't see why the actual brightness of the flats is so critical (must be things to do with normalisation, dynamic range of image vs. flat, etc.) ...

One reason is to ensure that the camera is operating comfortably within the linear part of its response curve, avoiding the non-linear region at each end of the dynamic range. I always assumed that was the only reason but perhaps there's more to it than that.

Adrian

Link to comment
Share on other sites

One reason is to ensure that the camera is operating comfortably within the linear part of its response curve, avoiding the non-linear region at each end of the dynamic range. I always assumed that was the only reason but perhaps there's more to it than that.

Yes that's what I find puzzling, most CCDs have a pretty wide linear response range so the actual exposure shouldn't matter much if you get within that range, but people seem to report problems even when they should be in the comfort zone. On the other hand, it's not so surprising that DSLR users are stuck with a bit more trial and error since the relationship between exposure and the numbers that you get from the camera is definitely more murky.

Link to comment
Share on other sites

Yes, if people are finding that a moderate change in brightness of flats (within the linear zone) DOES make a difference, there must be some other non-linear behaviour going on somewhere.

Add flat-fielding to auto-guiding on the list of things more art than science :undecided:

Adrian

Link to comment
Share on other sites

  • 1 month later...

As an update to my previous post about having problems with my light box not giving out enough red, I built a new one using separate red, green and blue LEDs and took great care over the data sheets to try to match the intensity of each using resistors and mini-potentiometers.

I still haven't manage to equalise the RGB channels with my DSLR (I suspect the issue is somwhere between the wide margin of error the LEDs have and the differing sensitivities of the three filters on the chip). Even so, I now have red green and blue peaks that are away from the ends of the histogram and the results are a lot better (far less noise in the red channel). I still have to colour balance after calibration and stacking, but that is not a problem.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.