Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

Rainbow colours 🌈


Recommended Posts

I second the above comment, it sounds interesting!

I would guess that beyond the red part, we'd see some purple. There is a lot of NIR floating around in daylight so it may be hard to discern from the background purple-ness that induces in full-spectrum DSLRs however.

For UV I have a harder time guessing. I think our CMOS sensors are typically only sensitive about 50-100NM below the visible spectrum on that end because there is a harsher dropoff.

  • Like 1
Link to comment
Share on other sites

IR and UV would be visible.

None of recorded colors would look like original.

When shooting pure spectral colors, in principle we can record them (depends on camera gamut) - but in all likelihood we are lacking display capable of showing them.

Take look at this graph:

434px-CIExy1931_sRGB_gamut_D65.png

It is something called xy chromaticity graph and in this particular case it shows sRGB color space (what computer screens can usually show) versus whole gamut of human vision.

Outer border of this shape is where spectral colors lie (and in fact they have been labeled with labels from 380 to 700 representing nanometers - wavelength of particular spectral color).

Any type of screen that we produce and use to display colors is usually made with 3 components (we often call them R, G and B ). Each of those colors our screen use to mix resulting color will be a dot on above graph.

Display is capable of showing only colors in triangle formed by those three dots (or if we used more primaries - only in convex hull made by connecting those dots).

This shows that we can never render more than 3 spectral colors with trichromatic display (and only in case that primaries are in fact spectral colors).

Look at this diagram here:

image.png.45ab0ccf4fca271d701c692852d2d930.png

It is same as above - but showing different color spaces. You can note certain things about some of color spaces.

1. Color space does not need to have only 3 components. CMYK for example has 4 and this in turn means that gamut won't be triangle (it has more vertices because of defined primaries - using color on paper is not the same as generating light so don't be confused that number of colors must be number of vertices - it is number of primaries (light emission) that is equal to number of vertices)

2. ProPhoto RGB is "virtual" color space - no display in the world can be made to show full ProPhoto RGB profile. Two of three colors it uses do not exist in nature (they lie outside of gamut). It is only mathematical color space

RGB color spaces use different primaries (different blues, greens and reds) and you can't simply treat R, G and B as R, G and B always - you need to know what color space you are working with. This is the reason people get different colors for objects in astrophotography - they don't pay attention to color spaces (especially raw of their camera and conversion to standard color space).

Some cameras can't even record spectral colors - they have narrower gamut.

In order to uniquely record each spectral color - following must happen:

image.png.1dd8bb79d3b81f85754237cf5707d8fa.png

When you look at your QE response for each color - for every X coordinate (wavelength in nm) - you must have unique triplet of RGB values - I market triplet of RGB sensitivity for 500nm above.

This triplet must be unique in order to uniquely record this spectral color.

We can see that in above image - we can't distinguish colors above about 900nm - sensitivities are the same so each triplet is effectively 1:1:1 - we can't know if we recorder 900nm, 950nm or 1000nm if we see pixel that has 1:1:1 values (or 0.2:0.2:0.2 - as actual value will depend on exposure length or brightest pixel in image if there is scaling to 0-1 of all pixels).

On the other hand Mono + RGB is notoriously poor at recording pure spectral color - look at this:

image.png.9c0fbc4c33369fcb4b9bda5eaf91120c.png

Take any spectral color in say 600-700nm range. It will only record red component and no blue or green. So any pixel value will be in form of X:0:0 - where X is arbitrary number (depends on exposure length).

You can't distinguish any spectral color in that range. Same is true for most of G and B range as well.

Bottom line:

- UV and IR will be recorder recorded

- No display will ever show image correctly (until they make display that can show individual spectral colors instead of working with primaries)

- not all cameras are capable of even recording / distinguishing (they can record - count photons, but can't make distinction between different wavelengths) spectral colors.

Edited by vlaiv
typo
  • Like 4
  • Thanks 1
Link to comment
Share on other sites

It is an interesting question but even if it is recorded by the sensor your eyes won't see it anyway unless we map the UV/IR to a colour range our eyes can see. I know my eyes can't see in the infrared range nor the ultra violet - maybe some folks are more fortunate - or unfortunate - depending on how you look at it! I don't believe eyes are like ears - my son can clearly hear bats - I just hear tinnitus :( 

Edited by Adreneline
Typo
Link to comment
Share on other sites

31 minutes ago, Adreneline said:

“depending on how you look at it! I don't believe eyes are like ears - my son can clearly hear bats - I just hear tinnitus :( “

always amuses me when some people say “ you see what I’m saying?” Nope, but I heard it.😁

chaz

 

Edited by Chaz2b
  • Like 1
Link to comment
Share on other sites

3 minutes ago, DaveS said:

Have a look at These filters from Astronomik, you'll have to scroll down to see the curves.

Don't know if anyone uses them though.

Actually, as far as color gamut is concerned, best filters to combine with mono are absorption filters rather than interference filters.

Due to lower QE they are not as popular, but they are much cheaper.

filterkurve_farbfiltersatz.jpg

Good combination would be blue, green and orange. However, blue and green are only 70% at peak transmission.

Up side with absorption filters is that you lower probability of nasty reflections :D (if properly implemented of course with AR coatings).

  • Like 1
Link to comment
Share on other sites

 

11 hours ago, pipnina said:

 

For UV I have a harder time guessing. I think our CMOS sensors are typically only sensitive about 50-100NM below the visible spectrum on that end because there is a harsher dropoff.

The UV end sensitivity limit can depend largely on the rest of the optics, particularly the antireflective coatings that are designed for visible wavelengths. If you pick the right camera sensor and optics though it is possible to measure down to at least 320nm with the ultimate cutoff at ~300nm due to the ozone layer. See this spectrum by Christian Buil and a UVEX spectrograph for example

http://www.astrosurf.com/buil/UVEX_project_us/images/image-collee-810.jpg

At the IR end an unfiltered camera can potentially reach to ~1200nm (with holes due to absorption of oxygen and water vapour) but with colour cameras the spectrum tends to look white beyond ~750nm as there all three colour filters become transparent, like this "rainbow" of a cool star at the top taken by Christian Buil and an unfiltered DSLR for example

http://www.astrosurf.com/buil/staranalyser/wr5_2.jpg

When photographing the full spectrum of a rainbow though I suspect the limiting factor will be the low contrast between the rainbow relative to the sky background as the sensitivity of the camera drops off in the UV and IR. I too would be interested to see an example ! (Rainbows taken with UV and IR pass filters could be interesting)

Cheers

Robin

Link to comment
Share on other sites

22 hours ago, vlaiv said:

Actually, as far as color gamut is concerned, best filters to combine with mono are absorption filters rather than interference filters.

Due to lower QE they are not as popular, but they are much cheaper.

filterkurve_farbfiltersatz.jpg

Good combination would be blue, green and orange. However, blue and green are only 70% at peak transmission.

Up side with absorption filters is that you lower probability of nasty reflections :D (if properly implemented of course with AR coatings).

Astrodon do a set of Johnson-Cousins photometric filters that might fit the bill. Baader also do a set, using a combination of coloured glass and dielectric coating.

  • Like 1
Link to comment
Share on other sites

2 minutes ago, DaveS said:

Astrodon do a set of Johnson-Cousins photometric filters that might fit the bill. Baader also do a set, using a combination of coloured glass and dielectric coating.

If we are talking color reproduction, I'd probably choose following combination:

Wratten #80A:

image.png.9ee5fb6acb006e42ff27e747ff4a3591.png

#8:

image.png.ecf2a0164bbb844fc977d3e76030aa32.png

And regular UV/IR cut filter.

In fact I would use UV/IR cut all the time, and use three channels: No filter, #80A and #8

These are all very affordable and have very high total transmission. They separate spectrum well enough so that we can get proper colors with color correction matrix.

Only place where they have issue is around yellow/orange part (in above graphs)  - that is 570-610nm range, luckily - most cameras have sloped curve at that point:

image.png.330dda2e74db50cbd33f64dd469215c1.png

and total per filter response will depend on both camera response and filter response multiplied.

 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.