Jump to content

SkySurveyBanner.jpg.21855908fce40597655603b6c9af720d.jpg

50 Shades of Grey


Thalestris24

Recommended Posts

... or gray :)
As an imager who obsesses over things like bit depth, detail, resolution, colour balance etc etc I was wondering how many shades of grey I could distinguish. I know my monitor can display 2^8 =256 levels but that doesn't mean I can actually see them all as different (my age + early stage cataracts probably don't help...). Apparently the normal human eye can only discern between 30 and 50 shades of grey. Anyway, I had a go at an online (but non-scientific test) and it said I could distinguish 37. I found it really difficult to decide whether two close shades were actually different or not. See how you fare! Here's the link:

http://time.com/4663496/can-you-actually-see-50-different-shades-of-grey/

So, if imaging in monochrome, only 12-bit images (4096 levels) should still give very good reproduction and 16-bit (65536 levels) images may actually be overkill? I can still only distinguish a mere 37 levels! But probably combining false colour narrowband images helps to distinguish details that otherwise might not be so apparent.

Just musing: According to different sources, the trichromatic eye can distinguish between 1 and 10 million different colours (100 different shades for each of 3 different cones), perhaps 7 million on average. But I don't think that suggests that colour astro imaging is better! I don't think there are very many different colours out there in space from a telescope + camera point of view. On the other hand, when processing and viewing a colour image we can dynamically alter the colour relationships so maybe that gives us, in principle, a way of seeing more levels via colour than in mono? I'll have to think about that one...

Louise

Link to comment
Share on other sites

I can do 42.

But I would like to see that same test done a bit differently - Whole monitor being gray and blocks being full gray - so no either black or white borders. These introduce color bias depending on how much surrounding is dark or light - our eye/brain adapts to that and we can see shade as being more bright or less bright.

Check out this video (optical illusion on gray):

 

Link to comment
Share on other sites

45 minutes ago, Thalestris24 said:

So, if imaging in monochrome, only 12-bit images (4096 levels) should still give very good reproduction and 16-bit (65536 levels) images may actually be overkill?

Don't forget there can be quite a difference between the effective bit depth and the bit depth of the AD converter. 

This is from Christian Buil

"a remark about CMOS and CCD sensors compared dynamic. For the very popular CCD camera ATIK 460EX, the measured electronic gain is 0.274 e-/ADU, the RON is 5.5 e- and digital coding is on 16 bits, so the true dynamic is 11.67 bits… CMOS technologie is not so ridiculous !"

The additional read noise eats up the extra bits!

Regards Andrew

 

Link to comment
Share on other sites

13 minutes ago, vlaiv said:

I can do 42.

But I would like to see that same test done a bit differently - Whole monitor being gray and blocks being full gray - so no either black or white borders. These introduce color bias depending on how much surrounding is dark or light - our eye/brain adapts to that and we can see shade as being more bright or less bright.

Check out this video (optical illusion on gray):

 

Yeah, there are lots of examples of optical illusions out there!

The Time.com test does say it's not scientific and what you can actually see can be influenced by many different factors. Still, it gives one an idea.

Link to comment
Share on other sites

Hi Louise

The eye and brain can be satisfied with quite poor quality.

For instance. 300dpi is good enough for a 6x4 print - that's 1800x1200 pixels.

So why do we buy 20Mp cameras ?

And 8bit is fine for a display.

But for astro a heavy gamma stretch of an 8bit DSO image would not do, hence the need for 12, 14, 16 bit camera.

Michael

Link to comment
Share on other sites

23 minutes ago, michael8554 said:

Hi Louise

The eye and brain can be satisfied with quite poor quality.

For instance. 300dpi is good enough for a 6x4 print - that's 1800x1200 pixels.

So why do we buy 20Mp cameras ?

And 8bit is fine for a display.

But for astro a heavy gamma stretch of an 8bit DSO image would not do, hence the need for 12, 14, 16 bit camera.

Michael

Indeed, the stretching is the thing! However, people still produce nice images from 12 bits i.e. dslrs and zwo/qhy 4/3 cmos and similar. Screens are only 96 dpi but it's how we view our astro images. I've never actually printed out an astro image even though my printer is theoretically capable of 1200 dpi. I suspect a printed image wouldn't be so satisfying or have the same sort of quality. Maybe I'll try it one day even though I wouldn't know what to do with a print!

Louise

Link to comment
Share on other sites

On 12/06/2018 at 01:01, Thalestris24 said:

Indeed, the stretching is the thing! However, people still produce nice images from 12 bits i.e. dslrs and zwo/qhy 4/3 cmos and similar.

Yes, but the combination of subexposures and use of floating point values can increase the bit depth substantially over the source 12bit. The real need here in having the extended number of shades of grey well over what we can discern is the stretching in image processing - making what are disparate intrinsic brightnesses visible in the same image.

Surprised to score 41 on a dodgy mobile screen!

Link to comment
Share on other sites

4 hours ago, coatesg said:

Yes, but the combination of subexposures and use of floating point values can increase the bit depth substantially over the source 12bit. The real need here in having the extended number of shades of grey well over what we can discern is the stretching in image processing - making what are disparate intrinsic brightnesses visible in the same image.

Surprised to score 41 on a dodgy mobile screen!

Yes yes of course you can increase the s/n by calibration and stacking, and you can selectively enhance levels by stretching and adjusting. My train of thought, though, was that one can still only ever see a limited number of greyscale levels. A monitor can reproduce many many more colours than greyscale levels and the eye can discern a lot more rgb colour shades too. It follows that a pseudocoloured greyscale image (as opposed to a false colour one) could, in principal, show a lot more different levels than the original greyscale one (resolution would stay the same). You'd have to have some software that would create the pseudocolors based on the greyscale levels but I don't think that would be difficult to do. It's interesting to flip between a full colour image and a greyscale version. If you set it up under Windows 10, ctrl+Win+c switches between a colour display and a monochrome greyscale one. If like me, you watch tv on your PC, then you can go retro and watch in b+w! Nostalgic :)If I can get into it, I'll see if I can write something that will enable pseudocolouring according to greyscale value. I just have to be in the mood...

Louise

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.