Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

Light units and information


Recommended Posts

I would like to start a thread that I will contribute to very little. Maybe a question here and there- I hope to learn a bit about something that continues to puzzle me.

How much information (units of light) does it take to form a complete representation or image that we can photograph or see visually? ie does one packet of light or one photon contain all the information about an object?

Any thoughts are much appreciated, Gerry

 

 

Link to comment
Share on other sites

I'll be interested in this thread too so replying in part to follow and in part to say that I'm certain single photon does not contain full information about an object.

It would be a good start to define what is full information about an object - and even if there is such a thing :D

 

  • Like 1
Link to comment
Share on other sites

I'd guess that a single photo can only carry information about that part of an object(s) it was emitted/reflected/refracted by. To get the full "picture" of an object you need to collect photons emitted/reflected/refracted by all regions of the object, else risk making assumptions.

Or from another angle, isn't a photon the definition of the smallest unit of information? Need a lot of information to build a picture!

  • Like 1
Link to comment
Share on other sites

Some background to start off with. The best theory of light and matter is called Quantum Electrodynamics a subset of Quantum Field Theory. 

I am not an expert on this topic but I have spent many happy hours trying to get a grasp of it. The unfortunate thing is that almost everything you have read about it in popular science books and articles is wrong! This is even true when written but eminent scientists. About the best introduction is "QED: The Strange Theory of Light and Matter" by Richard Feynman but even this can give you the wrong impression about what is physically happening rather than how things are calculated.

The key thing about QED is that it is a field theory. It is about how the electromagnet field behaves and interacts with matter. It is a wave theory and particles are secondary.

A photon is an excitation of the EM field and can be created (emission) or annihilated (absorbed). However, apart from these two events they are hard to pin down.

Photons don't have a position operator so you can't even ask where they are (only where they are created or annihilated) . The concept of a light beam as a stream of photons is completely wrong. Most states of the electromagnetic field don't have a well defined number operator so you even can't ask how many photons are there! Most of the time the excitations of EM field are delocalised. Its best to think of photons only at the act of creation and annihilation.

So the OP's question is difficult to address directly.  But, to try anyway lets take the example of the image of a narrow slit illuminated from behind and the image captured on a screen (or CCD etc.) If we reduce the intensity of the beam so there is only one photons worth of energy in the beam at a time the image on the screen will be built up one detection at a time as a photon is annihilated at that spot. If you go on long enough you will see a well formed diffraction pattern emerge (the image of the slit). So it comes down to how well you want the pattern defined to be clear what it is. In other words what signal to noise ratio is acceptable in the image. 

That's my mind dump for now. Not sure if it helps or if it is the type of discussion you wanted @jetstream.

Regards Andrew 

Edited by andrew s
  • Like 1
Link to comment
Share on other sites

Looks like somone is (as ever?) researching this sort of thing:
https://medium.com/the-physics-arxiv-blog/researchers-take-low-light-images-using-less-than-one-photon-per-pixel-22f03c391235
And the nature of object and (subjective) quality of image does play a role...

The idea of *triggering* detectors to reduce noise is often used.

But then I only typed in: "How many photons constitute an image".
"Search" is quite good re. random questions in plain language? 🤡
 

Edited by Macavity
  • Like 1
Link to comment
Share on other sites

Thanks for the link @Macavity a very interesting paper. I scanned the arxiv paper and it is a fascinating read. Not too technical.

Given the need to illuminate the target fully with entangled photons it's not applicable to Astronomy just yet, pity.

Regards Andrew 

  • Like 1
Link to comment
Share on other sites

Due to the random nature of photons, to measure the brightness (flux) from the object to say 1% you need 100^2 = 10^4 photons

To resolve the object into say 100x100 spacial points you then need to do this for 10^4 points so you need a total of 10^8 photons

If we then also want to know  the spectral distribution ie the distribution of energies emitted, we  then need to know this for each wavelength. If we resolve this from say 4000 to 7000 Angstrom (visible light) at say 1 Angstrom intervals we need to do this 3000x so our total photon count is now 3x10^11

This is for one instant in time. If you want to know how it changes with time you then have to repeat this for each time interval

Note this assumes perfect noise free detectors and no interfering photons from other sources

Robin

Edited by robin_astro
  • Like 2
Link to comment
Share on other sites

Then of course this only tells you about what the object looks like from a particular viewpoint.

To know everything about the light from the  object to the highest precision you would have to collect all the light. Anything else is just an approximation

(Anyone else watch DEVS  ? 😉  )

  • Like 1
Link to comment
Share on other sites

5 hours ago, andrew s said:

Not sure if it helps or if it is the type of discussion you wanted

Oh yes, I like to hear ideas on this. I have a dog eared copy of QED and a more dog eared brain lol!

As the light travels through space, near and around objects with gravity and finally reaches us is it a true approximation of the object at the instant of time we imaged it?

 

Link to comment
Share on other sites

1 hour ago, robin_astro said:

Then of course this only tells you about what the object looks like from a particular viewpoint.

Is it possible or likely that an object could look different from a slightly different viewpoint? Can gravity have an effect on what we see or image with respect to the light wave stream and photons of an object?  If so would this be static or an ever changing dynamic?

Link to comment
Share on other sites

3 hours ago, Macavity said:

Looks like somone is (as ever?) researching this sort of thing:
https://medium.com/the-physics-arxiv-blog/researchers-take-low-light-images-using-less-than-one-photon-per-pixel-22f03c391235
And the nature of object and (subjective) quality of image does play a role...

The idea of *triggering* detectors to reduce noise is often used.

But then I only typed in: "How many photons constitute an image".
"Search" is quite good re. random questions in plain language? 🤡
 

Excellent link- thanks Chris. I'm glad I'm not the only one wondering about this!

Link to comment
Share on other sites

36 minutes ago, jetstream said:

Is it possible or likely that an object could look different from a slightly different viewpoint? Can gravity have an effect on what we see or image with respect to the light wave stream and photons of an object?  If so would this be static or an ever changing dynamic?

I was thinking of something much more prosaic. To completely reconstruct  an asymmetric object we need to view it from different angles. For example from here on Earth we cannot tell what the reverse side of the moon looks like. Or with a galaxy, observations from a different viewpoint would reveal more about the  internal structure. .

  • Like 1
Link to comment
Share on other sites

3 hours ago, andrew s said:

Thanks for the link @Macavity a very interesting paper. I scanned the arxiv paper and it is a fascinating read. Not too technical.

Given the need to illuminate the target fully with entangled photons it's not applicable to Astronomy just yet, pity.

Regards Andrew 

There is little "magic" in this though as far as I can see. The entangled pairs are merely being used to reduce the background noise so that every photon counts. The image has only a single bit depth (black or white) and the rest is image processing based on assumptions or a priori knowledge about the image.

Cheers

Robin

Link to comment
Share on other sites

11 hours ago, robin_astro said:

There is little "magic" in this though as far as I can see. The entangled pairs are merely being used to reduce the background noise so that every photon counts. The image has only a single bit depth (black or white) and the rest is image processing based on assumptions or a priori knowledge about the image.

Cheers

Robin

Probably a matter of taste but the experimental setup is pretty magical to me. 

I agree the image reconstruction is quite "basic".

Regards Andrew 

Edited by andrew s
Link to comment
Share on other sites

12 hours ago, jetstream said:

As the light travels through space, near and around objects with gravity and finally reaches us is it a true approximation of the object at the instant of time we imaged it?

 

Clearly, gravity distorts our image of objects. Just look at the images of Einstein rings, crosses and other shapes. It can also enhances the apparent luminosity of the object.

We see the object as it was when the light was emitted not as it is now . Indeed as the light from different elements of an Einstein cross (or ring etc.) travel different paths they were emitted at different times. This allows us to see some events, live, more than once.

Regards Andrew 

  • Like 1
Link to comment
Share on other sites

12 hours ago, jetstream said:

Is it possible or likely that an object could look different from a slightly different viewpoint? Can gravity have an effect on what we see or image with respect to the light wave stream and photons of an object?  If so would this be static or an ever changing dynamic?

A bit like the last reply. Strong gravity acts like a badly made lens. Look at an object through the bottom of a wine glass. You can see significant distortions which shift as you move it. Similarly as the gravitating objects move or as our perspective changes the image shifts and the distortions vary.

Regards Andrew 

PS the effects are due to physics not the effects alcohol! 

Edited by andrew s
  • Like 1
Link to comment
Share on other sites

1 hour ago, andrew s said:

PS the effects are due to physics not the effects alcohol! 

:grin:

1 hour ago, andrew s said:

It can also enhances the apparent luminosity of the object.

I wonder why and how this can happen?Does the lens "shrink" the objects apparent size and by this seem brighter?

So as gravity warps spacetime I wonder what happens to the streams of light that are following it? are there photons in the mix?

Entanglement...isn't this just the bees knees of puzzlement. I also wonder how much of a light stream or photons the eye brain takes to form an image- I would presume that we need a steady stream of light to do it. But then again I think our brain stores images viewed through a telescope and each viewing session builds on the next.

Another scope aperture question after this ^^ lol!

Link to comment
Share on other sites

1 minute ago, jetstream said:

I also wonder how much of a light stream or photons the eye brain takes to form an image- I would presume that we need a steady stream of light to do it. But then again I think our brain stores images viewed through a telescope and each viewing session builds on the next.

I think that it is something like 7 photons that is threshold to trigger sensation.

Eye/brain is very interesting in how it processes image. It is well capable of seeing very low photon levels and we know that at these levels - there is big level of noise - but we never see that noise.

This is because our brain has some sort of noise filter.

I only once managed to see noise associated with light and that was "cheating" - I took Ha filter (Baader one 7nm) and was in relatively dim room while outside was bright summer day with plenty of sunshine. I held filter in front of my eye (not too close) and looked outside. It was noisy image!

I guess couple of factors contributed to trick the brain to shut off denoising. It was bright outside - enough for pupils to contract. It was rather dim in the room itself (compared to outside) - I guess that prevented too much stray light to interfere. It was not very dark so I guess brain was still in daylight "mode" - expecting plenty of light.

  • Like 1
Link to comment
Share on other sites

9 minutes ago, vlaiv said:

I think that it is something like 7 photons that is threshold to trigger sensation.

Is this right Vlaiv? I'm actually shocked at this- excellent info.

 

10 minutes ago, vlaiv said:

This is because our brain has some sort of noise filter.

Ah, youv'e uncovered part of my desire to understand photons and how many to form an image...! I agree about the filter and want to know more about this. I can say we can wreck our eyes during the day for night use later. Not sure how it happens. Polarized LED lighting (my 65" TV) absolutely wrecks my eyes - including the filter.

It is incredibly amazing what I see through a scope when things are right with the eye/brain system. Some physicists say it has to be noise- but it is not and a few others see this too-very faint background patches (dust etc?) all over the place, richer in areas than others.

I would like to understand how to ensure the filter will work through technique and also if it can be enhanced somehow... thoughts?

I also need to know more about aperture and light streams or packets...

Link to comment
Share on other sites

21 minutes ago, jetstream said:

Is this right Vlaiv? I'm actually shocked at this- excellent info.

 

Ah, youv'e uncovered part of my desire to understand photons and how many to form an image...! I agree about the filter and want to know more about this. I can say we can wreck our eyes during the day for night use later. Not sure how it happens. Polarized LED lighting (my 65" TV) absolutely wrecks my eyes - including the filter.

It is incredibly amazing what I see through a scope when things are right with the eye/brain system. Some physicists say it has to be noise- but it is not and a few others see this too-very faint background patches (dust etc?) all over the place, richer in areas than others.

I would like to understand how to ensure the filter will work through technique and also if it can be enhanced somehow... thoughts?

I also need to know more about aperture and light streams or packets...

Here is some starter info:

http://math.ucr.edu/home/baez/physics/Quantum/see_a_photon.html

Maybe most important paragraph is the first one:

Quote

The human eye is very sensitive but can we see a single photon?  The answer is that the sensors in the retina can respond to a single photon.  But neural filters only allow a signal to pass to the brain to trigger a conscious response when at least about five to nine arrive within less than 100 ms.  If we could consciously see single photons we would experience too much visual "noise" in very low light, so this filter is a necessary adaptation, not a weakness.

As for daytime exhaustion of Rhodopsin - take a look here:

https://en.wikipedia.org/wiki/Night_vision

Biological night vision section.

Here is interesting part:

Quote

All photoreceptor cells in the vertebrate eye contain molecules of photoreceptor protein which is a combination of the protein photopsin in color vision cells, rhodopsin in night vision cells, and retinal (a small photoreceptor molecule). Retinal undergoes an irreversible change in shape when it absorbs light; this change causes an alteration in the shape of the protein which surrounds the retinal, and that alteration then induces the physiological process which results in vision.

The retinal must diffuse from the vision cell, out of the eye, and circulate via the blood to the liver where it is regenerated. In bright light conditions, most of the retinal is not in the photoreceptors, but is outside of the eye. It takes about 45 minutes of dark for all of the photoreceptor proteins to be recharged with active retinal, but most of the night vision adaptation occurs within the first five minutes in the dark.[4] Adaptation results in maximum sensitivity to light. In dark conditions only the rod cells have enough sensitivity to respond and to trigger vision.

Interesting fact - keep your liver clean if you want good night adaptation.

High levels of light during day cause depletion of these chemicals and it takes a day or two for things to get back to normal.

  • Like 2
Link to comment
Share on other sites

In GR gravity is the bending of space time. Light just follows the straightest  possible path through the curved space time. The straightest path is called a geodesic, and anything in freefall follows them e.g. planets in orbit round their suns.

It's best not to think of photons in this context, good old fashioned wave theory is best in this context.

Regards Andrew 

  • Like 1
Link to comment
Share on other sites

49 minutes ago, jetstream said:

Is this right Vlaiv? I'm actually shocked at this- excellent info.

 

Ah, youv'e uncovered part of my desire to understand photons and how many to form an image...! I agree about the filter and want to know more about this. I can say we can wreck our eyes during the day for night use later. Not sure how it happens. Polarized LED lighting (my 65" TV) absolutely wrecks my eyes - including the filter.

It is incredibly amazing what I see through a scope when things are right with the eye/brain system. Some physicists say it has to be noise- but it is not and a few others see this too-very faint background patches (dust etc?) all over the place, richer in areas than others.

I would like to understand how to ensure the filter will work through technique and also if it can be enhanced somehow... thoughts?

I also need to know more about aperture and light streams or packets...

Yes the eye brain is a very clever and complex system. @vlaiv has given some great references. However, it is not infallible,  it seek patterns where none exists, it fills in were data is missing. (My tinnitus is a good example where it is filling in for the loss of high frequency hearing.  Ear brain in this case.) 

Expectations can reliably condition what is and is not observed. 

Regards Andrew 

Edited by andrew s
Link to comment
Share on other sites

29 minutes ago, jetstream said:

I also need to know more about aperture and light streams or packets...

Ok - that is really tough question. What is light made of and what are photons in fact?

We often think of light as either wave or stream of some sort of bullet like particles. Neither is in fact true.

Best way to describe that phenomena is QED - and language is mathematics. If we start to "visualize" things or try to find analogy in everyday life - we are bound to get it wrong. However - we don't necessarily need to be right here - we can still try to describe it in layman terms and perhaps wrong analogies.

Let's start with waves, or rather wave function. Imagine a pond and a rock is thrown in - waves start to spread out from that location.

ripples.jpg

Would light be that ripple? What is photon in this case?

That ripple is actually a wave function. It is complex valued function - something that ripples in space in electromagnetic quantum field. Depending if you subscribe to notion of wave function being element of reality or not - you'll be in favor of some interpretations of QM versus others. See here to be confused further: https://phys.org/news/2012-04-quantum-function-reality.html

In any case - we must look at that wave function, real or not (not in the sense that it is just mathematical tool used to calculate things rather than being something that actually "waves around') and see how it behaves in order to understand photons and light.

Imagine now having two simple cameras that are trying to detect "photons" from this wave - one on each side. What happens when wave hits either camera? Well one of two things can happen - either nothing or one of cameras detects photon and whole wave changes shape instantaneously.

Wave can hit camera and do nothing. That is important thing to realize. Also important thing to realize is if we assume that only one photon is in this wave disturbance (here we are making a mistake - I'll explain later why is that) - once that photon "hits" something - either camera A or camera B - whole ripple changes at the same time. Does not make sense? Well - spooky action at a distance - faster than light thing, you are not the only one to be baffled by this :D

Now lets look at why we can't really say there is one photon in that ripple (or rather poor analogy for that).

image.png.88d80e9ca382212967f3c7ae016eda37.png

here is one function - it is wave like. Imagine that function represents a "particle". Particle with exact speed (momentum) would have precisely known frequency / wavelength. In order to have exactly known wavelength and frequency - you need to have cyclic function - sine/cosine thing - that repeats to infinity. Above function is not like that. Our pond ripple is not like that.

Do you know precise location? Well - we could say that "center" of above function is position - but what if ripple is not symmetric - in fact "particle" is anywhere along that oscillation. More you know exact wavelength - more this function needs to repeat and be longer - more position is spread out - less you know where particle is.

Now imagine you only have one "up/down" change - it will be very localized. Here is another image:

image.png.9992af1f349976b5d4f7cf62816348aa.png

left is what we are talking about - single ripple, few ripples, more ripples - more and more you go - finally you arrive at sine wave going of to infinity. On top we have very defined location - but we don't really have wavelength defined - where do we start to measure it? When we have few ripples - it is a bit easier to say wavelength is probably this much - but suddenly we start to have problem with location and so on....

This is uncertainty principle explained in terms of wave function.

Single photon has well defined wavelength - now take above ripple. Can you decompose it into single well defined wavelengths? You can't really because it would need to go off to infinity - you would need infinite number of such well defined wavelengths. For this reason we can't really tell how much photons are in a wave function ripple.

Like Andrew said above - photon number operator is not well defined.

Once energy transfer takes place - camera detects a photon - it is removed from above wave function and it then changes shape. That interaction is very complex thing as it involves bunch of entanglements with environment and so on...

We did not mention one more very important feature of wave function and that is interference with itself.

We often say - light interferes or photon interferes with itself - in reality it is wave function that interferes with itself. It defines probability that energy transfer will take place.

How is this for starting to define what light is? :D

 

 

  • Like 1
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.