Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

Photometry with DSLR


Guest

Recommended Posts

I've come to the conclusion that you can't properly carry out photometry on variable stars with a DSLR despite what it says in the AAVSO guide.  The reason is because of the Bayer matrix in the sensor.

According to the AAVSO guide you have to carry out the measurement in green because that is closest to the V range of wavelengths. The photometry works by counting the electrons in a circular aperture around the star and also in an annular regions out side of that. The  inner area may be say 10 pixels across but that would include green, red and blue pixels. This means that you won't be counting green photons in some of the pixels covered by the star so the flux values will be wrong. You may get some kind of reading if the debayering has estimated green values for the non-green pixels but that is just an estimate and not a measurement.

I've tried it with my DSLR and I get readings which look about right but I wouldn't be happy about posting those results to AAVSO. I guess it would be fine for your own interest to plot some light curves which would be interesting but that's as far as it goes.

Looks like I'll have to start saving up for a CCD camera.

Cheers

Steve

 

Link to comment
Share on other sites

Lots of variable star observers use DSLR photometry. If you just use the data in the green channel its fine.  The photometry apertures for the star and the background sky take this into account. Have a look at this article by Des Loughney of the BAA Variable Star Section: https://britastro.org/vss/JBAA 120-3 Loughney.pdf

The resulting photometry is very similar to V filter. It can be submitted to the BAA VSS or AAVSO databases as “TG” photometry. 

 

  • Like 2
Link to comment
Share on other sites

Yes you can - you don't need to debayer image using any sort of interpolation - you can simply extract bayer matrix per channel as separate image and use those as monochromatic image.

Another thing that you can do is to create color conversion model. Trichromatic sensors with bayer matrix have certain QE response curve in each color.

V filter also has certain response curve.

You can derive transform matrix that will best model V response curve based on R, G and B curves.

To put it simply:

image.png.2304513afd9491fa55bcff3caf6d43df.png

You scale blue and red and subtract/add from/to green so that marked out areas sort of cancel out or add up and you get close to this curve:

image.png.20d651f60fcafe5a8c6c606f3488a78c.png

Link to comment
Share on other sites

Seems a bit dodgy to me.  Any photons of the right wavelength which fall on the red and blue pixels won't be counted.  Aren't you relying on the fact that you get roughly the same proportion of missed photons on the comparison star as the target star?  You'll be missing about 40% of the photons.  You'd have to make sure that the star image covered a minimum area of the sensor. 

 

Link to comment
Share on other sites

1 minute ago, woodblock said:

Seems a bit dodgy to me.  Any photons of the right wavelength which fall on the red and blue pixels won't be counted.  Aren't you relying on the fact that you get roughly the same proportion of missed photons on the comparison star as the target star?  You'll be missing about 40% of the photons.  You'd have to make sure that the star image covered a minimum area of the sensor. 

 

Sensors have QE smaller than 100% - roughly around 50-80% depending how good they are (some even as low as 25-30%) - so you are already loosing as much as half of photons falling on any one particular pixel.

This works for both absolute and relative photometry - you just correct for it (in relative photometry - there is nothing to be done as it affects both target and comparison star). It is not much different than using telescope of different diameter - just more or less photons gathered over period of time. Want better SNR - change period of time over which you collect.

Link to comment
Share on other sites

Jeremy, the article you quoted looks interesting. I'll work my way through it.. 

Vlaiv, I notice that the author of the article works out the Average and standard deviation of a set of results - echoes of a previous discussion.

Needs some thought.

 

 

Link to comment
Share on other sites

On 18/12/2020 at 15:56, woodblock said:

You'd have to make sure that the star image covered a minimum area of the sensor. 

As in any imaging, you just have to avoid being under sampled ie the size of the star image (FWHM) should be more than 2 pixels but in this case the size of the "pixels"  is twice as big as the size of the actual pixels. (Since you will only be using the green channel it is not actually as bad as this as half the pixels are green)

  • Like 1
Link to comment
Share on other sites

It's not uncommon to defocus the image in photometry so that the image is over  sampled.  This minimises the pixel to pixel variations which together  with guide errors add to the noise.

Regards Andrew 

Edited by andrew s
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.