Jump to content

Narrowband

Imaging a pulsar. Is it possible with amateur equipment?


dokeeffe

Recommended Posts

Until we acquire the first photons I'd very much like the other participants in this thread, who already offered some software expertise, to join the discussion again and start considering some possible methods of processing the data.

In a 3 ms exposure the signal isn't going to be much higher than the noise level which we'll try to combat capturing a great number of frames. The idea is to sort the frames based on which part of the period they represent and then individually stack those 11 groups to produce 11 frames for a 33.5028583ms period of the pulsar. 

3 hours ago, Stub Mandrel said:

Sounds a great project Marko, keep us informed of progress. You may have to do some very clever stacking to get this to work, accurate frame timestamps will be critical.

I had no idea there were any pulsars in Europe! :icon_biggrin:

There are many pulsars here in Croatia, they call them lighthouses. They are on the islands so there are crabs near them too :icon_biggrin:

Link to comment
Share on other sites

  • Replies 34
  • Created
  • Last Reply
4 hours ago, m a r k o said:

Hi,

I'm not sure weather to contribute here or to start a new topic and link this one there. Anyway...

I'm resurrecting this topic and call for all interested parties to participate.

Me any my friend Lovro have started an attempt to image the M1 pulsar (or is it? More on this later) here in Croatia. Both of us have very limited funds for our projects so we are  starting a collaboration between our astronomy club Beskraj and Team stellar (who will provide an AZEQ6 mount) and we hope to receive help here regarding the processing of data from you. 

I have an OrionUK 10" f/4.8 newt and an ASI178 color which I got from selling my 16" homemade primary, Lovro has an ASI120 and experience in astrophotography, we are still missing an OAG. Also we could ask an acquaintance to lend us his massive AstroPhysics  mount if the AZEQ6 proves to be too shaky. I hope to get decent data from the 10", I have regular access to fairly dark skies so that should help. Capturing the data shouldn't be an issue but timing will. I get around 331 fps out of 333 fps for 3ms exposures when testing here at home...

Hope to hear from you soon,

Marko

Hi,

331fps is impressive, this is with ASI178 and smallest ROI, I presume?

Ok, here are a couple of pointers for you:

1. Make sure you have guiding that is as best as it can be. I think that you will need OAG, any possible flex, even a tiny one, on a guide scope will mess things up. One might be guiding well for 5-10min subs, but here because there will be no data to align frames on, you must insure that for the whole duration of recording (the night basically) you don't get any movement in your target. Also try to get guiding RMS below 0.5" if possible. If you can't achieve these two with your setup, then I think there is not much point in trying to image pulsar.

Easiest way to test this would be to go out one evening (choose night of good seeing) and shoot a star for maximum amount of time (for example 4h, 2 hours before and 2 hours after meridian). For this it does not matter if you use short or long duration subs - you can use 1 minute subs and it will be ok. Then you stack all of your frames without aligning them first, just simple average of frames - do flip frames after meridian to stack them on frames before (both vertical and horizontal flip). Also star that you are shooting does not need to be very faint, in fact it should clearly show on 1 minute subs (this will give you feedback if your guiding is really 0.5" RMS or not, if it is - star should be really tight dot of light).

There is one more thing to do here - make sure that you can position your scope to exact same position after meridian flip - so that star that you are shooting is in center of the frame in both cases - best to try it out without looking at the star itself or using it as a reference - just try to see if you can use goto / plate solve to put that star dead on center just by using coordinates. both east of pier and west of pier after meridian flip.

If you can do all of the above, and after stack without align you get something resembling a star in long exposure photo, then you are good on "mechanics" part. It must not be to smeared/deformed too much or have any kind of trailing, look at above post, I've made example how it should look like (just stacking and no aligning / centering of frames). Based on how good your guiding is it will be more or less smeared (less is better).

2. you should run some tests to figure out how your camera behaves on 8bit capture. If you are planing to use ASI178 in 8 bit mode, I guess best place to start would be to use ADU of around 0.25 (checkout charts on zwo website to see gain settings that are close to 0.25 ADU - it should be somewhere around 110-120). ASI178 when doing 8 bit capture is actually sampling at 10bits and just using 8 most significant bits as output. Ideally you would want 1 photon on sensor to be 1 ADU in file. You can check all of this by comparing 16bit and 8bit modes with different gain settings on the same star, shot one after another (kind of basic photometry, you really want 8bit mode not to crop off any histogram, or remove data of the least significant bits / clip histogram to the left). Also do tests with "darks" and "flats" to see if you can get images with same statistics in both 8 bit and 16 bit mode. Here we assume 16bit mode is ok and does not clip anything, so we want our 8bit images to behave the same.

3. As a final test to see if you would be able to capture pulsar, I would recommend using 3ms exposures with above guiding and above camera settings on 15 mag star for 1/10th of total planned shooting of the pulsar (so for 4h of real shooting you would use 24 mins imaging for this - I usually say 4 hours because that is what I usually get out of the night when imaging - two hours pre and two hours post meridian, but you might image for 6 hours for example). Processing those frames with dark and flat calibration (please pay attention to use at least 32bit precision for this), and stacking without alignment. After that bin as needed and see if you have meaningful star signal on your final image. If you do have it with reasonable SNR, I guess you are good to go for shooting the data by the end of the year. Only thing that is left to do is use/write special software to align frames based on timestamp. If you are planing to write your own software and need pointers, just ask. I'm planing to shoot this as well and will be writing software to stack frames, so you can use my software for processing.

Even if I don't manage to do recording of my own data, I would be happy to write processing software for you if you manage to capture data.

Link to comment
Share on other sites

On 13/08/2017 at 17:37, vlaiv said:

One might be guiding well for 5-10min subs, but here because there will be no data to align frames on, you must insure that for the whole duration of recording (the night basically) you don't get any movement in your target.

Hi vlaiv, I was hoping to hear from you. Thanks for the proposal to write the processing software, I have no knowledge in this field and wiill absolutely need your help here.

 

I will test the point 2. and the framerate again tonight and will repot here with the result.

An idea: could a planetary stacking software be used for spatial frame alignment and stacking? The whole night capture should be sorted based on timestamp in, say, 11 groups and than each group could be aligned using the pulsar as a "planet" and then stacked. Than the guiding would only need to be excellent during the night and not perfect.

Which capturing software and equipment will you use?

Link to comment
Share on other sites

17 hours ago, m a r k o said:

Hi vlaiv, I was hoping to hear from you. Thanks for the proposal to write the processing software, I have no knowledge in this field and wiill absolutely need your help here.

 

I will test the point 2. and the framerate again tonight and will repot here with the result.

An idea: could a planetary stacking software be used for spatial frame alignment and stacking? The whole night capture should be sorted based on timestamp in, say, 11 groups and than each group could be aligned using the pulsar as a "planet" and then stacked. Than the guiding would only need to be excellent during the night and not perfect.

Which capturing software and equipment will you use?

I think that there is absolutely no way we can use any kind of stacking software that aligns things based on any sort of signal in the image.

In 3ms exposure there is simply not going to be any perceivable signal to align frames by. This is why tracking is so important. If you look at the spread sheet I've made and the calculations, expected rate of electrons per exposure is something like 0.06e - this means that every fifteenth frame on average will record one electron of signal - so how would you align 14 out of 15 frames when they really don't have any signal at all (this is on average, due to nature of light, it follows Poisson distribution - meaning it is really random so one frame might have 2e and then 30 frames without any signal at all, then 3e on one, and then again series of frames without any signal) and on frames where you have signal - level of noise is such that SNR per frame is much less then 1.

Now on "sorting" and "aligning" of frames on time slots - my idea is as follows: based on time stamp of each frame it can either land on exact slot (highly unlikely, and if you look at time as real number - probability for this is 0 :D ) or between slots (most likely), so we can take timestamp of frame and interpolate frame into two slots depending where between slots it lands - this will have a bit of smoothing effect on animation between frames, but it will work ok. Here is an example, and for sake of simplicity, I will set sampling time interval to integer values (same as slot number):

slot 0, slot 1, slot 2 .....

So if we record frame with time 0.5 it will be equality split between slots 0 and 1 - we divide values of pixels in frame by 2 and add to both stacks. If we want to stack frame with time 1.8 then we would split it between slots 1 and 2 but slot 1 will get 1/5 of frame and slot 2 will get 4/5 of frame - to stack of slot 1 we add frame * 0.2 and to stack of slot 2 we add frame * 0.8. During the stacking for each slot we keep counter of added frames, and in the end we divide each slot with its counter value.

It is important to note that we don't actually do any alignment of frames in terms of shifting pixel values along x and y axes of frame or any rotation. This is simply because we will not have any way of determining what would be proper shift in x, and y and angle of rotation (if there is good polar alignment and we use OAG - there is not going to be any rotation of the FOV, and if tracking is precise enough we are going to have only small shifts in x and y - and that is going to produce much the same effect as long exposure - star of a certain FWHM - better the seeing and guiding - smaller FWHM of star - higher SNR per star pixel because star will cover less pixels).

Now, you could just sort frames depending where they fall between slots and just assign them to closest slot, but difference between such approach and the one I'm suggesting will be similar to what you get when you resize an image and use nearest neighbor resampling vs linear resampling - linear resampling gets nicer looking image, and in our case I think it will get us better SNR.

Equipment wise, I'm planning to use following: RC 8" (F/8 - FL 1624), ASI 1600 (which will limit my frame rate, but it has biggest pixel size and most sensitivity and it is also BW). Mind you, you don't have to use B/W camera, you can do this with OSC camera as well, just bin resulting frames by at least x2 (you will want to bin them anyway to get better snr). Mount is HEQ5 that has been tuned and belt modded (I can sometimes get less than 0.5" RMS on guiding if night is good and there is no wind), OAG + ASI185 for guiding (this is my regular setup now for imaging with RC8).

For capture software I'm either going to use SharpCap (probably this unless it turns out that there is something that will prevent successful capture)  or FireCapture.

Also Flat panel at the end for capturing flat field.

Besides light frames, I'll capture dark frames (also 3ms, so you can think of them as being bias because of short exposure but they are in fact dark frames - same exposure as light), and probably quite large amount of them (10000+), flat and flat dark files (flat frames I'll capture in 16 bit mode, and also will get like really big number of those 1000+).

Link to comment
Share on other sites

Thanks for the link David. The more I think about it the more I want to try both methods to get results. Both will be challenging...

 

Here's the link I promised. Spoiler!: The study concludes that the Crab nebula isn't a supernova remnant but a result of a head on collision of two massive stars.

lempel.pagesperso-orange.fr/un_os_dans_le_crabe_uk.htm

 

And a very fresh study

https://www.kth.se/en/forskning/artiklar/new-observations-reveal-crab-nebula-s-polarised-emissions-for-first-time-1.746056

 

Link to comment
Share on other sites

2 hours ago, m a r k o said:

Thanks for the link David. The more I think about it the more I want to try both methods to get results. Both will be challenging...

 

Here's the link I promised. Spoiler!: The study concludes that the Crab nebula isn't a supernova remnant but a result of a head on collision of two massive stars.

lempel.pagesperso-orange.fr/un_os_dans_le_crabe_uk.htm

 

And a very fresh study

https://www.kth.se/en/forskning/artiklar/new-observations-reveal-crab-nebula-s-polarised-emissions-for-first-time-1.746056

 

This is very interesting, it did strike me as odd when I was looking at already posted GIF animation of this pulsar that there seems to be two distinct peaks one brighter and one less bright and I wondered what might be causing this.

Link to comment
Share on other sites

Here's a simple design for a pulsar disc for use with a 3.75 degree (96 step) stepper rated up to about 800 pulses per second.

It has ten slots each covering 9 degrees, with a 27 degree area between. There's also a 3d printable bracket which will be a push fit on the front of a C90 MAK which with 1250 focal length should be ample for imaging the pulsar.

To synch with a pulsar at 30 Hz it need to rotate three times per second, so that's 3x96 ~ 300 pulses per second, well inside the capabilities of the stepper (about 800pps) and fast enough to give smooth movement. A commoner 48-step stepper would need to rotate at about 600 pulses per second, still doable as the torque required will be small.

This approach would allow the use of a crystal controlled arduino or similar which would easily deliver the level of accuracy in rotation speed without the complexities of PLLs linked to PC sound cards.

59969e394477c_PulsarDisc.thumb.jpg.b2400a1a261005956683b6a7ea3f5b29.jpg

5996a044cd865_stepperholder.thumb.jpg.265fdbb4932a0b15b47f581dbba32555.jpg

Link to comment
Share on other sites

To report on the ASI178 framerate and capture settings;

 

A one minute capture produced 19481 frames @ 324fps and 160x120 pixels (resulted from 2x hardware binning) in RAW16 colorspace. The .ser filesize was 713.56 MB

This is the only setting I could get the 300+ framerate. Raw8 would hang at smaller resolutions or couldn't produce the 300+ framerate, only about 250 max.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.