Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

Can someone help me out with the read noise issue I am having on my player one Ares-C camera?


Recommended Posts

I just got the new Ares-c from player one, and there is some kind of read noise issue going on. Its almost like the low read noise mode is not even working, and I have to set the gain to 200 just to get a comparable read noise to the graphs on the product page

The read noise at unity gain is supposed to be around 1.35-1.38e, but I only reach that at gain 200, otherwise I am well over 1.5e at unity gain which is way out of specification. 

 

Can anybody help me out? I have consistently gotten these similar values on 3 tests.

readnoise3.JPG

Edited by zernikepolynomial
added picture
Link to comment
Share on other sites

  • 2 weeks later...
On 24/06/2023 at 01:12, AstroFin said:

Following this thread with great interest since I also bought an Ares-C from Player One. Haven’t had a chance to test it yet though… 

While they did offer to take the camera back, Player One decided to go against their 30 day policy and somehow simultaneously claimed it was not a true "performance failure" to pay for shipping and still weird enough to take it back. Shipping costs back to their factory are way too much, and I would rather save for another camera than pay for that. Not to mention the risk of sending anything back to china just disappearing if it takes too long. 

 

You might have better luck, but anyone else should wait until you see more performance results before buying from them. My results may in fact be the true average, which is not much better than a typical 533mc camera from other manufacturers when taking into account how it responds to gain. Mine must be over 1.5e at unity gain (125), 16k full well. Can hit up to 1.5e at gain 150, 12k full well. My sweet spot will probably be at 1.38e at gain 200, 7k full well, or maybe even whatever I can get down to 4k full well. It will just be a balance of not overexposing with my fast newtonian operating at f3.75, while taking the shortest exposures possible due to its weight.

Link to comment
Share on other sites

5 hours ago, zernikepolynomial said:

While they did offer to take the camera back, Player One decided to go against their 30 day policy and somehow simultaneously claimed it was not a true "performance failure" to pay for shipping and still weird enough to take it back. Shipping costs back to their factory are way too much, and I would rather save for another camera than pay for that. Not to mention the risk of sending anything back to china just disappearing if it takes too long. 

 

You might have better luck, but anyone else should wait until you see more performance results before buying from them. My results may in fact be the true average, which is not much better than a typical 533mc camera from other manufacturers when taking into account how it responds to gain. Mine must be over 1.5e at unity gain (125), 16k full well. Can hit up to 1.5e at gain 150, 12k full well. My sweet spot will probably be at 1.38e at gain 200, 7k full well, or maybe even whatever I can get down to 4k full well. It will just be a balance of not overexposing with my fast newtonian operating at f3.75, while taking the shortest exposures possible due to its weight.

So first thing first. Are you aware that ZWO and Player One also ( unless I am mistaken) do not use Sharpcap for their chart generation. So what you have is three different measurement methods in play here. I think playerone have chosen a more generous method of measurement as your figures are not far from the ZWO figures for the ASI533mc pro. You say 1.38e at gain 200 7k full well. ZWO are saying about 1.3e at gain 200 and 5k full well. So a little lower but at what looks like slightly higher relative gain. Now these difference between Normal read noise and LRN mode are small as things go at unity gain you are talking about 0.11e between the two modes. I would expect the read noise variability from sensor to sensor to be larger than that tiny difference. If it was my camera the difference between 1.38 and 1.5 would not bother me at all and you won't visually detect it in your results that's for sure. 

But as above I think the bigger issue is you are comparing apples and oranges if you are looking at those charts and expecting them to perfectly match results from sharp cap. 

So I think you should stop testing the sensor in sharpcap and take some images with it. There are many other sources of noise beyond read noise that everyone forgets about and added together make your little differences in read noise insignificant in terms of total noise. Just expose very slightly longer if it is really eating away at you and your will have negated the effect anyway. 

This is a panic over nothing. Go take some images. 🙂

Adam

Edited by Adam J
  • Like 2
Link to comment
Share on other sites

31 minutes ago, Adam J said:

So first thing first. Are you aware that ZWO and Player One also ( unless I am mistaken) do not use Sharpcap for their chart generation. So what you have is three different measurement methods in play here. I think playerone have chosen a more generous method of measurement as your figures are not far from the ZWO figures for the ASI533mc pro. You say 1.38e at gain 200 7k full well. ZWO are saying about 1.3e at gain 200 and 5k full well. So a little lower but at what looks like slightly higher relative gain. Now these difference between Normal read noise and LRN mode are small as things go at unity gain you are talking about 0.11e between the two modes. I would expect the read noise variability from sensor to sensor to be larger than that tiny difference. If it was my camera the difference between 1.38 and 1.5 would not bother me at all and you won't visually detect it in your results that's for sure. 

But as above I think the bigger issue is you are comparing apples and oranges if you are looking at those charts and expecting them to perfectly match results from sharp cap. 

So I think you should stop testing the sensor in sharpcap and take some images with it. There are many other sources of noise beyond read noise that everyone forgets about and added together make your little differences in read noise insignificant in terms of total noise. Just expose very slightly longer if it is really eating away at you and your will have negated the effect anyway. 

This is a panic over nothing. 

Adam

I think you are being a bit unfair here. There is nothing wrong with sharpcap itself and just advicing to "take some images with it" is just plain silly when OP has a genuine worry here. I would imagine manufacturers also measure camera performance using exposures and measuring the pixel values within (like sharpcap does). These technical specs cant be seen in a single sub frame, at least easily so i really dont think looking at a subframe has any meaningful value here. But the small technical differences can be measured and they can have a real effect in integrations you would not have been able to otherwise see. Im sure you know all this...

Definitely not a panic over nothing.

However @zernikepolynomial if you look at your graph and the published graph they do look similar in many parts. Sharpcap has skipped gain 125 and gone directly from 100 to 150 which is a problem if you wanted to know what the stats are for gain 125 but im not sure there is a way to force sharpcap to use specific gain values for the graph. Both 100 and 150 look like fairly accurate measurements compared to the published values so i think it may be possible the read noise drop actually does happen at 125 but its just not shown. The actual amount of read noise is slightly higher, which i think is a bit concerning but it doesn't look too bad. The "low read noise" modes are something i dont really trust myself. My Rising Cam branded IMX571 camera has a low read noise mode, but there is no documentation on what this actually means. I do notice that the framerate drops exactly by half when this is engaged, leading me to think that this is some sort of double exposure internal dark calibration thing, which would not be useful at all for long exposure imaging where this probably gets disengaged. This would not be apparent with sharpcap measurements since the exposures used are very short hence my distrust for it.

If we assume your camera is not using the "LRN" mode (whatever that is), then at gain 150 you only have a discrepancy of 1.49 to 1.417 which is not too much. Still a bummer that its more than it should.

Link to comment
Share on other sites

1 hour ago, ONIKKINEN said:

I think you are being a bit unfair here. There is nothing wrong with sharpcap itself and just advicing to "take some images with it" is just plain silly when OP has a genuine worry here. I would imagine manufacturers also measure camera performance using exposures and measuring the pixel values within (like sharpcap does). These technical specs cant be seen in a single sub frame, at least easily so i really dont think looking at a subframe has any meaningful value here. But the small technical differences can be measured and they can have a real effect in integrations you would not have been able to otherwise see. Im sure you know all this...

Definitely not a panic over nothing. 

Yes his worry is genuine but that doesn't mean he should be worrying about 0.1e of read noise or less by your own estimate. The maths just doesn't support that, 0.1e is not going to have a real effect in integrations especially for long exposure DSO imaging. When I have time I will come back and prove to you with some modelling that 0.1e read noise simply has no effect as the other noise sources are more significant. As above this is certainly withing normal variation from sensor to sensor. 

But if you don't believe me then take a look at Doctor Robin Glover's (yes the man behind sharp cap) presentation on read noise from the 2019 practical astronomy show, it's on you tube. You would spot the back of my head in the audience if you knew where to look. One of his conclusions is that read noise is so low in general in modern sensors that it is no longer a significant contribute to overall signal to noise ratio, so long as your exposures are not crazy short like 10seconds for braud band. 

here is a link: 

39mins and 30 seconds shows the effect of read nose from different cameras on overall signal to noise. 46mins companies 2.5e read noise to 7e read noise and basically even for that huge difference you get the same signal to noise as long as you expose for 120s or more. 

In the end it's just not going to make much difference and it might not even be a real difference at all. Just a difference in how read noise is being calculated. For example outlier handling of hot pixels etc. 

For short exposure stuff like solar system imaging it will matter more. But the IMX533 is more often a DSO camera. 

Adam

Edited by Adam J
  • Like 2
Link to comment
Share on other sites

9 hours ago, Adam J said:

So first thing first. Are you aware that ZWO and Player One also ( unless I am mistaken) do not use Sharpcap for their chart generation. So what you have is three different measurement methods in play here. I think playerone have chosen a more generous method of measurement as your figures are not far from the ZWO figures for the ASI533mc pro. You say 1.38e at gain 200 7k full well. ZWO are saying about 1.3e at gain 200 and 5k full well. So a little lower but at what looks like slightly higher relative gain. Now these difference between Normal read noise and LRN mode are small as things go at unity gain you are talking about 0.11e between the two modes. I would expect the read noise variability from sensor to sensor to be larger than that tiny difference. If it was my camera the difference between 1.38 and 1.5 would not bother me at all and you won't visually detect it in your results that's for sure. 

But as above I think the bigger issue is you are comparing apples and oranges if you are looking at those charts and expecting them to perfectly match results from sharp cap. 

So I think you should stop testing the sensor in sharpcap and take some images with it. There are many other sources of noise beyond read noise that everyone forgets about and added together make your little differences in read noise insignificant in terms of total noise. Just expose very slightly longer if it is really eating away at you and your will have negated the effect anyway. 

This is a panic over nothing. Go take some images. 🙂

Adam

 

Adam, you could not be further from the truth. My discussion and tests with player one were long and well documented, all you had to do was ask. 

 

First thing first, Player One did also use sharp cap to compare their test camera to mine:

Player One's test camera in normal mode:

RealNormal.thumb.png.3d43044d53126290f4f0c538124c8085.png

 

Player One's test camera in LRN mode:

RealLRN.thumb.jpg.7f121b3e9e44f6e84cefc30c4a306bcc.jpg

 

Then I compared my results to theirs:

An example of my LRN mode without cooling:

readnoise.thumb.JPG.eebe9d48497ea2ff19d25fe20b59da3b.JPG

 

An example of my LRN mode WITH cooling enabled:

readnoise3.thumb.JPG.cbf960e24a8c39835fd4d802c64b743c.JPG

 

I conducted many more tests, some were clearly erroneous results and not even consistent with the gain curve. I perfected my experimental environment and took several tests with the exact same conditions, I then compiled them into a graph comparing them to Player One's results (Sean [removed word]):

readnoisegraph.png.b6a2b5cffc85c065de48aabee0d51264.png

 

As you can see in the results, I was able to establish that LRN mode suddenly started working in one of the tests (test 5), but I was not able to reproduce it despite having the same exact conditions for each of the tests. In the tests on the above graph, I utilized stable natural sunlight and paper as a diffuser. I also checked to make sure the ROI was on a perfectly flat illuminated area, so systematic error was not having an effect. After this, Player One said the results were very strange via email, and offered to take the camera back. When I accepted the offer, they informed me I would have to pay shipping, but the costs were too high so I reconsidered. This went against their 30 day policy, but they claimed it was not a sufficient enough performance failure to warrant paying the shipping cost, and I found that to be understandable to some degree, but still a bit disappointing because Player One acknowledged the result was unusual and made me do extensive tests without coming to a conclusion.

readnoise13.JPG

readnoise12.JPG

readnoise11.JPG

readnoise10.JPG

readnoise9.JPG

readnoise8.JPG

readnoise7.JPG

readnoise6.JPG

readnoise5.JPG

readnoise4.JPG

readnoise2.JPG

Link to comment
Share on other sites

7 hours ago, Adam J said:

Yes his worry is genuine but that doesn't mean he should be worrying about 0.1e of read noise or less by your own estimate. The maths just doesn't support that, 0.1e is not going to have a real effect in integrations especially for long exposure DSO imaging. When I have time I will come back and prove to you with some modelling that 0.1e read noise simply has no effect as the other noise sources are more significant. As above this is certainly withing normal variation from sensor to sensor. 

But if you don't believe me then take a look at Doctor Robin Glover's (yes the man behind sharp cap) presentation on read noise from the 2019 practical astronomy show, it's on you tube. You would spot the back of my head in the audience if you knew where to look. One of his conclusions is that read noise is so low in general in modern sensors that it is no longer a significant contribute to overall signal to noise ratio, so long as your exposures are not crazy short like 10seconds for braud band. 

here is a link: 

39mins and 30 seconds shows the effect of read nose from different cameras on overall signal to noise. 46mins companies 2.5e read noise to 7e read noise and basically even for that huge difference you get the same signal to noise as long as you expose for 120s or more. 

In the end it's just not going to make much difference and it might not even be a real difference at all. Just a difference in how read noise is being calculated. For example outlier handling of hot pixels etc. 

For short exposure stuff like solar system imaging it will matter more. But the IMX533 is more often a DSO camera. 

Adam

 

Adam, I have watched that video multiple times in the past, and I have actually myself derived his equation for calculating optimal exposure time based on a desired relative noise level and the cameras noise characteristics, as well as light pollution. If you run the calculation yourself, you will see that going from 1.38e to 1.5e results in an increase of (1.5/1.38)^2 = 1.18 times the original exposure length to reach the same noise target (18% longer). That is significant, and it does warrant investigation. Now, will that be noticeable in the images? It depends. If your doing long exposure astrophotography (fewer exposures, but longer) then no.  If you are doing short exposure astrophotography, then it can make a significant difference.  Also, if your signal is very weak, it can be VERY noticeable. All that information is in the video that you yourself have referenced.

Link to comment
Share on other sites

9 hours ago, ONIKKINEN said:

I think you are being a bit unfair here. There is nothing wrong with sharpcap itself and just advicing to "take some images with it" is just plain silly when OP has a genuine worry here. I would imagine manufacturers also measure camera performance using exposures and measuring the pixel values within (like sharpcap does). These technical specs cant be seen in a single sub frame, at least easily so i really dont think looking at a subframe has any meaningful value here. But the small technical differences can be measured and they can have a real effect in integrations you would not have been able to otherwise see. Im sure you know all this...

Definitely not a panic over nothing.

However @zernikepolynomial if you look at your graph and the published graph they do look similar in many parts. Sharpcap has skipped gain 125 and gone directly from 100 to 150 which is a problem if you wanted to know what the stats are for gain 125 but im not sure there is a way to force sharpcap to use specific gain values for the graph. Both 100 and 150 look like fairly accurate measurements compared to the published values so i think it may be possible the read noise drop actually does happen at 125 but its just not shown. The actual amount of read noise is slightly higher, which i think is a bit concerning but it doesn't look too bad. The "low read noise" modes are something i dont really trust myself. My Rising Cam branded IMX571 camera has a low read noise mode, but there is no documentation on what this actually means. I do notice that the framerate drops exactly by half when this is engaged, leading me to think that this is some sort of double exposure internal dark calibration thing, which would not be useful at all for long exposure imaging where this probably gets disengaged. This would not be apparent with sharpcap measurements since the exposures used are very short hence my distrust for it.

If we assume your camera is not using the "LRN" mode (whatever that is), then at gain 150 you only have a discrepancy of 1.49 to 1.417 which is not too much. Still a bummer that its more than it should.

 

My results are not similar to Player Ones LRN mode. My results are only similar to the Normal mode both on the Player One website and the results I just published, except at gain 0. I still believe something happens after it jumps from gain 0 and messes up the results, because it randomly started working properly in test 5 of the published results. The way LRN works is based around the way they integrated memory into the camera, and its an easy to miss bit published on the Player One product page. 

Anyway, I agree that the camera is still useable, just underperforming and disappointing. 

Link to comment
Share on other sites

1 hour ago, zernikepolynomial said:

 

Adam, I have watched that video multiple times in the past, and I have actually myself derived his equation for calculating optimal exposure time based on a desired relative noise level and the cameras noise characteristics, as well as light pollution. If you run the calculation yourself, you will see that going from 1.38e to 1.5e results in an increase of (1.5/1.38)^2 = 1.18 times the original exposure length to reach the same noise target (18% longer). That is significant, and it does warrant investigation. Now, will that be noticeable in the images? It depends. If your doing long exposure astrophotography (fewer exposures, but longer) then no.  If you are doing short exposure astrophotography, then it can make a significant difference.  Also, if your signal is very weak, it can be VERY noticeable. All that information is in the video that you yourself have referenced.

So what I will say is that you have now posted a large amount of information that you did not include in your original posts.

I had also worked out the 18% but the thing is in most cases people over expose anyway and so unless you are keeping your exposure as short as possible you will probably never notice as you are likely to be above the required exposure in either case for the read noise difference to have no effect on a stacked image. For example when you look on astrobin most people are taking 300s or similar exposures, way longer than required and producing fantastic images. So unless you are wanting to use 20second exposures and moving to 24second exposures is a problem for you, you are just not going to notice it.  Also important to remember that's 18% more sub exposure not 18% more total exposure. 

The main thing that drives my exposure time is not trying to get the shortest possible exposures for my read noise and sky conditions but more that I don't want to be dealing with 1000s of subs. 

My guess is that playerone have taken 20 cameras tested them and presented the best results. Sensor to sensor variation is a real thing. 

Can you post a dark frame? The general issue with this type of sensor measurement is that the stats can be effected by a small number of bad pixels that in practice calibrate out, although I don't know the exact method used by Dr Glover to potentially reject these. Perhaps you could find someone with camera and the same sensor to make a comparison with. 

You may have said above but have you tried a different capture program?

Try not to dwell on it and be too disappointed. It's still going to take great images in the right hands. My experience is that one of the biggest factors in end image quality is the imager themselves. 

Adam

 

 

 

Edited by Adam J
  • Like 1
Link to comment
Share on other sites

Have you actually done any completed astro images with the camera? With all mine I've used all sorts of gain settings and exposure times, the end results all have noise regardless of settings used all of which WILL need a post process NR applied to them to smooth the image out. The only factors which visibly introduced more noise was a very short total time, imaging in blue or o3 or close to the horizon, moon or any local light sources.

Edited by Elp
Link to comment
Share on other sites

I think the take from this thread and other posts I have read is that Player One over promise and under deliver on there camera data, this is not the first time I have seen this issue with there cameras, which is odd as Player one is owned by one of the original partners in ZWO, they split after a disagreement and one stayed at ZWO the other formed Player One, so would think the cameras would be on par with the ASI counterparts….but that does not seem to be the case….🤔

@Luke Newbould may well have some insight as he is a big advocate for these cameras….

Link to comment
Share on other sites

45 minutes ago, Stuart1971 said:

I think the take from this thread and other posts I have read is that Player One over promise and under deliver on there camera data, this is not the first time I have seen this issue with there cameras, which is odd as Player one is owned by one of the original partners in ZWO, they split after a disagreement and one stayed at ZWO the other formed Player One, so would think the cameras would be on par with the ASI counterparts….but that does not seem to be the case….🤔

@Luke Newbould may well have some insight as he is a big advocate for these cameras….

 

To be fair, I generally see the Uranus-c perform better than the ZWO counterpart, and to be fair my Ares-c is performing better at certain gain settings than the ZWO counterpart. For example, as Adam pointed out, my camera performs better at 7k full well than a standard ZWO.

Link to comment
Share on other sites

I just want to let everyone know, that this is not a Player One bashing thread, and I just heard about someone on the cloudy nights forum having Player One do a very nice thing for them and offered a replacement as well. They do take the time to try and solve these problems. Its just mine is not bad enough to pay for shipping. The other person had a minor dust issue, but they still offered a replacement. :thumbsup:

Edited by zernikepolynomial
Link to comment
Share on other sites

16 minutes ago, zernikepolynomial said:

I just want to let everyone know, that this is not a Player One bashing thread, and I just heard about someone on the cloudy nights forum having Player One do a very nice thing for them and offered a replacement as well. They do take the time to try and solve these problems. Its just mine is not bad enough to pay for shipping. The other person had a minor dust issue, but they still offered a replacement. :thumbsup:

Thanks for sharing your experience, I thought about player one as I’m thinking of maybe going to 533mm at some point after reading your comment regarding about paying shipping that puts me off tbh there’s no middle ground it’s either working or there is an issue, no we will fix replace but you’ll have to pay shipping that puts me off and also why they need a uk agent , I will probably stick to zwo knowing if an issue arises and if  I buy from Flo they’ll sort it .

  • Like 1
Link to comment
Share on other sites

18 hours ago, bottletopburly said:

Thanks for sharing your experience, I thought about player one as I’m thinking of maybe going to 533mm at some point after reading your comment regarding about paying shipping that puts me off tbh there’s no middle ground it’s either working or there is an issue, no we will fix replace but you’ll have to pay shipping that puts me off and also why they need a uk agent , I will probably stick to zwo knowing if an issue arises and if  I buy from Flo they’ll sort it .

Yes they need a UK distributor and retailer. Until then it's not attractive to me. 

Adam

  • Like 1
Link to comment
Share on other sites

Interesting thread! - Only just saw I got tagged in this so I'm sorry about the late response here.
I've not had the chance to run sensor analysis on the cameras I'm currently testing unfortunately, they just got unboxed, briefly tested then mounted on my scopes so I can't offer anything for reference I'm afraid - first light tests went great though at least!

Regarding the variance observed between P1's graphs and the ones in this thread, my gut feeling is that there's a few things at play:

 

First is likely a 'silicon lottery' issue - probably safe to say if you got 100 of these cameras on the bench to test, you'd get 100 very slightly different end results! (same issue as in other fields, PC overclockers run into this issue all the time for example)


Second would be the test-to-test output variance in Sharpcap's sensor analysis - be it either through slight variances in the test environment (unavoidable without a laboratory grade test bed I'd guess!) or just because of the general shot-to-shot 'noise' in measurements from each frame, something makes it so that SharpCap sensor analysis numbers don't ever truly match for 5 tests in a row - I reckon the graphs published from P1 could just be from a great sensor and then taking the best graph out of 10 for example?

 

Thirdly, environmental variables could be at play with equipment this sensitive and numbers so small! - testing on a day where you can practically feel the static crackling in the air? using a regulated, well earthed power supply? the test computer's earth connection? strong sources of unshielded magnetism nearby? - the list goes on, but there's probably something at play that's near impossible to anticipate too - worth consideration imo!

 

Hope that helps! 🙂

Link to comment
Share on other sites

On 08/07/2023 at 20:55, Luke Newbould said:

Interesting thread! - Only just saw I got tagged in this so I'm sorry about the late response here.
I've not had the chance to run sensor analysis on the cameras I'm currently testing unfortunately, they just got unboxed, briefly tested then mounted on my scopes so I can't offer anything for reference I'm afraid - first light tests went great though at least!

Regarding the variance observed between P1's graphs and the ones in this thread, my gut feeling is that there's a few things at play:

 

First is likely a 'silicon lottery' issue - probably safe to say if you got 100 of these cameras on the bench to test, you'd get 100 very slightly different end results! (same issue as in other fields, PC overclockers run into this issue all the time for example)


Second would be the test-to-test output variance in Sharpcap's sensor analysis - be it either through slight variances in the test environment (unavoidable without a laboratory grade test bed I'd guess!) or just because of the general shot-to-shot 'noise' in measurements from each frame, something makes it so that SharpCap sensor analysis numbers don't ever truly match for 5 tests in a row - I reckon the graphs published from P1 could just be from a great sensor and then taking the best graph out of 10 for example?

 

Thirdly, environmental variables could be at play with equipment this sensitive and numbers so small! - testing on a day where you can practically feel the static crackling in the air? using a regulated, well earthed power supply? the test computer's earth connection? strong sources of unshielded magnetism nearby? - the list goes on, but there's probably something at play that's near impossible to anticipate too - worth consideration imo!

 

Hope that helps! 🙂

 

The tests I conducted were pretty rigorous and well thought out. Natural stable sunlight, flat ROI, grounded separately powered USB hub (also tested with direct computer connection, no difference), paper diffuser. Another user on CN conducted his own tests and actually came out with a very similar result to mine.

The smoking gun on why it has to do with LRN engaging properly or not is test 5. Test 5 followed its own complete curve, well outside the std dev of the other tests. Sean from Player One stated that if it was an instability, the curve would be unstable, with points at each gain having a wild deviation; this was not the case, and it looks like a true systematic change in test 5. The silicon can not become better on one set of gains in a row, but then go back for the other same sets of gain ranges. Again, if it was instability in the system, we would expect wild results at individual gains. 

This is why we need more tests from other people, because if many or some of the cameras are having issues engaging LRN properly, it might be a fixable thing. Might even be sharp cap issue with the camera, resolved with an update. As far as hardware, read noise tends to be related to the amplifier and ADC. One interesting thing to note, on mine and other peoples tests with the camera, engaging the cooler results in a very slight bump in read noise. There could be MANY reasons for that behavior, and so I am not worried about that unless it was more noticeable like this general LRN issue. 

  • Like 1
Link to comment
Share on other sites

This might not be relevant but what version of Sharpcap are you using? I read somewhere that you need to use the Beta version with the Player One cameras?  I am not sure if this would make any difference to the issues raised in this thread?

  • Like 1
Link to comment
Share on other sites

On 16/07/2023 at 01:20, AstroPCB said:

This might not be relevant but what version of Sharpcap are you using? I read somewhere that you need to use the Beta version with the Player One cameras?  I am not sure if this would make any difference to the issues raised in this thread?

It was already tried, made no difference.

  • Like 1
Link to comment
Share on other sites

  • 2 weeks later...

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.