Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

Adam J

Members
  • Posts

    4,939
  • Joined

  • Last visited

  • Days Won

    1

Everything posted by Adam J

  1. Got to be honest that is not its only issue, collimation is off and it has a touch of astigmatism too. Still might be hard to see hidden under the CA. Adam
  2. It doesn't quite work as simply as that, it's not just about the focal point being different it's about spherical error changing with wavelength. Hence although you can focus for RGB at best focus B and sometimes R will not be as sharp as G and that leads to blue bloat or purple halo irrespective of you refocusing. Longitudinal chromatic error and spherochromatism are traded against each other in all lens designs so if you want to bring all colours to focus at the same point you normal end up having to trade against a reduction in focus quality for some wavelengths normally blue. Also as others point out you can't do this for Lum anyway. To avoid this design compromise you need a triplet design slower than F6, a quad design will do better and doublet worse. In most cases.modern processing will fix things for you but for galaxy imaging. your going to inevitably loose some resolution. Adam
  3. The cooler is not in direct contact with the sensor. The pins for the sensor are on the rear so heat is conducted away via these, causes some none uniformity in cooling. Once the tablets have been refreshed you need to leave them to dry the chamber for 48 hours ish before re-testing. Adam
  4. As others said Ivan on the sensor caused by moisture in the chamber, it's just going to keep getting worse. Best bet is to open it up and refresh the desiccant. Adam
  5. It will likely have some bloat in the blue and not be what I would consider a true apo but it should be manageable with blue filters cutting at 420-430nm or higher and or UV/IR like a Astronomic L3 or Baader UV/IR. But once you do that I am sure it's going to be great for the money. The reason I conclude that is hidden in the spot diagram. Askar normal generate the blue line at 430nm but for some 'inexplicable' reason the choose to specify it at 440nm for this scope, that says it all to me it's not a true APO but it's good for it's price. Adam
  6. OMG you are right, OP why have you no bought it yet lol. Also mono is much faster in terms of data collection than and OSC due to lum. Just more effort, there is a difference. The effort is worth it. Adam
  7. just to be clear, are the flats not working? You don't explicitly say or show a stacked image.
  8. If your budget is about £300 then this is impossible to beat at the price. But you would need to pick up as a minimum a LRGB set or a Ha filter to get started. https://www.astrobuysell.com/uk/propview.php?view=201930 4 year back you would have been looking at £900 for this camera used and £1250 new. Don't like that then there is also a Orion 533mc camera on Astro buy sell at them moment too for £550. These days you have more options so unless you can only muster £250 max then can't recommend a DSLR any more to start out. Adam
  9. My honest opinion is that you need to switch out that telescope the NEQ3-2 pro is not going to Handel it for astrophotography. It will struggle with anything bigger than a 60mm refractor and the shorter the focal length the better. If you change the scope you can make the mount and the camera work for wide field targets. I assume the X-t2 is not modified. If not you will need to choose your targets with care. Adam
  10. Interesting thought occurs to me in that turning the settings up to the max, while not visually preasing, might produce extreams of results that give a clue as to the underlying process or direction being used by the AI. Adam
  11. It only matters to me for three reasons, 1) If the result looks fake 2) If the result unintentionally adds elements that did not appear at all in the original image. 3) If the result removes elements from the original image that you don't want to loose. For me the example above covers all three. 1) I find the stars too perfect to be believable. 2) A weird approximation of a planetary nebula was added. 3} In adding that nebula like object a star was totally removed from the image as opposed to repaired. It could well be setting that the op has selected causing these issues. But that's another reason to take care Now does that mean I would not use it myself. No I would use it. It means that you need to keep a very close eye on what it's doing. For me the biggest sin is adding anything not real to the image as it removes it too close to art and to far from the science sides of the hobby. To put a finger on it I believe it's sometimes treating bright linear structures as stars. Another one I have seen is it creating little blobs ob diffractions spikes. All in all it doesn't matter, but it makes me think more is going on than just the claimed AI administered deconvolution. So I would be checking my images. It's just an opinion don't be offended by it. Adam
  12. My issue is I have seen them in images from a couple of well known and liked forum users. I had cut some examples out of those images and was about to post then thought twice. As I don't want them to see it as a commentary on their image, they are still good images. In the end if you want to go looking take a look at recent Cygnus loop images and if you are sufficiently familiar with the target you will spot anomalies. I have decided that for me winning a technical debate is not worth it if it involves making someone I like feel bad about their image. I am willing to comment on not liking the result here as the OP opened a debate on the subject. One thing I might do is reproduce it myself but I am a busy man so that will be on my own time not just because you demand it of me. Suppose I could post it just to you via PM, bit I get the sense that in itself won't get me too far...
  13. ok wide field for me means around 200mm focal length. What's your idea of wide? A FMA180pro fits your budget.
  14. and star exterminator too. I have seen it remove small planetary nebula before and glowing edges to nebulosity too. For example I imaged the question mark nebula and by blinking it in PS between the before and after images I picked up and a shocking number of nebula elements that it had removed along with the star field. Looking in other people's images recently of that target there is a pronounced epidemic of missing objects within their images and it's all caused by blind trust of the star removal program. Adam
  15. No I know its not taking the stars from the Hubble images that would be ludicrous and thats not how it works. But the issue with AI is that its not really possible to know what its doing. I am surprised that he used hubble images through as that would have in my view taught the AI to add diffration spikes to the stars. What I am saying is that it has created a mathamatical model of what a star should look like based on what its been shown (unsuprisingly a gausian profile) then using that model it is replacing the stars with a AI generated stars that conform to that model. Hence it is not adjusting the stars or fixing them. It is sampling them to determin what the star colour and the luminocity profile should be and then mathematically generating a replacement. There is no need for it to refare to another image because its been taught what a star should look like. If it was looking at the distortion you see in the orignal image, modeling it and then adjusting it would never result in the bright star being removed as it would apply the same model to that star. It probably uses a totally seperate model (not because it was told to but because it has learnt to) to operate on nebulocity or glaxies etc. What it has dont with that star is fail to identify it as a star remove it for some reason and then treat it with the nebula mode. I cant say for sure of course but that is my hypothasis. The fact that it can turn a star into a planetry nebula however shows that a model is performing extrapolation and that Blur XT is not just purly acting on the data it has it is applying a model and generating new data that it is placing ito the image. Its done that because the AI made a mistake, but it is revealing none the less. The problem with your hypothasis of how it works is that I have seen examples of bright nebulocity been turned into stars by Blur XT. I have seen a section of the witches broom recently posted on here were a fillement of nebula has been turned into a something that looks like stars on a string like a pearl necklace. Those stars dont exist, there is nothing for it to corrected, its added them from scratch into the image. Ill go find you a good example of that. None of that makes it invalid as a tool, even if I dont personally like the results, but I think its important to be aware that your resulting image is left with some arfificially generated elements within it. Adam
  16. The thing is that, it's very difficult to know what the AI is really doing. In the end it's goal is to show you what it thinks you want to see. But I think it's a mistake to think that it is "reshaping" the stars. It's much more likely that it is simply sampling what is present and replacing it (not reshaping it) with a 2d gaussian profile. In that sense it's closer to being computer generated than it is enhanced. But that for me is not the issue, the issue is that it looks computer generated. So what would I do with that data. I would prefer a starless image to one with fake looking stars added back in to it. It's like my wife using AI filters on her photos to change how she looks. Yeah she looks younger, but a younger fake looking version of my wife. Hence, I tell her I would rather look at the real thing than something I know is not real. Just my view. Adam
  17. My view with this sort of extream of AI processing is that if you are willing to do this, you may as well be willing to use a tool that takes the contense of a star catalogue and artificially generates the star feild from the data. My honest opinion is that your end result simply looks fake. It chatches the eye as odd in the same way as early CGI, too perfect too symetric. Adam
  18. The riccardi is not as harsh and a better balanced reducer. Adam
  19. the 294mm will work with 1.25 for systems slower than F4. Adam
  20. Yeah and leave you with a huge and expensive and heavy filter wheel that you will never use unless you get a full frame mono camera. If he is thinking about sticking with 1.25 to save money I really dont think the cost of 2 inch and a wheel is going to be a player. Oh and the cost of a scope capable of covering a full frame sensor even if you had one. Adam
  21. So as above my AD and Baader 1.25 inch are about 27-28mm, the 1.25 inch is the thread size not the size of the clear aperture. So the cell wall has thickness. Even 31mm unmounted filters sit on a lip that is smaller than the 1.25 inch thread and reduces clear aperture by about 2mm to 29mm. None of that stops you using the filters with a 2600 it will just result in vignetting in the corners that you may need to crop out if not fully corrected by flats. You will likely still get use of over 80% of the sensor area, so still more than a 294 would give you. But long term you will want to switch to 36mm filters to get the most from your new camera. However, all in all if I was not able to afford quality 36mm filters in the near future I would get the 294mm and stick with the current filters as flats will correct them fine so long as you are not imaging at less than about F4. Adam
  22. It looks like the end of the line for the ASI1600mm pro may be incoming, certainly its been removed from sale on ZWOs own site and thats normally a sign that a camera is on its way out. https://astronomy-imaging-camera.com/product-category/cameras/dso-cameras/ Only the GT still showing. Adam
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.