Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

Adam J

Members
  • Posts

    4,955
  • Joined

  • Last visited

  • Days Won

    1

Everything posted by Adam J

  1. Interesting thought occurs to me in that turning the settings up to the max, while not visually preasing, might produce extreams of results that give a clue as to the underlying process or direction being used by the AI. Adam
  2. It only matters to me for three reasons, 1) If the result looks fake 2) If the result unintentionally adds elements that did not appear at all in the original image. 3) If the result removes elements from the original image that you don't want to loose. For me the example above covers all three. 1) I find the stars too perfect to be believable. 2) A weird approximation of a planetary nebula was added. 3} In adding that nebula like object a star was totally removed from the image as opposed to repaired. It could well be setting that the op has selected causing these issues. But that's another reason to take care Now does that mean I would not use it myself. No I would use it. It means that you need to keep a very close eye on what it's doing. For me the biggest sin is adding anything not real to the image as it removes it too close to art and to far from the science sides of the hobby. To put a finger on it I believe it's sometimes treating bright linear structures as stars. Another one I have seen is it creating little blobs ob diffractions spikes. All in all it doesn't matter, but it makes me think more is going on than just the claimed AI administered deconvolution. So I would be checking my images. It's just an opinion don't be offended by it. Adam
  3. My issue is I have seen them in images from a couple of well known and liked forum users. I had cut some examples out of those images and was about to post then thought twice. As I don't want them to see it as a commentary on their image, they are still good images. In the end if you want to go looking take a look at recent Cygnus loop images and if you are sufficiently familiar with the target you will spot anomalies. I have decided that for me winning a technical debate is not worth it if it involves making someone I like feel bad about their image. I am willing to comment on not liking the result here as the OP opened a debate on the subject. One thing I might do is reproduce it myself but I am a busy man so that will be on my own time not just because you demand it of me. Suppose I could post it just to you via PM, bit I get the sense that in itself won't get me too far...
  4. ok wide field for me means around 200mm focal length. What's your idea of wide? A FMA180pro fits your budget.
  5. and star exterminator too. I have seen it remove small planetary nebula before and glowing edges to nebulosity too. For example I imaged the question mark nebula and by blinking it in PS between the before and after images I picked up and a shocking number of nebula elements that it had removed along with the star field. Looking in other people's images recently of that target there is a pronounced epidemic of missing objects within their images and it's all caused by blind trust of the star removal program. Adam
  6. No I know its not taking the stars from the Hubble images that would be ludicrous and thats not how it works. But the issue with AI is that its not really possible to know what its doing. I am surprised that he used hubble images through as that would have in my view taught the AI to add diffration spikes to the stars. What I am saying is that it has created a mathamatical model of what a star should look like based on what its been shown (unsuprisingly a gausian profile) then using that model it is replacing the stars with a AI generated stars that conform to that model. Hence it is not adjusting the stars or fixing them. It is sampling them to determin what the star colour and the luminocity profile should be and then mathematically generating a replacement. There is no need for it to refare to another image because its been taught what a star should look like. If it was looking at the distortion you see in the orignal image, modeling it and then adjusting it would never result in the bright star being removed as it would apply the same model to that star. It probably uses a totally seperate model (not because it was told to but because it has learnt to) to operate on nebulocity or glaxies etc. What it has dont with that star is fail to identify it as a star remove it for some reason and then treat it with the nebula mode. I cant say for sure of course but that is my hypothasis. The fact that it can turn a star into a planetry nebula however shows that a model is performing extrapolation and that Blur XT is not just purly acting on the data it has it is applying a model and generating new data that it is placing ito the image. Its done that because the AI made a mistake, but it is revealing none the less. The problem with your hypothasis of how it works is that I have seen examples of bright nebulocity been turned into stars by Blur XT. I have seen a section of the witches broom recently posted on here were a fillement of nebula has been turned into a something that looks like stars on a string like a pearl necklace. Those stars dont exist, there is nothing for it to corrected, its added them from scratch into the image. Ill go find you a good example of that. None of that makes it invalid as a tool, even if I dont personally like the results, but I think its important to be aware that your resulting image is left with some arfificially generated elements within it. Adam
  7. The thing is that, it's very difficult to know what the AI is really doing. In the end it's goal is to show you what it thinks you want to see. But I think it's a mistake to think that it is "reshaping" the stars. It's much more likely that it is simply sampling what is present and replacing it (not reshaping it) with a 2d gaussian profile. In that sense it's closer to being computer generated than it is enhanced. But that for me is not the issue, the issue is that it looks computer generated. So what would I do with that data. I would prefer a starless image to one with fake looking stars added back in to it. It's like my wife using AI filters on her photos to change how she looks. Yeah she looks younger, but a younger fake looking version of my wife. Hence, I tell her I would rather look at the real thing than something I know is not real. Just my view. Adam
  8. My view with this sort of extream of AI processing is that if you are willing to do this, you may as well be willing to use a tool that takes the contense of a star catalogue and artificially generates the star feild from the data. My honest opinion is that your end result simply looks fake. It chatches the eye as odd in the same way as early CGI, too perfect too symetric. Adam
  9. The riccardi is not as harsh and a better balanced reducer. Adam
  10. the 294mm will work with 1.25 for systems slower than F4. Adam
  11. Yeah and leave you with a huge and expensive and heavy filter wheel that you will never use unless you get a full frame mono camera. If he is thinking about sticking with 1.25 to save money I really dont think the cost of 2 inch and a wheel is going to be a player. Oh and the cost of a scope capable of covering a full frame sensor even if you had one. Adam
  12. So as above my AD and Baader 1.25 inch are about 27-28mm, the 1.25 inch is the thread size not the size of the clear aperture. So the cell wall has thickness. Even 31mm unmounted filters sit on a lip that is smaller than the 1.25 inch thread and reduces clear aperture by about 2mm to 29mm. None of that stops you using the filters with a 2600 it will just result in vignetting in the corners that you may need to crop out if not fully corrected by flats. You will likely still get use of over 80% of the sensor area, so still more than a 294 would give you. But long term you will want to switch to 36mm filters to get the most from your new camera. However, all in all if I was not able to afford quality 36mm filters in the near future I would get the 294mm and stick with the current filters as flats will correct them fine so long as you are not imaging at less than about F4. Adam
  13. It looks like the end of the line for the ASI1600mm pro may be incoming, certainly its been removed from sale on ZWOs own site and thats normally a sign that a camera is on its way out. https://astronomy-imaging-camera.com/product-category/cameras/dso-cameras/ Only the GT still showing. Adam
  14. I would be shocked if it was not compatable. Am sure they will get back to you. Adam
  15. My advice is buy a flattner or a reducer before you buy a dedicated camera. Adam
  16. Yeah thats not nomal just make sure its not your power supply causing issues before you consider returning it. Adam
  17. Got confused with the model number was thinking of a different camera. Not a fan of their model numbering system.
  18. it's not the camera as such if it's working in sharp cap. Just the other platforms have not gotten support fully sorted out yet or it's a driver issue. I would give it a couple of months before assuming it's going to be an issue. Problem with being an early adopter.
  19. yeah but remembering you need at least 24 exposures for dithering to be able to properly reject hot pixels.
  20. think I answered this one recently but there is a difference in the specified band pass for the blue filter 400nm for the new one Vs 380nm for the old one. There may be other tweeks to the coating but all in all you can still use the old CCD set on CMOS cameras without issues as I do. So no it's not a pure rebranding. Adam
  21. This sort of thing is exactly why I will not be "upgrading" from my 5nm Astrodon Filters to Chinese made filters any time soon. Even if I can supposedly get 3nm Chinese filters for less than the cost of 5nm AD filters. Adam
  22. I don't think that would be the case. The big risk with their kit is putting the power supply into one of the output instead of the input, very little to go wrong apart from that. Certainly on balance of risk I would not be paying £60 for a low capacity power supply. Also if buying from FLO I would be shocked if you got treated like that. Adam
  23. I am sure that any 12 - 13.8 volt supply will do just fine. Adam
  24. You simply measure the bearing, one number will be the inside diameter another the outside diameter and the final one the depth of the bearing. Then search google for those measurements and select a nice shielded ceramic one. Adam
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.