Jump to content

wimvb

Members
  • Posts

    8,949
  • Joined

  • Last visited

  • Days Won

    7

Everything posted by wimvb

  1. I don't use APP, so can't advise on its use. I work exclusively in PixInsight. So, assuming that APP can work in a similar way as PI: In PhotoShop you have to apply a permanent stretch in order to see anything on screen. PI (and I assume APP also) can apply a screen stretch. This means that the image is stretched on screen, but not in reality. Any processes you apply are done on the unstretched (linear) image. Only when you apply a permanent stretch, will it be applied to the image. All processes after that will be applied to the stretched non-linear image. If you upload the tiff (or fits) file as it comes from your stacking program, ie without any other processing applied and unstretched, I can have a look at it.
  2. They cite astronomy as one possible instance. Since most cameras are found in devices that are thinner than a centimeter, and lenses need to become more advanced, we may very well see a market for small curved sensors in mobile devices. It's usually the cheapest solution that wins, and a curved sensor with a simple lens may be cheaper than a flat sensor with a more complex lens. The lenses on that web site are probably not typical for the application of curved sensors once the technology matures. When I worked in the semiconductor industry, colleagues sneered at me when I told them about these exciting new organic led devices I had heard about at a conference. "never" was the general consensus. That was almost 30 years ago. Nowadays we have oled tv screens. I've learned never to say never. And who knows, we may need them in f/1 fast astrographs. 😁
  3. In another image on the company web site, it looks curved right, so probably just an illusion or due to imaging angle.
  4. Probably. Cheapest for certain. Square peg and round hole. We're just not sure yet how that hole curves.
  5. Very interesting development. The curvature radius doesn't have to be an exact match to the focal plane curvature; it only needs to be significantly better than a plane sensor. But I do wonder about long term characteristics. Bending silicon induces stress in the material, and this creates defects in the crystal. This in turn can lead to lower quantum efficiency, and a shorter lifetime. It's certainly not a trivial feat to make a good and stable curved sensor.
  6. What @tomato wrote. Oddly enough, you can't use a white point (255, 255, 255) as reference, because that may leave midtones with a colour cast. PixInsight takes (certain) non saturated stars as white reference, or the entire galaxy. I assume that the Calibrate Star Colour tool in APP does something similar. This is best done in the linear stage, before stretching throws the balance for midtones off.
  7. I also don't use tgvdenoise any more, nowadays. This is partly because I use longer integration times from a much darker site, and partly because it introduces very subtle artefacts in my images. But the main reason I posted that link, was to show you mmt on colour noise. I tend to look at an image in terms of lightness and colour, luminance and chrominance, and I use the appropriate colour space for noise reduction. The small scale "salt and peppar" noise is usually lightness noise. I use MLT mostly, to reduce that. Larger scale colour mottle doesn't show up in the L channel. This I treat with MMT. Many people just blur the RGB image to get rid of colour noise, but I don't like that.
  8. I set the mount parking feature in the mount module for that: Focus Frame Guide Capture Set mount parking to about 20 minutes after the sequence ends Off to bed (or a good movie first) I know the scheduler does much of this automated, but I like the illusion of being in control.
  9. Excellent. If you want to improve this further, you can reduce the colour mottle (chrominance noise) with multiscale median transform. Have a look here http://wimvberlo.blogspot.com/search/label/noise reduction
  10. The 10 arseconds default is more than many mounts can achieve. Any small backlash will result in eternal corrections. Better to use eg 30 arcseconds. That 's still only a small percentage of the sensor width.
  11. I downloaded and examined your image. The histogram peaks are aligned close enough, but that only got you a fairly neutral background. You also have to get the white point right. I don’t know enough about astro pixel processor, but suggest you have a look at their tutorials.
  12. Do you have a copy of Making Every Photon Count by our very own @steppenwolf (aka Steve Richards)? To use his terms: Find and Frame: the alignment module in Ekos Focus: the focus module in Ekos Follow: the guide module in Ekos Film (a bit of artistic license on his part): the capture module in Ekos More on Ekos here https://indilib.org/about/ekos.html
  13. I assume that you also have a dew shield? The combination of shield and heater keeps dew off my maknewt, even if all else is dripping wet.
  14. I'd advise against using the Scheduler if you're just starting with Ekos. Use the alignment module as per @rickwayne's reply. Then start guiding (the guide module in Ekos), and finally go over to the capture module to set up a capture sequence. The Scheduler will automate all this, but get to know what each module does before automating. I've been using INDI for about 3 - 4 years and I still only use the scheduler on rare occasions.
  15. Do GoTo with your camera, not an eyepiece. As @Davey-T suggested: test on a bright and familiar star first.
  16. Here's my attempt. As I wrote before, there's not much more to pull out of the data.
  17. Nice. I liked the previous, colourful version too.
  18. That may not be necessary. The essence of cmos workflows is to gather many short subs. Because of the low read noise, this allows you to gain dynamic range while keeping bright areas (stars and core) from blowing out. Beware this comes at a cost of many Megabytes or Gigabytes of data.
  19. Much better. I played with your image in Pix, and got about the same results as you. Differences are only in the details. I will post my results later. I noticed that the core was blown out, so you might test shorter exposure times, maybe 90 or 120 seconds, or a lower gain.
  20. All the above about focus is for a guide scope, basically. For an oag you can get elongated/misshaped stars when defocused, and guide programs may not even be able to find a star to guide on, if focus is off. Besides, even guiders (oag or scope) are subject to star light spreading due to atmospheric conditions, and will have a star width at best focus. Just make sure that the guiding camera isn’t too much undersampled.
  21. Is APP faster at pre-processing than PI? I find PI slow when there are lots of large files (read: cmos) involved.
  22. Apples and pears, I'm afraid. Atmospheric conditions are also at play here, and may have caused a difference in apparent resolution. Plus, as Rodd said, the central obstruction (camera) of the RASA, decreases the effective aperture. Unfortunately, that makes the RASA not quite f/2. Pixel size difference has also to be taken into account. The effectiveness of an optical system to collect photons is proportional to the square of pixel size divided by the square of f-ratio, unless there is (severe) undersampling.
  23. I downloaded your data, and it seems to me that the image is already stretched. Unfortunately that makes it a lot harder to do anything more with it. Can you post the unstretched image from your stacking program. Ie do not apply the stretching that DSS suggests.
  24. I found his tutorials too casual/cursory. As in his book, he describes what each tool can do, but unless you're an absolute beginner, there isn't very much to be learned. The book is a good reference, though. Especially for a beginner. YMMV, as they say. Since he increased the subscription rate, I no longer put mine there. Too little ROI (return on investment). Otoh, I may put my money on Adam Block's site. Expensive? Yes. But if you follow his youtube chanel, you know there's quality there. Just my € 0.02
  25. Mostly internet resources: Youtube videos by Richard Bloch LightvortexAstronomy tutorials Pixinsight.com web site Pixinsight resources http://www.pixinsight.com.ar/en/ Warren Keller's book is an ok reference, but isn't really my style. It's definitely not my PixInsight "bible" RBA's web site deep sky colors has a section with interesting tutorials (RBA's book Mastering PixInsight)
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.