Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

tomato

Members
  • Posts

    5,054
  • Joined

  • Last visited

  • Days Won

    8

Everything posted by tomato

  1. Very informative write up, thanks. Still waiting for the cloud dispersion device that really works.😏
  2. I am a total novice as regards smartphone AP, but found the Celestron nexyz easy to use, I was snapping the moon about 10 minutes after taking it out of the box.
  3. This was my latest attempt at my M81 data, a 100% qualitative approach. Now where is brown slider for the mid tones…
  4. Does APP’s Colour Star Calibration Module work on a similar principle? The trouble is I get varying results when I use it. Sometimes it adjusts the colour of the DSO to something akin to what I regard as correct (based on existing library images) but other times it can be way off even though the stars appear to have been brought into line. But once again this is me comparing the result to what I think is correct based on the body of images already out there, rather than anything scientific.
  5. But can you compare colours in daytime photography with light in abundance to astrophotography? When ambient light is scarce, colour is less well defined, as far as our eyes and brain interpret it. In Vlaiv’s example we know what the correct colours are because we already have a first hand image in your memory of the object or at least something similar to compare it to. I love the deep colourful renditions of the Milky Way but I’ve never seen it like that with my own eyes, even from a very dark site. If we just went with the scientific raw, unprocessed data, image wise we would see hardly anything. If the colour information collected by the camera is a correct reproduction of the light entering the camera (and I agree it must be), then why do we use tools to remove the green? If we use the argument that the green is there in error and not part of the image we are trying to reproduce, then surely we have crossed the “faithful reproduction” line and all bets are off in terms of what you can do with the processing? That’s the button I was referring to in my previous post. When one appears in the latest releases of the software, I’ll be first in line.
  6. Well, I’d venture there are no external or similar self-imposed constraints on varying the colour, and as the adjustments that can be made in the software are many and varied, this is the result. This IKI dataset I found particularly challenging, trying to balance the IFN with the galaxies, this might have something to do with it. I produced several versions before arriving at the one I posted. I do like processing luminance data to tease out faint and fine detail, and achieve optimum brightness, contrast etc, but colour processing leaves me cold. This might sound like heresy, but if someone could produce a standard workflow that gave an ‘authentic’ colour result every time I’d be happy to just press that button. Of course that is never going to happen because all of our raw data is unique. I find the ‘auto’ settings for colour adjustments usually produce a result far away from what is regarded as acceptable, particularly with LRGB data, NB being more subjective anyway. It’s not just different individuals take on the same data. Here are are a couple of M31 images which I assure you are using the same OSC data. The second one is about 12 months after the first, thinking the extra time would have honed my processing skills a bit, but I actually still prefer the first one. That’s why I prefer image capture to processing, much more straightforward.😉
  7. There are a few alternatives to the traditional flanged steel tube with stabilising fins, such as plastic pipe filled with concrete, or even a brick built pier. My DIY skills in those areas aren’t up to it and one advantage of the steel pier is it can be unbolted and moved, which I had to do when my cable ducts below it flooded. Fortunately when I purchased my second hand dome it came with a substantial Altair steel pier. It may be a simple device compared to other Astro kit, but the pier does need to be right, fit and forget is what you want.
  8. Sounds good to me. For the QHY268c (same sensor as your camera) I use APP to calibrate the lights and enter a master dark, dark flat and flat.
  9. I agree the consensus is to use darks, flats and bias calibration frames with CCD and DSLR cameras, and flats, flat darks and darks for CMOS cameras. The reasons behind this are discussed in this thread:
  10. I can’t comment directly on APT but certainly for SGP and NINA if PHD2 and your mount are connected to the imaging control software and you request it to dither in between subs, then it happens. There are dithering parameters that can be adjusted in PHD2, but I’m not sure if these are overwritten by the imaging software, assuming they are also in that software.
  11. I have just realised that Scooby Doo is the same nebula @steppenwolf was referring to. It’s a small sky…
  12. Great capture, I guess I should know this but are the colour transitions in the trail a real effect, related to the layers in the atmosphere?
  13. Just a quick heads up that a member of the Wolverhampton Astronomical Society is scheduled to be on Central news tonight (6 pm) talking about Astrophotography, I'm looking forward to it.
  14. The Scooby Doo Nebula (part of the Heart Nebula). Of course I take this hobby seriously.😄
  15. I think the design and manufacturing costs probably rule out a friction drive from the likes of SW and Ioptron, but you never know. There are a couple of similar friction mounts out there but nothing like in the numbers of Mesu units that have been supplied.
  16. Cardboard off, first speed, cardboard on second speed. In fact with the slippage that is likely to occur when turning it, the speed is infinitely variable.😉
  17. One observation, when using an OAG on a refractor fitted with a FR I get sausage shaped stars on the guide cam, but PHD guides on them, no problem.
  18. I think this will be fine, my pier has 8 machined spacers under it to get the flange flush with the false floor in the observatory and it ain’t moving anywhere when bolted down, even with ~100 kg all up load riding on it.
  19. Lovely images, a good thing about digital data, it doesn’t go off over time.👍
  20. This was also processed with Startools 1.8 Alpha and AP 1.10. Started with the ST Compose module using L + Synthetic L from RGB, RGB, Ha into NB accent. Then default work flow and then a lot of adjustment in AP with HSL and colour tools to try and achieve "conventional" galaxy colouring and lose the blue tone on the IFN. I am still not a fan of IFN, to me it detracts from the galaxies and makes the processing, at least for me, too darn tricky. Still it's part of the skyscape so I should stop moaning and learn to how to process it. Fantastic data, as always.
  21. I use a 120 mini with a ZWO OAG v2 on an Esprit 150 FL 1050 mm. I have to date never failed to have a guide star in the FOV, but there have been occasions where only 2 or 3 are visible. The USB connection on the camera is a bit fragile, best to loop the cable back round and velcro to the camera body.
  22. Total respect for dedication to find away through. When I took the plunge and purchased a PI license and then hardly used it I thought what a waste of money, but then @Laurin Dave introduced me to the photo metric mosaic script and it was worth buying just for that.
  23. Cracking mosaic. Did you use any specialist processing software to stitch the panels together? I couldn’t get anything as smooth as this without using the likes of APP or the PI scripts.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.