Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

tomato

Members
  • Posts

    5,096
  • Joined

  • Last visited

  • Days Won

    8

Everything posted by tomato

  1. Have you had a try with the Photometric Mosaic script in Pixinsight? It worked wonders on the data I collected for a 12 panel LRGB mosaic of M31 last year. You will need a wide field image of the region to act as a reference image. I am no expert with the script but with help from @Laurin Dave, the basic workflow I followed was: “the first step will be to create the individual panels as stand alone items... next crop these so that there are no artefacts at the edges.. this is important as any artefacts will mess up the gradient removal.. next step is to plate solve each panel using Pixinsight's Script... Image Analysis.. Image Solver.... once this is done you need to co-register them all, this is done using Pixinsight Script... Utilities.... Mosaic by Co-ordinates..... now would be the time to a) go for a long walk or do overnight... Once you have all the registered panels, including the wide field reference frames, the fun starts with the Mosaic script...” The script help file is very useful.https://drive.google.com/file/d/1sxPP-L2WMMQEESTsZVFEtQr8xr1ugti4/view I note your data is NB but this shouldn’t be a problem.
  2. I have QHY268c OSC camera which is going on the scope soon for Sept/Oct/Nov so I’ll try the calibration on this one first. I just need to sort out a lens so it can image the monitor screen.
  3. I can provide a calibrated monitor for the calibration device, it’s manipulating the data that I would need help with.
  4. Ok, thanks. I would really like to follow the recommendations, so can anybody tell me if this can be done using existing processing software, (e.g. PI, StarTools, APP or Affinity Photo), without using spreadsheets to calculate the values of each pixel. Sorry if this sounds like the lazy option, but I think it has more chance of being widely adopted if it is relatively straightforward and easy to carry out. Perhaps someone smarter than me could write a PI script assuming one doesn’t already exist?
  5. So we can have both. And on the perceptual element of the processing, taking @jager945 and @wimvb's point, we still need a white reference? So going forward, would it be possible to see an initial scientific colour calibrated image, (just to demonstrate that this has been done correctly) and then a final perceptually altered image? Would I be correct in saying every imager's scientific version would look the same when viewed on the same monitor, or as I suspect, would all other elements of the images need to be standardised for this to be true?
  6. Getting back to the original post, can we assume the judges of the IKI M81/M82 competition will be assessing the entries on the basis of what makes a pretty picture rather than a scientific image, or is it possible to do both? If it is the latter, then there must still be an element of subjective judgment to achieve a balance between the two, and if the assessment is 100% quantitative then why do we need the judges?
  7. Sharpcap is great value for money IMHO, the Polar Alignment tool is brilliant and you will make good use of it if setting up each time. The developer has done a lot of work on optimising the best settings for use with CMOS cameras, and new features are always being added. Other software you may not be familiar with is Astro Pixel Processor, which I primarily use for calibration and stacking, but it will also do post processing. Also check out StarTools for processing, an innovative package and a very well thought out workflow, great value for money. And I should also mention NINA, a recent (and free) integrated image capture package, my current choice of software for this operation. An awful lot of choice these days, enjoy the research!
  8. Hi Neil, I wouldn’t recommend shortening the CW shaft unless you are 100% certain you will never increase the weight that’s currently riding on the mount. A shorter shaft could reduce the resale value as a shorter bar would limit what could be put on the mount, better to make a new shorter shaft and keep the original. I actually have a longer than standard CW shaft on mine as I have a big load of scopes on the Mesu, it just clears the inside of the dome, but just clears works for me, as I’m not in the dome when it’s running.
  9. I've been following the @vlaiv initiated discussion on the many and varied colour renditions of the IKI OBs M81/M82 data with great interest and have made a few low level contributions to it. I confess to not fully understanding the discussions on colour transformation and reproduction, but it has motivated me, along with the ongoing UK permacloud, to try and get closer to what would be regarded as the 'right' colours on some LRGB data I acquired on M81 back in 2020. This is my latest effort to be compared to what is deemed to be an accurate rendition of the galaxy. Clearly not there yet, but if nothing else it has put more hours on my processing pilot's log. Reference Image:
  10. May I suggest you do as much as you can while it’s cloudy to ensure your kit all works as it should when the clear night comes. Seven weeks ago I was all ready to go with my first automated unattended imaging session, now I’ve forgotten how to switch it all on. I’ve come to the conclusion that in the UK clear skies are a more precious commodity than all the imaging kit combined.
  11. I could be mistaken but I think that Pillars image is narrowband, where there seems to be much more leeway when it comes to assigning a colour palette.
  12. I have the 150 and the dewshield can droop a little if the two locking screws are not tight. I have fitted a foot long plastic tube extension to the existing shield (not dew control related) so some droop is inevitable. However, as you say it doesn't affect the performance of the scope at all. With the standard flattener fitted, the focuser is out at 6.3cm with an imaging camera on there.
  13. Is it worth doing this on the IKI Obs data using an existing transformation matrix or would the IKI Obs have to create one with their camera? It would be really interesting to see the results.
  14. Have you considered buying second hand from SGL classifieds or Astro buy sell? The HEQ5 with and without belt mod seem to come up quite regularly. I have imaged with a 102 mm APO on a belt modded HEQ5, it guided at 0.6-0.7” RMS all night long. I found it quite manageable to carry and set up, I even took it on holiday in the UK on one occasion.
  15. Very informative write up, thanks. Still waiting for the cloud dispersion device that really works.😏
  16. I am a total novice as regards smartphone AP, but found the Celestron nexyz easy to use, I was snapping the moon about 10 minutes after taking it out of the box.
  17. This was my latest attempt at my M81 data, a 100% qualitative approach. Now where is brown slider for the mid tones…
  18. Does APP’s Colour Star Calibration Module work on a similar principle? The trouble is I get varying results when I use it. Sometimes it adjusts the colour of the DSO to something akin to what I regard as correct (based on existing library images) but other times it can be way off even though the stars appear to have been brought into line. But once again this is me comparing the result to what I think is correct based on the body of images already out there, rather than anything scientific.
  19. But can you compare colours in daytime photography with light in abundance to astrophotography? When ambient light is scarce, colour is less well defined, as far as our eyes and brain interpret it. In Vlaiv’s example we know what the correct colours are because we already have a first hand image in your memory of the object or at least something similar to compare it to. I love the deep colourful renditions of the Milky Way but I’ve never seen it like that with my own eyes, even from a very dark site. If we just went with the scientific raw, unprocessed data, image wise we would see hardly anything. If the colour information collected by the camera is a correct reproduction of the light entering the camera (and I agree it must be), then why do we use tools to remove the green? If we use the argument that the green is there in error and not part of the image we are trying to reproduce, then surely we have crossed the “faithful reproduction” line and all bets are off in terms of what you can do with the processing? That’s the button I was referring to in my previous post. When one appears in the latest releases of the software, I’ll be first in line.
  20. Well, I’d venture there are no external or similar self-imposed constraints on varying the colour, and as the adjustments that can be made in the software are many and varied, this is the result. This IKI dataset I found particularly challenging, trying to balance the IFN with the galaxies, this might have something to do with it. I produced several versions before arriving at the one I posted. I do like processing luminance data to tease out faint and fine detail, and achieve optimum brightness, contrast etc, but colour processing leaves me cold. This might sound like heresy, but if someone could produce a standard workflow that gave an ‘authentic’ colour result every time I’d be happy to just press that button. Of course that is never going to happen because all of our raw data is unique. I find the ‘auto’ settings for colour adjustments usually produce a result far away from what is regarded as acceptable, particularly with LRGB data, NB being more subjective anyway. It’s not just different individuals take on the same data. Here are are a couple of M31 images which I assure you are using the same OSC data. The second one is about 12 months after the first, thinking the extra time would have honed my processing skills a bit, but I actually still prefer the first one. That’s why I prefer image capture to processing, much more straightforward.😉
  21. There are a few alternatives to the traditional flanged steel tube with stabilising fins, such as plastic pipe filled with concrete, or even a brick built pier. My DIY skills in those areas aren’t up to it and one advantage of the steel pier is it can be unbolted and moved, which I had to do when my cable ducts below it flooded. Fortunately when I purchased my second hand dome it came with a substantial Altair steel pier. It may be a simple device compared to other Astro kit, but the pier does need to be right, fit and forget is what you want.
  22. Sounds good to me. For the QHY268c (same sensor as your camera) I use APP to calibrate the lights and enter a master dark, dark flat and flat.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.