Jump to content

Stargazers Lounge Uses Cookies

Like most websites, SGL uses cookies in order to deliver a secure, personalised service, to provide social media functions and to analyse our traffic. Continued use of SGL indicates your acceptance of our cookie policy.

stargazine_ep2_banner.thumb.jpg.e37c929f88100393e885b7befec4c749.jpg

Filroden

Members
  • Content Count

    1,360
  • Joined

  • Last visited

  • Days Won

    2

Filroden last won the day on January 24 2017

Filroden had the most liked content!

Community Reputation

1,404 Excellent

2 Followers

About Filroden

  • Rank
    Sub Dwarf

Profile Information

  • Gender
    Male
  • Location
    East Cleveland
  1. Good first effort. PixInsight can take a while to get to grips with even the basics but once you become familiar with it, it becomes a lot easier to use. I'd recommend hitting your background with some noise reduction, particularly colour noise. TGVDenoise would be ideal on both RGB/K and chrominance. Extract a luminance copy of the image and apply it as an inverted mask to protect the brighter areas so noise reduction only works on the background. It should help reduce the speckle.
  2. Given you caveat then Lightroom should offer you most of what you need. However, it is not very good at taking linear astro-data and stretching it as it's really designed for normal photography and does not give fine controls over applying non-linear stretches to an image's "exposure". Personally, I would only use Lightroom once that initial stretch has been done. If you may want more control over the various functions (contrast, tones, etc) which would probably mean using a more dedicated photography package like Photoshop (not cheap), Affinity Photo (much cheaper) or GIMP (free) which will give you finer controls and they can all do non-linear stretching using iterations of either levels or curves. But you shouldn't consider image calibration and stacking as a time consuming process. It can be if you want to do a really good job and PixInsight as well as many other packages can do very good jobs. But you can do simple and quick calibrations and integrations in Deep Sky Stacker and then take the resulting TIFF into Lightroom for finishing touches. You can run a trial version of PixInsight for free for a limited period but it sounds like it wouldn't be right for you at this stage in your astro-imaging journey.
  3. If you use a FL of 468 and a pixel size of 4.54 it will solve. And guess what it does? Makes the image more blue!! LOL
  4. I fear we're mixing colour terminologies between RGB and HSL (or HSV). Saturation that does not change the colour balance (hue) only makes sense when considering HSL(V). To change saturation in RGB without affecting colour balance requires changing the rations of RGB and the luminance to which it applies. Of course, you can convert between the two at will, so you can apply "saturation" tools to RGB values but they will not appear to work the same. H53 S75 L50 = R128 G116 B032 with a luminance of 49% H53 S50 L50 = R128 G120 B064 with a luminance of 50% H53 S25 L50 = R128 G124 B096 with a luminance of 52% (rounding error as PS only shows integer values) So both scales describe the exact same colour balance but in different terms. I think this derives from RGB typically being used for print and HSL(V) for screen.
  5. Glad you got this resolved. Isn't SGl a wonderful place?
  6. GIMP should have all the same core processing abilities as Photoshop though they may have different terminologies. Where Photoshop will possible shine is in the available (paid for) extensions which can make astro-processing easier. Before investing, you may want to trial more dedicated astro-processing software such as Startools, PixInsight, Astroart, etc. These have more astro-oriented processes such as gradient removal and colour correction built in. Each has its strengths and weaknesses, and a lot of which package you eventually use will be subjective based on which workflow you prefer. Some of them also offer better calibration and integration tools than DSS. There are plenty of web and YouTube tutorials for each of them and I think they all offer trials. Edit: I should add, these packages have different ways to do colour calibration. E.g. PixInsight can plate solve your image and match the stars to their spectral type using an online database and then apply colour corrections to get your image as close to their "true" colour as possible.
  7. I doubt you can avoid it. If you want coloured stars you may be better developing two versions of the same image - one with a very light stretch just to get whhat colour you can in the stars (the galaxy will probably be lost in the background) and one to process the galaxy then join the two images so the stars from one join the galaxy from the other.
  8. I note you're only using DSS and Lightroom to process your images which could explain the blue colour cast. Whilst M101 does have blue arms, the giveaway to know the colours in the image are not balanced is that both the background, which should be a neutral grey, is much bluer and slightly more green suggesting red needs a boost and blue dialled back. Also, the centre of the galaxy should be a yellow hue. Your stars are fairly saturated so they aren't going to help you balance the colours unfortunately. Colour balance needs to be done early in processing. You'd need to separate the RGB channels and work them individually before they are stretched. And each will require it's own stretch. That way you can boost the red channel which looks to be weakest and tone back the blue. This is probably be beyond DSS and Lightroom so you may need to use a third package to do the bulk of the heavy lifting and use Lightroom to do the finishing touches (vibrance, sharpness, clarity, etc). Did you image under heavy light pollution or under moonlight? They may have introduced the colour cast so you may also want to remove any gradients in the image before stretching.
  9. So any planet, whether an inferior or exterior, can be see at midnight at very high latitudes during winter because it's also possible to see the Sun at midnight at those latitudes! I think the BBC have over simplified something to the point that it's now actually more confusing. As Louis just posted:
  10. Hi Carl, I have a Nexstar Evolution 9.25 OTA I'm not using. However, I wouldn't want to post it as I no longer have all the original packing materials. If you don't find anything before restrictions are eased, I travel down to Stroud and Bristol quite often, so could deliver it.
  11. I've only started to skim my digital copy but I can already see how useful the additional reference guide is going to be. I was expecting as fairly short second book but its 263 pages on its own! Nicely written too. More conversational than lecturing.
  12. I'm liking the less bright full image more than the brighter cropped image. It's not so much the brightness but the high contrast across the cropped area that makes it difficult for me to see on a screen. I did have a quick play with the jpg to test this and I definitely found a lower contrast (and less bright) image easier to look at. That said, the contrast/brightness does clearly show the quality of the data and brings detail into focus that you just would not find in a lesser image!
  13. Just to add the answer regarding darks for your flat frames. You will only need to redo these if you use a different exposure length for your new flats. If your new flats are the same exposure as the original flats then the existing darks should calibrate them. However, in practice, I find my flats always have different exposures. I do a test exposure to a fixed ADU value and use a light source that is variable - sunlight - so they vary quite a lot between sessions. However, because my light source is so bright my flats only take a second or two to expose so it's very quick to run off 30 new darks per filter (each filter also changes my exposure time).
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.