Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

Keep close to authentic look of object or create stunning, award winning and over processed photo?


Recommended Posts

Hello Guys,

I am an absolute beginner as I have started with astrophotography less than one month ago. I am not sure if these kind of questions are typical only for newbies or this is something common. So astrophotography is about to show objects in clear way, possibly closest to how actually those objects look like or this is an art to attract audiences so casual interpretation is allowed (of course within the limits of good taste)? To be honest before I have started I was sure this is more about showing objects the way they look like but apparently I was wrong. 

From the other hand how we can be sure object (especially in terms of colours) really look like if ever scope/lens have different colour sensitivity? I will give you an example. As far as I know Rosette nebula is made of hydrogen, maybe with some trixogen in the middle so first thoughts that you have is that is should be red'ish and blue'ish inside what most of the photos of this object confirms. But I see a lot of adaptations where it is whole blue, rusty brown or whatever. Same with Orion nebula. Nowadays I see a lot of adaptations where it has external layers of nebulosity... WHITE. And I would say those photos collect the most of recognition.

My apologies for chaotic way of explaining my doubts but I really wonder what is your opinion when it comes to natural photos, maybe with a bit bumped colours, contrast etc. or stunning, overedited "arts"? Probably truth is somewhere in the middle, in the sweet spot but my concerns is that those overedited/unnatural photos are a way more popular. 

Edited by raf2020
  • Like 1
Link to comment
Share on other sites

The line between art and science is very thin. Colors are most likely always an interpretation, and to be honest, I doubt they resemble the actual colors of the subject. Even when the photographer actually tries to do a white balance as close as possible to what the object might look like in reality.

The bluish/golden renditions that you are seeing are probably been captured in narrowband (typically H-alpha, OIII, and SII), and then mapped to the Hubble palette: SII to red, H-alpha to green and OIII to blue.

Of course, there are tons of variations of color mapping and those will all produce different results. Sometimes people only shoot with two bands and map one on one color and the other for the remaining two colors.

These, including the Hubble palette, are not intended to render the true colors of the object, but rather to exasperate the contrast of the different gases and wavelengths in order to make them stand out more and be more visible.

Even the same data, given to two different astrophotographers, will inevitably produce two different results, because they will be post-processed according to the taste of each.

As far as me, I try to not over-saturate or over-stretch the image, but try to keep a more natural, soft and blended feel. But I am shooting with a color, astromodified DSLR. That might change, when I'll get into monochrome and narrowband territory.

Edited by endless-sky
  • Like 1
Link to comment
Share on other sites

Hi raf2030,

You have raised a good question, one that has crossed  my mind on occasion. Every once in a while an image will be posted that looks both amazing and natural, and quite rightly it will receive a lot of good comments. On the other hand we sometimes see images that although are great in their own way are clearly over processed, but they still get a lot of good comments.

The point here though is not only what is it we are trying to achieve but who is to say how an object should look? Or who is to say it’s over processed? Let’s face it, without high tech telescopes, filters  and software we would not even be able to make out more than a whitish smudge of light for some of these distant galaxies, so asking the question of how should it look is a moot point. When we see Hubble images we are told what the colours represent, hydrogen, oxygen, helium etc, they are processed that way to bring out the scientific detail. Amateur astronomers however are under no such obligation and can stretch or shrink whatever colours happen to bring out the most detail or perhaps just looks the most pleasing, in the eye of the imager.

In the final analysis beauty is in the eye of the beholder. To my mind I am just impressed to see the amazing amount of detail that some members are able to bring out in their images and I do think they are free to process their own image as they see fit. I really don’t care too much about the colours, what I don’t like to see is when an image is over sharpened or over worked that creates weird effects.

I’m sure there will be a number of different points of view being raised.

  • Like 2
Link to comment
Share on other sites

40 minutes ago, raf2020 said:

Hello Guys,

I am an absolute beginner as I have started with astrophotography less than one month ago. I am not sure if these kind of questions are typical only for newbies or this is something common. So astrophotography is about to show objects in clear way, possibly closest to how actually those objects look like or this is an art to attract audiences so casual interpretation is allowed (of course within the limits of good taste)? To be honest before I have started I was sure this is more about showing objects the way they look like but apparently I was wrong. 

From the other hand how we can be sure object (especially in terms of colours) really look like if ever scope/lens have different colour sensitivity? I will give you an example. As far as I know Rosette nebula is made of hydrogen, maybe with some trixogen in the middle so first thoughts that you have is that is should be red'ish and blue'ish inside what most of the photos of this object confirms. But I see a lot of adaptations where it is whole blue, rusty brown or whatever. Same with Orion nebula. Nowadays I see a lot of adaptations where it has external layers of nebulosity... WHITE. And I would say those photos collect the most of recognition.

My apologies for chaotic way of explaining my doubts but I really wonder what is your opinion when it comes to natural photos, maybe with a bit bumped colours, contrast etc. or stunning, overedited "arts"? Probably truth is somewhere in the middle, in the sweet spot but my concerns is that those overedited/unnatural photos are a way more popular. 

take a look at some of the articles at https://clarkvision.com/articles/do_you_need_a_modified_camera_for_astrophotography/

  • Like 3
Link to comment
Share on other sites

I think it's a fine - and difficult - balance, but yes, details and amount of nebulosity / parts of the subject have a higher priority over colors (and their resemblance of reality). Going to the extremes, many subjects that are rich in H-alpha, would show their best with a monochromatic camera and a narrowband H-alpha filter. In that case, you would have no colors. But the highest amount of details your eyes can see (for that imaging train combination). This because your eyes have more "resolution" in greyscale than in colors.

  • Like 3
Link to comment
Share on other sites

1 hour ago, raf2020 said:

So astrophotography is about to show objects in clear way, possibly closest to how actually those objects look like or this is an art to attract audiences so casual interpretation is allowed

It doesn't matter!
It is a hobby, no more. Neither is it a competition. No one I have met has ever claimed to be taking amateur images of the Orion Nebula to push the boundaries of science :)

If the end result pleases you then it has achieved its goal. If it doesn't please, then there is always room to learn more about processing and what _raw_ material makes for good end products.
The only suggestion I would make is to preserve all your unprocessed data. That way if you do find you like one form or another better, or just want to experiment on the few nights when the skies aren't clear, then you have everything you need to go back to step #1.

Edited by pete_l
  • Like 4
Link to comment
Share on other sites

@Moonshed true, this was one of my concerns. If we will agree that final photo should be close to the original look of object, what should be considered as appearance pattern then? And this probably doesn't exist.

@runway77 thank you, I will check this out.

@endless-sky I think you re right and it's reasonable that details are more important, but there is really no way to keep colours at least close to reality instead of making whole blue Rosette or white Orion nebula? From the other hand I have to admit those photos look still, somehow amazing to me. Not sure because I like it or I appreciate tons on work on postprocessing of it. My wife has commented it in the way that it is same story like with photoshoped girls on the covers of the magazines. Everybody know it is kind of fake but still everybody likes it. And probably this is very adequate comparison.

@pete_l Thank you for your input. This is very good point that amateur astrophotography is a hobby, not a science. I always keep my raws in 2 locations. One in safe place, untouched, even with failed subs, 2nd production copy on which actually I work.

Thank you all for your time responding me. Knowing more experienced users point of view will let me to calibrate my own as there is nothing worse than going to one or the other extremum imho.

 

Link to comment
Share on other sites

Depends what you want to show really.

It takes more skill and attention to detail to show object as it really looks, but it is certainly possible. Most of the time - it is not what people want to see.

You can also capture true color - as eye would see it, most of the time. In fact - you can capture it all of the time - issue is with our displays. They can't display all the colors human eye can see. For this reason we have what is called gamut of display - range of colors it is capable of reproducing.

No display can properly show spectral colors for example. You can't faithfully show rainbow colors - they will always be less saturated and lacking "the punch" of real rainbow colors (which are just pure spectral colors).

Important thing to understand is the fact that objects we image have such a huge dynamic range, often not supported by eyes. For that reason we need to "compress" into image - dynamic range our eye can easily see at the same time (our eyes are in fact capable of seeing extraordinary dynamic range - but not at the same time - we function perfectly in broad daylight under direct sunlight, but also in the dark under moon light - that is vastly different level of light - but think what happens when someone turns on headlights in the night - you instantly stop seeing properly in darkness).

For that reason, images are often stretched in non linear fashion - to show faint detail together with bright detail. Take this example:

image.thumb.png.cae0d7d7adad3e59ea2a6c15ff7c5996.png

This is the same image of M31, however left right side is the image as it would appear to our eyes in dark skies - left is how this image is usually presented to people - just because otherwise we would miss bunch of detail present in the image.

As for color, well - you can choose what color you want to present, and all it takes is a bit attention to detail. You can choose 3 different "natural" color looks of the object:

1. Color of the object as viewed from Earth. This one is the easiest to achieve - but it is not "proper" color of the object.

Atmosphere of the earth changes color of object - depending on altitude of object. Moon looks yellow, while it is actually grey. Similarly Sun appears yellow - but it is white. It appears orange or even red - at sunrise / sunset - because of more atmosphere light goes thru.

Atmosphere scatters blue light and removes it from color of object - lower object is - more blue light is removed.

2. Object as viewed from orbit around the Sun - one just needs to remove influence of atmosphere and will get this type of color. This is still not proper color of object.

There could be interstellar dust between us and object and that can cause reddening of the image.

3. Object as viewed from orbit around the object itself - and nothing in between causing issues - this would be proper color of object. It can be obtained by removing any interstellar reddening.

As for getting the first step - there is something called color calibration and can be easily performed. Take calibrated display (say your mobile phone) - and shoot image of it with your telescope / camera while displaying color calibration chart. Derive color transform matrix from that.

That is what is done for DSLR cameras in factory and you end up with different color balance presets. We don't need to change color balance as objects in outer space can't be illuminated with different light - we just need regular color transform matrix and you'll have "proper" color.

In the end, I would say - I advocate "documentary" approach to imaging and don't really like overly processed images.

Edited by vlaiv
  • Like 6
Link to comment
Share on other sites

8 minutes ago, raf2020 said:

@endless-sky I think you re right and it's reasonable that details are more important, but there is really no way to keep colours at least close to reality instead of making whole blue Rosette or white Orion nebula? From the other hand I have to admit those photos look still, somehow amazing to me. Not sure because I like it or I appreciate tons on work on postprocessing of it. My wife has commented it in the way that it is same story like with photoshoped girls on the covers of the magazines. Everybody know it is kind of fake but still everybody likes it. And probably this is very adequate comparison.

Regarding reaching a balance between getting the colours more or less “correct” and perhaps loosing detail, or getting the max detail without making M42 all more or less one colour, there is a solution. The more experienced imaging guys when processing their images in PhotoShop will create layers, a separate layer for each colour and for bringing out different aspects of the nebula, plus one for the original, and some layers will be only a selected part of the nebula. They will work to get each layer looking the way they want it to and when happy they will then blend all the separate layers into one image thus achieving a realistic combination of all the colours, bringing out the Trapezium stars,  and not over blowing the central bright part of the nebula.

I must add though that although I have attempted working in PS Layers I didn’t get it right and haven’t yet returned to it, so you may find my description of working in PS Layers will be corrected by the experts.

  • Like 1
Link to comment
Share on other sites

1 hour ago, raf2020 said:

My wife has commented it in the way that it is same story like with photoshoped girls on the covers of the magazines. Everybody know it is kind of fake but still everybody likes it. And probably this is very adequate comparison.

I wouldn't say it's the same type of comparison. When we do astrophotography and processing, we don't invent data, we don't take the healer brush and make parts of a nebula disappear, because they don't match a standard of what a beautiful nebula should be.

We work with the data we have and we try to bring the signal (the object) as high as possible, while trying to render the noise as low as possible. Without creating artefacts.

We are manipulating the data that's there. But smoothing out the "skin" of a nebula would be something counterproductive, because we would actually get rid of details, instead of showing them more.

  • Like 3
Link to comment
Share on other sites

@endless-sky good point of view on it. Thanks!

@vlaiv nice elaboration about the colours, thanks!

@Moonshed atm my PS skills are very basics so I do more experiments each time than actually I have some point-to-point workflow planned so usually I still merge layers because it's handy for me when I try next PP step. I know it isn't best practice ;)  

Edited by raf2020
Link to comment
Share on other sites

6 minutes ago, endless-sky said:

I wouldn't say it's the same type of comparison. When we do astrophotography and processing, we don't invent data, we don't take the healer brush and make parts of a nebula disappear, because they don't match a standard of what a beautiful nebula should be.

We work with the data we have and we try to bring the signal (the object) as high as possible, while trying to render the noise as low as possible. Without creating artefacts.

We are manipulating the data that's there. But smoothing out the "skin" of a nebula would be something counterproductive, because we would actually get rid of details, instead of showing them more.

But of course we do :D

Do we use denoising and sharpening? Those are "morphological" transform, in essence similar to healer brush on pixel level.

Do we use local contrast enhancement? Again that is "morphological" transform. It can sometimes even lead to order inversion. Pixels that are higher in signal then other pixels end up being darker than those other pixels. This is partial "negative" on the image.

Here is often seen example:

image.png.6aff2f9708883074e572fb6ec6af1d58.png

but reality is more like this:

image.png.466c81d4b6d4e27d4b9b186a6fc1dcbb.png

How is that different then emphasized cheekbones :D

 

  • Like 4
  • Haha 1
Link to comment
Share on other sites

I have no idea how to make an award-winning photo, the only thing I have ever won was here on SGL, guessing the answer to a Christmas cracker joke.

My vote is, process to your goals or taste. Process it for you. Enjoy the processing.

It is lovely to see multiple interpretations, these images can be processed in so many different ways, and putting your interpretation or taste into the image makes it special.

The universe without life would be a bit empty, it is nice to see life indirectly in the image!

  • Like 2
Link to comment
Share on other sites

It's interesting to have a look at these competition entries - everyone had exactly the same data and nothing was permitted to be added or subtracted.

There are no two the same and the variety of results is amazing.

Yet they must all be 'right' as they are from the same data !

 

  • Like 2
Link to comment
Share on other sites

Firstly we must distinguish false colour imaging from broadband imaging. In narrowband images, colours are ascribed to particular wavelengths, and so to particular gasses - in the same way colours on a geology map might be ascribed to different rocks. Your blue and brown Rosettes are in this category. Nobody claims that the Rosette really is this colour. This imaging has its origins in science but it has been taken up by amateurs, particularly those with no access to dark sites, because the narrowband filters used to isolate the gasses also obstruct light pollution.

That leaves broadband or 'natural colour' imaging and here I think you're in danger of presenting us with a false dichotomy. I don't think it is true that highly exaggerated images are admired. Most experienced audiences want to see images in which there is a sense of a little bit being left in the data. An image should never look as if it has been pushed beyond what the data had to give. A good imager can wring every last drop out of the data while making it look perfectly relaxed. Nor should any processing techniques leave visible traces. This applies particularly to noise reduction and sharpening. If you're an imager yourself you can spot the signature of processing excess a mile away. The holy grail is invisible processing. It is that which is most admired in the community.

My processing aims are:

- invisble processing.

- colour close to what our eyes would see if they could see faint colours in the dark. (My Rosettes are always red and slightly bluer in the middle!) Star colour should agree with spectral class. (A good test.)

- as much information to be found as possible. (Long exposures and careful stretching to find the very faintest signal possible, especially signal so faint as to be entirely imperceptible to the eye in any telescope. I see that as the whole point.) So I will certainly exaggerate the contrasts between features if it helps distinguish structures in the target, but I won't invent any.

- Colour saturation (as opposed to colour itself) is largely mood dependent!

17 hours ago, vlaiv said:

 

image.png.6aff2f9708883074e572fb6ec6af1d58.png

but reality is more like this:

image.png.466c81d4b6d4e27d4b9b186a6fc1dcbb.png

 

 

Vlad, if the Trapezium looks more like the lower image in your telescope you need to replace it!!! :D :D

Olly

Edited by ollypenrice
Typo
  • Like 2
Link to comment
Share on other sites

15 minutes ago, ollypenrice said:

Vlad, if the Trapezium looks more like the lower image in your telescope you need to replace it!!! :D 

:D

In my scope it looks more like this:

image.png.fc4604fcd8896f78c39bc2a224fb256e.png

But you see the problem - my scope is too far small to be able to show full extent of nebulosity :D

  • Like 3
Link to comment
Share on other sites

5 minutes ago, vlaiv said:

:D

In my scope it looks more like this:

image.png.fc4604fcd8896f78c39bc2a224fb256e.png

But you see the problem - my scope is too far small to be able to show full extent of nebulosity :D

In truth it is hard for the camera to match the eyepiece on the stars and hard for the eyepiece to match the camera on the nebulosity. (But the camera will beat the eyepice most of the time on stellar colour...)

Olly

Edited by ollypenrice
Deconvolution!!!
  • Like 2
Link to comment
Share on other sites

2 minutes ago, ollypenrice said:

In truth it is hard for the camera to match the eyepiece on the stars and hard for the camera to match the eyepiece on the nebulosity. (But the camera will beat the eyepice most of the time on stellar colour...)

Olly

It is actually not the issue with camera - but rather with display device.

Our displays are 8bit and often even not that high - some are 6bit displays.

This means that at best, difference between brightest star and one that is the least bright that display can show is about 1:256, or if we convert that to magnitudes - ~ 6mags of difference.

At the eyepiece - you can easily see 10 or more magnitudes of difference (12mag star next to 2mag star).

That is the part of the problem.

  • Like 1
Link to comment
Share on other sites

Some very good replies to what is also a really good question from the OP, and one I am sure many people who take up ameteur astro imaging ask themselves, I know I did when I started, not too long ago.
I think as a total beginner looking at all the marvelous hubble images thinking how beautiful they were then when getting into astronomy realised they didn't actually look like that, or had those colours in real life  (to humans anyway) made me feel a bit confused.

But as @ollypenrice explains when you consider that they are Narrowband images using very specific narrow bands of frequencies of the spectrum we as humans would not pick out naturally, then in essence the actual colours we map to these frequencies doesn't really matter , We have to assign some colours to them in order for eyes to see them as we do in these magnificent images.

Broadband using LRGB filters or OSC cameras then it makes sense to assign the correct colours to them, in my opinion.

But it is all very subjective and you have to do what you are happy with. Nobody will ridicule you whatever you do. There are a few competitions on SGL last year using some provided data to produce an image of various DSO's, so everybody uses the same data. But if you look at the images submitted then the colours vary so much and the amount of processing also varies greatly, some are quite subtle and some really stretch the data, most lie very much in the middle.

Personally I think some of the most natural looking images are just the Ha images in mono colour (but obvious not natural to our eyes as they are NB Ha only), but after saying that I probably enjoy looking at subtly processed bi-colour or hubble palette  NB images.

Anyway I am rambling now and probably talking rubbish but they were the sort of conclusions I came to when I started imaging and as a relative newbie tend to aim for images that are not over processed but contain enough detail to make them interesting and that is for both Narrowband and Broadband.
In essence I have done very little with LRGB yet as most of my data has been with NB filters due to either having almost full moon when my skies are clear or just general light pollution and stuck to bi-colour images if very little SII or Hubble palette if enough data in Ha, OIII and SII.

Steve

Edited by teoria_del_big_bang
  • Like 1
Link to comment
Share on other sites

I had a small conversation with one guy which probably prove there is something like "natural look of objects". Let me try to explain as I am not an expert on this matter and I have no guarantee that our conclusions were right. So in general human eye is able to recognize light waves up to 700-750nm. For H-alpha it is 656nm, for SII 672nm and for OIII it is 495-500nm. The issue is that both, cameras and human eyes overlaps some of those signals so for example OIII is registered as blue and green pixel. That's why Oxide with dual bands filters show it as blue-green'ish. To make things even more complicated we should consider also what @vlaiv has mentioned - issues with colours on displays.

So as per above "natural look" of mentioned, as an example Rosette nebula, if we will consider it as made mostly of hydrogen with some of trioxygen inside - it will be definitely red'ish and blue'ish inside so other concepts are just an freestyle interpretations/art.

But still from the other hand.... We can't be sure what is on the way to it. Between us and the object. What kind of gases, dust etc. is making impact on colours. That's why probably I will give up on thinking about this topic as considerations of the natural look of objects from Earth vs natural look of object while observer would be close to it sounds pointless so finding of unequivocal answer is impossible.

 

Link to comment
Share on other sites

16 minutes ago, raf2020 said:

We can't be sure what is on the way to it.

But you can :D

Say you image Rosette Nebula. There are stars embedded in that nebulosity, right?

Stars have spectra and from their spectral analysis we can figure out what spectral class particular star is. We know temperature for each spectral class and we can know what color particular star is because of its temperature. Plank's law.

All stars have one of these colors:

image.png.fff310bd8f8eb450cfa654aa35f4cf47.png

Here we see sRGB triangle on XY chromaticity diagram and Plankian locus marked with corresponding temperatures.

We can take certain star in the image that we know is at same distance to our object and check expected and recorded color of that star - from that we can infer any interstellar reddening and devise inverse transform.

For galaxies - we can do statistical analysis because we have idea what sort of colors we can get and we know relationship between temperature and luminosity and again we can do calibration on whole galaxy - fit the data to what is expected and again derive transform by statistical means (it will not be %100 correct - but think of it as fitting a curve on set of data - it won't hit each point exactly but it will be good approximation to set of data).

 

  • Like 2
Link to comment
Share on other sites

Haha @vlaiv it sounds reasonable but isn't it going too far from what we call "amateur astrophotography"? :D Even for me, person who has considered it more as a way of showing objects possibly closest to their natural look it is too much probably :D Maybe one day imaging technology will be so advanced that we wouldn't need to touch science so hardly and overkill the joy :D But still, probably you are right. 

Link to comment
Share on other sites

1 hour ago, raf2020 said:

Haha @vlaiv it sounds reasonable but isn't it going too far from what we call "amateur astrophotography"? :D Even for me, person who has considered it more as a way of showing objects possibly closest to their natural look it is too much probably :D Maybe one day imaging technology will be so advanced that we wouldn't need to touch science so hardly and overkill the joy :D But still, probably you are right. 

99.9% all astrophotos are made by method of stacking.

I you were to read how it is done in terms of mathematics involved - you would certainly think it is overkill to know all that - and you are right. As far as making a photo, you don't need how to do any of that - as long as there is software that will do that for you.

It is exactly the same with method that I've written above - it can be literally reduced to single button with caption - "calibrate color".

There is plate solving - that will help software identify each star. There are catalogs (like Gaia DR2) that contain temperature information for bunch of stars. Software can recognize stars - that is how alignment part of stacking works.

Once approach is implemented in software - then it is just push of a button really - and you would not be thinking about it except to the level - "select color correction type: 1. earth bound, 2. solar system bound, 3. orbit of object" :D

 

  • Like 2
Link to comment
Share on other sites

1 hour ago, vlaiv said:

99.9% all astrophotos are made by method of stacking.

I you were to read how it is done in terms of mathematics involved - you would certainly think it is overkill to know all that - and you are right. As far as making a photo, you don't need how to do any of that - as long as there is software that will do that for you.

It is exactly the same with method that I've written above - it can be literally reduced to single button with caption - "calibrate color".

There is plate solving - that will help software identify each star. There are catalogs (like Gaia DR2) that contain temperature information for bunch of stars. Software can recognize stars - that is how alignment part of stacking works.

Once approach is implemented in software - then it is just push of a button really - and you would not be thinking about it except to the level - "select color correction type: 1. earth bound, 2. solar system bound, 3. orbit of object" :D

 

Hello Vlaiv,

what you're explaining here, isn't that what Siril does with it's "Photometric calibration" functionality?

AstroRookie

 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.