-
Posts
38,138 -
Joined
-
Last visited
-
Days Won
304
Content Type
Profiles
Forums
Gallery
Events
Blogs
Posts posted by ollypenrice
-
-
-
2 hours ago, AKB said:
The diagonal measurement seems to be 15.97mm (according to FLO) so about 0.6 of an inch.
Your point still stands that bigger is better, of course, but one inch is nearly as big as an ASI2600?
Tony
The diagonal of the 2600 is 28.3mm which, for me, seems an awful lot bigger than 15.97mm...
I agree with the thrust of Adam J's analysis of budget and priorities.
Regarding tying yourself to one manufacturer or system, I would say don't. Leave yourself with as many options as possible for when software starts acting the goat and for when a new option appears on the market. Although this may not be an option for you, I can tell you as someone who hosts a number of remotely operated rigs that a basic desktop near the mount, with a lot of USB ports and no hubs or astro-dedicated computing hardware, is very, very reliable.
Olly
-
20 minutes ago, Laurin Dave said:
Oh no! We have a widefield Flaming Star and a huge California-to M45 already... Madness! But Paul is mulling over something an order of magnitde madder than that...
😁lly
- 1
-
This is posted on the deep sky board but since it's a 42 panel Samyang 135 mosaic I'll pop it here as well. Love this lens!
Olly
- 10
-
The 1 inch sensor of the 533 strikes me as a bit small for a deep sky camera. In any upgrade I'm sure you'd want something larger. Many deep sky objects are surprisingly extended.
CMOS colour cameras are good, but what is your level of light pollution? If it's high you might find mono more productive.
Olly
-
58 minutes ago, vlaiv said:
I would like to point a few things out on this topic.
- Fixed set of intensities mimics the way we see. We are capable of seeing larger intensity difference than our computer screens can show - but only because of anatomy of our eyes - we have iris that opens and closes depending on amount of light we see. If we fix size of our pupil - then there is small range of brightness that we can perceive at the same time.
Having fixed range of brightness in processing is not limitation of image processing / display systems - it is limitation of our physique for the most part.
- Problem is not showing very large intensity range in single image. Problem is with detail / structure. To be able to see detail and structure in something we need to distinguish enough intensity levels that compose those details and structure. Since we have limited "budget" of intensities - we can choose which part of large range will we show in fine detail / granularity. We can opt to show fine grained high intensity range, or medium intensity range or low intensity range. We can also opt to say show low and medium range but with less granularity - or maybe everything with the least granularity. This is processing choice.
- There are tools that perform "high dynamic range" - but I would caution against using those. They will show both low intensity and high intensity regions in high fidelity. Or rather - tool will attempt to do this by using certain technique. Unfortunately - this technique is unnatural and in my opinion should be avoided.
It relies on using non monotonic stretch of intensities - something our brain easily recognizes in everyday life as unnatural.
Our brain does not feel cheated if we keep one thing in our processing - order of intensities. If something is brighter in real life and we keep it brighter in the image - all is good, but if we reverse intensities - our brain will start to rebel and say - this is unnatural.
This translates into slope of curves in processing software - if we keep it monotonically rising from left to right - image will look natural (it may look forcefully stretched and bad in different sort of way - but it will look ok in domain that I'm discussing) - so:
that is ok - and we don't have immediate objection to the image but this:
feels unnatural - note how curves first raises then falls then raises again - it is not constantly rising when going left to right. This in turn produces some very unnatural looking regions of the image - like strange clouds, or parts of buildings that lack detail - look flat / gray.
Slope of the curve in curves is level of detail in given intensity region - more it slopes - more detail. If it's flat - less detail.
Basic stretch in astronomy images looks like this:
and this is for a reason - we want to show detail in faint regions - left of curves diagram, so we set that part to steep slope and we don't care about detail in high intensity regions - so we leave that flat in right part.
Here you can see that - in order to have two regions with steep slope - we need to go back down - and change direction of the slope - which is no good as it reverses intensities and confuses our brain (in effect - it creates a negative image superimposed on positive image - and we find negative image unnatural).
You might be applying the same technique without realizing it - if you create two layers that you stretch differently and then selectively blend those - you might end up in above situation with reverse in slope.
I'll try to find classic example of this in high dynamic range processing and post a screen shot.
Here it is - making central part of M42 with all the detail - while still maintaining detail in faint outer reaches. This is intensity inversion.
I agree that HDR techniques usually produce an unnatural look and I only use them in astronomy in cases like M42, where the alternative is, to my eye, worse. If the camera itself cannot span the dynamic range there is no way to present it with any curve.
In daytime photography I sometimes enjoy using HDR precisely because it does give an unnatural look. It makes the picture look like a picture, so to speak. It presents an object or view in a way that we don't normally see it and this can stimulate the eye-brain to look again and think again. I like to get it to the level where you think, 'There is something slightly odd about this...'
Of course, you may not like it at all!
Olly
-
Just thinking aloud...
One of the interesting challenges of widefield image processing is that the imager usually has a higher dynamic range in the target than is found in restricted fields of view. We are likely to find patches of obscuring dust which lie well below the background brightness, we have the clear background itself, then nuanced levels of reflecting dust and the full range of emission nebulosity. The dynamic range of the software does not, unfortunately, expand to allow for this. What is more, the imager will want to bring out very large-scale structures in the faint, usually dusty, signal and doing so consumes more of the available range. This is borne out by the fact that, when I add high res close-up data, the background of the original processing is almost always much darker than in the widefield.
Olly
-
I'll say it again: dither is for DSLRs and, maybe, CCDs.
CMOS? Why?
Olly
-
3 minutes ago, Mr H in Yorkshire said:
Life is hard for perfectionists, as I was once counselled by a manager. Do what I have done, get in touch with you inner slob and relax.. 😉
What bugs me is the inherent pretentiousness, though. Method. Two syllables. Methodology. Five syllables. Three of these syllables are redundant and incorrect but hey, man, five syllables have more gravitas than three, right? And, get this, if someone calls you out on it, you can slag them off as a pedant. Win-win!
Believe me, my inner slob has better things to do!!!
lly
- 1
-
( I can't reveal too much about our methodology as it is against the rules of the competition to display method or results prior to the exhibition )
Of course you can't reveal too much about your methodology because the phrase doesn't make sense! Does a biology student reveal their 'biology?' What you are not allowed to reveal is your method. It shouldn't, but this use of 'methodology' to mean 'method' drives me raving nuts!!!
lly
- 1
-
This is a CMOS sensor so is it worth dithering? Personally, I wouldn't bother.
Olly
-
34 minutes ago, glafnazur said:
Truly an amazing image Olly. What I really like about it is that it puts everything into perspective, it shows where objects, that we know so well, are in relation to each other and their size. It always amazes me how big some of these objects are.
That's the fun of it, I think. You do need to bear in mind, though, that there are distortions of proportion inherent in rendering a curved reality onto a flat picture. Don't take relative sizes as gospel but as approximations.
Olly
-
8 hours ago, Meluhanz said:
Watching those first images come in is truly amazing.
Olly
-
2 hours ago, tomato said:
Just on the question of the Sy135’s (and the RASA8) amazing ability to capture this much signal in 2 hrs, I’m wondering just how much your dark sky contributes to this. I’ve just watched a video on the AIC about remote imaging in Chile, the sky at the site records a SQM of 22, from memory your location is not far from this?
We hit SQM22 several times a year but it's not our default, so to speak. We regard anything less than 21.5 as sub standard but workable down to about 21, depending on the project.
The only place I've ever imaged is here, so I really don't know how our site compares. I do think it would be hard to beat in mainland Europe. Two guys from the professional observatory to our south said they wish they could swap sites with me. They still managed to discover the first exo-planet, though.
Olly
- 1
-
20 hours ago, tomato said:
Phew, relieved to see you didn’t pick up the Spaghetti Nebula in all it’s glory or else I would be selling up now.😉
A wondrous image, as always.
Confession time: I hadn't realized that the frame included the Spaghetti. On reading your comment, I tracked down an old horse-drawn computer on which I had 33x15 minutes of Canon 200L Ha data. Just the job! Here we go:
Big one here: https://ollypenrice.smugmug.com/Other/Emission-Nebulae/i-Tqb6kBP/A
Olly
- 10
-
9 hours ago, Steve Ward said:
Paul spotted your question, Steve, and emailed me with this information:
The nebula at the top right of O2M is a combination of several LDNs - 1534, 1532, 1528, 1527 and the lower "eye" is IC 2087.
Olly
-
7 hours ago, Meluhanz said:
I soo want to use the ASI 183 camera that i bought.
The Star Adventurer Pro manual read that a max payload of 11 pounds, so my thought was it can load even my Nexstar 8SE to try it out along with the Orion StarMax 102mm.
My last resort will be to use my Canon EOS 600D with my 18-55mm lens
Unfortunately the 'go to' priority in internet mount discussions tends to be payload. It shouldn't be, the priority should be accuracy. This is what underlies Elp's post above.
To understand this you need to understand the following; an imaging system has a resolution defined by how many arcseconds of sky it places on one pixel. Plenty of online calculators will give you this;
https://astronomy.tools/calculators/ccd
You can increase resolution either by using a longer focal length or smaller pixels. The unit is arcsecs per pixel ("PP) * The instability of the atmosphere (the 'seeing') usually means that going below 1"PP is a waste of time.
Because pixels are getting smaller this limit is reached at increasingly short focal lengths.
Now for what this has to do with your mount: in order to capture the resolution of the system in the real world, your mount needs to track with an error about half as large as your pixel scale. That is going to require a very small error. If your image system is working at 2"PP you need to track at no more than 1"PP error. The native error of an unguided mount like the EQ6 is between 20 and 30 times more than that. Ouch! That is why we use autoguiders. These will bring an EQ6 down to an error of between 0.5 and 1"PP.
If you image at a scale finer than your mount's accuracy will allow, you might as well image with a shorter focal length. You will capture the same details and have a wider field of view. The Star Adventurer is designed for low resolution imaging. You may be within its payload but you are vastly beyond its tracking precision.
Olly
* In casual daytime photography discussions 'resolution' is often used with sloppy inaccuracy to mean pixel count on the chip. This is nonsense.
-
The Star Adventurer is a light, portable mount intended to carry cameras and lenses. Even if your reducer works, your focal length will be about 800mm, a thoroughly telescopic focal length requiring a thoroughly telescopic mount. Why not buy a used prime focus camera lens, an adapter to fit your camera, and try that on the Star Adventurer.? You cannot expect it to track at 800mm, unguided. My £6000 Mesu mount could not do that.
Olly
-
-
22 minutes ago, Laurin Dave said:
Marvellous image Olly.. I think you need to get it printed up and placed on the side wall of Les Granges! Great detail at full res... spotted a little Crab, a trace of Spaghetti plus countless more objects which this image places in context.
Dave
More pasta would be nice but, unfortunately, it really doesn't show well in OSC. You never can tell. Some Ha targets come out in a blaze of light but a few play hard to get and we don't have a NB option at present.
I was curious about the substantial nebula below the Monkey Head. Large and bright, I had no idea what it was and had to ask Paul. It's Sh2-261 and it gets next to no attention. I found this nicely done, archived image on here and, again, the comments express the same surprise and unfamiliarity. I certainly think we should do it in the RASA and blend it in here, too.
Olly
- 1
-
14 minutes ago, step_hen said:
Wow. So much to see. Amazing work Olly and team. I've just bought a Samyang 135!
Stephen
Good choice. Combined with Star Xterminator or Starnett++ it is a real demon. Do not buy the Wega adapter. It is junk.
Olly
- 1
-
- Popular Post
- Popular Post
Another of those mad endeavours! Here is the blurb:
Orion
bordered by Gemini, Auriga, Taurus, Eridanus, Lepus, Canis Major and Monoceros.
The celebrities here include the Spaghetti, Jellyfish and Monkey Head Nebulae, the Crab, the Hyades, the Cone, IC447, the Rosette, the Meissa Nebula, Barnard’s Loop, Barnard 32, The Boogey Man, M78, The Horse and Flame, M42 and the Running Man, NGC2179, The Witch Head and the Wolf. We have the glow of Sirus lower left and plenty of room above Aldebaran upper right. The uppermost bright star is Elnath, definitively placed in Taurus in 1930, prior to which it was also placed in Auriga. Enough glitter for Saturday night at the London Palladium! Most of these photogenic celebs were gently enhanced by the addition, very discretely, of a little telescopic data which fifteen years of imaging has left in the bank.
This has 42 panels, each with a target of 40 x 3 minute subs. The hard work was done by Paul Kummer, who planned the mosaic, captured it, calibrated the 42 individual images, combined them, sorted out the field geometry and handed me a large linear image to post-process. This I did, blending in the telescopic data along the way. All stars, however, are Samyang.
Peter Woods provided both mount and camera (M Uno and TS2600 OSC.) The remarkable Samyang 135 lens is mine, used wide open here at F2. This robotically controlled rig lives in one of our remote imaging sheds.
And here's the pic. If you want to find the Crab you'll need to see the big one: https://photos.smugmug.com/Other/Emission-Nebulae/i-VFj7SdZ/0/437c53cb/X3/ORION MONOCEROS 10K web-X3.jpg
Edit: Please scroll down the thread for an important improvement to the image.
Olly, Paul and Peter.
- 53
-
Difficult target done well. And yes, I agree about the extended spiral.
Olly
- 1
-
I have used two Mesu friction drive mounts and consider them perfect. Whatever my budget, I would never choose another make, and I host two of the most obvious rivals in my robotic sheds.
The Mesu has a two stage reduction and I think that the metallurgy is very important.
If Mesus slip, I haven't noticed. My number one mount lost not one single sub to guiding error in over ten years of commercial use. I know that this is hard to believe but there it is.
Olly
New to imaging after a few years lay off
in Getting Started With Imaging
Posted
Hi Tony, I haven't found anywhere which gives that as the diagonal. Here, for example:
https://www.astroshop.eu/astronomical-cameras/zwo-camera-asi-533-mc-pro-color/p,64475#specifications
Had the chip been as you thought then it would indeed have been insignificantly smaller than the APSc.
Three loud cheers for Astroshop EU in the link above because they give the chip size in mm, x axis and y. Why on earth is this vital information not stated clearly on every spec sheet? It is infuriating. I've been banging on about this for years. You need these dimensions for plugging into planetaria to give simulations of field of view. This is certainly something the OP should do before choosing any camera.
Olly