-
Posts
38,061 -
Joined
-
Last visited
-
Days Won
302
Content Type
Profiles
Forums
Gallery
Events
Blogs
Posts posted by ollypenrice
-
-
This looks interesting but, on Kindle, it's rather expensive and has a large file size. This makes me wonder if the book's format will be appropriate to a Kindle. Pictures, graphs and graphic illustrations don't work well on the Kindle, for me.
Thanks in advance,
Olly
-
1 hour ago, TiffsAndAstro said:
Im hopping i can just slew and center m87 in nina over multiple nights. dslr dark frames might not be an issue for too much longer. i think its telescope and astro cam time.
after what i can only assume is a decent 3ppa, it slews and is usually about 4000 pixels out. then it adjusts and is about 40 pixels out, then one last and its a couple of pixels out.
In that case you'll find nothing difficult about multiple night imaging. Have fun!
Olly
- 1
-
1 hour ago, TiffsAndAstro said:
confession time: i only took 5 dark frames. also i used 'library flats' because i am lazier than cuiv and it was just another test. wasn't really expecting to actually see any galaxies and i gave up counting at twenty. i think i've managed to clean the finger print off the front element that i think caused the black hole.
its good to know the cause and likely solution. im rushing a bit into more complexity, but so far its been ok. most problems are with niggly usb things. and focus. and clouds. i need time to fail more then fail bit less the next time
Many DSLR users don't use darks because they add more noise than they remove, and using only 5 will certainly do that. A large dither (12 pixels) is by far the best way to go.
Shooting multiple nights is perfectly simple provided you have the camera consistently aligned. I always align along RA and Dec (either in portrait or landscape) because it is repeatable. Once set up, simply slew slowly in one axis while taking an exposure of about 3 seconds. You'll get star trails and these will show the current orientation of the camera. Once they are horizontal or vertical on the chip you're good to go.
You can plate solve to reframe, though I never used to do this, working manually. I just looked at the star pattern round the edges of the chip.
Olly
- 1
-
More integration is always better but the benefits are very target-specific. If you are chasing faint signal like tidal tails, faint galaxy arm extensions, accretion loops, outlying nebulosity, IFN, etc., then the benefits are enormous. When you are trying to drag something faint out of the background sky, you will never have enough. If you have a reasonably bright object already but would like to sharpen its brighter details, the benefits are considerable. You need signal to sharpen. Multiply by four and you'll see a difference. If you are imaging an old elliptical galaxy with little structure, more signal will make it cleaner and smoother and, quite probably, a little bigger but the benefits will not compare with the previous examples.
With a screamingly fast F2 system, cooled CMOS camera and a very, very dark site we regard 2.5 hours as a minimum, so that would equate to about 11 hours in a small refractor of comparable focal length.
Olly
- 1
-
This is a good one for comparison. https://www.cloudynights.com/topic/616189-m51-how-far-can-i-push-until-it-breaks/
Olly
-
29 minutes ago, murwille said:
Hopefully all of this was a hoax! Is Alyn Wallace Dead : “Alyn Wallace: Alive or Dead?” – County Local News I don't know yet what to believe.
I trust any hoaxers are feeling proud of themselves and are happy to have reached their personal peak on the food chain where they will be a happy find for any passing slugs and snails.
Olly
-
It's a total non-issue if you have a decent number of subs and a good outlier-rejection algorithm, as others have said. Now that I'm working with CMOS in 3 minute subs, rather than CCD with up to 30 minute subs, sat trails no longer exist, despite the increase in the wretched things.
Olly
-
Super resolution in the galaxies but is the black point slightly clipped? It's very flat and black. I wonder if you could trade a bit of noise for a bit more faint stuff?
Olly
- 1
- 1
-
I'd put it on a Skywatcher AZ EQ6. AZ (Alt Az) is much nicer for visual because eyepiece position and orientation are more consistent and this is OK for planetary imaging, too. It's quick to set up, with no polar alignment. But... when you want EQ orientation, you have it. Finally, you will not be under-mounted, which is just so nice.
For Ha viewing I would just buy a dedicated Ha scope at whatever budget suits your pocket.
Olly
-
For focus, your capture software should give you a Full Width Half Max reading on either a chosen star or an average across the chip.
https://en.wikipedia.org/wiki/Full_width_at_half_maximum
This means that it looks at the bell curve of brightness going from one side of the star, through the middle, to the other side and it measures the stellar width half way up the curve. It can't accurately measure from one side of the full stellar image to the other because the faintness of the edge of the star makes it impossible to be sure just where it starts and ends. If using this FWHM facility in your software, the star must not be saturated. If it is, the top will be chopped off top of the curve and this will then have a flat top. To avoid saturated stars you can choose fainter ones or shorten the exposures, but 3 second-plus exposures average out the seeing and give you more stable FWHM readings. Do expect them to vary, though, at each reading and keep the lowest reading in mind.
In practice, I would use the B-Mask in live view on your bright alignment star, frame up the image and then check FWHM just before starting the run. The thing about FWHM is that you'll find a focus star without having to leave the target whereas, with a B mask, there might not be a suitable one in the frame. So... initial bright star focus with B mask and then FWHM on the target region after that. Check focus regularly - as ever.
Olly
- 1
- 1
-
10 hours ago, Gerr said:
For me, way better. We can now trace spiral detail much further into the core.
Olly
- 1
-
13 minutes ago, Gerr said:
Hi Olly,
Core isn’t saturated in the linear data. I guess I like the centres bright as long as I don’t obscure the dust lanes. A number of processing connotations can be followed dependent on your ‘artistic’ side. It’s what makes astro processing so difficult as there is no hard and fast rule to follow - whatever looks good I suppose!! 😂
Thanks,
Geraint.
I'm not sure I agree. Not destroying information in processing is as much scientific as artistic and going from unsaturated to saturated does destroy information.
Olly
-
I agree with Elp. This would be great with the colour balance eased away from red in the bright parts.
Olly
- 1
- 1
-
An attractive nebula, nicely done.
Things I might look at would be 1) noise - StarXterminator is so effective. 2) Colour noise in the background sky. It's quite 'colour busy' and just selecting the background and reducing saturation might make a difference.
Olly
- 1
-
On 20/02/2024 at 10:54, JOC said:
For viewing the heavens the key seems to be the apparent magnification from the EP.
I wonder if the active ingredient here is the exit pupil?
Olly
-
I'm not a fan of mini-computers on the scope. I host six remote instruments and have replaced a lot of them for their owners. For long term reliability I like an observatory desktop with a lot of USB ports and I gather the cables at the dovetail and at the top of the counterweight bar and let them hang from there. Clearly this isn't a portable solution.
Olly
- 3
-
1 hour ago, Rodd said:
Yes, lovely. I'm going to see if I can find my old linear TEC140 data and see what difference modern processing tools make.
Olly
-
I'm surprised this hasn't piqued more interest. It certainly doesn't look like near IR in your data.
Olly
-
That green glow... I bet you could be persuaded...
These are great galaxies.
Olly
- 1
-
It's very good and the colour is far more accurate than the bright blue spirals which we often see with M101.
The core is saturated in the first one and still saturated, though less so, in the second. The first thing I'd want to do is look at the core in the linear data. Is that saturated? If it isn't, there is no need for it to become saturated during the stretch. You just need a better stretch, or a blending of two stretches. Maybe just a hand-shaped stretch in Curves would do it. What does the linear core look like?
Olly
- 1
-
For me, this image didn't seem exceptional until I clicked for the largest size - and then I found that the central bulge and dust lane were absolutely stunning. Unfortunately the last one seems to have been posted at lower resolution, or am I giving it the wrong clicks? It needs to be seen in large format to show its class.
What I do think is that the fainter outer regions are noisy, with a pronounced grain. I'm sure Russ Croman's Noise Xterminator would fix that easily and might allow you to give the lower brightnesses a bit more of a stretch. I wouldn't apply it to the brighter parts or dust lane. Those are superb.
Olly
-
Ultra-promising.
Olly
- 1
-
5 hours ago, Budgie1 said:
Interesting. This is the blue channel from our OSC camera.
I'd probably vote for reflection, but who knows?
Olly
- 1
-
4 hours ago, TiffsAndAstro said:
Apologies for the title but it's hard to describe what I'm trying to search for.
With limited visibility in backyard say roughly west I can see Orion as it drops, followed by other stars at same distance from ncp
Is there a way to sort of take mosaics as they pass by and then stitch them together?
Cos it might look cool
The answer is 'yes.' This is a mosaic combining 42 individual images.
I can't tell you how to do it in five minutes, though!
lly
- 1
What is the best broadband filter in 2023 for OSC cameras ?
in Imaging - Discussion
Posted · Edited by ollypenrice
Clarification
Don't try to calculate crop factors at all. They are entirely meaningless in astrophotography. 'Crop factors' imply that you can improve resolution by reducing chip size, which is pure nonsense.
There are two dimensions which matter:
1) The size of the pixels. These, along with focal length, determine the resolution in arcseconds per pixel. (The pixel count of the chip on its own says nothing at all about the resolution.)
2) The dimensions of the chip in mm. These determine the field of view.
If you muddy the waters by using the kinds of shorthand used in daytime photography you go right up a gum tree.
Olly