Jump to content

ollypenrice

Members
  • Posts

    38,263
  • Joined

  • Last visited

  • Days Won

    307

Everything posted by ollypenrice

  1. I haven't seen an FT Crayford though I have their rack and pinion on my TEC 140. It's excellent. You're saying that the FT uses a smooth roller running on an anodized smooth drawtube? Olly
  2. I have a Moonlite but don't use it. For imaging it is, in my view, a joke. A smooth steel roller runs on a shiny anodised drawtube. Come now, how is that meant to work? Baader have gone through two iterations to improve on that, firstly with the steeltrack (coarse steel strip replaces shiny drawtube) and then diamond (coarse industrial diamond strip replaces coarse steel strip.) The Moonlite just demonstrates that you can sell anything if it's finished in pretty anodised colours. Lovely on a Dob, smooth as you like and always horizontal. I wouldn't use one anywhere else. Olly
  3. A go-to and tracking mount is not necessarily a photographic mount. On a modest budget I would forget photography and buy a nice basic Dob. (This is an astrophotographer speaking, by the way.) Olly
  4. Sure. This is HaLRGB at lower res than yours, about 1.8"PP. It's a very tight crop but there are nice little things in there. Likewise your NB has found things that this didn't. Olly
  5. Deliciously resolved details in the dusty part of the trunk feature. Are you tempted to try this in LRGB as well? There are entirely different things to see in the top of the trunk. Super image. That kit is working! Olly
  6. I'm using 140mm at 0.9"PP and like the result. (The comparison in our earlier conversation was between this and a 14 inch ODK at 6.2"PP which I felt was oversampled.) I agree that the OP would be pushing it going for 1"PP and your guide above agrees broadly with my experience. Olly
  7. Oooh! OOOHHH!!! But that is not what you said when I posted a comparison of 0.6 and 0.9 "PP images of M101 and M51! You said you saw more detail in the 0.6"PP versions. I said I didn't think it was worth bothering with!! lly By the way (on more serious matters) I had to copy and paste your green grin icon since I don't have it offered any more. I like that icon. I feel it resembles me to perfection!
  8. Almost, apart from, 'because of the 127 MAK's focal ratio compared to a faster scope, the field of view is reduced...' The field of view is reduced because the focal length is longer, not because the focal ratio is slower. My advice would be to remember that focal ratio is what's called a derived variable. It describes the relationship between two primary qualities of a telescope. These are its aperture and its focal length and they are hard, physical properties which never deceive. In AP we have to be wary of focal ratio because it can be deceptive. We cannot just say 'F2 is twenty times faster than F10' (as is claimed for the Hyperstar focal reducer, for example) because the aperture is unaltered, so the amount of light from the galaxy is unaltered, so how can the exposure time be reduced? By making a tiny image of the galaxy rather than a big one is the answer. Not very impressive after all! To start from the beginning, you want a certain resolution in your galaxy images. Say you choose 1 arcsec per pixel. You need a scope and camera combination which gives about that. So your pixel size is now matched to your focal length. Now, how fast do you want it to be? That will depend on aperture. (Yes, more aperture will mean a faster F ratio but don't be fooled. The 'active ingredient' here is aperture. More light. The 127 Mak has rather a small aperture for its focal length. The same focal length with more aperture would be preferable for deep sky imaging. (I'm putting it this way rather than saying 'The Mak is a bit slow for DS imaging' because I want to keep focal length and aperture up front in the explanation given the confusion brought by F ratio. Olly
  9. When these scopes are good they are very good. There was a long and detailed post by Singlin on here a few years ago in which he went through all he'd done to get his into good shape. I can't find that thread but there is good info on this one. Laser Jock runs through an interesting list of mods. I think any reflector is best guided by OAG because of potential mirror movement. Olly
  10. Could we have a recording of the lens cap going 'Ding' please? 😁lly
  11. You don't have to upgrade, though. I prefer 5 over 6 myself. Olly
  12. It was a 150 F8 refractor. Sorry not have been clearer. Mind you, how much difference would there be? Olly
  13. Agreed. At one time I did have the 150 F8 on an EQ3/2 and it was possible but marginal, irritatingly wobbly and hopeless in the wind. Olly
  14. I have two EQ sixes and never have to lock the clutches hard. I'm not a human torque wrench but I apply maybe a fifth the force I could apply if I really tried. Maybe less but certainly not more. Olly
  15. I suppose I ought to try but AstroArt does the job juslikethat in a nanosecond, I'm used to it and I can't see anything wrong with it! The rejecting of bad subs is an interesting one. Once you have sufficient subs to get an effective sigma rejection how bad does a sub have to be to be worth rejecting? At some point, yes, you have to bin a bad one but a bit of soft focus, a hint of trailing...? I bung 'em in, especially in RGB. Olly
  16. Processing is a huge part of imaging... Olly
  17. I would concentrate on two things with the original data. 1) Get the colour channels balanced and 2) Stretch without clipping the black point. I'm sure your data has more to give. Olly
  18. Complicated. If you are a beginner you will find it much easier to shoot equal amounts of L and R and G and B. As your processing skills improve you will find that you can shoot considerably more L than colour and that this will be an advantage on targets with very faint traces which you are trying to bring out. But you will need more complex processing in order to avoid the luminance bleaching out all the colour from the RGB. The simplest route at capture is to scroll through the filters, as in LRGB, LRGB, etc. The more sophisticated way is to shoot blue and luminance when the object is at its highest elevation because the atmosphere is at its least harmful at high elevation. Blue is the worst hit by the atmosphere and luminance carries all the detail, so it wants to be sharp. The most sophisticated approach involves shooting the luminance on nights with the best seeing (the most stable seeing, not necessarily the most transparent sky.) If you use an FWHM measurement to focus you will soon learn what is a good value for your rig and when you have a nice low FWHM you should shoot luminance. When the seeing is poor and the FWHM values are high, shoot colour. This is vital in high resolution imaging but only 'useful' in lower resolution and widefield imaging. I hope that's not 'too much information!' If it is, just shoot equal amounts of LRGB, LRGB, etc. Olly
  19. The reducer will have a specified 'chip distance,' the required distance between the reducer and the chip. You need to know what this is and then add or remove hardware to get your chip to that distance. The reducer manufacturer will surely have published this somewhere. Olly
  20. Wide field at low resolution makes fewer demands on tracking precision than the contrary. Some deep sky objects are, in fact, so large that they are rarely photographed at all. This is a famous image taken with the Canon 'nifty fifty ' lens. http://sguisard.astrosurf.com/Pagim/Orion_constellation-HRVB-50mm.html Barnard's loop and the Meissa Nebula are DS objects but how often to you see them? The cnsiderable skills needed to make an image like this can be learned on data like these. Olly
  21. I also know lots of people who've had exactly this problem with Team Viewer. They've usually moved to Any Desk. If Team Viewer don't want to provide a free service they should stop offering it. Offering it and then withdrawing it under patently spurious suspicions is fraudulent. Olly
  22. Dave, remember that, if you have an RGB image as well, this may give you all the 'short' exposures you need. I quite often use the RGB-only as a less saturated stack to cover regions saturated in the luminance layer. I say 'quite often,' by which I mean, 'Quite often in the rare cases where I need less exposure.' You can extract a synthteic luminance from the RGB if you like but it will work in the same way as a luminance layer in its natural form. When layer masking three exposure lengths for M42 I don't shoot the shortest in luminance at all. Why bother? If you're trying to reduce signal there is no point in capturing luminance in order to capture more! Olly
  23. Vlaiv's is an informative post regarding dynamic range. Most of the people I know using modern CMOS cameras are using fine minute subs. It's easy to see if you're saturated the moment you look at the linear stack and, if you suspect an area might be saturated, you just mouse over to read its values. My guess is that multiple sub lengths will be as rarely needed as with CCD imaging. SQM 21.6 is dark! We occasionally hit 22 here but it's usually 21.6 or so. Guests who've been to Nambia say the zenith is similar there to here but that the sky is dark to lower elevations, which is to be expected. Olly
  24. Imaging with narrowband filters is looooong exposure territory or it may not be worth it. Remember that a narrowband filter cannot bring in new light. All it can do is exclude light not of its selected wavelength. Your chip gets just as much Ha signal without the filter as with it. The difference is that it ONLY gets the Ha signal with it. Why bother? Because, if you have enough of it, you can drag out structures and details which only exist in Ha. But you need a lot of signal... Olly
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.