Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

ollypenrice

Members
  • Posts

    38,157
  • Joined

  • Last visited

  • Days Won

    305

Posts posted by ollypenrice

  1. You're right that there is a slight R/L elongation but it really is very slight. I think it's fine to try to get to the bottom of it but it would be a shame to let it spoil the image for you.

    Olly

    Edit: While we're being pernickety, I'd be very careful where you place your background markers in DBE because I suspect they might have been a tad too close to the galaxy. There's the tiniest hint of a dark ring around it, maybe? Or am I inventing it? ABE tends to do this on galaxy images, I've noticed.

  2. You might be surprised by how small a roll off observatory can be. The most compact design is usually called the 'sentry box' with the entire shed rolling off when observing. This way the shed can be a tight fit around the mount and scope since it doesn't need to house the observer as well. Also they can be made incredibly easily: you start with a small concrete base and pier. You buy the smallest, simplest shed that will fit over your setup. These can be wooden, plastic or metal. You make a plywood floor the size of the shed's footprint and fit it with little wheels running on rails. It needs a slot cutting into the plywood floor to let it run halfway round the pier and then you just build up the shed on the wooden rolling floor.

    I've made two sheds like this but the simple idea of using a rolling floor to carry a standard bought-in shed didn't occur to me so I spent ages welding up chassis and frames myself. Curses!! 

    ?lly

    • Like 2
    • Haha 1
  3. 6 hours ago, Stub Mandrel said:

    Thanks Olly.

    I do aim to keep things slightly unbalanced.

    Something I have noticed is that guiding is noticeably worse when low down (<20 degrees) but this may be a combination of bas seeing and poor contrast due to LP.

    I think it's the seeing. It's exactly the same for us even with pretty dark horizons.

    Olly

    • Thanks 1
  4. Vlaiv is perfectly correct that the round stars test is almost meaningless but at least with round stars you can produce an attractive image. When setting up the first Mesu we always had round stars but by tuning the guide parameters they got smaller and smaller!

    I don't think there's any reason to believe that the HEQ5 is less accurate than the 6 provided both are within payload.

    Any backlash adds to the guide error and the worse the seeing the more it does so. (When corrections are few there is a reduced tendency for the mount to be sent oscillating across the backlash.) So losing the backlash is a very good idea. The other ways to combat backlash are to run slightly east heavy and slightly polar misaligned. East heavy keeps the drive in 'push' mode and the misalignment means you can disable guiding on one direction in Dec, letting the other 'push in the direction of correction.' This does work and won't stop you from doing 15 minute subs.

    The duration of guide subs is also something to tune in the light of prevailing conditions. Our EQ sixes thrive on short guide subs if the seeing is good because they have pretty rapid PE. However, if the seeing is bad this has them chasing it, so we do better with longer subs which average out the position of the guide star's image. (It's worth remembering that the guide trace has no way of knowing where the real star is. It only knows where its image is. So very short subs do give a better trace but is that a trace of the real star or just its image?  Curses!! )

    The other variable within the guide trace is the position of the scope between corrections. Again we have no means of knowing, but Avalon claim that with their backlash-free belt drive the correction is fed in faster so that, between guide inputs, the true location of the scope is on target for more of the time. With any backlash oscillation in play it's a good bet the situation between inputs will be worse.

    Olly

    • Like 4
  5. I also use the AP adapter and it would astonish me if AP didn't do a good job of blackening. They're a pretty serious organization.

    If using paint, be aware that you shouldn't use anything made with dyes, which are reflective in other wavelengths. I forget which. You need to use a paint made with pigments, such as high temperature paints for barbecues and stoves. There's a paper about this somewhere on the net.

    Olly

    • Like 1
  6. 7 hours ago, Whirlwind said:

    However all of these are going to be dependent on the system being used.  The FSQs are well corrected sealed units so there is less likelyhood that these will show some form of aberration.  There's likely a number of factors that come into play.  What we do know is that the size of the shadowing is dependent on the distance to the sensor.  The further away the larger it becomes, but the more diffuse as the effect s spread out over a larger area of the ccd. The reason we don't see dust from an objective is because it is so far away that effectively it has been 'diffused' to being not visible. 

    On on that basis it then becomes a question of percentage change where changing the location of the sensor to the dust makes a noticeable difference.  If the back focus distance between ccd and dust was 50mm and by focusing you changed that by 1mm then the difference in the dust shadow will be more significant compared to one where the back focus distance is much larger (let's say 150mm).  It will also be dependent on the size of the pixels as you will be more sensitive to such changes as they get smaller.  Hence it may not be a case that they aren't there in remote setups but they aren't noticeable as the setup as designed.  However more flexible setups may find they are more susceptible to this kind of fluctuation.

     

    edit  - apologies for the typos, I hate typing on an iPad.

    The theory is perfectly sound, I agree, but does this actually ever happen in practice? I use pretty much the same setup as the OP - sometimes exactly the same depending on which camera he's using - and I've never seen such an effect.

    I've also hosted two robotic TEC140s and, again, never encountered any refocusing effects on flats.

    Olly

  7. 4 minutes ago, JohnSadlerAstro said:

    I'm afraid I'm just plain superstitious when on the issue of flats taken the next morning, after 'nothing has changed'. I have never got next-day callibs to work satisfactorily. Possibly the setup is angered when the by-3am-half-dead astronomer puts sleep before photos and staggers to bed. :D 

    You mention there is vignette, to be honest I cant see any? :)  Perhaps the effects of the website compression and screenshotting have masked it though.

    John

    ? Well, I'll admit it, wild horses wouldn't drag me into doing flats at the end of ten hours out under the winter stars! The darned things can wait, and always have done, other than in summer when it is impossible, here, to cool the cameras by day. If, in summer, I need new flats then I do have to shoot them in the wee hours because shooting them in an observatory at 55 degrees C is not really an option!

    But as for being superstitious - sure. Oh yes. Like most imagers I have a shedload of personal superstitions. Eg, Never use software to do anything you can do manually, Never add a gadget to 'simplify' your rig, Never use a hub when you can add another cable, Never listen to theory when you can use experiment... I could go on!!! But I'm a dinosaur and know it.

    Olly

    • Haha 1
  8. 30 minutes ago, Whirlwind said:

    It probably depends on which side of the focuser the flattener is.

    I use both Petzval scopes (whose flatteners are on the objective side of the focuser) and flattened triplets (whose flatteners are on the camera side of the focuser) and have never found that flats were sensitive to fine focus. And think about all those robotic scope users. They can't shoot flats 'per focus.' No, I simply don't believe that flats are sensitive to fine focus at the level carried out during a night's imaging. It simply flies in the face of everyones' experience.

    Olly

  9. 2 hours ago, kirkster501 said:

    Wow, thanks guys/Olly. What a wealth of genius this forum is.

     

    Run that past me again @ollypenrice please?  So you use a master BIAS as your dark for the flats?  Do you not use a BIAS as a "bias" for the flats as well?  You develop this flat before you start fiddling with lights?

     

     

    Yes, I use a master bias as my 'dark for flats.' The flats need this and only this by way of calibration. Don't subtract the bias a second time, once is just right!

    And yes, I always make my master flat before using it on a set of lights. I need to believe in it. To believe in it I need to look at it - though I can sometimes be wrong in thinking a flat is OK when it isn't or thinking it isn't when it is. This doesn't happen often but it does happen occasionally. Even with a full frame camera the TEC with flattener gives a seriously flat field, almost free from vignetting. Any vignetting that I see comes from my tight fisted use of 2 inch mounted filters. When I use the Atik 460 in the flattened TEC I would describe the vignetting as zero.

    You won't like this because you've already said previously that you don't like the 'one flat fits all filters' model, but when I find I have a dodgy flat for, say, green I just dump it and run the luminance flat for green. I do a lot of multi-panel mosaics and these need flat background skies. I've never had a problem doing it this way...

    Olly

    Edit: my calibration files may not be immortal but they are certainly geriatric! Flats for every night in an observatory based mount? Nooooooooo. One month's lifespan is unlucky, six months is lucky. A year is not unkown, not that I'd ever admit it on a forum... ?

  10. 5 hours ago, JohnSadlerAstro said:

    Hi,

    Does photoshop have the ability to mask an area and then apply a brightness change to that mask (like Star Tools)? If so, that's probably the best way. You might need to make several attempts at getting the mask the right size/blurriness, though.

    John

    P.S. Nice M101! :) 

    John, PS does have this ability but it also has layers and the eraser tool. For many, many processing adjustments I find this a far simpler method than masking for precisely the reasons you imply when you say,  'You might need to make several attempts at getting the mask the right size/blurriness, though.' If, instead of worrying about shaping a mask, you carry out a modification globally to a bottom layer, you can then use a soft edged eraser to remove the top layer where you prefer the bottom, modified, one. And you can see the result in real time as you do so. (As Sara once said to me, 'All you ever do is use the bloody eraser!'  Guilty as charged, m'lud.)

    Olly

    • Like 1
    • Haha 1
  11. Firstly any anomalies are unconnected with dust on your objective, which is far too far out of focus to have any effect. (Look at that big central obstruction in the middle of the lightpath of a Ritchey Chrétien!) Whatever is stopping your flats from working perfectly, it isn't that. Dust bunnies are created from contaminants much closer to the chip.

    And, although I follow Oddsocks' logic, I have never ever detected any issues arising from changes in focus between flats and capture. We refocus regularly but nobody shoots a set of flats prior to doing a refocus - do they? I've never heard of anyone doing so and they'd be pretty unpopular at star parties. ? If, inadvertently, you'd shot your flats at a very different point of focus then, yes, they would probably have an issue but the tiny adjustments we make to focus on an imaging run simply do not require replicating when shooting flats.  Is there any chance of a big accidental change in focus?

    Are you using the TEC flattener? I do find this needs to be kept clean because it's a huge piece of glass a good way from the sensor so it can create very large, faint bunnies.

    PI always strikes me as hard work for stacking and calibrating and, on the two occasions when guests have tried to persuade me to use it, they've demonstrated it and found they'd done something wrong so it didn't work.... In AstroArt I just make a master flat by average combining the individual flats and putting in a master bias as a dark flat. Lots of people use this short cut. It works perfectly for me. The idea is that, for short flats exposures, there is no significant difference between a dark and a bias.

     I never assemble flats in the same pass as stacking lights, though. With experience you get a feeling for when a flat is 'right' and when it's suspect. I always want to look at a master flat before applying it. Why they are sometimes wrong I don't know, but occasionally they are, and I just reshoot them.

    Your 'cresent moon' artefact would be an easy cosmetic fix in Photoshop (he claims! Famous last words.) Copy layer, top layer invisible and inactive. Open curves. Put a fixing point on the curve at the value of the background sky next to the crescent. (Put the cursor on a bit of this background sky and alt click.) Put a second fixing point on the curve below that. Pull down the curve above the top fixing point till the crescent is no longer brighter than the background. Top layer active and visible, use the feathered eraser to remove just the crescent. Dodge and Burn brushes at 1% set to shadows will provide a final tweak.

    I tried on a screen grab:

    2118218959_SteveKirk.JPG.9a1628a037e449f778f2add2d26588d7.JPG

    Any cosmetic fix is a last resort, of course. I never use them myself. Ahem. ?

    Olly

  12. On 5/2/2018 at 10:01, MrsGnomus said:

    I had another go at it this morning from scratch, trying to incorporate Olly's suggestion of pulling out more of the faint structures and also using Wim's idea of deconvolution.  This might be a slight improvement.  I still resorted to PS to get rid of the Lum artefact and for one or two other tweaks, like Unsharp mask.  

    B_Whale_FIN.thumb.jpg.a2499b2c7b9c3d58ca9baf1ea47558f6.jpg

    Superb job with more outer glow coming into view. I'd still be tempted to brighten and warm up the core reds a tad but this must be close to what the data has to offer (which is a lot - it's great.) My gut feeling is that this is at the limit of saturation and bang up against the noise floor. Really excellent.

    Because one of our cameras is very old and has a plethora of dead columns I'm used to clean-up operations, as Mrs G mentioned. I really think that there is very little which can't be invisibly mended! For a background repair in this case I'd just work on the LRGB, ignoring the defect, then do a quick stretch of the RGB to get the background up to exactly the same RGB values as the LRGB (Colour sampler tool in Ps for me) then paste the LRGB onto the RGB and erase the defect.

    Regarding the PI and Ps approaches I think another important thing is to pick the one you enjoy using. For me the processing is the fun part of AP and I want to enjoy it. If you enjoy using masks more than layers, or vice versa, base your choice on that.

    What a great thread and happy result.

    Olly

    • Like 1
  13. 13 hours ago, NigeB said:

    Hi Geordie85, MrsGnomus, Olly,

    Wow....

    I'm astonished - your images are fantastic - thank you so much for your efforts. I have to say this is a revelation to me; I knew I was not getting the best out of the data, but I didn't expect to see such a big difference between my outputs and those of people who really know what they're doing! I guess I'm partly pleasantly surprised that the data I took was good enough to achieve these results; it's also sobering to see how far off my own processing approach is. I've just shown my wife, and she is similarly amazed.

    Thanks also for setting out your processes in detail - this is an enormous help, and I'm going to go back and work through this from scratch this evening following your methods. MrsG... Yes, that's a flat field problem in the L frame - I took twilight flats on the first evening, but not subsequently - big mistake, because this feature appeared in the later nights. 

    Not at all! Thank you for your comments and efforts!

    Olly, am I right in thinking you're not a big fan of PI? Why is that? (I recall a post from you some years ago where you were talking about the usefulness of layers, which as far as I know, PI doesn't have).

    Thanks again; it's going to take me a while to go through the information in your posts and properly understand the logic, but I'll post an updated version of my own once I've done that.

    Much appreciated!

     

    Nigel

     

    I'm a fan of parts of PI but, in the important matter of layers versus masks, give me layers every time. Layers and masks don't resemble each other, on the face of it, but they serve the same purpose in allowing the imager to process selectively only one part of the image at a time. Masks have to be made to cover exactly what you want to be covered and this is very difficult. In Layers you modify the whole bottom layer and then simply erase, as fully or partially as you prefer, the top layer to let the modification into the result image. You can see the consequences of each touch of the eraser as you make it and can go back as you please. I also find the Ps colour selection tool a simple and powerful way of selecting certain parts of an image.

    In the early days of PI its users were often tempted into gross excess, particularly in the use of HDR wavelets, but now there are plenty of nicely processed PI images. Check out Barry Wilson's, for instance, whose processing is always subtle and invisible.

    Olly

  14. Although I get the hell out of Pixinsight as soon as I can (as Mrs and Mr Gnomus know :D) I heartlily agree with Mrs G's advice and her result is lovely. I also like Geordie's but I prefer Mrs G's higher black point, lower saturation and stretch which keeps below the noise floor.

    My PI approach to RGB is always very simple: combine the colours at equal weighting, edge crop, screen transfer function 'preview stretch' then DBE and SCNR green set as low as will do the job. (The original post was very green.) However, I've recently been trying PI's photometric colour calibration and it seems convincing. After that I head for Ps.

    The background sky is a peach in Mr' G's rendition. Dead even, just the right brightness and just the right amount of grain with no sense of noise reduction. All I can think of to try on this rendition to take it a touch further (and it might not work) would be to put a little upward kink in the Curve just above the (pinned) background sky to pull out the faint outlying tidal structures and a slight lowering of the cyans in red (Ps, Selective Colour) to perk up the Ha components. Because Mrs G has done such a good job on the stretch my impression that there might be a tiny bit more left in the data could be an illusion. For me the perfect stretch does look as if there's a bit more left in there.

    Olly

    Edit: To emphasize a point already made, I would never stretch the colours individually. Combine first, then stretch.

    • Like 1
  15. The project would have been worth it just for that lower right hand corner of swirling dust, which is sumptuous.

    I think this is a target which is bound to look unfamiliar in NB because so much of the broadband structure is from reflection rather than emission. To see in in emission only is exciting. I'm surprised that there is so much emission nebulosity, in fact.

    Olly

    • Like 1
  16. 13 hours ago, xdjdx said:

    Hello Mickey

    I have an ASI174MM cool, I was unable to remove the Amp Glow, I discovered that covering the sensor with the supplied plastic cap, and then, covering the sensor end of the camera up to the vent fins with four layers of Aluminium kitchen foil, held in place with an elastic band. The amp glow calibrates out with the darks produced. 

    IMG_1670.JPG

    This is an excellent post and deserves a wide public.

    Atik provide screw on metal caps for their chip windows, as do some other camera makers. I think they are necessary.

    Olly

     

  17. On 3/23/2018 at 11:23, Oddsocks said:

    Whenever you see ‘Bas Relief’ effects (3D) in subtracted or divided images this is almost always due to mechanical movement between the two source images, in this case the images are the flat master, or flat stack and the target image stack.

    DBE has detected and enhanced the image artefacts resulting from mechanical differences beween the master flat and image stack. The dark areas around the galaxies is most likely due to poor placement of the sample points in DBE/ABE

    Most likely there was some physical movement of the L filter, camera, flattener, focuser etc, either during the flats acquisition or the lights acquisition and this may have affected the entire series of flats, or lights, or just a few individual subs in either stack.

    Since you report that RGB were ok but only L was affected I would suspect the filter wheel maybe was knocked during the sequence, possibly the L filter is loose in its holder or the wheel detent is a bit loose in the L postion allowing some movement as the telescope assumes different positions for flats and target imaging, or possibly the flattener to filter wheel coupling is a little loose, even slight focuser movement can cause these effects.

    The artefacts are quite large making me think these are dust particles further up the image chain than the filters. You can measure the donuts in the image and determine how far from the sensor the dust particles are and that will give you a better idea of what moved, filter or coupling between filter wheel and flattener, focuser, OTA etc using the formula:

    D=Pdf

    Where D= the distance from the sensor to the dust casting the shadow, P=The width of the dust donut in pixels, d=The width of a single pixel and f=The focal ratio of the telescope. If you use the pixel size in millimetres then the distance will be in millimetres too, if you use pixel size in microns the distance will be in microns so you just need to be able to convert accordingly. Once you have the distance calculated you can work out what moved.

    HTH.

     

    This must be one of the most informative posts ever seen on SGL. Every day's a school day when you read an Oddsocks post! Much appreciated here.

    Olly

    • Like 1
  18. It's a great shame that Astro Tech can't be bothered to help you. Frankly that's disgraceful. I can't help with the collimation but your experiences define exactly why I stay away from this kind of scope and stick to simple designs.

    Is there an optical shop within reach of your home? It would be great if you could employ them not just to sort it out but to take you through the procedure.

    Olly

  19. On 3/17/2018 at 12:36, RayD said:

    You could try by eliminating an issue with the OAG prism by installing the Lodestar directly in a 1 1/4 eyepiece holder in your focuser to make sure you can indeed get focus, and it will then let you calculate a rough distance.  It's a basic starting point but may stop you chasing around in circles.

    I would do this. I recently installed a replacement Lodestar into a robotic rig I host and it produced elongated stars. I then lent the guys my spare Lodestar and that was fine in the same OAG, so the camera has a problem, probably a tilted chip. I'd check it a normal scope to eliminate/confirm camera and OAG problems.

    Olly

  20. 1 hour ago, souls33k3r said:

    Here's an idea, do what the animals do. p155 around the observatory and mark your territory. This should keep the opportunists away :p

    I think my observatories are in multiple ownership, then. There's a badger, a local dog, two cats and a fox making claims this way - though the fox uses a slightly different kind of marker... 

    :eek:lly

    • Haha 3
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.