Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

BrendanC

Members
  • Posts

    1,045
  • Joined

  • Last visited

Everything posted by BrendanC

  1. Hey, thanks! In the meantime, here's one of them stacked with just flats... ... and just lights alone. Both have just had ABE and EZ Soft Stretch applied so they're visible. So I'm wondering whether something has gone terribly wrong my with flats somewhere along the line? I have recently taken quite a few images over an unprecedentedly clear few nights in the UK, so it's entirely possible I've mixed them up.
  2. Good idea - problem is the files are quite large, but here you go. Stacked in PI but saved as FITS to reduce the size. About 100MB each. Happy to provide anything else that might help. NGC 2403.fit NGC 2683.fit
  3. Hi all, Well the story continues. Here are two more examples. Weird gradients that don't appear in the subs and, to my eye, seem similar but not identical in each image. I've tried stacking with just bias, with darks, in APP, in PI, always the same result, very difficult to shift (if at all) using ABE or DBE. These have just had ABE applied and a quick EZ Soft Stretch. I think it's a fairly recent problem. I don't know what's causing it, and it bothers me. Any ideas? Thanks, Brendan
  4. Hi all, This is NGC 2419 taken in good conditions (no Moon, Bortle 4). I don't understand why the cluster and stars in the middle have a slightly darker ring around them. I can't shift it entirely, using ABE, DBE, or the light pollution tool in APP. Is this over-correction of the flats? Or was the sky just like this anyway ie high cloud perhaps? This is with an ASI533MC, cooled to -10C, through a 130PDS Newtonian with 0.9x coma corrector, 50 darks, 50 dark flats, 25 flats, about 2 hours' worth of total exposure, 60s subs, gain 101 and offset 50. Any takers? Thanks Brendan
  5. Hi all, So I spent quite some time recently getting to grips with the subframe selector. After much tweaking I got a nice expression that tends, on the whole, to reject 20% of frames, which is pretty much exactly what I wanted as a nice rule of thumb, at least as a starting point. So, imagine my surprise when WBPP started rejecting more frames, from among the remaining 80%. I don't want this! I want to have my carefully honed expression reject the frames, and then use the non-rejected ones wholesale. I also don't want to have to rely on Blink - I prefer the rigour of maths to the fickleness of my eyes. I've looked around, searched online, but cannot find anything that just simply switches this 'feature' off. Turning off measurement doesn't work, nor does selecting a different weighting formula, which is what is kind of implied (for an earlier version) here: https://pixinsight.com/forum/index.php?threads/wbpp-2-4-1-measures-despite-being-asked-not-to-measure.18225/ Any takers? Thanks Brendan
  6. It's actually better with the darks. Believe me, I've been through all those hoops with creating dark libraries, light leakage and all! Thanks again.
  7. Hey, thanks both for your help! @ONIKKINEN, I really appreciate you doing all this. I was just using bias, but I did a test with darks yesterday and it did show an improvement. I always used to use them, I'm just a bit lazy too! It's just that I'd noticed very little difference, if any, with just using bias because of the 533's zero amp glow. I guess that for some images it does make a difference. So, I'm going back to 'proper' calibration. I also stacked another image last night and it came out fine, so I think this was light pollution, which again is a lot clearer to see for some reason when I stack with darks. It's just that horrible feeling when you think something is wrong, but you don't know which of the many possible things, or combinations of things, it might be. So, all good, and I'm just going to carry on. Totally get what you're saying about the sensor not being exactly central, but if what I'm getting through is workable, I'm not going to start making micro adjustments here and there. Thanks again both of you. :)
  8. Excellent, thank you so much for this. As I say, other images have calibrated out fine, so maybe I need to look at something else. I haven't had the camera that long so it could just be light pollution from shooting in a different area of sky. I just needed a check on the flats as a first step. Thanks again.
  9. [originally posted another file here but not needed]
  10. Hi all, I've been enjoying my ASI533MC Pro, but I've recently noticed vignetting and gradients at the edges that I'm finding hard to remove. The flats should be fixing this, I think, but they're not. So, could someone take a very quick look at the attached flat, give it a bit of a stretch, and tell me whether it's OK? Cos it looks like my sensor might be way out of centre, and I need to collimate. This is through a 130PDS with a 0.9x coma corrector, and a 2-inch SVBONY UV/IR cut filter, created in APT using the flats aid to get to 19,000 ADU, and a Lacerta flat panel. I've been stacking with just bias, no darks cos I tested this and there's no discernible difference (I also stacked with darks and got the same result). If it helps, I've attached a bias and image file too. I've checked previous flats, with this and another dualband filter, and get the same pattern, but I'm sure I didn't have these issues. So, it might be something else. I don't know. I've collimated the heck out of this thing but if I'm too far out, I might just have to give it (yet) another go. Thanks, Brendan F_26103_0.28379s__-10C.fit L_26136_60s__-10C.fit B_22587_1s__-10C.fit
  11. Nice! I've managed to incorporate the formula successfully into my planning spreadsheet, so now I can quickly tell how far a proposed object is from the Moon on any given night.
  12. Hey, thank you! It's all about the search terms I guess - didn't even know about angular distance. This could be exactly what I was looking for - and proves it wasn't quite as easy as I initially thought. Ah, post-edit... I don't understand how he's getting his sine cos etc values. For example, he's saying that d1=-16.58. Then he says sin (d1)=-0.285 If I do sin(-16.58) in Excel, I get 0.768. This is the same across all the functions - very different results. There's clearly something very basic I'm doing wrong here. I've tried converting to radians etc, and cannot understand how he's obtaining those values for sins and cosines. Any ideas? Post-post edit... The input to the sine etc functions needs to be in radians, so I just convert from degrees to radians. Then convert the answer back into degrees. All good.
  13. After more thinking, this isn't about distances, it's about angles. The smaller the angle, the closer the objects. The largest angle would be 180 degrees, that is, opposite side in the sky. So, if I know the difference between the alt for two objects, I have one angle. Same with the az, although I would need to start counting down when it goes past 180 degrees. So given those two angles, would that be sufficient to derive the angle between the two objects? I have angle up, and angle across. Can I then get the 'diagonal' (for wanting of a better word, as I said, I'm no mathematician) angle? I have a feeling this is either absurdly simple, or fiendishly difficult. Stop me if I'm talking nonsense again.
  14. Hi all, I'd like to use a spreadsheet to calculate the distance between a given DSO, and the Moon, in the sky. Sounds easy enough. At first I thought this was simple: just do something like hyp squared=x squared+y squared, figure out what x is (whether RA1-RA2 or Az1-Az2), figure out what y is (whether Dec1-Dec2 or Alt1-Alt2), and you're done. Which would work if this were a regular flat grid with cartesian coordinates. But it isn't, it involves polar coordinates for RA or azimuth. I think. I'm not a mathematician. The more I think about this, the less I understand the problem or answer. I think that essentially the problem is calculating the direct line distance between two points on a globe. I've found some formulae that calculate an equivalent problem - between two points on earth, using lat and long - but they go around the surface of the globe, whereas I want to go direct. I think. As for the answer, well I don't even really know what units this would be in! Degrees? Arcmin? It's sort of like asking how big is the sky. So, are there any physics/maths geniuses who could offer some insight into this? Ideally with some sort of formulae that I could implement in Excel to help me with it? Or if someone could tell me this is a nonsensical notion to begin with, that would be fine, and it would help me to stop fretting about this. Of course, I could just fire up Stellarium and look, but that would be too easy... Thanks, Brendan
  15. I've only ever stacked in PI using WPBB, but I like it. I watched the full, free Adam Block WPBB intro series on YouTube, and he says somewhere in it that there is no difference whatsoever in the result, whether you do it manually (the hard way) or using the script (the easy way). It's initially quite intimidating, like everything in PI. There are some foibles I still don't like, for example always having to tell it to debayer lights, and it takes a very, very, very, very, very long time to stack (even longer than APP believe it or not), but other features are great, mainly the ability just to load files, and it reads the FITS headers, and assigns accordingly. So, no need to upload bias, darks, dark flats, lights etc separately, just chuck it all in and provided you've got the right types in the header, it all automagically sorts everything out. You can even point it to a folder and it will pick up everything in there and do its thing. This means you could, if you wanted to, load up your entire bias, darks and dark flats library, and it will automagically assign the right ones to the flats and lights for that session. In fact, that's what I've done, except I only use bias frames now after lots of comparing (thank you ZWO for the amp glow reduction circuitry in the ASI533MC), so I just keep the master bias in place, load up lights and flats, and go. Very handy.
  16. Thanks for doing this, really appreciate it. I just stretched your XISF file and cannot see the diagonals! This is interesting. The only step I haven't done is LinearFit, so I'll look into that. In the meantime, I've stacked the files in APP and I don't get the artefacts. So, that implies acquisition isn't a problem. But, when doing that, I think I might have found a stray flat file hidden among the flats that might be causing an issue. So, I've got PI back on the case, currently restacking, will try Linear Fit, and see how I get along. Thanks again!
  17. I'm pretty new to PI so it's entirely possible I'm doing something wrong, although I'm familiar with DSS, Siril and APP so I know about stacking. In fact, I'm stacking the files right now in APP to see whether there's any difference. Would be great if you'd like to take a look: https://1drv.ms/u/s!AqovBuVZMwj3mJIg9H5vuCW4cXpa1Q?e=Fs8Imm That folder contains the XISF and separate TIF files, so just grab what you want. I'd be very interested to see if you can replicate this, and if you can get it to work, how you did it. Thanks!
  18. Hi all, I'm completely at a loss as to what's going wrong here. I've got a load of RGB data, and Ha data, from an OSC camera (ASI533MC Pro). Within PixInsight, I stacked all the RGB, and all the Ha, using WPBB. Then, I've extracted the G and B from the RGB, and the R from the Ha, aligned them using Star Align, and used Channel Combine to bring them together with the Ha for R. All good so far. HOWEVER, when I start stretching, I get this bizarre set of parallel, diagonal stripes. I cannot figure this out at all. None of the individual channels show this. The example image is just combined, stretched in GHS, and then some curves applied to show what I mean. Any ideas? I'm stumped. I've added another version with yellow lines to emphasize where the lines are.
  19. Splendid, thanks all. I did a LOT of thinking before getting the ZWO Duoband. It is very wide for something called narrow! 15nm Ha and 35nm O3. Still, given I'm in a Bortle 4, and It was a lot cheaper second-hand than the really narrow Optolongs and Antlias and Altairs, I guessed it would be ok as an intro to dualband imaging with an OSC, and I've been pleased with the results so far. It's just that I've increasingly been wondering at what point there's a crossover between when to use just Ha, and when to use dualband. Looks like there isn't one. Plus, at some point I will very probably upgrade, probably to the Altair because I like their quality assurance certification, especially after seeing Cuiv's videos about filters, so I wanted to get a grip on this before I did anything new.
  20. This is kind of what I've been thinking too - just shoot duoband and then mix with RGB if I need to, for example for stars. I very probably am thinking too much about it! But it's been bugging me for a while and I wanted some opinions, so thanks for this. So, would you ditch the Ha filter, or the duoband filter? Or keep both?
  21. Hi all, I started out with a modded DSLR three years ago, then got a 7nm Ha filter - all good, shoot galaxies, planetary and reflection nebulae and clusters as just broadband RGB without the Ha filter, use the narrowband filter for emission nebulae to add a ton of definition to the red channel. Then I had an ASI1600M for a short while. Didn't take to mono at all, but understood that the broad distinctions still applied, except that I could also shoot S2 and O3 for emission nebulae. So, I took a step back and now have an ASI 533MC Pro, which is great. I got a ZWO duoband filter too, and have enjoyed using it for a few months. But now, I'm starting to realise, I'm a bit confused as to exactly when is best to use each filter. I expect I continue with broadband as before - galaxies, planetaries, reflection, clusters. I also get that if the Moon is above around 50% phase, I should use the Ha filter because the O3 of the duoband could be wiped out. So, that means I'm back to where I started, doesn't it? That is, broadband RGB , or broadband RGB with Ha. So, what's the advantage of using my duoband filter? At what point in between purely Ha when there's a Moon, and purely RGB, when there's less Moon, should I be using duoband? And what's the real benefit? Thanks, Brendan
  22. Hi all, I've used the scope simulator a lot in the past, for putting together mosaics. It's always worked fine. Tonight I was using it for a two-pane mosaic of M45. The rotation said 104.4 degrees. However, when I looked at the preview from my camera, it was quite a way out from what I expected. I noticed the FOV angle said East, so on a hunch, wondered if this translated to 75.6 degrees. On rotating my camera to that angle, it was as per the Telescopius preview. So, I don't understand why the scope simulator gave me a different angle from that which actually worked. I've looked and can't find a way to make the simulator give a FOV angle West, in case that would change things. Any ideas? Thanks, Brendan
  23. Nice one, thank you! OK, so I'm feeling my way through this, and I don't know which metrics to select in the drop-down menus (there are two?) I can't find Electrons / Data numbers or Normalised (0-1). What two(?) would I select just to get an average ADU?
  24. Thanks, and I'm starting to wonder whether I might use my JitBit macro recorder to do something similar.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.