Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

Filroden

Members
  • Posts

    1,373
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by Filroden

  1. And the forewarning has given me time to sharpen my stake!
  2. My Canon would go through the routine every time I turned it on, so there was no need to do it more frequently.
  3. It's just how my workflow works. Because I capture L i don't need to carry detail from RGB (so I collect less and I can apply a lot of blurring to reduce noise). I only use its colour information. If you collected enough RGB on its own it does become like a OSC. But I gather L much quicker than RGB so I prefer it for detail. If I don't have a lot of L I have extracted L from RGB and combined it with real L. Sometimes it improves the image, sometimes not. So the suggestions are possible but given I have L data I would end up throwing away detail. That said, the Ha is so clear I think I could use it instead of L. The other reason not to replace R with Ha is you would get false star colours.
  4. Very little detail carries into the image from the RGB. It is used to add colour to another layer, which could be L, Ha or a blend of both. I could use the colour of Ha instead of red but I would need to also use Ha instead of L or with L. So more like LHa-HaGB
  5. I fold about half a dozen sheets of white paper so they hand from the end of the scope and act as a diffuser then I put up a completely white image on my TV screen. So long as you expose for long enough to not capture the screen refresh then it seems to work well.
  6. It's a complete redesign of the body so they can't offer an upgrade kit. They have offered $200 off a new camera if you owned the previous version. Given the camera's popularity, resale value of the older versions should be high, so it wouldn't be too much of a loss. I just need to be more careful. My flats above have been stretched to highlight the variations. The "flat" flats look more like yours (example of the L filter below). I've always had about that much vignetting but the flats seem to fully correct it, so I've not been too bothered by it.
  7. Would you believe it...the only filter with any discernible difference is the Ha filter which has two nice blobs in the centre. There's quite a bit of dust on the sensor. It's a design flaw with the first two versions of the ZWO ASI1600 camera. ZWO included 4 desiccant tablets inside the camera next to the sensor window so any time you unscrew the camera body there is a risk it grinds the tables, spreading dust everywhere. The third version removes this flaw. Looks like I need to open it again.
  8. When I did some quick integrations of my M81/82 images last night the dust was clear in the L channel but was invisible in the Ha. I think it's because there is so little background in the Ha and the signal is weaker that it's just harder to spot. Now I have time to properly calibrate all the files it will be interesting to see the difference. I will also see if I can balance the flats and subtract the Ha from the L to see if there is any real difference. Given it's a new filter, I would hope it is clean so the flat should mainly be dealing with dust near the sensor and the vignetting of the optics. If so, I could possibly be lazy in the future and correct Ha using the L flat. That, or I have to figure out how to get my light box working so I can increase the intensity when doing Ha flats and reduce the exposure time. 32s exposures means I also need to correct for darks. With 1s exposures I could get away with using the bias as a dark.
  9. Thank you. It is such a richer image than the original luminance that I used. I'm going to have to study the tutorials about blending it into L and into R. I may try three versions: 1. Mono Ha with stars removed and tinted red 2. LHaRGB 3. HaRGB The latter to see if it retains more detail without the L channel. I've just taken flats using a white image on my LED TV. The Ha flat took 32s compared to between 1s and 2s for the LRGB. I can't see much difference between the results yet but I've somehow managed to put more dust onto the camera having (I though) given the whole system a good going over with the blower when I fitted the Ha filter.
  10. Well, here's early sight of my integration of the Ha images. I only managed 34 x 30s and 22 x 60s, all taken at 300 gain, 50 offset and -20C. I haven't inspected them, rejected any images, or applied any calibration. This is uncropped and only stretched with levels and curves. I have to say, Ha data is so much cleaner to work with than LRGB! I'll be taking flats tomorrow and doing a proper integration of this, and my LRGB data (not that I got much before it went behind the steam vent). I also took a quick peek at M81 and it's a breath taking galaxy (but that one really does need calibrating!).
  11. The developer website shows the comparison, mainly differences in object databases. http://skysafariastronomy.com/ I went with the Pro but bought it when they do there 50% discounts, which are quite frequent (maybe once every month or two). You can sync your settings across iDevices but otherwise there is no need to use the cloud.
  12. How is the fog? I'm seeing lots of bad reports for the whole south of England including high pollution alerts.
  13. Well, that didn't last long. The target is already too close to my neighbour's heating vent and there is no other easy target to its east. So I've moved to M81/82 to see what I can get. Just doing LRGB as the targets are small in my fov and any Ha data will be lost.
  14. Just took my first Ha sub. What a nightmare finding focus. I had to reduce the step range on my autofocus as the stars quickly went too far out of focus to be measured. Here's a single 60s sub of NGC2239 at 300 gain, 50 offset and -20c. Edit: looks like I still might have tilt/spacing issues. The stars in the upper right are looking elongated in every sub so far (though an improvement over my previous settings which had all four corners looking elongated).
  15. My first Ha sub. A single sub of 60s at 300 gain, 50 offset and -20C. No calibration, just a quick levels and curves stretch and no noise reduction. Going to go lie down now... Actually, I've set my first sequence to be 30 x 60s Ha, 10 x 30s RGB and 30 x 30s L. Hopefully I can repeat that sequence a few times to add to my original data.
  16. If the reboot worked it's worth one last try, though why it stops working I have no clue. I can't imagine the cold would have such a quick effect.
  17. I've just taken everything outside. It's looking like my first clear night in about three weeks. I've just recalibrate the auto-focused and need to do the same for my StarSense to get accurate gotos. And then I'm hoping to get my first Ha images of the Rosette.
  18. I remember calibrating my Mac and laptop but I hadn't calibrated the new TV. However, having run it now, it calibrates and now has an overly warm hue, making whites appears a pale red. When I bring up a colour chart, I have to say the uncalibrated looks much better than the calibrated.
  19. Err...maybe...quickly finding my Spyder after the house move and setting it running!
  20. It helps that my main monitor is a 40" 4k TV though sometimes I think images have more contrast and richer colours on the iPad.
  21. In Photoshop, if you have Noel's actions, I think it can bring the colour from the star's halo (which shouldn't be saturated) into the centre. It's pretty neat and I think I found a way to do it in PI but I've not been able to replicate it again!
  22. The more I look at it the more I think you could be right. It's not quite straight (at 1:1 scale it looks very straight but zoomed in it does show some variation). I think its position in the corner has thrown me.
  23. I think it's a reasonable assumption to make in general. We know the total distribution of stars in the galaxy (and in others), so the total integration of this distribution should define the correct colour given enough stars. However, at large scale, there are uneven distribution of stars (clusters spring to mind or whether you are imaging towards the core or an arm of the galaxy or into the halo) where one colour may dominate. So applied with caution it's my preferred method of balancing the colour (after background removal) but I wouldn't hold it up as being accurate. If the image was plate solved and each star's colour positively identified from a database taken from spectroscopic measuresments and this value was used to balance the colour, then I'd think it's closer to a true colour.
  24. The one you posted yesterday, though of the two today, I prefer the first of those too. It was this area (I've cropped and boosted the contrast/brightness). It just looks too straight and suggests a stacking artifact rather than the edge of the nebula.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.