Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

pipnina

Members
  • Posts

    1,898
  • Joined

  • Last visited

  • Days Won

    1

Everything posted by pipnina

  1. My dad (retired) takes the garden as his baby, in previous years I've grown things in there too (only things I could eat later mind you) but I got tired of sowing 25 plants and only getting a few ears of corn for it quite quickly. Now I just take over his potting shed with astro bits. The tree in the background is an apple tree that predates the house (used to be an orchard). Sadly while it looks to have character from a distance, it is likely to die in a few years as the trunk is hollow in various places (big enough for birds to nest in, and they have! Though with little success due to resident cat)
  2. At this point, I would suggest maybe getting a mini PC that's X86 based? For example: https://www.ebay.co.uk/itm/114733154309 Select the core i3, 4GB of RAM and the 120GB SSD. it costs less than many RPI boards and kits with the same memory, probably has a faster CPU and also includes storage, and a fair bit of it which is going to be faster than the SD card on a PI. And as a bonus it's still quite small! I am tempted to give this a go myself! Simply use the provided windows install on a lot like these (for N.I.N.A users) although for that it might be worth getting them config'd with 8GB ram and enjoy! Myself, I'd be putting something like Ubuntu MATE or similar on them, and configuring X11VNC to open at boot. This would let me control it remotely as I do with my RPI and use Kstars + PHD2. Given the stock issues with the RPI it might be worth a go!
  3. You might be the first person I've spoken to, who has any familiarity with the name at all! The people at FLO, as well as Es Reid the optical specialist they partner with, had no idea what the company was. They'd never heard of it! I got it basically third hand so my chain of knowledge to the scope's origin is broken. I don't know how old it is or anything! I just know it has been a source of great pain, and I may as well have bought a new triplet instead of this second hand one with all the money I spent getting it up to standard 😕
  4. My SET Optics 130mm triplet after a night of imaging Cable management is hard, but at least it works! I seem to always end up with scopes comically oversized for my poor abused HEQ5, those two 5kg counterweights don't come close to balancing this!
  5. The best advice I can think of is to use a program like Stellarium to find objects you want to see on a given night, make a note of them and what constellation they're in, and then draw out some star hopping routes so you can find the object in the eyepiece when you're finally in the field. It will take some practice as star hopping is a skill, and the telescope AND finder (on most dobs) will have their field flipped vertically and horizontally! I'd look at brighter objects like M81/82, M51, Leo Triplet etc at the moment, and soon you'll be looking for the ring nebula and such. Happy hunting!
  6. Also a cracking image! I notice a good trend of RASA owners and this object emerging. Starting to think I should have picked one up and put up with manual filter swapping back in september when I went with my APO instead! Certainly it's very hard to pass up the speed of f2 imaging given the UK's weather! How have you balanced the colour in this image? I notice the other two examples in this thread have a very neutral grey colour in the IFN while you have an almost golden colour in yours. I can only assume at least some of this comes down to artistic interpretation in the various images. I can only guess that the most accurate interpretation we could make would be using PixInsight's SPCC tool after background extraction? I can see some reddish hints in areas of your image though which as you say, could be little bits of hydrogen?
  7. Cracking shot Robin! Definitely looks like IFN to me, I think I might have to suggest something on the Stellarium GitHub or something because it's currently down as a HII region... I could possibly give it a try, what regions could you recommend I point at? I chose this area because it is one of few non-galaxy objects I can point at at the moment haha. I know there is IFN around Polaris and near m81/82, but is it bright(er) than this object I wonder?
  8. Whoops! Thankfully I can correct the title haha. Given how hard it has been for me to pick out IFN in the past, I am very surprised to have caught some while using a Ha filter! It must be a very bright piece of ifn!
  9. With the big moon and uncertain weather last night, I decided to try and find a narrowband target to make use of the relatively clear weather. At this time of year of course that is no easy task! I eventually found in Stellarium two objects, in the LDB and SH2 catalogs, which seemed promising, in a good sky position and would fit in my scope's FOV! Perfect. One night's imaging later (and half my subs thrown away due to a frosting sensor window! not doing -15c again!) I have my subs and put the scope to bed before heading to work. At work I look up these objects to see what I might expect from my images. To my horror they look like IFN objects, which I was shooting with a 3nm Ha filter! Fully expecting a blank void, I download the subs to my PC when I get home, calibrate and stack them. The SNR is poor... but is this an object I detect anyway? It looks like *something*, but even binned 4x4 the image has such a low SNR it's hard to make out. I will need more assistance! So I use RawTherapee's denoise features and turn them up to 11, and in a separate attempt I use Pixinsight's annotation feature. The grainy blob in the middle of my image DOES line up with where pix expects SH2-73 to be! My image is not pretty, but given I only ended up with 2h40m of subs to include, and possibly of the wrong wavelength bandpass, I think I am pleased! I'll include the downsampled, unedited tiff if anyone else wants their own look. Now I wonder if Stellarium is lying about the objects being HII regions, or if my idea of what a HII region is, is wrong! In my mind HII regions are clouds of hydrogen that glow in mostly H-alpha! master.tif
  10. Whenever I had issues like this with DSS, I did a factory reset and stacked things again. Set everything back to default and uncomplicated settings. Use the stacking mode where DSS won't do any cropping, use bicubic debayering, don't use dark pixel removal or anything, also only use "average" stacking methods and make sure the output directory is set correctly. Also as Elp says, ensure the issue isn't in the images being fed to DSS too! DSS is a wonderful free program but I did also do my fair share of cursing at it as well haha.
  11. Stellarium misclassified LDN 106 and SH2-73 as HII regions, when in reality they are IFN. 3+ hours of Ha imaging last night down the drain 😕

  12. I've never heard of someone using a Ha filter for visual successfully, maybe with night vision scopes it could be effective. The problem is that while the colour vision of humans is relatively sensitive to 656nm, our night vision is not. Our cones (colour-seeing cells) are not very sensitive in general, and become useless below a certain light level. To allow us to maintain some night vision, we evolved to have rod cells too. These are much more sensitive but only see one colour, which turns our vision black and white. Note also, how the grey curve (the rod cell response spectrum) loses all sensitivity above 600nm, this means any Ha object would need to be bright enough to be visible to our cones if we were to observe it with our eyes! An object of such brightness does not exist besides bright stars/sun. However most (all?) objects that emit Ha also emit Hydrogen Beta, which is around 480nm. While this band is only 1/3 as bright as Ha, it falls in our rod's most sensitive wavelength band, which means some hydrogen nebulae can be observed much easier with a HB filter! M42 is likely to be one of these nebulae.
  13. I'm sure there are sprays and such we can use to deter them... I hope! My dad's shed has had a spider or two but thankfully it has stayed mostly arachnid-free. Or at least the spiders are hiding and small...
  14. Wow that really shows what a competent processing wizard can do! What denoise algorithm are you using? I have struggled to find one that doesn't make the whole image look like a jpeg artifact, even the one pix includes.
  15. I just re-calibrated the data with the proper input pedestal and re-stacked twice. First dataset is all subs from both nights thrown into WBPP, second is only the first night's subs (so 12 subs per channel for first night only vs 45-ish for all data image). I cropped them down to 1280x1024 so this is now only a 30MB total download. I will be honest when I look at individual channels from the stack, autostretched, I can see the difference in SNR much more easily. I guess it all threw me off because it doesn't look like a 5x integration difference in SNR! All_data_correct_calibration_crop.tif First_night_only_correct_calibration_crop.tif
  16. I did some looking through the settings: WBPP has an output pedestal which adds a certain ADU *post* calibration, and the separate ImageCalibration tool has both output *and* input pedestal. Setting the input pedestal to match my camera offset does appear to reduce the overcorrection, albeit a bit hard to see given how dark and noisy the subframe is to begin with! I stacked the blue frames that I calibrated with input pedestal and hit autostretch (right), and compared it to autostretched blue on a previous stack with no pedestal in calibration (left) Thank you very much for telling me this exists! I have been banging my head against this flats problem for what feels like forever haha.
  17. As in normalise both stacked results with a linear fit, and register them so flicking between them in an image viwer or photoshop will let me see a flickering 1:1 comparison? It'd be worth a shot! Transparency might have been lower for night 2, but I don't know about a whole bortle lower! Might help here to see the subs: The sub with the full date stamp in the file name is the second night, the sub that's just M51_Light_Blu is from the first night. I notice putting them to the same stretch level in Kstars that the subs from night one are definitely darker, but I am not sure how much the signal in M51 is being attenuated by, it doesn't look like a 5x loss to me but my eyes are not so keen for this sort of thing. I found two of the darkest subs in the middle of each dataset and put them side-by-side at the same stretch and they look similar in terms of signal and noise, but the new dataset is definitely brighter: And looking at the stats, given my offset of 256 (which yeilds minimum ADUs around 40-60 on my cam), it seems that the average ADU value difference between the two nights is only about 80-90 ADU. It seems a bit suspicious that it would make such a big difference in the final product. As for my overcorrecting flats... I am still experimenting with the issue. I think I need to apply darks to my main data and bias frames to my flats (flats are around 0.05 to 0.01 second exposures so flatdarks maybe not so necessary?) Getting good, uncontaminated darks is quite hard though. I've burried my camera before and still seen light get in (I even stuck some socks over the front of it in a dark room once trying haha) It is a bit odd because my much noisier and stronger dark current DSLR had no issue with flat calibration, despite me never using darks or bias frames! M_51_Light_2023-04-23T23-43-01_001.fits M_51_Light_Blu_001.fits
  18. Hm ok maybe the processing will mask it a bit I just put the two nights together into a simple LRGB combination without stretching, and the second night by itself in another simple LRGB combination I stacked the two nights combined by adding all the raw files into the WBPP script, so I avoided trying to stack two master lights (PIX refuses to stack fewer than 3 images anyhow) Maybe this makes it a little easier, I don't really see any appreciable difference myself both-night-raw.tif 2nd-night-only raw.tif
  19. I have been trying to capture a nice and clean image of M51, and in my first session I caught an already quite nice image with only 1.5 hours in RGB. On the 23rd, night 2, I managed to bring in over 5 hours! I figured I would see a big improvement sticking the two stacks together... Alas there appears to be no improvement at all 😕 Could it simply be a matter of the second night having that much worse SNR per sub, or could it be difficulty getting multiple nights to work well with eachother in pix in general? When I stack night one and night two individually, they do look very similar, I am struggling to tell a difference SNR wise. Is it just bad luck or am I realistically limited to one night per image or per colour? Thanks Image13 is both nights together with a quick pix process (ABE, SPCC, stretch and colour boost only), The other Image13 is, as described, only data from the second night. Image13-2nd night only.tif Image13.tif
  20. A lot of heavy technical discussion in this thread. I'm happy to enjoy the great picture myself!
  21. My raspberry PI with astroberry runs Kstars, which is a planetarium software with an embedded astrophotography tool called Ekos. It has a lot of features and does show you things like live guidecam feed, platesolving, autofocus, image capture, goto, assisted polar alignment, mount guiding built in (or connected to PHD2) and even a scheduler which lets you set up a target, a image capture sequence and start/end times and it will automatically start capturing the target at the programmed start time. On astroberry it's quite stable but on my desktop the updates bring more and more problems- I think my desktop must have ended up running beta branches or something. I control Kstars on the PI by viewing the desktop in my web browser at my PC indoors. I set my home router to always assign the Pi's MAC address the local ip 192.168.0.33, so I type that into firefox and the Pi's desktop appears! As for board cost, how the mighty have fallen! I remember the original Rpi model B in 2012 or so cost £30 for the board, as the higher memory version. I even only paid around £80 for my RPI4 2GB with enclosure (a bad enclosure that I replaced, the metal in it acted like a wifi faraday cage haha). There is a youtube video on RPI alternatives at much lower cost, it is only a roundup but it seems in some cases we can ditch the RPI for one of the alternatives.
  22. You are right it would be a bit redundant in your case where you'll be next to the scope- I was having a slightly confused moment where I use my RPI4 to control the scope from a distance and didn't make the connection that your and my use cases weren't quite the same haha. As for the usability: it's honestly really good. If you use Astroberry, which is a fork of the raspbian project which has a desktop, samba server, wifi hotspot and even a VNC server set up from the box, you just have to burn the Astroberry disk image to the SD card, plug it into the PI, put the power on and in a minute or so the "astroberry" wifi hotspot appears. If you connect to that you can open a web browser or a VNC viewer app, and connect to the PI's ip address and it will present you with the login screen, with some configuration of the INDI server (telescope hardware drivers) on the left. I find it very convenient and flexible, not to mention reasonably priced, but as you say the use case is limited in your situation. Also Re: availability. Yeah, I struggled to get one and settled on the 2GB model... Only to find out that I *really* needed the 4GB instead. It is a very neat setup though
  23. I'm quite fond of my raspberry pi for scope control, but that does require a laptop or it can be a bit fiddly to control via a VNC app on a phone. You also need the 4GB pi4 at a minimum and to source a good quality 3A 5v supply and cable.
  24. I think wind affects it more than we think. Even in winds clearoutside classes as green we can get gustsa few times in a sub that wobble things about a bit.
  25. Given as you are using PixInsight, I am surprised Spectrophotometric colour calibration hasn't been mentioned yet! It works absolute wonders for colour rendition, even under limited signal conditions. I processed this for a friend (not my data), who uses a Moded D3200 (full spectrum). The signal is clearly limited, I think moreso than your image. Yet with PixInsight SPCC the colour looks very natural and vibrant, especially in the cigar!
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.