Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

ONIKKINEN

Members
  • Posts

    2,380
  • Joined

  • Last visited

  • Days Won

    5

Everything posted by ONIKKINEN

  1. Pretty good sharpness and SNR, how much data is that? Looks quite nice already. One thing you could do if the magenta stars bother you is an inverted SCNR. I've used that for OSC processing a few times when i made some mistake that turned things too magenta. To do that you first invert the image, after which magenta will turn green, then you run SCNR green and invert back. Then you will find that the magenta is gone, but as a caveat if you had actual magenta you wanted to keep as that is also gone, so a compromise.
  2. Thanks, they are very similar looking galaxies. This one is like a more difficult and time consuming version of M33 it seems.
  3. It does look different than what most of those cases looked like, but my bet is still that something is on the sensor itself or at least close to it. The pattern looks like rows and columns of pixels, if there is some residue on the sensor itself i think it could result in shadows like this.
  4. Could it be the dreaded oil leak issue the 2600 suffered a few years back? Fixed in newer production runs, but since the camera is second hand maybe this is one of the affected models.
  5. Lets hope these weren't the last good nights of the season, sure was good to actually do some imaging. Look forward to seeing your shots whenever ready!
  6. Thanks both, I actually had some trouble dialing in the "crispy" and faint parts of the image to where they felt in balance. It feels like an unusual galaxy to process, something about it is just strange. The core is quite bright but then it gets very faint quickly, but those faint arms still have speckles of sharp nebulosity, so found this quite weird to process.
  7. Thank you! Was it clear over your place this week? Porkkalanniemi had 2 back to back fully clear nights with not an arc second of cloud in the sky, cant say i remember that happening in the past few years so very rare.
  8. Thanks and good luck with your image! I had been planning to image this since last year, but decided to wait until a night of good seeing arrived, and i think it was worth the wait. Its not a very large galaxy so every little bit helps.
  9. NGC 2403 with 214 x 120s to a total of just a bit over 7 hours: Taken this week under decent seeing on 6.3 and 7.3 from an SQM 21 ish location with my usual kit: 8'' newtonian, TeleVue Paracorr, RisingCam IMX571 OSC camera. Processed in PI and Photoshop. Target for this galaxy was 20 hours, but it looks pretty good already. Might still put in the extra hours to improve the faint outer regions if weather allows for this year, but we will see as only 6 weeks remain until summer. Feedback welcome -Oskari
  10. Its mounted on an AZ-5, with all sorts of parts i had at hand. The phone holder part itself is a Celestron smart phone eyepiece adapter that didn't survive the first Finnish winter and some plastic bits broke that prevents its intended use. The holder is ziptied to a Skywatcher L-bracket that i bought years ago but never ended up using and the L-bracket sits on the AZ-5, which sits on top of an EQM35 steel tripod with the polar alignment peg sawed off. I have full alt-az movement this way, but only limited roll control because the ziptie prevents full movement, and forces me to use only portrait mode. So yeah, a monstrosity that MacGyver would be proud of, but it works.
  11. Another version, this one was stacked with properly working calibration frames: Pulled out some of the H-alpha regions better. Getting to the point where its spent a little bit too much in the oven, i think this is all i can pull out of the image for now. I am noticing that this kind of very wide field milky way processing is very different from my usual 1-meter focal length telescope images, a whole new game to learn it seems. The abundance of Ha in a 35 minute image greatly surprises me. Most of Barnard's loop is visible, as is the vast cloud of Ha above it (not sure what this is called, if it has a name). Should be said that the horizon is maybe a thousand pixels from the bottom of the image here, so far from ideal conditions to try and pull out faint stuff from. Will absolutely be doing more imaging with the phone to see what it can do!
  12. I solved the flat issue just minutes ago, the acceptable exposure time range is quite narrow with my very blue LED panel when the 10-bit sensor has quite a shallow full well depth (have to fit red and blue both into the histogram, which is difficult with my panel). I also found out that there is a median dark value of 64, which i subtracted with a stacked master bias frame. I cant see any internal reflections in the flats, i did see a few in some light frames where a nearby car had lights on just outside the frame illuminating the foreground - horrible streaks across the entire image so it can happen it seems. 20 x 17s darks with aggressive cosmetic correction for hot pixel sigma seems to have eliminated the hot pixels, i also manually dithered every 10-20 exposures or so which kind of happens naturally with untracked imaging when i have to re-center anyway. To be fair it was -7c when imaging, but i think the phone runs much hotter as it was noticeably warm to the touch. It may be the case that your phone was really operating at +30 or more which is much worse. Summer will tell if that is the case for mine as well. BlurXterminator is really good here, the field is not at all flat and corner stars are kind of banana or cross shaped - far from a perfect lens in the camera. The newest AI can correct these aberrations as long as they are fairly consistent and can be measured from the stars.
  13. The built in camera app outputs JPEG and raw .DNG, but there are only a few settings to pick from so i am not sure as to whether they are true raw images or not. DeepSkyCamera beta outputs raw .DNG which are fully linear images, which is easy to see since every image is strongly green as one would expect a colour camera to output if there is no internal meddling with levels. Lots of hot pixels everywhere too, which makes me believe that there is no internal noise reduction applied to the images before saving - so i will call it a true raw image.
  14. I know right? Its hard to believe a phone can do this, but apparently one can. I am particularly pleased to see that the phone outputs true raw images with no sRGB gamma or internal meddling with noise reduction to the frames, which is why proper stacking becomes possible.
  15. 126x 17s untracked exposures taken with DeepSkyCamera beta. Calibrated with darks and flats and stacked in Siril, processing with PixInsight (including BlurXterminator - which is why the stars are nice) and Photoshop. Flats didn't quite work perfectly, still experimenting on flats exposure times. Its crazy how technology has improved, a few years back would never have believed a phone would take this image. If you look closely you can see the flame nebula, the H-alpha region where the Horsehead would be (not seen at 6.81mm focal length obviously), part of Barnard's loop, the Rosette nebula, some open clusters like M35. Still a short integration yet all those things have pushed through the noise floor - very impressed with the performance of the Pixel 7. There is also a foreground, which i had to crop as didn't quite figure out yet how to best integrate that to the sky image. Gradient removal especially will be tricky with the foreground visible, i think i need to stack the foreground separately and try to combine that to the star image in Photoshop. For a first test shot i will call this a complete success, greatly exceeded my expectations to put it mildly.
  16. Starship landing on the Moon seems very problematic, I'll believe they are seriously considering doing that when they actually attempt the landing but we are a ways off from attempt #1. Apollo lunar descent module was wide with a low center of gravity. I wonder why we are trying to "re-invent the wheel" with landers since the Apollo design worked for what it was supposed to do. The reason why these recent landers tend to be vertical is that there is only so much space inside the fairing of whatever launch vehicle launched the thing, and most of that space is vertical rather than horizontal. You could fit the spacecraft sideways, but that creates extra engineering problems with the design. Some kind of folding landing gear system that expands after the lander portion of the spacecraft is detached from the transfer stage would do the trick, but again more moving parts = more failure points which is an undesirable feat for safety reasons. I'm no NASA engineer either, just played too much Kerbal Space Program and tipped over probably dozens of spacecraft due to being top heavy designs. A good lander is about as wide as it is tall, and lands the last portion of the journey vertically with no horizontal movement. The recent lander that fell over had too much horizontal velocity and tipped over as soon as one of the legs dug into the lunar surface which cant happen for manned missions.
  17. Great images, the third one is especially impressive. Seems like there isn't really any background empty space in this region, just dust all around.
  18. You should probably get in contact with Skywatcher directly and ask them what part fits and what doesn't so there isn't any guesswork involved.
  19. This is about 10 degrees away from the ecliptic, so not a Geostationary satellite i would think. It could be a Geosynchronous satellite that has the semi-major axis of a Geostationary satellite but some inclination to its orbit, and if it has at least 10 degrees of inclination it would find itself right around here - or an old satellite that has been sent to a "graveyard" orbit with some inclination so as to not needlessly take a spot of the limited geostationary orbit path. Seems very large for a satellite at this distance though. If you can find this in the individual subs and measure some movement you could maybe draw some conclusions from that. Maybe try stacking the dataset in 3 batches? For example the first 1/3 of the data, the middle part, and the end of the night. If stationary in all of them then it might be some internal reflection or anything scope-born rather than an object in the sky as a satellite in a 10 degree inclined orbit will have moved noticeably during even a couple of hours. *Hmm, no wait i got confused. The ecliptic doesnt have anything to do with Geostationary, but still looks like its not exactly over the equator.
  20. Sure did, all i had to do was click the shutter button every 5 minutes - no need to worry about dark adaptation since the sky brightness jumped from the usual magnitude 20.9 to 18.4 so it really wasn't dark at all. To be fair it wasn't quite this spectacular in person, the phone is really doing some heavy lifting here. Its funny how it goes, when the Aurora is directly overhead and in every direction like this time, its actually more difficult to see the striking patterns and "curtains" of light. That's why they were only easily seen towards the northern horizon where the edge of the light show brought contrast against the darker background sky. Same thing with the Pixel 7, you cant manually tell the camera to go into astrophotography mode. The phone needs to be stationary for a good while, maybe as long as 10 seconds. I think they did this to encourage people finding a way to mount the camera properly instead of trying to do handheld or braced handheld imaging to let the software do its thing properly. My setup is cobbled together from broken and unused parts i had lying around. An EQM35 steel tripod with the polar alignment peg sawed off and an AZ-5 on top of that. Skywatcher L-bracket on the AZ-5 and a broken Celestron eyepiece smartphone holder ziptied to the L-bracket 😁. Which is why everything is in portrait mode, it cant rotate fully to landscape because of the ziptie preventing full roll control. But it works, and cost me nothing so i will call it functional kit.
  21. Some videos of the light show we just had. Taken with a Google Pixel 7 in astrophotography time lapse mode from southern Finland at around 60 degrees of latitude. First one is towards Orion, 4 minutes worth. PXL_20240303_182029622.NIGHT.mp4 Next one towards the Pleiades, also 4 minutes. PXL_20240303_183039778.NIGHT.mp4 Then the best one, 5x 4 minute timelapse stitched together. Towards north this time, where most of the show was going on. MOVIE.mp4 Not sure the videos will look nice in forum format, might need to click to full screen and let them be scaled to your display - let me know if there is a better way to upload. Couldn't figure out a (free and easy) way to stitch the videos together for the last one without a loss in resolution so this is just a google photo autogenerated "video highlight" which reduced the resolution but i think it still looks ok. Gotta say i am very impressed with the Pixel 7 and the automatic astrophotography mode. Really there is no trial and error involved, just stabilize the phone somehow and click go and it will do its thing for 4 minutes and spit out a stacked image and a video of the frames that went into it. The stacked images dont really work for Aurora in this case, since these ones moved really fast so no point in stacking all that movement to a single image.
  22. Naked eye was just green here, some photographic red but not much and not for long.
  23. Bright Aurora here in southern Finland. These were pretty strong, i wonder if UK observers got a chance to see some as well? Its cloudy now though, but was good while it lasted.
  24. To register the images to each other, first drag and drop both of the stacks into the "Conversion" tab in Siril. Give the sequence a name and then press the Convert button, this creates a sequence with just those 2 files. Here you have to have them in the same format, so both in RGB with no channel extractions or anything like that. Then in the Registration tab choose the "Global star alignment" as your registration method and hit Go register, now they are as well aligned as they can be. You can choose which of the 2 stacks is the reference frame by having that as the first frame you input into the Conversion tab. If you want to align the Duoband to RGB, then drop the RGB image first or vice versa if you want to use the framing of your Duoband image.
  25. Quote from the site: Cant say i like that. To my eyes this looks like a VX scope with a new coat of paint, and a new focuser (which is a welcome addition, if better than the VX one that is). The aluminium tube in the VX series is quite frankly awful, and i will argue the worst component of the scope so if this is the same tube with another coat of paint then its not "ideal" (sorry, had to). Also looking at the pictures posted on the site the mirror cell appears to be very similar to the VX cell, differing only in the way its attached to the tube (VX is held from the side with 3 screws with the mirror cell being completely inside the tube, this appears to be screwed to the back - otherwise, the actual mirror holding part looks the same). The flimsy and narrowly spaced tube rings are also the same as in the VX scopes. So just the focuser is brand new then? I will be cynical and say that this seems like an attempt at re-marketing the generally not well received VX series.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.