Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

JonCarleton

Members
  • Posts

    334
  • Joined

  • Last visited

Posts posted by JonCarleton

  1. To be clear, the uploaded image displayed by this website (and others) appears with magenta stars...yes.  The original .fits file on my local machine does NOT have magenta stars.  The .jpg on my local machine that I uploaded here does NOT have magenta stars.  There is apparently a problem with the conversion for web display.  Something about my .jpg is not converted properly. 

    I have heard others complain about this issue on other sites, and not just Linux/UNIX users.  But, it is not apparently a global problem.  I do intend to solve it one way or the other :).

     

    • Like 1
  2. 1 hour ago, ollypenrice said:

    I don't normally see any degredation between here and my view of an image in Photoshop. Are you're in the right colourspace? The internet standard is sRGB.  (I process in ProphotoRGB for its better gamut and then go into Photoshop's Edit-Convert to Profile to choose sRGB. Don't confuse this with 'Assign Profile.')

    Olly

    I only own one Windows machine.  I use it ONLY for Pole Master and iPolar (because those two require Windows), then it goes back in the closet.  And yes, I am careful to be mindful of sRGB in my exports.  I have been doing the file conversion to jpg in GIMP, but perhaps I will try something else.

     

  3. 7 hours ago, Stuart1971 said:

    Thanks a great image well done..👏🏻

    ‘Not sure what software you use for processing but in pixinsight if you invert the image and run SCNR it will remove those magenta stars for you…👍🏻

    I use PI, but they aren't magenta on my monitor.  I did a monitor calibration to try and correct it, but it didn't help.  Perhaps I need to use a different tool to create the jpg....or just upload a tiff instead.  Could be something in the fits conversion.

     

  4. I keep trying.  Still not what I want, but better than last attempt.  3 hours with a Radian Raptor and QHY183M one-shot.  I have to say what I see after upload is a poor copy of the .jpg I see on my local monitor screen.  At least the playing field is level, we all have that, to deal with I feel sure.

    NGC2239.jpg

    • Like 8
  5. I notice a lot of folks with issues using the PI4 as a hotspot and not getting NTP.  One can get fairly inexpensive ($10 US) USB GPS devices on eBay and other sources.  You get a rock solid time standard along with LAT/LON.  That is what I  use when I go to remote sites.  Just search "GLONASS GPS RECEIVER USB."  They all say "Windows Support" but there is plenty of software for Linux/Pi that works very well. 

    I use a VK-172 (read as: cheap!) and it works fine.  You will have to install some software and write a script to read the location and time, then set it per your needs.  Once that's done, it just works, forever.

    ---->>>>>>> EDIT:  OK, apparently I did it the hard way a few years ago.  There is now INDI support for a great many of these devices.  https://indilib.org/devices/auxiliary/gps.html   There is also a device compatibility page that is worth a look as some of the USB GPS devices are better than others.

     

     

     

  6. I prefer to just add the software I want to a ubuntu distro.  That method has served me well since the Pi's have been around.  They are my go-to mount computers.

    I know ubuntu is going to be around and is rock solid on a Pi 3, 4 or 5. I'm going to need INDI drivers for whatever imaging platform I need, so that download is a given.  I have used Siril on a Pi4 and OrangePi5 with good result, though I did have to compile Siril on the OrangePi5.  At least the OrangePi Ubuntu made it easy to find the library requirements for the compile. I can also run CCDciel or Ekos, ASTAP,  GIMP with little effort.  None of the ARM SBC's are going to run PixInsight anytime soon, so that isn't a factor.

    That said, I have found in the past some really nuggets of genius in unusual places by "just trying new things."  If you never try, to never find those.

  7. Interesting.  I knew there was a community out there for the platform, but I didn't suspect that it was mostly Windows-based.  I've never used ASCOM and came to CdC and CCDciel from KStars.  I prefer the platform to KStars because it seems to run better on my hardware. I typically run a Pi4 with the drivers on each setup and the imaging/guiding/planetarium inside on the desktop.  Ubuntu as the common OS.

    Thank you for your replies!

  8. I have been using CCDciel and SkyChart (CarteduCiel) on raspberry pi's for a few years now.  I realize that this software set doesn't have the following of KStars/Ekos, but wondered if was still represented here in this group.  For me, at least, it works well, is simple to demonstrate and makes a good showing at star parties and the like. 

    This software set relies on INDIserver and the INDI drivers, under Linux or ASCOM drivers under Windows.  Again, much the same as KStars/Ekos.  

  9. I have been using PI4's for a couple of years to run all my rigs.  During this time, I have tried all sorts of combinations of VNC, ssh  -Y, and any other option I could find.  What I find that works best is to run ONLY indiserver with the drivers on the pi4 on the mount with the mount pi4 on the local network via WiFi (or ethernet cable).  Then I run the imaging software, EKOS with KStars or CCDciel with CarteDuCiel (I prefer the latter), on a desktop inside or a laptop.  On the same local network, of course. 

    The only exception to this setup is that I have found that on a mount with which it is dificult to get excellent guiding, I install PHD2 on the mount pi4 and run PHD2 via ssh -Y on the imaging desktop/laptop.  I speculate that in troublesome mounts, even the slightest networking delay between PHD2 and the guide pulse makes a difference.  Since there is a small delay in transferring the guide image to the software over the network,  removing that delay can tame a troublesome mount.

    This setup has the added advantage of removing the need to transfer the files from the mount Pi4, as they are transferred at time of capture.  This also means that having a large amount of storage for the image files on the mount pi is no longer necessary. 

    Ekos will happily start and operate a remote indiserver instance.  For CCDciel, you may either SSH into the mount pi and start indiserver or use a free utility called IndiStarter.

    I do run remote from time to time.  When away from the home network, I bring along a spare WiFi router configured with the same spec as my home network.  I typically just run a bit of ethernet cable to the wifi router from the laptop. This way, nothing has to change on the mount pi.....almost.  If you do not have a means to connect to the Internet when remote, you =MUST= SSH onto the mount pi and set the correct time/date.  This is, of course, unnecessary if you have a GPS device on your setup that provides time to the mount pi. I do not.  That said, without a GPS device, one has to make sure the imaging software knows the Lat/Lon/Alt of your remote location.  Bad things happen if the time/date is wrong on the mount pi or the location is wrong on the imaging software. 

    I always verify the mount pi has the proper date/time and the imaging software has the proper location when operating at a remote site, even if I am using a GPS device as a time and location source.  This is not necessary for most home setups as your location is fixed and presumably setup when you installed the imaging software and the standard setup for most pi operating systems is to go in search of a network time server the moment a network connection is operational.

  10. I bought a Dwarf II when they first came out and ended up not doing much with it, as using it on a phone was tedious for me and I didn't have a big enough tablet to make much improvement.  One day, for grins, I tried putting Android on a Raspberry Pi4, then loading the Dwarf II software.  Eureka!  Now I could use the little contraption on a 1080P HDMI screen.  For me, this was a game changer.  It has limitations, and you won't get "pixel-peeper-proof" images, but it isn't a bad little device if you work within its limits.  Here is the Orion Nebula.  1200x10 second subs over 2 nights.  Processed with ASTAP(grading), Siril (stacking), PixInsight (general processing) and GIMP (finishing, sizing and type conversions).

     

     

    M42_Dwarf.jpg

    • Like 6
  11. I am fortunate for my area to be in Bortle 4.  When I bought this farm in 1996, I was tired of living in the city and took my plane up at night looking for the largest area of "dark" I could find.  I circled this general area on a map and brought it to a real estate agent the next day saying, "Find me something in this circle."  Back then, this was Bortle 2.  A shame I wasn't doing astronomy of any sort back then.

    Thank you for your kind comment!

  12. This is another in my apparent "see if I can do better" reshoot series.  This is 260x120sec. subs with a Radian Raptor using a QHY183C.  Processed with ASTAP (grading), Siril (stacking), PixInsight (general processing), GIMP (finishing, sizing and type conversions).  Once again, I recommend clicking on the image, as the thumbnail, large though it is,  does not accurately present the image for some reason

    .

    M33.png

    • Like 2
  13. This is the Bode's Galaxy area shot with a Radian Raptor using a QHY183C for 220x120second exposures.  Imaged with CCDciel. Processed with ASTAP->Siril->PixInsight->GIMP.  Frankly, I think that I need to use the bigger scope to get the detail I'd like to see in this area.

    Interesting.  I notice that the preview image is significantly different from the image one sees when the image is clicked in that it seems washed out.  If this image interests you, I suggest clicking the preview image.

      

    M81-82.png

    • Like 6
  14. I shot this image a few years ago, but hope to have improved some since then.  This is a redo from the past several clear nights.  Certainly, the processing tools now are much better.  

    This is 600x120sec exposures with a Radian Raptor using a QHY183C one-shot color camera.  Imaging with SkyChart->CCDciel->Phd2 using INDI drivers on a remote Raspberry Pi4.  Processing with ASTAP (grading), Siril (calibration and stacking), PixInsight (general image processing), and GIMP (final touch-up and sizing).  All Linux.

     

    IC405.jpg

    • Like 6
  15. I always do a "first-light" now with Messier-42.  This, because it was Messier-42 that first knocked me out of my chair when I put a cheap web-cam on the back of a department store refractor and initially caused this addiction.

    I bought the Dwarf II when it first came out with the thought that I could at least use it for solar to the small extent I care about solar and my wife might use it for bird photography.  It turned out that while it was "OK" for solar, it wasn't anywhere the neighborhood of "wonderful" and trying to decide if something was in focus with a tiny phone screen was not ideal for my lousey vision.  So, it sat in the box for a while.

    There have been, just recently, some "really not terrible" posts on other services that support the Dwarf II user community.  So I decided to give it another try.  Full disclosure:  I hate phone apps, because my vision is "worn out."  This is why I =do= astrophotography, I can't see anything in an eyepiece.  So, I did a small hack.  I used a Raspberry Pi4 and installed Android as the OS, after which I loaded the Dwarflab app.  Success!  I could use a big HDMI screen and keyboard and mouse.

    The device turned out to be remarkably simple to use to collect images.  Although, being Alt/Az, I find that 1200-1500 10 second images is a good place to start.  But, I can set it up and start imaging in about 5 minutes and just leave it to its own devices.  This image is the result of one such session.  This set was around 1300 images at 10 seconds over 4 nights.  No filters.  Bortal 4 sky.  Graded with ASTAP and 10% discarded for quality.  Stacked with Siril.  Some processing in PixInsight.  And finally, finished in GIMP.  All Linux.

    It isn't a "pixel-peeping-proof image," but not bad, in my opinion, for the limited amount of effort involved.

    M42_Dwarf.jpg

    • Like 1
  16. On 18/12/2023 at 04:26, WolfieGlos said:

    I image in OSC and I've never had a good result with Sirils NR, and their own info / help files suggest that it works best on mono images so I assume this is why.

    I also use OSC cameras rather than MONO.  I do run all Linux, however, so perhaps the library files Siril uses for image processing in Linux make a difference.  I know that even in PixInsight, I can see slight differences between Windows render and Linux render, especially when writing tiff and fits files.

  17. Siril now has a pretty good Noise Reduction function in the Image Processing section.  Along with "Remove Green Noise," that is just the SCNR algorythm in disguise, it does a respectable job.  The whole suite of Image Processing in Siril is worth looking at again if you haven't seen it lately.  Even the Background Gradient Removal function is at least as good as PixInsight's DBE routine.  Although, the new GraXpert AI background extraction tops them all at the moment (and is now available as a PI plugin).

    Another advantage of Siril is its platform independence.  I even have it running under Linux on an OrangePi5 Plus, though I had to compile from source.  There are distributed download-and-go versions for Linux, Mac and Windows.  The other advantage of Siril is that it is a very fast and capable stacker.

    That said, GIMP is also an excellent choice for similar reasons.

     

  18. Alas, the new Linux Ubuntu version of PoleMaster is for amd64 and will not run on arm64 hardware (all the pi boards).  I could replace my mount Pi4 with a little Intel SBC, I suppose.  However, I use a WaveShare motor hat board on the Pi4 driving a stepper-driver motor for a focuser (indi_wmh_focuser).  I'd also have to buy or build some kind of Intel-compatible focuser...probably something USB-ish.  Unfortunate.

  19. COOL!  I have a dedicated Windoze machine (the only one I own) that I ONLY use for PoleMaster and iPolar.  I tried running the old version of Linux PoleMaster on a Pi3 once, but it was only barely functional.  That version wouldn't run on a Pi4 or X86_64 box.  I run my INDI drivers and INDI server on each mount on their own Pi4 and image from in the house over the network with a Linux desktop.  It will be VERY nice not to have to setup a Windows machine with monitor out at the mount just for the polar alignment.  If this new version will run on a Pi4, I can just bring out a portable monitor and a mouse and get the job done.

    To this, I must add that I recently had a great experience with QHY technical support.  QHY cameras require a SDK on the INDI Server in addition to the indi_qhy_ccd driver.  The SDK runs on X86_64 hardware and Raspberry Pi3 and Pi4 without issue.  However, I bought an OrangePi5Plus running Ubuntu and it would not work, nor was it possible to get a clean compile from source on that hardware.  I contacted support, not expecting much.  But, they were all over it.  They had me up and running the next day.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.