Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

discardedastro

Members
  • Posts

    895
  • Joined

  • Last visited

  • Days Won

    1

Everything posted by discardedastro

  1. I've got InfluxDB for some bits at home and it works... okay. It's a nightmare to figure out query syntax, stuff like migrations/backups/data archival is a nightmare. We're currently looking to migrate our InfluxDB setup at work to TimescaleDB which is similar but built on PostgreSQL and so avoids a lot of the problems InfluxDB has with data management (a lot of which is sadly now in their commercial release - which we have at work, but it's a five-figures entry price). https://www.timescale.com/ https://github.com/timescale/timescaledb - the open source version Whichever solution you end up with Grafana is an awesome bit of software for dashboards - but may not be quite as configurable as what you have there. Much better for long-term data analysis, though, so may be a useful counterpart. You can use Timescale with standard PostgreSQL drivers which node-red has. If the dashboard is busy I'd be checking the number of concurrent connections you're limiting the broker to? As for Ethernet vs WiFi - there is no question, for anything you want 24/7 reliability on, stick to Ethernet.
  2. That's fair - my apologies for something of a rant that didn't contribute much. I thought it worth calling out the things to consider with MQTT vs other protocols people might be familiar with.
  3. I can't find anything on the NASA/IBM story but IBM are superb at press releases and have been strong promotors of MQTT for a long while. Edit: I did actually find it - IBM were promoting a "Cognitive Telescope" project which appears to have turned into a few CS undergrad projects and went no further. MQTT was part of the picture largely to align with IBM's cloud telemetry portfolio. It's all vanished off the internet now as far as I can tell - just a lot of dead links that don't show up in the internet archive etc. But MQTT is a perfectly good message broker which is suitable for control applications if and only if you put an application layer on top of it to handle the fact that it employs UDP-style semantics by default, if you want to talk network protocols. If a TCP packet does not arrive at the final destination, the sender is aware and can take action. If a MQTT frame is sent to a destination but not acknowledged, it is considered delivered. QOS 1 in MQTT only guarantees delivery semantics between broker and client, not between two clients, and does not guarantee delivery if your client suffers from byzantine failure modes (e.g. received it, parsed the frame and then blew up - this is considered received - compare to HTTP RESTful APIs where a bidirectional exchange between the two final endpoints occurs to validate proper receipt and processing; even QOS 2 in MQTT doesn't guarantee application delivery). If NASA want to use it for control applications with an application that handles all this atop the basic protocol then that's all jolly good but it is not something you get for free with MQTT - you have to think about it as a developer - and therefore where I'd generally advise against using it as a building block for applications where you need that - because many other equally-accessible things can do that "for free" 😊 MQTT is superb for fire-and-forget telemetry, which is what it was designed for. It's worth noting MQTT isn't alone in having some of these problems and error management as part of your control protocol is essential regardless of transport - MQTT (as well as similar messaging/fanout brokers in common modes of operation) just makes you handle more of this yourself than you get out of the box with other approaches. ASCOM over MQTT is a protocol dumpster fire waiting to happen. With my network/systems engineering hat on, I've seen "oh just bung protocol X into protocol Y as a transport/encapsulation to modernise it" go horribly, horribly wrong plenty of times before, even with active protocol conversion going on. Lots of assumptions made by applications and even protocol semantics can break in incredibly hard to debug/fix ways at the application layer. ASCOM Alpaca is a better way to do ASCOM-over-IP since they can do protocol-aware translation at the network boundary, though ASCOM, being a basically "closed" ecosystem, isn't the way to go for this really imho. The whole ecosystem for remote instrument control is a mess, but stuff like INDI (which is network-native and so has an understanding of these sorts of client/server-at-a-distance protocol considerations baked in, as well as being properly open-source and open-community) is the way to go. I'd stick to HTTP RESTful interfaces where required for control or ensure you're doing application level confirmation of commands in an RPC-style manner to ensure that the remote application both received your message and is able to act on it. Doing local fallback for safety is a sensible approach as Gina is already doing.
  4. And on that note, I've literally just finished making v2, which has some active cooling. Metal fan grilles - this is mostly to keep small crawling things out but might help with dust somewhat, I might add some polypropelene dust filter material to the input port but we'll see how it goes. The port on the left is the inlet - I wanted to use the rubber mounting supplied with the Noctua fan to minimise vibrations (though in practice they're not really significant enough to worry about much) and so the metal filter is on the inside; on the right it's on the outside to make life easier for screw mounting. I rejigged the internals somewhat - added some 3M adhesive foam blocks to hold the electronics in place. The Wago blocks could do with a better mounting scheme but it's good enough for now - it isn't exactly moving around much but this keeps most things out of sitting in any water. The fan driver I'm using is a dumb-as-chocolate little board from Amazon with a thermistor which is sort of hanging out in the middle of the box. This PWM modulates the little Noctua A4x20 fan which can deliver quite a lot of air at full pelt while being quite quiet. I've covered the top in alu foil - I wanted to check this was all going to work OK before I dismantle the dome again to paint the surface under the dome black. I actually cracked the dome at the edge when assembling it, so might seek to get another dome of the same design (or two, for spares) before I take it off and muck about re-siliconing.
  5. Oh, that's neat. I've got some aluminium tape which I'm using to wrap my enclosure (aside from the home) to help reject solar load - it'll also completely reject UV. Could layer that and/or black flocking over the PLA and it'd be fine for extended use I'm sure.
  6. MQTT's great for sensor data - but I'm not sure I would use it for control data. It doesn't have very good mechanics for handling failures; well, any, really. You can guarantee delivery to a broker but you can't guarantee the broker delivers it. This means you either get to implement some timeout-based RPC mechanics - i.e. wait for a reply - at which point you're often better off with a synchronous system like HTTP etc - or just assume that your message got there. In which case things like "move focuser in" *wait* "stop moving focuser" can go badly wrong, e.g. what if the stop command didn't arrive? You also get no feedback from MQTT itself about message delivery, and MQTT won't retry to send. Things like RabbitMQ/AMQP or Kafka work better in this regard in allowing consumers to retry message handling or receipt in guaranteed-sequence semantics, even in the face of network failures and potentially hardware errors like reboots mid-command. Grafana is a fine thing for graphing; I use a system based on InfluxDB, MQTT, and some small Python scripts that take MQTT/HTTP data and feed it into InfluxDB. Grafana then pulls data from Influx for display. That works pretty nicely - though I'd consider TimescaleDB next time around. JSON is a great thing for messaging but the size overhead is pretty nasty. I've been using CBOR for a few things here and there as an alternative, which works really well and is very compact, but haven't tried it on embedded platforms yet.
  7. Yeah, the distortion aspect is why I've kept my one low - I have some black foamboard and my current plan is to make a shroud for the lens to "peek out" from, flush with the bottom of the hole for the camera - if I then heat the dome in future this will also act as a thermal barrier to allow for easier heat regulation of the camera (as the dome will be airflow-isolated from the main box. I'm also going to paint everything under the dome on the top surface with Black 3.0 paint (to avoid any issues with reflections), and then cover the rest of the box's topside/sides around the dome with aluminium foil tape - that'll help to reject most of the solar IR load, which will help minimise the thermal gain. I've been tinkering with some simple simple CFD stuff to work out the best way to manage the airflow/thermals within the enclosure. I have a Noctua 40mm fan plus a pair of metal grilles and polypropylene filter material (to keep the bugs etc out) to match, so my current plan is to place two 40mm holes in the bottom of the box - since this makes avoidance of water ingress easier - and go with a simple push-vent setup, with a PWM controller for the fan to modulate it. I might cook up a little uC controller to manage the fan and the heating element based on the ambient temp and internal temp via a few temp sensors. I do have enough room to accommodate a small cooled fan, but could also accommodate a peltier element and heatsink on the existing camera as a hack which would certainly be cheaper and probably be enough to get some consistency, which is key. I've also started writing an INDI client in Rust; this is a pain in the proverbial because Rust has no nice XML parsers (XML being something of a dying language, and certainly not typically used in the systems domain). A little bit of me is tempted to try and update indiserver to speak something sensible like JSON or CBOR but I stopped speaking C/C++ long ago! Still, I'm almost at the point of having two-way comms working, and from there it shouldn't be hard to do basic capture and start implementing control loops based on image brightness, etc. I suspect I'll end up building something of an ecosystem - Rust isn't a great language for astro (there is a FITS library, but very little in the way of existing astrophotography/image processing code, so it'd be first-principles-and-up) so shoving the images to a Python processor for the thinking might be easier, but on the other hand Rust is really easy to make correct, stable and verifiable code in and so strikes me as a fantastic language to write 24/7 AP/obsy code in - hence in part my desire to have an INDI client lib...
  8. Wow - that's quite a beautiful shot. Can I ask what the acquisition parameters/equipment were?
  9. Definitely will need to arrange for some environmental control - currently 'tis a little hot in the box! pi@unreliable-witness:~ $ sudo /opt/vc/bin/vcgencmd measure_temp temp=80.0'C Sensor sat around 60c. I think a little 80mm fan or similar with a good fan filter to stop the bugs getting in and some decoupling to reduce vibration transmission will be required. Cool-down isn't going to be very fast in a sealed enclosure.
  10. I haven't - I was going to have a stab at writing something up myself. Feels like a good bite-sized project to combine camera control via INDI with some simple automated processing; I'd like to try and write something in Rust or Python (using AstroPy/scikit/numpy). Hoping to do some automation e.g. automatically determine sky state, cloud cover, etc.
  11. asc-2020-08-11.mp4 Managed to assemble this fairly quickly - still using KStars for capture and PI for processing, and clearly this close to the house suffers from some lighting interference, but it works pretty well - quite happy with it optically as a starter for 10 with the cheap lens supplied by ZWO.
  12. Thanks for the info - definitely interesting. Noise is definitely an issue - I'll capture some dark frames because it's going to need it. Starting to think a smarter plan might be to replace the bit of balsa wood with a two-stage TEC, or just replace the 120MC with something cooled. But I think with dark subtraction I'll get some usable data out. Sat out and fiddled with focus a bit - very hard with so much noise and such an insensitive sensor, plus the image scale. test-asc-30s-30g.fits
  13. Finally got everything together to make my own ASC - I'm not using the Pi HQ cam because they're still hard to get hold of and I had a ASI120MC lying around. Enclosure is a fairly standard thing. My only "d'oh" moment during assembly - other than not measuring my hole quite right, hence the tiny overhang of the dome, but I figure that's immaterial - was that in using a cloth to tidy up some bits of silicone I inadvertently dragged some across the dome. Got most of it off and I don't think it'll be visible, just a thin smear left - but annoying! I have a separate Dribox which has the 12V supply in and a simple gland for the power in to the box. Should be good to IP68 - we'll see in the coming thunderstorms! Internally it's a mess and a half. Big block of balsa I had lying around was adjusted to make a pretty good height match for the camera but I'll replace this with a 3D printer mount sometime soon if this works. Simple 12V/5V step-down (in practice I could have just used a 5V PSU, but keeping the internals 12V-sourced means adding dew heaters etc later is easier) and then a Pi4. External WiFi dongle is again something I had spare with a nice big antenna; I need to run some ducts through the garden and sort out my Ethernet properly, this position doesn't have wired cabling yet. No cooling or ventilation thus far - I'll lob in some silica gel sachets for now. If this works, I'll probably dismantle it, mount everything properly on 3D-printed bracketry, paint the top surface inside the dome with antireflection paint, make a collet for the camera out of black foamcore, and maybe consider airflow/dew removal - will see how it fares in the meantime. Software wise I'm just using INDI and KStars for testing. I'd like to have a go at making something on this front so will start playing with that tomorrow I guess!
  14. You'll really want flats to fix vignetting - darks won't do anything for that. An inexpensive approach can be as simple as an old monitor or TV showing a white/grey signal (I use a spare tablet w/ HDMI out and have the desktop background set to a 80% grey), or several layers of plain white bedsheets/linens/tshirts stretched over the end of the lens with elastic (taking care to avoid any creases), pointed at a reasonably even light source (such as the dawn/twilight sky). Black PVC tape will fix any errant LEDs. If you're going to be moving things around much then a Polemaster is an excellent addition to the arsenal.
  15. As an ex-TVC-adjacent bod (across the road at Centre House) for a few years the fact that ST connectors found a home in BNC panels does not surprise 🙂 Both ends terminated is definitely the simplest way to get a good exit pupil with a reasonable NA. Definitely better to get the exit pupil of the point source as small as reasonably possible I think, the flipside is your entry pupil is also tiny and so getting light in is tricky. I'd maybe consider looking at cheap laser diodes you could align and glue or otherwise secure onto the far end of the patch cords, but you'd end up testing at a fixed wavelength rather than broadband. How do the point sources look defocused? Yes, the couplers mechanically work on the ferrule outside diameter only so you're safe with that. FS are well known in industry for being cheap as chips basically by virtue of volume on a restricted product portfolio and a 100% CN-sourced portfolio. Their quality control is okay, and for most passives they're okay. We use them for stuff where optical quality isn't critical and where we can tolerate failure (short links in datacentres etc).
  16. That's an excellent set of result images you've got there! The ST connectors are a bit oldschool but should work great for that application - as you say, the round hole is easy to make and you get a free clamp. The other option I'd consider is LC-LC alignment sleeves with flanges - these can be secured with wire/bolts but are a rectangular cut-out which is a bit trickier. The polishing materials you've got there are meant to be used with a connector - the only way to make a "clean" end on a bare fibre is score-and-tension cleaving. You also need an alignment puck to hold the connector flat on the polishing sheet (which should be on something nice and flat - surface plate in a lab, piece of flat glass is the common field option). I would well imagine that on your 9um bare you'll not have a clean end. If you have a cheap USB microscope that'll actually do alright at showing the end. This is a SC connector under a microscope (a Nikon SMZ18 at about 8x mag, if memory serves) with a single mode fibre. You can just about see the core in the middle of the ferrule (a tiny dark spot inside the dark grey spot in the middle). This is what's in the back of a particular LC ferrule - this one's a bit odd (and failed) but from right to left is the outer jacket, an acetal binder, the 250um cladding, 125um coated fibre.
  17. Getting a sharps bin is a fairly important thing if you're cutting fibre and leaving shards - not a good thing to have in your carpet, especially if you have pets/kids or might sell someday to someone who does... Kevlar snips are a thing e.g. https://www.millsltd.com/mills-kevlar-scissors.html My last Thor Labs order was five figures (optical breadboard, frame, etc, plus some piles of fibre and tools) and no lab snacks contained within. Gutted. I'd definitely stick to ferrules/connectors for the launch end - if nothing else you're guaranteed a fairly flat end which will accept light from free space a bit more readily.
  18. Sounds good. I was thinking on this some more and going for a fibre specifically targeting a low mode field diameter at visual wavelengths may yield an even smaller point source, e.g. https://www.thorlabs.com/thorproduct.cfm?partnumber=SM450 is £5/m and has a MFD of about 3um from 488 - 633 nm. The 9um stuff (standard telecoms fare) is going to be around 8.6-9.2um at those wavelengths but I've no idea what the behaviour would be at visible wavelengths. Sadly I don't have MFD measurement kit (it's expensive stuff and we generally just push that onto the manufacturers to specify or pay NPL to check it).
  19. Still grappling with comet alignment in PI, but in lieu of working that out, stacked a bunch of frames from last time. ISO 320 Z6 on 200PDS with very very rough and ready polar alignment, 20x5s subs stacked. Had to move the telescope to a place with no polestar visibility to get the shot. A single sub below (you can see what I mean by rough alignment): Stacking revealed just how vignetted the full-frame sensor is in the 200PDS, and I forgot to take flat field frames as part of calibration, so had to crop quite a bit for the stack without calibration.
  20. Since they both have 125um fibre diameters, they'll both be pretty robust - look for G.657.A1/A2/B3 fibre if you want robustness to bending, this is the stuff commonly used in FTTH networks and the like and will tolate bends as low as 25mm without any loss. Pretty much all of these will have at least a 250um cladding around the outside. Removing this will help - you can do this easily with some 250/125um strippers like https://www.millsltd.com/tri-hole-miller-stripper-ripley-80677.html which are pretty inexpensive. For shining light into the far end, I'd keep the far end terminated in a connector and get something like a "visual fault locator" - these will give you a red light, but will typically have a non-contact mounting flange for a 2.5mm ferrule as you'd find on an SC connector, e.g. https://www.ebay.co.uk/itm/10mW-10KM-Visual-Fault-Locator-Fiber-Optic-Laser-Cable-Tester-Test-Equipment/261476281179?epid=1374228066&hash=item3ce1336b5b:g:pjMAAOxyMjJTlov6 - alternatively, you can just dangle your SC/LC ends into a box with a good bright light source. You ideally want the fibres to be pointing at the bulb, so getting some SC or LC couplers might be helpful mechanically. If you want to further constrict the exit aperture of the glass, I'm not sure how to go about that. Glass processors for fibre exist but are super expensive (e.g. https://www.thorlabs.com/newgrouppage9.cfm?objectgroup_ID=11031 - and when Thor tells you to call for price, given they frequently list stuff in the £50-100k bracket, you know it's going to be cheap). We normally use fusion splicers (which are still expensive - halfway decent ones are 3-4k with a cleaver/stripper etc) to make joints in fibre. Down at 9um you're into proper microscopes to do anything much with manual manipulation. Edit: Almost forgot, you might do well with a fibre cleaver. You can get cheap ones off eBay/Aliexpress. These will make, using a tension-and-score method, a very clean cut of the fibre (within 0.5 degrees of flat, typically). That'll do you a much cleaner edge than snapping or cutting with scissors (we typically use scissors/diamond snips if we want a bad endface - this is often beneficial to provide a low-reflectance termination (since there won't be a flat end to bounce the light down the core).
  21. What you want is single mode OS2 fibre - this will be 9/125, i.e. ~9um core in 125um cladding. The cladding is bonded to the fibre core - the protective sleeve (furcation tubing, for its proper name) over the top is optically separate. Singlemode is cheap as chips and easily gotten (e.g. https://www.fs.com/uk/products/96103.html ). You might also want to try an APC finish - this is angled 8 degrees at the connector to reduce reflections into the fibre from unterminated connections but might also help cut down on any light other than the primary mode. The mode field diameter of most OS2 cables is ~8-9um at 1310nm (infrared) and will be in the same ballpark for visible spectra. You won't get points any smaller than this without external optics.
  22. Very rough and ready one using an old 80-200mm f/4.5 Nikkor and Z6, no tracking, 60x2.5s, CosmeticCorrection and CometAligned in PixInsight, quick SCNR/histogram/curves and knocked the red down a bit to lose all the CA from the lens. Also shot some stuff with an 85mm f/1.4 Zeiss Milvus which I'll crunch tomorrow and should do much better on the CA front. Telescope needs to move 10 feet to the left to be able to catch it, so if the weather looks promising I'll do that tomorrow and accept I'll lose my perfect PA...
  23. It takes some getting used to for sure coming from mostly Python, and it's definitely still overkill from a performance perspective for some applications I'm tinkering with (currently writing an API server for a web app I'm writing to handle optical time domain reflectometry traces, which is fun) but it's lovely for low-level stuff. I was writing a binary file parser for the Telcordia SOR format (which is pretty darn obscure) and after the first few days bashing my head against compiler errors and whatnot I managed to adapt my thinking and now view the compiler as a close ally. Genuinely stops you writing things in "bad" ways - I had to be much more explicit and precise than I'm used to but when you do that it gets you a lot of goodness for free and a better, more robust app to boot. I've even fuzzed this new parser (with rust-fuzz) and it's robust against random binary inputs and all the rest with a very amateur developer at the helm! Feels like it'd be a very good fit for observatory software where robust testing, crash-free operation and stable, continuous, sometimes-timing-sensitive apps are order of the day.
  24. I'm really hankering for my transparent dome to actually show up so I can start playing with this. Reading through this I'm quite tempted to have a go at doing something with Rust, which I've been writing quite a bit of lately and enjoying for a systems programming language. Very, very fussy compiler but once it compiles, it'll run, and writing tests is super simple...
  25. Caught it just setting below the hedge tonight. Nikon Z6 w/ an ancient 70-200f/4.5 telephoto, 1/25th of a second at f/4.5.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.