Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

powerlord

Members
  • Posts

    2,333
  • Joined

  • Last visited

  • Days Won

    16

Everything posted by powerlord

  1. You're nearly right. The proper cable may or may not have thicker wires, but the secret is, it has MORE of them. The reason is that with USB, the data lines are used to tell the PSU what voltage and current the device can take (USB-C in addition can supply all sorts of voltage and current levels). In absence of these, a PSU will only deliver 5v at 1a max at 5v usually. most devices will play it safe talking to a 'dumb' PSU (which is how they now see it thanks to the cable), and will draw only 500ma or 1a incase they cause the PSU to stop working (most cheap 500ma ones will just drop to low voltage if you try to draw more than 500ma from them or worse - do something nasty). I'm simplifying here - usbc actually spreads the current across multiple different wires in the cable - 12 in total for data and power) Your cheap usb lead undoubledly had only 2 wires inside (Earth and +ve), hence you got 5v and 500ma or 1a. it's a 'dumb cable' The proper lead will have a minimum of 4 wires and lets the S50 negotiate and probably get its 2a. A more sophisticated negotiation happens with things like new phones and laptops where USB4/C features are used, and they might get 12v and 2a or something - and then even more than 4 wires are required! USB does not help itself here, in stuffing 'USB' marks on cables whether they are 2 wire or more (usbc can be 12 wire!) To avoid this sort of thing happening (and this is true to a lesser extent with usb2 too where 4 wires are required for full power negotiation), I always bin usb leads that come with devices for charging. They will nearly always (unless its something like a laptop or phone), be rubbish 2 wire things. Before you bin it, pull it apart, and wonder at the 2 wires, or the fact you've just destroyed a 4 wire one 🙂 Stick to using good quality leads with are advertised as 'data leads'. For USB-C just to confuse things further, you can instead just search for 'PD' leads (power delivery). But data cables will also work fine (though not for very high 40w+ delivery required for laptops unless specced accordingly). The thing that really annoys me, is this is there 5th go at getting usb right - and still its a bloody mess.
  2. ooo, I'm no expert Elp, but don't think that's correct. @vlaiv will explain it better than I can though so I best not try.
  3. Shot last night with my newly acquired Canon 24-70 F4 L USM stuck on one of my asi2600s. Only about 2 hours worth of 60 second subs - and since most of it was through a tree, I thought I'd keep in the tree. stu
  4. that's why it's called the beige galaxy @ONIKKINEN ! or not. yeh... as I say - quick and dirty - I didn't feel the data deserved any more attention than it got. It's funny. - that's 3 years I''ve been doing this now, and even 2 years ago I'd have been pleased as punch with that. But now, I suppose I'm more jaded. With 10 hours more data, and another 10 hours Ha, it would. be decent, but until then... Isn't it amazing though, that we can get stuff like this from our back garden in a few hours, which so surpasses what was possible even 40 years ago, that we can't be bothered edititng it properly. I think that is truly incredible. stu
  5. A bit of a different one. I took the S50 up to scotland to see the family at xmas. They live in a high village in central scotland, where you have amazing views to the NW - looking down over the whole of central scotland and then beyond up to the mountains. And though there was not a single time in the week I could see a single star, one afternoon the galeforce winds and rain stopped, and the sun came out just long enough to get a great view. So I set the S50 up and shot some video. As it only shoots video at 12fps or so, I put it through Final Cut Pro and used optical flow to tween the frames up to 60fps. I've shot a few intro shots on my phone to try and demonstrate how far away stuff is. The bridge you will see is the Erskine Bridge over the Clyde. It's 30 miles away - the BA plane you see is coming into land at Glasgow airport to the left of it. Later you will see a triangular mountain in the distance - that's one of the closer ones you can see - Ben Lomond: 50 miles away, The Cobbler about 60 miles away. I like the last shot where we are looking down on the blades of a wind turbine popping up over a field. Funny enough the family were more impressed with that than most of the astro shots I share. It's the first time ever anyone has noticed you can see Ersike bridge for example.
  6. Whats wrong with the moon ? Have a good look at that. Easy too see, even if it's a bit cloudy. Good luck
  7. Using up the last of the subs I've managed to eek out of the last 2 months - here's 4 hours of Andromeda, shot with the redcat. stu
  8. This is the only other image I've been working on the last 2 months (see ic342 for the other one). In the end, 10 hours of data over 4 sessions - all shot in broadband with the asi2600 and redcat. I've not shot it in broadband before. It has to be the easiest edit I've ever done - once it was integrated there was really no noise worth 'xterminating' so I didn't bother. it's just had a wee crop, and a tweak to the levels, curves and saturation. It's worth doing the clicky dance to the full res one - there's a hell of a lot of fine detail in there I've not noticed before in my NB attempts.
  9. Happy new year all! With the minging weather, I've not been posting much - this has been one I've been trying for the last few months on the odd occasion I can see a star and it doesn't star raining half way though a night that had a "0% chance of rain" according to the forecasts! It's one I've tried before, and got dejected and given up after not seeing a thing in a single sub. But - I know it's a tricky one - peering through the dust of half our galaxy to see it..so I thought I'd just get stuck in and start just gathering data. In the end I gathered 16 hours of RGB data, and another 15 hours of Ha/O3 data (L-ultimate) with my 200pds and asi2600. Edited in Siril and Affinity Photo. I've posted a crop in below as well. Lots of dust and some faint outer red spiral arms just about visible (top left) I think which I've not seen before ? stu
  10. one thing is, do make sure its focused BEFORE you try to goto anywhere - it needs to be focuses to plate solve. You can always focus again later. Note sure about other issues.
  11. I'm sure I'm not the only one who find's @vlaivs theoretical explanations of things difficult to really visualise and understand (no offence Vlaiv!) - And recently he was trying to explain aperture to me when I was talking about night vision - I tend to be a very visual guy - I have no problems rotating and working with 3d models in my head, etc - but if presently with equations, they don't naturally lend themselve to the same visualisation (even though 50% of my degree for maths). So for folk like me, I find Huygens Optics channel very interesting, and it just so happens his latest video makes what Vlaiv was trying to tell me clear to me: Again, thanks Vlaiv for making me think - it's just I need different medium sometimes to understand I think. stu
  12. wow. yeh that's what I'm talking about. that truly is amazing. I wonder how long till we get the seestar s500 that does that for all of us.
  13. But the SNR is clearly better, or else they'd be useless for thier purpose surely ? have a look at video - the milkyway is clearly visible in real time. With images bright to the human eye. it still seems similar to the planetary issues to me ?
  14. I just watched this excellent Veritasium video: And it got me thinking when they were looking at the night sky and the milkyway, etc. Seems to me if you stuck one of those onto a telescope, you could then attached a camera to that and take very short <10ms images aka planetary imaging with it. And with enough of them remove the emissive noise. Then apply the same lucky imaging techniques as we do to planetary to nebulas and galaxies. Couldn't this enable unprecedented terrestrial resolution to astrophotography ?
  15. The main difference is going to be selectivity. Stacking live, it's gonna stack a frame if it thinks it's decent. then it's on to the next one. When you stack afterwards you have the benefit that ALL the frames can be analysed and the software can make decisions about which is the best 'master', and depending on software, weighting, ones to not bother stacking, etc. screening is pretty straight forward - at least on a mac - I'd imagine something similar is possible on windoze. On a mac space bar 'quick views'. i.e. pops up an instant picture of the file. For fits, once you install a quickview viewer for fits (I use the free Quickfits), I can just go to the directory, press the space bar to quickview the first file, and then use cursor keys to move down the list. takes no time. cmd+delete to delete any rubbish ones as I go. then into Astro Pixel Processor, where it will also make decisions about which to stack if I want based on quality. However, if beginner, I'd start with the seestar stacked fit, and learn to make that better first. Here's a free process I'd suggest based on Siril, and installing starnet2++ and setting it up with Siril. - open it in Siril. set siril to autostretch (you will now see it) - go to plate solve, and enter target, set FL to 250, and pixel size to 2.9nm and plate solve. - go to colour correction and do photometric colour correction - save that if you like as file-platesolved-colourcorrected.fit - histrogram stretch - use auto. - change view to linear. - use siril to do a star removal, and create star mask [you are now looking at a starless version]. - save this as a tiff. - open the stars only one it created in siril and save that as a tiff. Now, go into you favourite photo editor that supports layers and load both the tiffs. you will use a blend layer on the stars - depending on the software it might be called 'lighten' or 'screen' - experiment with them you can now turn that layer off, and concentrate on the nebulosity - play with curves, contrast, saturation - try to bring out the nebulosity while keeping the background darker, but not pitch black. add some denoising. pop the stars layer back on, maybe add some saturation to that too, and probably curves to pull the brightness of the stars down a bit. - save it as a jpg. The above may sound like a mammoth task, but with a bit of practice it really only takes 10-15 mins. btw - nothing about specific to S50 - that is a general colour workflow which will work for ANY astrophotography. stu
  16. Sure, I saw that stuff - but that's just using the asiair HW really - it's not added 3rd party devices to the asiair really. I mean, it's interesting and I may give it a try sometime, but originally I thought the plan was to try and 'hack' the asiair app/software to support 3rd party apps - that would have been more interesting to me at least. stu
  17. I didn't think there had been any success though as of yet, or am I mistaken ?
  18. I used to use led panels, but frankly I just don't think they add anything other than extra hassle. what I did do is make an easy to use 'flat flap'® -
  19. no need to overthink it - all I do it stick a white sheet over the end in the morning and take em. job done. just set exposure to get in the middle of the histogram. use same iso, focus position, aperture as during the night - just chance the exposure.
  20. Yeh, I would say, not in their defence but just as a sort of 'bigger picture' view - this is pretty much the default way open source is used in companies in my experience. i.e. most large companies use it everywhere, but never publish changes they make to it. It doesn't make it right. But zwo are in the majority here tbh. The few companies who do "come clean" and publish are very much in the minority.
  21. Hi rob. good effort. It doesn't mention flat ? Those would have helped eliminate the background gradient I think. As is, if you havn't had a go with siril back ground extraction I\d try that - there might be a bit more detail of the dust around the sisters visible. Also, for rgb it's worth getting Siril to plate solve, and then doing a photometric colour calibration. Siril can also do start removal now if you install starnet++. And here it would really help you put out background detail without peaking out the stars. cheers stu
  22. unless you can trick it into under exposing, I don't see how it will help. Unlike the dwarf, at least for now, you have no control over exposure. So for nebulas, etc its gonna go to its max. But for stuff like moons, planets, its going to pick what it thinks is the correct exposure - so fitting an ND will just make it take longer exposures - at least until manual seutter speed is added. However, for M42 I think you might be right - since it will be defaulting to 10 seconds, fitting an ND might help bring more detail out the core I suppose. But to be honest, I don't think it's got the FL to make much of a difference. If you look at my core - that was with an asi678 and 1000mm FL. at 250 with the s50 sensor, there's only a handfull of pixels in that core anyway - I don't see there being much point imho. More interesting for manual exposure settings would be planets. Or jupiter and saturn anywae. tiny as they'll be - you would at least be able to make out a bit of colour on them if you could control the exposure, so it'd be great if zwo added that imho - it could be a hidden 'show advanced' setting so no need to complicate things for users, and tbh it would be easy to add. They are a small company, and I think it'll be tricky for them to identify their market demographics tbh - e.g. how many are 'never had a telescope before, what is a nebula?' vs the one eyed men with a bit of a clue vs the seasoned astro chap buying it as a nice wee portable toy. Knowing where to focus your limited budget is key I'd have thought. tbh, though there's not a rats chance in hell of it happening, I'd love zwo to see the bigger picture and open source the S50 - they'd get the sales they want multipled, and the community would see what they could do with it. Creative did the seme thing under pressure with the Ender 3 3d printer, and it changed the whole market. This is different as I'd imagine any other company would struggle to develop the hardware, plus the limited market just won't support it. But that all benefits ZWO - by open sourcing it they'd gain a massive developer community and that limited dev budget disappears. Are ZWO the sort of company to take that sort of leap of faith. I fear not. But then again, they did distrupt the market with the asiair. I'd love to be proved worng and see them do it - I think it'd be amazing to see what the dev community (and you can be sure I'd be in there) could do with it!
  23. Shot over 4 nights in the last 2 or 3 weeks. Wanted to spend a long time on the target for the first time and bring out some of the dust. 5 min subs, redcat and asi2600. Probably took about 30 hours, but only kept the best - lots wasted with cloud. Probably worth doing the clicky dance to zoom in a bit. Happy with this one - came out well I reckon ? stu
  24. 2nd clear night in a row. Ap - one of the few things that make winter more bearable 😁 M45 and ic342 are the current targets..
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.