Jump to content

vlaiv

Members
  • Posts

    13,265
  • Joined

  • Last visited

  • Days Won

    12

Everything posted by vlaiv

  1. Not all M42 are the same There is M42x0.75 and there is M42x1mm. Former is T2 thread and later is well M42 thread. You can't securely screw one onto another and you'll need adapter to attach lens to camera. With most of those lenses you'll need to stop them down to F/4 even if they are faster, and they are not particularly sharp as far as stars go. Your best bet is to decide on FL you want to use and then get lens that is x2 or x3 longer focal length. Then when you shoot - you will bin your data x2 or x3 as lenses are simply not sharp for modern small pixels. They want pixels in range of 9-12um or larger to give sharper image. Even modern lenses are not up to task with small pixels. I played around with Helios 58mm f/2 lens and ended up getting Samyang 85mm to give me ~50mm "feel" with small pixels (I use split debayer on ASI178 and then bin once more x2 which turns 2.4um pixels into 9.6um ones). BTW, here is example of image taken with 58mm Helios: Well, I did not bother to process it particularly well - it is just B/W (not debayered, just binned) - image of Veil nebula, 1h total exposure from Bortle 7/8 sky with ASI178 color cooled. Lens was stopped to F/4, but even with that stars are soft and bloated - not particularly sharp (and there is diffraction thing on bright stars).
  2. Here is "contraption" in action: contraption.mp4 I've learned a lot by making this. First - 3d printed gears work and work well. Second - I can use m5 threaded rod as shaft - provided I actually 3d print "screw on" sections of the shaft. If I don't do that - things will bend and shift around, but once I create "train" of gears, spacers and press fit thingies for bearings - everything just works as it should. I did not make focusing knobs with press fit tolerance for 608 in above setup and it feels - there is a bit of play because of it - on the other hand, second shaft is smooth and does not move a bit. In any case - I would not mind using this as focuser mechanism at all. Even the fact that it runs "backward" - is really not that objectionable - once at eyepiece - we don't really think in terms of "rack in" and "rack out" - we just move focuser in direction where things get sharper (star images smaller). It runs smooth and audio in above recording does not do it justice. Probably because it is so quiet that phone had trouble recording it (its not silent, especially in part where I force "draw tube" in and out, but it is quiet). BOM: 2x180mm long 6mm linear rods 4x SKF 6mm PTFE bushings 4x 608 bearings 2x 100mm (but should be less) of m5 threaded rod In final design, I'll probably secure focusing knobs with few m5 nuts as well, and possibly exchange one pair of 608 for 8mm PTFE bushings (or not - it will depend on available space / size of the unit). Linear elements (in this instance rods) for focuser and smooth motion - PASS! (flying colors I might add).
  3. Was this at full aperture? Try stopping it to F/2.8 and see if that improves corners
  4. I guess that best course of action would be to contact that reputable company and see if they know the reason for missing last lens (or if it is supposed to be there in the first place). Unless someone here happens to have exact same model so you can compare.
  5. There seems to be at least 9 different models of Samyang 135 F/2.0 - and on top of that - you have cinema version T2.2 It could be that what you see in top image is say Canon EF mount, and you have some other mount? Each of those different mounts have different flange focal distance - and could have different rear end. Sometimes, optical element is used to move focus position closer or further.
  6. I guess none of those AIs is capable of answering free form questions or writing an article?
  7. That makes sense, but only if target is larger than 30mm or more nicely framed on 30mm sensor when using reducer.
  8. How long does it take to double check all the data fed into AI engine? Centuries?
  9. Yep. Forgot to answer this one: It sort of does in some cases. Say you have 4" scope and ASI1600 and you want to speed up things, but you don't want to loose FOV - you really like your FOV. In turns out that ASI6200 has x4 the surface (roughly) of ASI1600, so if you get scope with twice the focal length - you'll have the same FOV. You get 6" scope, that is "slower" as far as F/ratio goes - but has twice the focal length. You bin pixels of ASI6200 to match imaging scale you had with ASI1600 and you have the same FOV - now you have faster system as 6" is larger than 4" and everything else is the same (pixel scale) - and you managed to keep FOV the way you like. To be able to do above scenario - you need to have 6" scope that has large enough imaging circle for ASI6200. This is when imaging circle comes into play. Another example is if you want fastest possible scope for small galaxies - again, sensor size plays a part there as you can match it with very large aperture and still have enough FOV to capture galaxies (you'll fix pixel size to match your needs with binning). All good and dandy - but you also need big enough imaging circle to have corrected image over such large sensor. Bottom line - imaging circle comes into "speed" equation as it lets you use large sensor and with large sensor you can use longer focal length for same FOV - which means you can use larger aperture. In some sense - size of sensor is part of "speed equation" as it allows you to use bigger scope (larger aperture) and with pixel binning still be on target sampling rate.
  10. Crude speed comparison is diameter^2 * pixel_scale^2 More accurate comparison is to replace diameter^2 with "clear aperture surface" for both scope - this one is harder to calculate as we have unknowns, but we can approximate things. Clear aperture surface is equivalent light gathering surface after we factor in all the losses in optical train. Say that we have triplet refractor scope. It will have say ~99.5% effective coatings on each air/glass surface. Total loss will then be ~ 0.995^6 = ~0.97 or 97% of light will pass onto sensor. Maybe we have 102mm aperture - so final clear aperture will be 51^2 * 0.97 = ~7926mm2 of light gathering surface. Say that we want to compare that to 6" newtonian with 32% central obstruction. It has standard mirror coatings that have 94% reflectivity. We have x2 mirrors so 0.94 * 0.94 = 0.8836 is reflectivity factor and we have to calculate in central obstruction as well so it will be 152mm diameter -> 76mm radius, 76 x 0.32 = 24.4mm radius of secondary obstruction. (76mm^2 - 24.4mm^2) * pi * 0.8836 = ~14381mm2 Now we just need pixel size and focal lengths to get arc second per pixel - and square those as well and multiply with clear aperture equivalent - and you'll get two numbers that represent relative speed of systems. By the way - standard losses are between 0.5% and 1% for glass/air interface with refractors and 94% or 97% for standard and enhanced mirror coatings and 99% for dielectric mirror coatings.
  11. No, but how wide do you want to go? What is that you actually want to perform? Just measurement of focal length? You can use just central portion of the FOV for that - coma free region of focal plane. Just find a way to get some images and leave rest to already existing software - like AstroImageJ - it will measure stars - do basic photometry as well, and if you plate solve - it will give you RA/DEC positions. All that will be in table ready to be copy/pasted to spreadsheet that you can then use to determine pixel scale (and from that focal length). In fact - just uploading image to astrometry.net will do exact same thing - provide you with pixel scale. As far as raspberry pi and camera go - just do google search on it and you'll get plethora of hits like these: https://astronomynow.com/2022/02/25/raspberry-pi-high-quality-camera/ https://adambaskerville.github.io/tabs/astro/ https://www.amateurastrophotography.com/astrophotography-with-the-raspberry-pi https://www.instructables.com/Long-Exposure-and-Astro-Photography-Using-Raspberr/ As long as you take exposure with it and you can fashion adapter to put it at prime focus of telescope - you can combine the two to produce an image.
  12. I also think that it is AI, and that brings us to interesting "catch 22" situation. AI is used to generate large quantity of content online which is in turn used to train AI on what is correct / truth
  13. When we are on the subject of AI, I'm under impression that more and more content on the web seems to be generated by AI. Here is an example of what I mean. I'm making 3d printed focuser and I always wondered if PLA (type of plastic) will be good enough for gears that I'm printing as part of assembly or should I perhaps invest into Nylon (or rather Polyamide). Nylon gears are much longer lasting (much tougher) and don't require lubrication, but this is very low torque and low speed application and my only concern is smoothness of the motion. In any case, I wanted to read as much as I can on comparison between these two types of plastics for given application - and this is typical page that I'm running into when doing google search on the topic: https://www.techinpost.com/3d-printed-gears-a-complete-guide/ Do you think this article is written by human or by AI?
  14. I think that it just a case of targeting fears in some people. People are more likely to take (wrong) advice without questioning it if it is presented as advice that will save them from harm. - don't reboil water - don't use aluminum wrapping because it does this or that (can't remember which) or if it is possibly cure for some unpleasant condition - like that with lemon and stomach acid, although I'm not 100% sure about that one. One possible reason why people might think that lemon helps with acid is that it has lower acidity than stomach acid - while still being acid enough to stop further acid production. So it effectively reduced acidity of stomach content and acts to stop further acid production or something like that.
  15. No, but lemon acting alkaline once digested ingested has been suggested
  16. Not only that, but she gets really upset with the way I react when she tells me things like that
  17. Design stage of prototype / proof of concept is finished. I already printed some parts, but there is more to be printed before assembly. So far, I've spotted some things that will need to be changed for actual focuser: 1. With this arrangement, focuser will work "backwards". Since there is reduction stage, there is also reversal of rotation motion, so as is - this functions in reverse to standard Crayford / rack and pinion with regards to "rack in" and "rack out" knob motion (what we are used to as "racking out" - will actually move draw tube inwards and vice verse). In final design I'll need to add idler gear to reverse rotation direction once more 2. Due to need for additional shaft for idler gears, and since it also needs to be suspended on some bearings - I'll need to switch from 608 bearings used in above design to smaller bearings. I'll either need to source some really tiny bearings, or maybe again use bushings for these shafts as well. This would mean also switching from m5 threaded rod as shaft/shaft support to maybe 5mm or smaller smooth rod (which I'd need to source somewhere - aliexpress has many listings for such things - but that is another 40-60 days of wait ). I tried the idea of 3d printing shaft to be used with bushing and found: - tolerance must really be dialed in. I managed to get close after 6 attempts - tolerance goes out of the window as soon as I change seam position / algorithm. Aligned creates smoothest surface - but leaves well - seam along length of the shaft and that can be felt when rotating as bushing also have groove - when too meant there is a "bump". Using random creates much worse looking surface finish (and probably feel of rotation - did not try that because using random seam position throws off tolerance and it also needs to be dialed in).
  18. Exposure time goes down - but not because of F/ratio, but because you also change sampling rate, or effective pixel size. If you reduce focal length but at the same time reduce pixel size to keep effective pixel size (in arc seconds, or in another words sampling rate) - you won't speed up anything. You'll need same amount of time to target SNR as with original focal length or F/ratio. This shows that it is not F/ratio that is important - but two other quantities - aperture surface and effective pixel surface (or how much sky is covered by single pixel).
  19. Was thinking about this just this morning. My wife casually mentioned that she read online that boiling water more than once is harmful. My jaw dropped, so I went online and - sure enough, there are such claims scattered around the web. Ok, so it is a known thing that people tend to write utter nonsense just so they can publish something and earn some money of people reading that (or watching you tube videos and so on). This is unfortunately not limited to odd individual here and there - it has become mainstream in journalism - write sensational things, and don't care about accuracy, truth or if you are in fact writing utter nonsense, so we can't expect general public to behave any better then professionals. Problem is however much deeper if we start including AI in the equation. Most AI / Chat bots nowadays use hive mind approach to knowledge. What is stated on majority of sources online - must be truth / fact. On the other hand - we have copy / paste journalism and just passing on the "news" / articles without any sort of background check, so some things become "common knowledge" - even if they are completely wrong. This problem is going to just grow larger if "trusted" sources like AI / Google / Chat bots start serving the same content as being "valid.
  20. Btw, if you want to do simple astrometry to determine focal length of telescope - what about just using mobile phone camera with something like 10mm ortho eyepiece? Mobile phone adapters are readily available (and cheap), and all you need is scope of known FL - which will help "calibrate" ortho eyepiece to exact FL.
  21. Centroid algorithms have sub pixel accuracy - you can measure star position with sub arc second precision with modern pixel sizes. What are your requirements in terms of FOV / sensor size, and what sort of budget you have in mind? How much DIY capable are you? There are few options out there that might be better suited for your need. How about RPI camera with RPI? If budget is tight - how about something like C270 - web camera. All you need to do is remove front lens and add 1.25" nose piece (that can even be piece of 32mm PVC pipe slightly sanded down).
  22. I would worry about retracting each column by slightly different amount - that would cause tilt / misalignment which would result in poor colimation, but maybe there is nothing to worry about if top assembly is very stiff and does not allow for different amount of retraction?
  23. Hi and welcome to SGL. If you want to replace stock focuser with Baader one - you'll probably have to do some modding. As far as I can tell - stock focuser does not have T2 thread on it, so there is no easy way to attach Baader one. Best course of action is to try to replace it. Adding another focuser on top of stock one will cause issues with reaching focus, as already pointed out by Louis above. Can stock focuser be removed? From images online, it seems to be single piece with top assembly that holds secondary? That might cause problem when trying to replace it as only way would be to cut it off.
  24. Why don't you perform your own crude balance? Take a piece of white paper and on sunny day - take an image of it (don't let it be in direct sunlight, but in shadow - just make sure day is nice with some blue sky) - or alternatively get D65 light source (not as easy). If you have all sky lens with that camera - it will do, alternatively - you can use your smallest scope - like that 51mm spacecat in your signature. Take that image you took of white paper and measure average red, green and blue pixel values and derive coefficients that will produce 1:1:1 ratio on white paper.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.