Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

vlaiv

Members
  • Posts

    13,106
  • Joined

  • Last visited

  • Days Won

    12

Everything posted by vlaiv

  1. I would not call that touch and go
  2. Because with interference filters, width of layer is what determines reflection. When light hits at an angle - it is effectively going thru thicker layer. Fast F/ratio means that some of the light is hitting filter at shallower angle and that light might be reflected instead of passed because of this - effectively creating aperture stop. This image shows LPS filter and how it shifts towards the blue depending on angle of incidence of light ray.
  3. That is interesting info. ASI178mcc is supposed to have UV/IR cut filter built in - according to ZWO: And I used Hutech IDAS LPS P2 - which is also supposed to act as UV/IR cut filter as it has following response curve: In fact, it is even somewhat minus violet since it does not pass wavelengths below 410-415nm according to above graph. Next time I plan to try it with Baader Contrast Booster filter - it has similar response to IDAS LPS between 500 and 600nm but cuts into violet part of spectrum more aggressively: I bought that one for use with achromatic refractor to eliminate both a bit of LP and violet halo. Should be getting artificial star tomorrow, so I'll be able to run extensive tests on different aperture settings and filter combinations to see what sort of stars I can expect. I also got very low profile rotator and quick changer - this will allow for RGB type of imaging with ASI1600 - Maybe doing individual channels and focusing for each - R, G and B will yield better results. Btw, this is what stars look at F/1.4: Interestingly, all have well defined "core" but also have significant "skirt" Profile plot of one star in each channel
  4. Btw, I did not give tech info - above is one hour of exposure - 60 x 60s from red zone (SQM around 18.5). Hutech IDAS LPS P2 filter used. Mount is AzGTI. No guiding.
  5. ASI178mcc is rather harsh on this lens as it has 2.4um pixels, but lens does have some chromatic aberration stopped to F/2. Here is the same image processed in two different ways. One is RGB compose in Gimp after calibration, stacking and some processing of linear data in ImageJ (background wipe, simple color balance). Second is RGB composition in ImageJ. For some reason (I think I know why), it has rather intense orange color cast, but luminance layer is much better processed - I used green for luminance here and did RGB ratio color transfer (I need to rewrite background wipe to account for color image - as is, I did "per channel" background wipe, but I need to do it combined to keep color balance). In any case, second image is much better looking with respect to star shapes and overall sharpness. Green channel is rather nice. Here is comparison of R, G and B channels on a few stars: I also took some images at F/1.4 (fully open) with intent of making mosaic and binning them to size where this sort of issues were not objectionable, but everything dewed up after an hour so most of that data is not good. I'll process it at some point just to asses star shapes when fully open.
  6. Very interesting point, and indeed I think you are right, however, I have observation to make - I'm not sure if I'm right about this as it just sprang to my mind: Sun is much more massive and is much further, in fact, it accounts for only 46% of the influence of the Moon according to this article: https://oceanservice.noaa.gov/education/tutorial_tides/tides02_cause.html summarized in this image: However, if we look at three points - one in the Center of the earth and one on each side - those distances are much larger in comparison to distance to the Moon than are with respect to distance to the Sun. Not sure if I put that correctly, here is another attempt - radius of earth in comparison to distance to Moon is much larger than in comparison to distance to the Sun. This means that tides due to the Sun should be much smaller although Sun exerts about half of the pull that of the Moon, right?
  7. I think you are overthinking it. However you explain it - Moon pulling on water - Moon making Earth pull less on the water - Waver just following straight path in bent space time you are essentially describing the same phenomena. Perhaps, third option would be "the most precise" explanation - as far as our understanding goes at this point. There is more to the whole story - low/high tide change every 12h rather than 24h - this means there is "counter bulge" on the opposite side. That one is due to system having certain springiness in it and it is just oscillation in sync with the motion of the Moon (similar to what happens when marching on the bridge and wind blowing over structures - ah, yes, resonant frequency is the term). As for water moving away from earth - there is spinning but also there is pressure. Water is not hugely compressible but it does compress somewhat and there are enormous pressures in depths. If we were to suddenly turn off gravity - all that ocean water would bounce of the surface of the earth into space. Mass of the water is not the same on sea level - because a) - sea level is not the same height over surface of the earth, and b) surface of the earth is not same distance from the center of the Earth
  8. I was thinking the same thing. How much print speed influences all of this? My reasoning is that one can lower fan speed if printing slowly and let layers bond better together. Time spent on slower printing would be the same as above preparation and annealing (at least for small parts).
  9. I think that TS made copy/paste error from their RC line of scopes. They even have reflectivity graph on their website for 8" RC model - dielectric coatings:
  10. There is no real answer to that question. I'm going to say - yes, that is rather normal under some circumstances. In any case, there are some other things that can be said - for example QHY183c is vastly over sampled. Both images are processed / stretched more than the data allows. Both have very high background levels.
  11. That is a lot of hours and I'm sure SNR of resulting image must be high, however, you seem to have lost the "bridge" and background looks too compressed / flat. Colors are also way too saturated in my opinion. Btw, the "bridge" that I'm talking about is apparent brighter connection between M110 and M31 like seen in this image (crop) from Wikipedia on M31 page: Same region crop from my M31 image - again this part is clearly visible, although total exposure is nowhere near yours (16h total in LRGB - 4h each).
  12. Not sure why you decided to crop it like that? In any case, data is rather good. There are some issues with flats, but I did not try to correct those. Here is one way it can be processed:
  13. That depends on distance between primary and secondary, and consequently focus position. Here is quick scheme: Secondary in position to the left will not reduce effective aperture of primary. Secondary on the right will reduce effective aperture of the primary. If you want to know if your primary will be stopped down - just measure your focus position and diameter of the tube. For example, you have 245mm primary, and let's say you have 255 tube diameter. You also have focus position at about 100mm above tube. This means that focus plane to center of secondary is 100 + 255 /2 = 227.5mm. You have F/6.3 scope, so size of beam at secondary is 227.5 / 6.3 = 36.11mm In this case, using 46mm secondary would not stop down your aperture, but it would reduce fully illuminated field in comparison to 50mm. Of course, for your exact case, you would need to measure distance between secondary center and focal plane as above is just an example.
  14. Hi @c3dr1c What I'm seeing is just regular SkyWatcher sample to sample variation and periodic error in line with most mounts of that class. Periodic error seems to be around 20" or so P2P. As I see it - you can return the mount and try another one in that class - but there is no guarantee that you won't get similar issues (some mounts are fine, and some are like one you have now). Other option is to tune the mount yourself. Of course, there is third option - stretch your budget and get better mount. I also had issues with my HEQ5 mount. Like you - I had very bad spike in RA that repeated regularly and it had more than 5" amplitude. It was due to broken housing on one of the bearings: So I stripped down my mount and replaced all the bearings and replaced grease and tightened properly everything. I also did a belt mod (you don't have to that on your mount since you already have belt transmission). Here is an example of PE analysis that I did at some point: You will see that I had something like 46" P2P periodic error prior to belt mod.
  15. Indeed, sorry for out of the place mention, I corrected wrong autocomplete
  16. I would say that over 2"/px is still sensible value and over 4"/px as well - it really depends on what you are doing. You can't shoot M31 with regular camera and single panel if you don't go at 2-3"/px. M31 is over 3 degrees long, that is 3 x 60 x 60 = 10800 arc seconds. If you camera has something like 3000 x 2000 px, you are bound to use 3"/px to get it in frame. One can always do 3x3 panel over M31 at 1"/px but that is going to take some time to complete
  17. I'm with @AstroTim on this one SCT has changing focal length with respect to focus position - "further out" you focus (on a regular telescope that would be further out, but in case of SCT - closer two mirrors are) focal length is increased. It is not uncommon for F/10 SCT to operate at F/12 or more. This is particularly pronounced when using 2" accessories (longer FL) and using focal reducers. Focal reducers shift focal point towards the scope - this means that one needs to push it "further out" to reach focus in standard configuration (diagonal + eyepiece) - again extending focal length of the scope. Take a look here:
  18. I'm not sure what is the FOV of electronic view finder? It also means that above 1.04m dot display does not have 500ppi as each R, G and B dot is counted separated.
  19. I found this interesting comparison online: In theory, with lucky imaging technique and processing style, even 4" scope should start to resolve this target. There are 4 nice stars around this target that can be used to judge how good particular frame is (FWHM value of each).
  20. Indeed it does: I doubt it uses above system, it probably uses eyepiece and a small display, something like inch or two. It really needs to have high DPI in order show good image, however, I believe that limiting factor in eVscope is not the display. It uses IMX224 sensor (same as ASI224) which has 1280 x 1024 pixels. This means that regardless of display used, resolution of the view will be about the same as proposed above - ~1000 px per 50 degrees or pixel size of about 3 arc minutes. It also has ability to save the image and show it on tablet - this will also be available in "open source" version If you are referring to LCD displays on modern DSLR/mirrorless type cameras? I'm not sure those have very high DPI? Do you have any example? Quick search for Canon M6 screen specs returned this: It has 3:2 screen size, so 1250 x 832 or something like that. Diagonal is ~1501 and it is 3" so about 500 ppi. It is a bit higher but not much higher than 400ppi of display that is both cheap and already adapted for use on RPI. If we do a search on high PPI devices we get this: That is double resolution of above mentioned screen. Ideally we would want something like 1200ppi ready to be used on raspberry pi and not very big in size - about 4-5" in diagonal. Not sure we can find such a device, but I think for the time being, 400ppi will suffice. Really, in recent test, if one was not looking for pixels - they were not obvious and only seen when graphics was such that emphasized them - sharp edges and saturated colors. In nice astronomical image that has smooth transition - I doubt they will be seen. Another alternative would be to use narrower AFOV eyepiece - like 30 degrees, but I guess that would be rather unpleasant.
  21. Just a quick update - I had very rewarding quick session on M31 a moment ago Opened image of M31 from wiki page - very nice rendition with Ha combined, placed my phone on the floor and held finder scope against the desk for added stability. Some notes: - when stable, image is really sharp, there is some pincushion distortion that is evident if straight lines enter FOV (edge of image for example) - about 70cm is enough to match phone width (portrait mode) to eyepiece FOV (my desk is about 70cm from the floor and I held finder scope pressed against desk) and about 5cm or so was distance of the eyepiece from the back of finder scope - I was rather hard pressed to tell single pixels on astronomical image. On phone UI it took some effort but it could be seen - I guess this is because of graphics used. On a bit more zoom (finder lowered about 10cm below desk) - pixels started to be obvious even in astro image. - Heavily processed image looks rather artificial thru the eyepiece. It really needs just basic processing to look more pleasing - finding noise floor, from that finding max brightness (either dictated by noise floor distribution or by max brightness of the target) and then applying gamma of 2.4 for natural look.
  22. I'm not sure we are talking about the same thing, so I'll expand a bit on my idea if I was not clear in initial post. Most people familiar with eVscope know that this stacking platform uses display in front of eyepiece to display stacking results - thus creating "sensation" of real observation thru the telescope with enhanced capabilities. For those who want to try to recreate this concept - it is really easy to do so. Take an eyepiece, I used 32mm GSO plossl, and unscrew 1.25" barrel from it, exposing the field stop. Take your smartphone and turn it on to show some interesting picture. If your eyepiece does not have exposed field lens - you can carefully place field stop onto phone screen. This should not scratch neither field lens nor phone surface. If you look thru the eyepiece - you will see image on the phone quite enlarged. You can do this with computer screen as well. This works because eyepiece is supposed to show us image formed at focal plane of the telescope. It is actual image and if one placed piece of paper - that image would show and it would be rather tiny, but it would be in focus. Same thing happens with sensor - image is rendered on surface of the sensor and this is why sensor can record it. If eyepiece can magnify image placed at its focal plane (field stop should be on focal plane of the eyepiece) - if we place computer or phone screen there - eyepiece will show that image as well and it will show it rather nicely in focus and all. Since 32mm plossl has field stop diameter of about 27mm - image that we see will be whatever is on phone screen in those 27mm. This is a bit of a problem for nice image as you will see. Phone pixels are not sufficiently dense to render nice image without showing. I'll explain why. First thing to understand is that humans resolve down to one arc minute. It we have line pair that is separated by one arc minute gap - most of us should be able to tell that there are two lines there. This also means that we will see individual pixels / pixelation if pixels are significantly bigger than one arc minute. But how big are phone pixels? You can calculate that, but we can also use internet search. My particular phone has around 420ppi (pixel per inch) I believe. It has 5.5" diagonal and 1920 x 1080 resolution. Diagonal of screen will contain sqrt(1920^2 + 1080^2) = ~2203 pixels. This divided with 5.5 inch will give ~400ppi. I was a bit off, but close enough. We will use 400ppi figure as I'm sure it is more accurate. Ok, so how big are pixels when we are viewing with eyepiece directly against the screen? Well, we have 27mm field stop and we have 50 degrees of AFOV (or 52 degrees - depending on who you ask , but let's go with 50). 27mm is 1.063" and at 400ppi that will give around 425 pixels across diameter. That is about 8.5 pixels across one degree of AFOV or 8.5 pixels per 60 arc minutes. This makes pixel 7 arc minutes big. No wonder we can easily see it. How do we make pixels smaller? Well, there is very simple way of doing it, and it involves contraption from this image: Yes, it is simple lens (or in our case we won't be using simple lens, we will be using achromatic doublet). Note that we can make larger object look smaller by using lens. We can also make small object look larger - it just depends where we place object and where we want our image to form. But what will we achieve by using lens? Well, in my case, and by the way, same exact IPS panel is available for purchase online for about $120 to be used with raspberry pi, so one does not need to do this with their phone, having 5.5" screen means that we can have more than 27mm used to show image. Height of the phone in landscape mode is 2.7" (1080 / 400ppi) or 1080px. In millimeters that is 68.58mm. We want to squeeze 68.58mm into 27mm so we need magnification (or rather minification?) factor of about x2.54. With proper spacing of display screen, 50mm achromat doublet from finder scope (I just used finder scope as it was easy to try and it kind of works well and is cheap - I bet almost anyone doing EEVA will have one in a drawer) we can have 1000 pixels in 50 degrees (1080 to be precise but I think we can safely assume that at least 1000 will be visible - it would take very precise placement to have all 1080 fit in FOV without having gaps at top and bottom). Now things look a bit better. This is 20 pixels per degree, or each pixel is only 3 arc minutes. Still not what we need, but I don't think we could find higher density display that easily. This btw makes me wonder what sort of display was used in original eVscope? When I written that I tried all of this - I actually took 32mm Plossl, my phone and 50mm finder scope and was able to fit almost full height (in landscape mode) of my phone into FOV. Just remember to remove finder scope eyepiece (mine just unscrews easily). Actual diagram of this telescope would look something like this: If I were to build this rig, I would also include following as EEVA equipment: 1. ASI183mc/mm (or other vendor) - I would probably go for cooled version, but one could also use regular non cooled version to save a few bob. 2. 102mm F/7 ED refractor (~4kg weight) 3. x0.6 long perng FF/FR 4. AzGti in EQ mode This should provide rather decent FOV and also plenty of different resolutions / magnifications. At lowest magnification it could almost fit whole Pleiades into FOV: (actual FOV is a bit larger but astronomy tools does not have 0.6 focal reducer, and we should really only observe circle that can be fitted inside this rectangle While at highest magnification it would actually have FOV like this - for small galaxies: So how do I figure this? ASI183 has 3672px in height. If we do ROI at native resolution and take only 1/4 of those pixels that is 918px and stretch that to our 1000 display we will get that sort of magnification. This is "native" resolution. But we might want to use 1/2 ROI instead - that will mean using super pixel mode instead of interpolating and we will "squeeze" 1836 of camera pixels onto 1000 display pixels for this sort of fov: In the end we can have super pixel and bin that x2 so we squeeze whole 3672px of camera onto 1000 px of display for this sort of FOV: So as you see - you not only get enhanced scope, you also get 3 eyepieces with 3 different magnifications to go with it BTW, @nfotis hope this explains things a bit better - and as you see, it has nothing to do with planetary observation, although, we could use same approach to do lucky imaging and automatic sharpening? Not sure how would one develop sharpening part to be automatic though.
  23. It was quite long ago and if I remember correctly - I think I did each one separately - until I got the best looking image. Btw, I've manage to find theoretical value since and it shows that I was very close with this simulation and measurement. This is expression for cutoff frequency, found here https://en.wikipedia.org/wiki/Spatial_cutoff_frequency Lambda is expressed in millimeters and F# is telescope F/number Since airy disk size is give as: or X = 1.22 * lambda * F/number We can see that maximum spatial frequency relates to airy disk radius by factor of 1.22, sampling rate relates to it by factor of 2.44 and airy disk diameter relates to it by factor of 4.88 - this is very close to my measured value of 4.8.
  24. Both scopes are operating at f/4.7 - F/4.8, right? Should give the roughly the same brightness. Are you using binoviewer in 24" and not in 15". Viewing something with both eyes can make it appear brighter although each eye gets only 50% of the light.
  25. This reminds me of another "well established truth" - F/ratio of the scope determines how fast one will acquire an image. Both lack crucial piece of information. In this case it is magnification used. 4" of aperture can create same surface brightness of projection on retina as 8" by using suitable magnification (in the case of f/ratio it is the sampling rate or pixel size).
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.