Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

vlaiv

Members
  • Posts

    13,106
  • Joined

  • Last visited

  • Days Won

    12

Everything posted by vlaiv

  1. Maybe these two will help? https://www.makeuseof.com/tag/ubuntu-remote-desktop-builtin-vnc-compatible-dead-easy/ http://microdebug.com/2017/06/02/ubuntu-without-monitor/
  2. Measuring telescope optical figure is certainly within your reach. You only need a couple of things - like camera, a star - artificial or real and software - Maybe filter as well if you are measuring refractor - best to use NB filters like Ha, or OIII or perhaps Baader Solar Continuum as it has peak around 540nm - closest to visual peak wavelength. Lookup WinRoddier software / Roddier analysis for this and explanation how to do it.
  3. Not sure this should go into "equation". We can then say - ok, let's throw in eyepieces, and let's also talk about how much sleep did you have on the night before. Did you eat your porridge? Are you warm and comfy and all other important questions Scope will provide certain level of performance at focal plane regardless of eyepiece and observer and two scopes will differ in performance at focal plane. In that sense we talk about ability to deliver certain contrast. Will that contrast be exploited, well, that is another matter. Sticking best EP in the world into scope/conditions combination that can't deliver and having eagle eye won't help much.
  4. I'm not sure that order is correct, or that in fact there is distinct order. Even the matter of contrast is subtle one - at what frequencies there is loss of contrast. For example, baffling will create contrast issues uniformly across all frequencies, while other things impact particular frequencies. I'm also sure that that effects of central obstruction can be seen much sooner than at 25% by area (which is 50% by radius / diameter) and general consensus is that it is usually not noticeable below 20-25% by radius.
  5. Not sure that I follow the argument, here is counter argument - air will be pushed in direction of least pressure.
  6. There is absolutely no difference. This is old question asked by computer enthusiasts that over clocked their computers, many tests were done and results are the same - it does not matter. Consensus is that push / pull configuration is the best - as it uses two fans - one to push ambient air into radiator and another one to remove - thus creating steady flow.
  7. By the way - I just thought of very nice and easy "protocol" to test if your filter is impacted by the speed of your system. I was thinking extensively about this filter shift because I have now F/1.4 lens that I'm planing to use wide open in some cases and I was wondering how much of impact will there be on LPS filter (which is essentially rather broad type of filter) and this lead to idea of a test protocol - that is much more useful for NB filters. One just needs flat panel and few pieces of cardboard - or even single piece of cardboard. Test consists of taking flats with clear aperture and taking another set with exactly the same parameters - but this time using aperture mask to slow down your system to say F/8 or F/10. We then measure mean ADU in both set of flats and look for unusual dimming in fast system. For example, if we have refractor (with reflectors - central obstruction must be taken into account when doing calculations) if we have, say F/4 system and we stop it down to F/8 - we would expect mean ADU value of F/4 to be 4 times higher than mean ADU value of F/8 - ratio of clear aperture surfaces. If it is more then this - then you are loosing some light at F/4 due to part of light cone being shifted out of band. Doing multiple apertures and plotting a graph can lead to finding a rule of how particular filter behaves with respect to light angle?
  8. Yes, I do have Skywatcher version of this scope and indeed it is very fine lunar and planetary scope. Many people say that it is narrow field scope, but I don't see it that way. I don't feel particularly "boxed in" with my 8" F/6 scope and that has only 100mm less focal length than Mak102 with its 1300mm. Put 32mm plossl eyepiece in and you have fairly decent field of view. Scope is very sharp in use. I'm using mine mainly on AzGTI mount but I did not purchase bundled version as bundled version does not have collimation screws. There is no image shift when focusing that I could detect (was not specifically looking for it though), however there is slight backlash in focuser knob. I guess this will vary from item to item. Other things that I noticed: - Exit pupil is small so sky is much darker than other scopes of similar size - In light pollution and around bright light sources, scope looses some contrast. This is due to baffling not being the best and this scope really needs a dew shield in use - it is known dew magnet but also, good longish dew shield will stop that baffling issues. It delivers what you might expect from 4" aperture. In fact, I had it once in the field next to Skywatcher ST102 F/5 wide field refractor and images that it gave were similarly bright although much more magnified in Maksutov. M13 looked better in Mak due to additional magnification - it almost started to resolve some of the members and smudge was filling the eyepiece. I managed NGC7331 with peripheral vision from Bortle 4 skies with that scope. In the end - I took some rather nice images with that scope - the Moon, Jupiter and Saturn. Hopefully, I'll add Mars to "small scope collection" next.
  9. not to worry, I'm sure friendly moderator will be along quickly to sort the issue
  10. Even same filters - one being slightly tilted, will produce reduction in FWHM.
  11. Ah yes, see what you mean. 1% off band and 3nm FWHM will have similar performance to 0.1% off band and 7nm FWHM filter. Want to shoot in moonlight? Here is what I would suggest to everyone wanting to do that - use double stack - put two same filters in optical train. Solar Ha observers use double stack filters to enhance contrast by removing off band leakage. Similar thing can be done with regular narrow band filters. One will loose a few percent of transmission of filter, but will eliminate almost all off band light even if there is significant (few percent) leakage.
  12. OD can be clearly seen from filter response graph - these are the same thing. Higher the response curve of a filter in particular wavelength - more light it transmits. Most of these narrow band filters have 90%+ efficiency in interesting wavelengths. Does few percent matter? We might argue that it does matter - in fact one should make every photon count, but here is counter argument: On a given night, choice of target and position in the sky as well as atmospheric conditions can have and often do have more significant effect than this. AOD (aerosol optical depth) is often in range of 0.1-0.5. This adds approximately that much magnitudes (actual number of magnitudes depends on wavelength, and for 550nm it is 1.02 x AOD while for 510nm is about 1.2 x AOD). Without even talking about position in the sky, AOD alone can account for 0.2 mags of difference on average. If we calculate intensity ratio from that - 10^(0.2 / -2.5) = ~0.83. There you go, about 20% difference in transparency alone due to aerosol optical depth.
  13. I agree with you - in this case, as long as you don't have any issues with flats (or don't use them if you don't have any dust or vignetting in your system), since this camera has such low dark current - one can use bias instead of darks. I just wanted to show why is that the case here, and under which circumstances it holds. After all, we are not doing rigorous scientific work and as long as there is no impact on final result - one can certainly choose one approach over another.
  14. I still maintain that it is best to take darks, however, Atik is right to recommend that darks are not really necessary for certain sensors. This does not mean that taking darks is wrong approach or should not be done - it means that you are highly unlikely to notice the difference. Let me show you why is that: On this page https://www.atik-cameras.com/specification-tables/ you will find specs for different models of their cameras, and this is important bit: I outlined 460Ex model. It has dark current of about 0.0004e/s/px. That is very, very low current, and if you look, next few models also has very low dark current. These are Sony ICX series of CCDs. But not all CCDs have that low current and that low current is at -10C. Someone working at 0C will not have that low current. Just to understand how low current is that, let's do single exposure at 5 minutes at -10C This will add 300s * 0.0004 = 0.12e of dark current to your image. That is very low dark current indeed and I've seen cameras that have more variation between main value of dark subs than that. However, in some cases, that might not be negligible. Take for example 30 minutes exposure at 2C (and we assume doubling temperature is 6C). In that case we would have 1800 * 0.0004 * 2 * 2 = 2.88e of dark current. Now it is not negligible any more. I agree that not taking darks and instead using bias sometimes can give same results as taking darks - but I would only "allow" it when one knows what sort of impact to expect given camera parameters. However, try doing that with Kodak sensor - any one from last three to see what will happen. It is again Atik camera - does it mean it is so good as to not need darks? Most certainly not with those dark current rates.
  15. Not sure that I get what you mean by this? That 100dpi corresponds to about 87cm viewing distance or that one needs 300dpi for reading distance and for this reason it is best to print in 300dpi as people tend to look at objects at "reading" distance when they want to inspect it in more detail?
  16. We have two things here - square / square root and sin Square root of something is not only positive value - its both positive and negative value (it has two roots). Then, there are some identities: This means that you can take small angle as correct answer even if you've got large answer from arc sin in calculator In fact it is +/- 5.74 degrees. Now if you want F/ratio - you need to use tan of that angle. Tan(5.74 degrees) = 0.1005184.... and inverse of that is 9.948426789.... However this is "half F/number" as incident ray comes at 5.74 degrees on one side and 5.74 on the other side - so above 9.948.... is ratio of radius of aperture to focal length and not diameter of aperture to focal length. We need to divide that with two, so we get: 4.97421339... = ~ F/4.975 = ~F/5 Makes sense? Adjacent is focal length and Opposite is radius of aperture in our case.
  17. I should be moving to a new house some time next year (was planned for this year but covid-19 and all that prevented us) and will change my sky from SQM18.5 to SQM20.8 (red zone / bordering on white to green zone, or Bortle 8 to Bortle 4). I did some calculations and for most interesting targets, improvement will be around x6 - meaning same SNR can be achieved at that location in one hour that I now need 6 hours of exposure. That is serious change and tells you how important dark skies are. I won't be using filters at that location because I believe they will hurt my images more than help. You should use UV/IR cut filter if you have refractor as there will be significant bloating in UV and IR part of spectrum. With reflectors you don't need one for luminance in monochromatic imaging but I would use one for OSC as it will help with color information (easier to calibrate color). My estimate is that IDAS LPS P2 reduces sky brightness by about 1 magnitude here where I am now - this is just estimate not actual measurement - so SQM 19.5 instead of SQM 18.5
  18. This is important regardless of other factors - if you can, dither, no question about it. If you don't plan on using dark frame optimization or calibrating with different length of darks than don't bother with bias subs. If you are going to use them - just set lower exposure value allowed by software. Only way to do proper calibration is to use darks. Maybe some people don't need darks, but then they don't need proper calibration. Simple as that. Use darks. Half to 2/3 of histogram would be my choice. These are calibrated with flat darks. You can use master bias if you have low dark current and strong flat panel and no mechanical shutter - meaning short flat exposures. Using bias will introduce very small error in your flats which in most cases won't show in your image. If it shows - use flat darks, or if you want to do it properly from the start - use flat darks
  19. Broadband filters are designed for that specific purpose - at least LPS ones. It is all about SNR in astrophotography and LPS filters improve SNR of your image. Simple as that. Use of broad band filters is actually questionable in lower levels of LP than in higher levels. This is because LPS filters remove signal from both LP and from target. In order for LPS filter to be successful it needs to remove much more LP than signal since we want to remove noise associated with LP - which is square root of LP signal level. if LP is weak - not enough of it will be removed to offset the fact that we are also removing part of signal from the target. Another important thing is that LP signal needs to be "bunched up" in certain part of spectrum (multiple parts of spectrum are fine as long as it is bunched up). Above holds true for broadband targets like galaxies, clusters, reflection type nebulae and similar. Emission type nebulae fare much better with narrowband type of filter - UHC type filter for example. Here is example of light pollution spectrum: We can see that bulk of light pollution here comes in ~550nm to ~620nm range and if we can remove that - we will remove much of LP signal. Here is spectral response of Hutech IDAS LPS P2 filter: Notice the match between above sample of light pollution spectrum and blocking part between about 530nm to about 620nm. There are two small band passes in this range and that is to offset color change that such filter makes. Most galaxies and clusters are made out of star light, and also, reflection type nebulae - reflect star light. Here is example of stellar spectra: stellar spectra are rather "uniform" in 400-700nm range - there is no major "bunching up" of signal, unlike light pollution. This means that we only take out part of the signal that is proportional to width of blocking part in filter signature. So name of the game with broad band filters is to remove bulk of LP signal while leaving the most of stellar spectrum intact. If LP signal is weak - we will remove enough of target signal and end up lowering SNR in our images. To see this - just examine case where there is no LP at all. Using filter will not reduce noise from LP since there is no LP signal, but will reduce target signal - lower target signal means lower SNR. This is extreme case. Other end of extreme is heavy LP in just few lines like Sodium lines. Somewhere between the two there is tipping point when use of broad band LPS filter becomes feasible.
  20. What is not going to be useful? Filter or imaging? Neither is correct btw. Only difference between heavy LP and light LP is amount of time you need to image. Given enough time, you can make good image even in Bortle 8 skies. Filter only helps this - by reducing time needed. As for your comparison - did you SNR tests on any of these images? Image washed out is really not a problem - one can always remove DC offset created by LP. We can even remove gradients created by uneven LP. What is problematic is shot noise associated with LP signal. This impacts final SNR, but final SNR depends on number of stacked subs.
  21. https://www.firstlightoptics.com/optolong-filters/optolong-l-pro-light-pollution-broadband-filter.html if you want to stay with Optolong, or maybe one of these: https://sciencecenter.net/hutech/idas/lps/plots/ (hover mouse over table on the right to see different charts - filter / LP source combination). I personally use Hutech IDAS LPS P2 and it does make a difference in SQM18.5 skies.
  22. First thing - it's not field curvature, it's lens distortion. Field curvature is something completely different. In fact, what you are dealing with is perspective projection and lens distortion is just how much lens deviates from perspective projection. Telescopes are perspective projection systems - image on the right, and they maintain straight lines. Lens distortion is how much those lines bend (left image - look at lines on the concrete). In any case - both are bad thing with wide field, and both prevent dithering when stacking if stacking software can't deal with them. Any software that just uses linear transforms (3d matrices) to align subs will fail to properly stack images that have either of the two above (or lens that is in between). However, I know two pieces of software that should enable you to stack images with dithering - one designed to deal with above cases, and one that handles them by chance (maybe it was designed to deal with distortion but I don't believe it was designed to deal with these kind of distortions specifically). https://www.astropixelprocessor.com/ has this as a feature - it can actually model your lens and even transform image to different projection type. You'll have to pay for license but there is 30 day trial. Maybe give that a go? Second is DSS. DSS does not list this as a feature and there is special processing, but from what I've seen, DSS does not use linear transforms when aligning subs. I've sometimes seen people complain that their stack gets warped by DSS and I believe this happens when DSS fails to properly identify stars and does some strange alignment of the image - one side of image is properly aligned and other side has stars mistaken for different stars so it is bent by DSS trying to match star positions. Unfortunately I can't find example now, but I think it's worth having a go with DSS to see if it will manage to stack your subs properly. You can even do a "sim" - take few subs and add lens distortion in software and try stacking those subs to see if DSS will handle it properly. Btw, while searching for example of image mangled by DSS, I came across another software that can handle lens distortion: http://www.astrosurf.com/buil/iris-software.html IRIS is free and it looks like it can handle lens distortions and align subs properly: In any case - you should dither and hopefully, one of these suggestions will work good for you.
  23. My computer screen is 23" diagonal and has 96dpi. Mobile devices have very high DPI count and there is even named scale of different pixel densities: Screens produce their own light and that is quite different then reflected light of a print. Higher contrast can be obtained from display device than from print. In any case, we can see what sort of resolution human eye supports. We resolve at best about 1 arc minute. Let's say that minimum viewing distance is 50cm. How small single dot should be in this case? At viewing distance of 500mm (that is 50cm), 1 arc minute feature equals to 0.14544mm or if we want to get that to DPI 25.4mm / 0.14544 = 174 DPI. For viewing distance of 30cm - which is "reading" distance, we thus need ~290DPI (174 * 50 / 30). This is why we say we need at least 300dpi for printed material. Would 100dpi be sufficient? Well, yes if you have something that can be only viewed at 873.19mm distance or 87cm away, but bear in mind - people want to get close to these sort of images to better see the detail. Your best bet is to aim for reading distance - around 30cm and go for at least 300dpi. How to get such image and what sort of FOV can we expect here? Key to creating large size image with smaller sensor is - making mosaic. At best, one should go for 1"/px - higher than that will be oversampling. This translates into 5 arc minutes per inch. Want to fill A3 print? You will need: 11.7" x 16.5" = 3510px x 4950px = 58.5' x 1° 22.5' You have about 1.3 degrees x 1 degree - not very large FOV.
  24. Just been looking around and it reminds of this one very much: https://explorescientificusa.com/products/twilight-1-mount and from little I've been reading on it - does not look like very solid piece of kit
  25. I think that good start would be Theoretical Minimum by Leonard Susskind https://theoreticalminimum.com/home There are nice lecture videos on youtube channel and you can use website to navigate to particular lectures in the series.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.