Jump to content

vlaiv

Members
  • Posts

    13,263
  • Joined

  • Last visited

  • Days Won

    12

Everything posted by vlaiv

  1. On math, experience and comment by Emil Kraaikamp, author of AutoStakkert series of stacking software. Math suggests what movie length will not suffer from motion blur smear due to planet rotation. This depends on sampling rate (which depends on scope aperture). It is really simple calculation. I've done some captures of planets and always used 3-4 minutes (or even longer) of captures without derotation, only with AS! software. I've read commend from Emil Kraaikamp that AS!2/3 is capable of handling small rotation of planet - precisely for the reasons I outlined - because it uses alignment points to deal with seeing induced tilt that stretches and distorts image. Such distortion can move alignment point up to 1-2 arc seconds from its original position on successive frames and AS!3 can deal with that without any issues - I don't see why it would not be able to deal with similar shift due to rotation. I made connection that if 0.5"/px can use 5-6 minute videos on Jupiter - 0.22"/px can certainly use half of that - 3 minute, as rotation blur grows linearly with time (for small planet rotation angles that discuss here). 0.22"/px is roughly half of 0.5" and 3 minutes is roughly half of 5-6 minutes - makes sense? No, I'm not looking at it from deepsky / long exposure point of view. I know precisely what I'm talking about and why. Not sure who Christopher is or why should I be telling him that he's been doing something wrong for past 20 years? If you are implying that there is a person that is proven planetary imager and that this person said that 30 seconds is limit or something like that, I'd like to remind you that Appeal to authority is form of fallacy: https://en.wikipedia.org/wiki/Argument_from_authority Examine arguments presented by this individual and ones presented above and decide based on those.
  2. Diffraction of light in combination with aperture acts as low pass filter on the image formed by telescope. Single point of light is "distorted" into Airy pattern - which consists out of disk and rings surrounding it. That is what is called PSF of telescope aperture (point spread function). It has interesting property - it completely removes higher frequency components of the image (think of it as finer detail) - and for that reason we say that telescope can't resolve things beyond certain size. This is commonly referred to as ability to split tight double stars in visual for example. This highest frequency is known as spatial cutoff frequency for optical system: https://en.wikipedia.org/wiki/Spatial_cutoff_frequency This means that image at focal plane of telescope simply does not contain higher frequency components and is band limited. We can now apply Nyquist-Shannon sampling theorem https://en.wikipedia.org/wiki/Nyquist–Shannon_sampling_theorem which says that in order to capture band limited signal you need to sample it at twice the highest frequency component present in that signal. Size of pixel dictates sampling rate and also - arc seconds per pixel depends on focal length and pixel size. When we combine all of these - we get above result. I use green light (~500nm) although in principle you should use 400nm blue as shortest wavelength, but that is the most blurred part of spectrum and very difficult to recover so I choose 500nm as reference wavelength. To give you exact formulae used: f_ratio = 2 * pixel_size / wavelength (pixel size and wavelength in same units) arcsec_per_pixel = 206.3 * pixel_size / focal_length (pixel size in µm and focal length in mm)
  3. Why do you think that? Let's do a bit of math to see, shall we? Jupiter's circumference is about 440,000Km, and it rotates in 9h and 56m - or 35,760s - from that it leads that point on equator that rotates the fastest moves at speed of 440,000Km / 35,760s = ~12.3km/s. With 9.25" scope being ~235mm has critical sampling rate of 0.22"/px. Single pixel covers 0.22". Distance to Jupiter is about 600 million kilometers now (or 588 at closest point, but let's go with 600 - nice round number). At that distance 0.22" is equal to 640Km of length. At above speed, it takes 52s for planet to rotate single pixel at fastest moving point (center of the planet). If we had perfect conditions without atmosphere, any motion blur would be less than single pixel (thus not really visible) even if we did not do alignment points in this time. Software used for stacking uses alignment points and these alignment points have no trouble aligning features that are even 1-2 arc seconds displaced between subs due to seeing wavefront disturbance (tilt component). This alone can easily compensate for multi minute video - there is no need to derotate even with videos lasting 5-6 minutes (where fastest moving point would move by about 1.3" at most). To be conservative I recommended half of that, but I happily image for 5 minutes with smaller scopes (0.5"/px resolution for example).
  4. Darks also contain bias signal that you should remove, even if dark current is very low and there is no significant amp glow.
  5. No. I personally don't understand what Photometric Colour Calibration does - and according to my understanding it does not produce correct color, but that is besides the point. Planck's law is given by this formula: And it only uses one variable - T or temperature (other is frequency and we are interested in range ~ between 400 and 700 nm) - all other things are constants - e : base of natural logarithm, h : Planck's constant, Kb : Boltzmann's constant and c : speed of light. It produces distinct spectrum for each temperature. Other unknown is integration time (multiplicative constant). In principle you need only to measure graph in two points to be able to solve it for temperature. Once you have temperature - you have color of the star:
  6. Any particular reason for skipping the darks? Luckily, you have set point camera and you can create darks and apply them later on. Maybe give it a go and see if it improves your result.
  7. I'd say that is excellent option. You might want to consider purchasing a few more eyepieces to go with it as it only comes with one eyepiece. I'd recommend getting 32mm plossl and maybe two more - like 16-17mm one and 8-9mm one. 32mm - wide field eyepiece 25mm - lower power DSO eyepiece ~16-17 - general DSO eyepiece <10mm - planetary eyepieces (don't go below say 5-6mm as you won't be able to use those most of the time due to seeing). Some people recommend getting a barlow lens to effectively double your EP collection. I've personally stopped using them for visual - I prefer "pure" view with eyepieces only. Plossl eyepieces will work fine https://www.firstlightoptics.com/astro-essentials-eyepieces/astro-essentials-super-plossl-eyepiece.html and so will BST Starguiders (depends on your budget, but you'll get discount on BSTs at FLO: 10% for 2 or more and 15% for 4 or more): https://www.firstlightoptics.com/bst-starguider-eyepieces/bst-starguider-60-15mm-ed-eyepiece.html
  8. Forgot to say: If you want natural looking stars - then you'll have to do separate exposures without filter and use just star color over original exposure with filter or alternatively - do some really clever math to get star color from filtered data. Stars emit black body spectra and you really only need two points to solve for black body spectrum shape and get star color based on temperature.
  9. No filter can help with moon glow when imaging galaxies - they can only hurt, even if background seems darker. That is because they block light from target as well. What you can do is go for UHC / narrowband type of imaging. You say that you are interested in emission nebulae? That should be your target of choice when the moon is out. L-Pro is worst for this as it passes the most of spectrum besides emission lines. It is good LPS filter when you are in area dominated with non led artificial lighting - like low and high pressure sodium lights. L-Enhance would be my filter of choice. L-Extreme is too narrow. It removes completely H-beta line and leaves only Ha and OIII. Don't worry about sub duration. If you can't go long - then go short. It is better to go short with narrowband filter in adverse conditions than with no filter at all. In the end - avoid imaging when the target is very near the full moon. Maintain at least 30 degrees of separation between them.
  10. Don't really know if you are going to get any results since you have no experience in planetary imaging. Problem with said scope is that it does not track, unless you plan on taking it of the dob mount and putting it on very large EQ mount. Alternative to that is to put it on EQ platform. In principle, you should take your ASI120, put it in x2.5 barlow (x2 will work fine as well), take video of about 3 minutes at highest frame rate that you can achieve (that means using 640x480 ROI instead of full 1280x1024 or what the resolution of ASI120 is) of 5-6ms exposures and then stack it and process it. Set gain fairly high. It would be good to take calibration videos as well - maybe just "dark" video if you can (which is short video with same settings as capture video - except scope covered). Problem is tracking. If you don't have tracking - well Jupiter will zoom thru your FOV in matter of seconds if you manage to get it aligned in the first place (you are trying to align things by hand with about one arc minute precision)
  11. You don't need barlow and set exposure to ~5ms rather than 12ms. Also, shoot 30000 frames (go for 3 minute videos, above I've seen that you imaged for only 30s - Duration=28.473s)
  12. No. On Mono+Filters - you untick checkbox on FITS files tab, and you don't care for the rest of the settings. Just remember to stack each color to its own stack with appropriate calibration files.
  13. No. Super pixel mode takes group of 2x2 pixels that comprises one bayer matrix element and forms single pixel with RGB values - taking R from R pixel of bayer element, B from B pixel and G from average of two G pixels. Resulting image has half vertical and horizontal resolution but there are no interpolated pixels (regular debayering interpolates missing pixel values).
  14. DSLR and astro cameras have different set of settings and you need to adjust settings for dedicated camera: Use Fits files tab when working with dedicated cameras (Raw files is for DSLR) and also check that you have OSC / color camera. Select bayer pattern - probably RGGB and debayering method below.
  15. I'm slightly confused - you say ES diagonal? Is that 1.25" or 2" version? I can't seem to find 1.25" version, and 2" version will not add 50mm. Shortest 2" diagonals add about 100-110mm of optical path. 173 - 31 = 142mm - that is longest optical path that will come to focus (racking out focuser just shortens that). Let's say that a) ES diagonal is 2" and has minimum optical path of 100mm b) Powermate and ES Extender need 0mm optical path (they move focus point out the same amount as they are long so they don't change anything) c) Combo Quark needs 67mm 100mm + 0mm + 67mm = 167mm, but you only have 142mm available (racking out focuser just shortens what you have available). On the other hand: Loosing diagonal would make this: 67mm + 63mm of focuser travel = 130mm still less than 142mm - you would not be able to reach focus in that configuration and would need some sort of extension as @michael.h.f.wilkinson pointed out. Maybe best course of action would be to loose diagonal (at least for imaging) and use 30mm 2" extension tube in this configuration: focuser -> click lock -> 2" 30mm extension -> Telecentric lens -> quark combo -> camera For visual - get 1.25" diagonal instead and maybe ditch the click lock adapter and go with stock one. Most 1.25" have 70-80mm of optical path and then you'll have: 80mm + 67mm = 147mm.
  16. This is way too low. Planetary imaging is balance between exposure length and freezing the seeing. You want your exposure length to be short enough to freeze the seeing - but not shorter than that since signal strength depends on exposure length and we want each of our subs to have enough signal so that stacking can work and that read noise is not dominant factor. You can get good results with x20 longer exposures of about 5ms. That is x20 stronger signal. If there is saturation - than back of a bit on gain.
  17. That is extremely large Jupiter image. What scope are you using and are you using barlow to get to that scale or do you drizzle? In either case - you are loosing SNR to gain "empty" resolution. Level of detail is suitable for the image of this size: Which is 20% in both height and width - or 1/25th of original image by surface. Back to original question: ASI290 has better QE in visible part of the spectrum, has maybe a tad higher read noise. ASI462 is much better for IR imaging - at around 850nm as it is much more sensitive (it is more sensitive in IR part than in visual). Pixel size is the same and so is size of sensor itself. What are your capture settings?
  18. Here is interesting question - even if temperature in nebula is very high - does room temperature gas (air) produces more energy transfer than it? Temperature is measure of average energy of particles in a gas. 10000 molecules per cm^3 in nebula will all have high kinetic energy so that their average energy is also high - high temperature, but in air at room temperature and 1 atmosphere of pressure we have roughly 25 x 10^18 molecules. Out of these 25 x 10^18 molecules, according to Maxwell–Boltzmann distribution - I think there are more than 10000 that have higher energy than average energy of those 10000 molecules in very hot interstellar gas / nebula?
  19. Been there, done that Well, not actually that with SkyMax, but whole thing is very familiar to me. I had 130m Newtonian scope and I decided to flock it and clean the primary mirror. Primary mirror cell is held in place by three screws - and those are screwed in some nuts. Once you unscrew the screws - well nuts become loose Fact that it is open tube - does help a bit, but not as much as you would think at it is almost meter long and not very wide. How on earth am I going to get the spanner in there to hold the nut while I put the screw back in? In the end - it was DIY solution of very long wooden handle and taped spanner at one end
  20. From what I understand, Powermates should not extend focus point the way barlows do? Even TV on their page on Powermates say:
  21. Ok, found it on wiki - and it sort of makes sense, although I never thought about it. It turns out that natural way the hand moves is for thumb to push and fingers to pull - and that makes distinction. Awkwardness that I was feeling when cutting with left hand was not because I'm right handed - but because it is natural to push with thumb, although, I just tried and I wound not mind using left handed scissors if needed with my right hand (it only took a bit of mental effort to adjust pulling / pushing action).
  22. Still confused. I took scissors now and used the with my right hand - they cut, used them with my left hand - they cut as well. Only difference was in how I applied slight pressure to achieve shear action. With right hand I "pushed" thumb and "pulled" other fingers, and with left arm I "pulled" thumb and "pushed" other fingers. In either case - you'll apply slight force in such way as to "bring scissor blades together" and not apart with respect to joint.
  23. That is a lot of elements and I think that you don't have enough inward focuser travel for all of that. Loose diagonal as a first step and see if you can focus properly with x4 Powermate.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.