Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

vlaiv

Members
  • Posts

    13,030
  • Joined

  • Last visited

  • Days Won

    11

Everything posted by vlaiv

  1. Distortion model is probably related to lens / wide field images. When we are imaging, we are mapping spherical "surface" to flat surface, or rather angles to distance on the sensor. Larger the angle (or smaller the radius of sphere compared to sensor size) - more distortion there will be in the image. When you try to align subs with very large offset this can cause issues as star distances will not be equal if stars are in center of the field and on the edge. Maybe easiest way to explain that would be to observe "north pole" and 4 points on equator. You can arrange them such that lines connecting them along the surface of the earth are - equal length and always at 90 degrees to one at north pole. Try placing such 5 points in the plane with those properties Angles don't map ideally to distances in plane and you have distortion. Not sure if you need to concern yourself with that unless you are using wide field lens or fisheye lens or something like that. As for star detection - I would try adjusting sensitivity, peak response and upper limit - these settings seem related to what you need but I have no clue what each does solely based on their names (try increasing sensitivity, lowering peak response and probably leave upper limit alone - not sure if it will help changing that).
  2. Presuming that difference between these two subs is only the way master flat was prepared (not quite clear to me from your post) - in one case you used proper flat-darks of same exposure (and other parameters) as flats and for other one you used bias instead of flat-darks to create master flat, then only difference that can show will be in flat application. You won't find significant difference in noise levels or star shapes or anything. One that uses bias can have either over or under correction by flats (not sure which one of the top of my head), but it's not necessarily the case. It will depend on few factors - how much light blockage there is in the first place (either from dust particles or vignetting), and what is the difference in bias mean value vs dark-flat mean value (larger the difference - more chance there will be issue with flat calibration). It might not even be visible there is issue (even if there is) unless you stack and stretch very hard, so there might be an issue that does not show on normal level of stretch and you can end up with good looking image anyway.
  3. Have no clue - quick search on the internet gave me this page: https://www.lightvortexastronomy.com/tutorial-pre-processing-calibrating-and-stacking-images-in-pixinsight.html#Section6 In section 6 it deals with registration / alignment of images and it shows that window so I assumed that is readily available option for image registration under PI.
  4. I can't seem to find relevant section in help file - maybe you can make a screen shot of that section opened to see what options are available?
  5. That is integration of already aligned images. From PI tutorial, there is this section: I'll check to see what options are listed in help to see if we can change something to aid detection process
  6. Is there any sort of threshold for star brightness in PI? DSS has that, and if you lower it - it will find more stars, but if you lower it too much it will start mistaking noisy pixels for stars and alignment will fail. I don't use PI so don't know exact setup of it, but maybe PI help files could provide a clue, or list of stacking options?
  7. I don't think it is noise in question. I had something similar in different software, and it might be the case with PI as well. Are you trying to align Ha frames on their own or are you trying to add them to some other stack? Issue that I was having that can be related to this is when software tries to match the stars not only by coordinates and relative spacing but also by star brightness. Ha subs will have significantly lower ADU values in stars then regular subs. Maybe PI tries to match subs with star intensity taken into account and fails for that reason? On the other hand it could be SNR ratio of Ha subs - how long are they in terms of exposure?
  8. This is just "redistribution" of the noise - signal will be the same, and noise in general will remain the same over the image - it will just change it's distribution. It depends on algorithm used to rotate - interpolation. Some interpolation techniques give better and some worse results with respect to this. I'll do another example for you here and comparison of bilinear and more advanced interpolation. Here is "base" sub - nothing but pure gaussian noise: Here are two subs rotated by 2 degrees - one bilinear interpolation other cubic O-moms: These are two rotated subs - no pattern is yet visible as we did not stretch the subs to show the pattern. Now I stretched in particular way to emphasize this pattern - left one is bilinear interpolation - and it shows pattern clearly. Right one is cubic O-moms - pattern is there but to much lesser extent. Every algorithm will produce it to some level because you need to cut into higher frequencies when you work with limited sampling rate, but some algorithms handle this much better. If you use Lanczos 3 resampling - it should have this effect down to minimum.
  9. This is consequence of small angle rotation and noise being clearly visible. Rotating image by small angle in presence of the noise produces this effect imprinted in the noise. It will not be imprinted in the signal in the image. You need hard stretch to show it. It can be explained as aliasing of high frequency components or it can be explained as consequence of averaging noise pixel values in some regions and not in other regions - it is same thing. You end up with noise - less noise - noise - less noise pattern which shows itself as grid. I created similar pattern in this thread: It depends on resampling method used - bilinear resampling gives worst results.
  10. It's a bit hard to tell who you are referring to in your post. Don't use "rank" but rather screen name of person you are referring to. It would also be helpful if you quote someone to actually write why you are quoting them. For example, in your post above, you quoted my topic without any text on your part. I have no idea if that was by accident or something else. If you want to mention someone without quoting them here on SGL - there is simple mechanism for doing so - just put "at sign" before their screen name like: @woodsie They will get notification that you mentioned them and can respond. As for returning the camera, maybe try couple of things first and see if that sorts out the problem: 1. Try using different capture software. There are couple of free alternatives out there that use ASCOM drivers: NINA (night time imaging or something like that), SIPS - Scientific Image Processing Software by Moravian instruments - I was using that but had some similar issues with ASI1600 - image was written in wrong format and broken, there is Sequence Generator Lite that is free (free version of SGP), and I'm sure there are other software as well. Not sure what you are currently using, but its worth trying other software to see what results you will get 2. Try setting your offset properly in ASCOM driver - that should solve issues with stripes and could possibly deal with strange part of the image as well. Set it to somewhere around 50-60. 3. Maybe try reinstalling drivers or use another computer. Above issue might be related to USB port on your computer (not likely but worth a shot). Once you try with different settings and still can't manage to solve the problem - yes, contact Telescope Express and ask for exchange or refund - whatever you prefer.
  11. Just don't use drizzle - no point in doing so. Drizzle as algorithm works only when you have certain preconditions, and in practice no one having amateur setup will have these preconditions met. There simply is no benefit for doing drizzle and it only "hurts" your data. In order to utilize drizzle algorithm, one needs predictable PSF, oversampling based on that PSF and means to point their scope with sub pixel precision. It requires guide system and imaging system to be connected in such way that dither issued by guide subsystem result in exact pixel fraction shift of imaging system. While this in principle can be done - no software support exists (that I'm aware of).
  12. Maybe just do visual examination of particular sub that is causing error? Second error that shows sometimes is completely unrelated - PI reports that it cannot successfully parse FITS header keyword for SITELAT / SITELONG - which should contain latitude and longitude of your observing site (I'm guessing here but it would not be hard to check those FITS keywords to see what they represent really). This could be because of either software used for acquisition is not writing proper format for these fields in FITS header, or PI can't interpret what is written according to FITS standard for some reason (not properly implemented in PI or part of specification not implemented at all). In any case, it should make absolutely no difference to stacking result - that is just metadata that you can go without when stacking subs. As for issues with stars - there could be number of reasons why these are not detected. Could be that there is too much of guiding error and stars are shaped like trails rather than circular - algorithm just does not recognize them, or maybe they are too large to be considered stars (depends on sampling resolution and how much "search area" is configured in PI when detecting stars), or any other number of reasons. Best thing to do is to first visually inspect said subs to see what the star profiles look like (round or not, etc) and then we can further think what would be good course of action.
  13. Not sure about that one. In classical interpretation of gravity hollow sphere (or any other shape) will not have gravitational field inside as gravitational influences of small pieces all cancel each other out perfectly. Have no idea what would be the case in GR though.
  14. Actually that was my point - there is spin without reference point. If you were out there spinning in empty space without being able to see any reference point - you would still know that you are spinning by stretching sensation in your head and feet. If you were in elevator without windows and there was a pull towards floor - you would not be able to tell if you were suspended in gravity field or you were uniformly accelerated in space. With rotation - you would be able to tell straight away as nothing acts as "negative" gravity source centered in your belly.
  15. good point, but here is "counter argument". First, you don't need someone else spinning for you to spin - you can eject particles and have your self spinning. Now imagine accelerating expansion of universe and the fact that one can disappear beyond the horizon. You shoot some photons from the flashlight attached to your head - like observing light - one that is directed perpendicular to your body. It sets you spinning and photons shoot off. At some point these photons will be causally disconnected from you once they cross event horizon. Now you are left alone in the space - spinning without reference point I know - not really realistic scenario, but still...
  16. Oh dear, Not sure what happened here - ASI1600 PRO camera has 4656×3520 resolution. Sub that you attached in your post is in fact: 5496x3672 pixels. It also says from fits header that it has 2.4um pixel size - while ASI1600Pro has 3.8um pixel size. Stats from fits header correspond to ASI183 - did you attach wrong sub by any chance, or maybe you made mistake in model of your camera? In any case offset is wrong, there is clipping to the left as histogram of your dark sub shows: There is something wrong with sub regardless of all of that - here is what it looks stretched - almost half of it is very brighter than the rest. It should not look like that. It might be due to capture application, but there might be a fault in camera. What capture app did you use?
  17. Could be due to low offset. What offset value are you using?
  18. I gave above example to show couple of things. First one is very "mundane" - same way objects in gravitational field follow curved trajectories - one's legs following a curved trajectory while spinning, will experience "force". Neither of two "forces" need force carrying particles to be exchanged to give raise to force - both are consequence of "curved space-time". We might as well call them "pseudo" forces (as one often does for centrifugal force). Other one has really baffled me for as long as I remember. Straight uniform motion is relative. If you are floating in empty space and don't feel anything, you have no way of knowing that you are "moving with respect to something". If you are accelerating along a direction - you will feel that as a force in your reference frame - but you need energy expenditure to do so. Again what is the thing that you are accelerating with respect to? But rotation around one axis is strangest of all - if you are spun and left on your own - you will feel above mentioned forces, although same would happen in completely empty space with no reference point. You are rotating with respect to what? And you no longer even have energy expenditure unlike in accelerating case. All of this "sounds" very counter intuitive, but I think there is very "reasonable" explanation. Not sure if anyone actually thought of it that way, maybe this is the path for GUT. We need to examine how waves behave in certain space/time configurations, and I'm sure that all of the above will emerge from behavior of waves. Math of it will show all effect. After all - everything that exists is in fact wave in quantum fields.
  19. Let's for the moment put gravity aside, and consider following example: You find yourself in empty space, far from effects of gravity of other bodies and you are imparted a spin along axis that goes thru your chest (front to back). You suddenly feel that "something is pulling" quite strongly on your legs and there is similar sensation in your head but to lesser extent. There is no force applied, yet you feel tension throughout your body - "something is stretching" you, and faster you spin, more uncomfortable it becomes. What is this force that is doing this? There is no force, yet you feel this effect.
  20. DSS has two tabs - one for fits files and one for raw files (those from DSLR), are you sure you are looking at the right tab?
  21. Well, fact of life is that OCS sensors in fact sample at half the resolution - nothing to do with the problem you are facing, but I just wanted to mention that. What you have here is debayering using super pixel mode - which combines 4 adjacent pixels into single RGB pixel value. Other debayering algorithms work by interpolating missing values (but can't restore missing information in high frequencies so you get your image at "native" resolution but still sampled at lower rate). If you want larger image - select some other debayering algorithm.
  22. Huge difference in observing from Bortle 3/4 and Bortle 7/8. I'll give you my experience. Bortle 3/4 100mm ST102 scope could easily show at least 4 larger galaxies from Markarian's chain. Bortle 7/8 8" Dob will struggle to show M51 (just two cores and you have idea that something might be there), while M81/M82 could be seen most of the time but very faint. As a contrast same scope at Bortle 3/4 and same M81/M82 targets - I was under impression I was looking at car headlights . M51 showed spiral structure and the bridge and looked almost like on images. So even 8" in heavy LP won't be able to render anything close to 80mm under dark skies.
  23. How much back focus you have is only important if you can't reach focus for needed sensor - flattener distance. I think you should have no problems with amount of back focus needed as flatteners often move focal point inward (particularly those that act as reducers as well).
  24. Technically it's astigmatism Angle is too small to make elliptical cross section of converting light cone (there is small contribution of this effect, but here primary effect is astigmatism). There is in fact a bit of coma from what I can tell - pure astigmatism is symmetric aberration, so star shape should be elliptical, but here there is a bit of coma in the far corners. In any case, this mix of aberrations is due to wrong distance of flattener to sensor.
  25. Distance is the issue for sure. Actual prescribed distance for flattener only works if flattener is matched for that particular scope - otherwise it is general guideline and one should try different spacing to get good results. Above diagram should be right in principle, but again depends on optical configuration of field flattener. I don't have enough knowledge on different flattener designs (if there are indeed different configurations) to be able to provide more detailed insight into all of that but I suspect that both things: spacing and above diagram should be taken as guideline more than fact unless specified exactly for flattener / scope combination by manufacturer. In any case, answer is trial and error. There might even be a case where you can't find "perfect" distance - there is always some level of aberration in the corners. This can happen if flattener does not correct large enough field to cover whole sensor. 16200 is full frame sensor with 34.6mm diagonal - certain flatteners might have difficulty to correct for stars that far off axis (not saying that TS one is such flattener - but it can happen).
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.