Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

globular

Members
  • Posts

    918
  • Joined

  • Last visited

Everything posted by globular

  1. I suspect the only horses head I'll see round these parts are in the fields down the lane.
  2. Nicely done I like doing open clusters but mine end up scruffy I've added this list to my list of lists Thanks for the tip off.
  3. Your starsense will automatically do the (step 3) star align for you... so you don't need to manually find stars you know the names of. You do need to make sure your starsense camera is aligned with your telescope though.... but this is a one off job (provided you don't remove the camera between sessions).
  4. At 0.5° across I thought I'd try with my N22T4 which gives 0.85° (& 100x and 2mm exit pupil). I'll try my XW40 too which gives 1.25° 55x and 3.75mm exit pupil - this often gives nice fairly black views although it can be a bit washed out sometimes - i.e. much more condition dependent. The C8 is my only scope right now... I do have a long term plan to get something like a 120 ED refractor - but right now I'm loving my C8 and am far from running out of good things to point it at that work really well. May be this dark stuff won't work too well for me in this scope and this location? But I'll have fun trying.
  5. For your CGEM II the alignment steps are: Rough Align EQ Align Polar Alignment EQ Align #2 (optional) Calibrate Mount (optional) For visual use you need to do 1, 2 and 3. 4 & 5 are unlikely to be necessary - unless you are imaging or unless 1 and/or 2 was very rough. 1. is point it north 2. is set the correct degrees for your location 3. is star alignment 4. will make automatic slewing to targets more accurate. 5. allows the mount to compensate for heavy / unbalanced loads that require slightly different movements depending on which way the telescope is pointing.
  6. But perhaps "special" skies? I'll try with my 20.4ish and see how I get on.... when my 100% clouds allow.
  7. Check your tracking is on the correct setting... one of Sidereal, Solar and Lunar. ** Gus beat me to it, but I've typed it so may as well post it 😁
  8. Competitions with judges will generally be scored on both how "accurately" the elements of the scene are reproduced and how "pleasing" the image is. If an entrant likes oval starts then the judges will likely score low on accuracy but might still be won over if the image is particularly pleasing. If an entrant distorted size to make one of Jupiters moons look larger and hence closer to the planet then the judges might not be impressed (unless it was a specific artistic category) with the unnatural look; but again might claw score back on aesthetic. Nice sharp focus will score high on accuracy, but a softer image might look nicer sometimes. Colour, though, seems to be judged ONLY in the "pleasing" category... justified on the basis that there is no right or wrong colour. I think vlaiv has presented a good argument that there could be both "accuracy" and "aesthetic" elements for colour. The colour management could have a standard (like there generally is in terrestrial photography) to which accuracy can be assessed. That doesn't mean, just like the other characteristics of the elements of the image, there there is no freedom to be artistic or to change colour balance to, for example, imply a particular set of environment conditions to a viewer. Then, outside of competition, images could adhere to both the accuracy standard and an agreed viewing environmental condition - making them scientifically useful.
  9. Will they be able to handle the challenge of matching the existing one?
  10. LDN is a very long list! Can’t wait for a clear night to give a few a try.
  11. Good read Magnus. Lovely looking scope. Are you 11 feet tall? Must be something like that to reach the eyepiece?
  12. @Xilman would it be ok if I use your wonderful globular picture for my avatar?
  13. When I go on a sightseeing holiday I take lots of pictures. In the future I look back on them and they remind me of the trip and provoke happy memories. I do not come home and collate an album of stock images of all the sights I saw. They would, no doubt, be better lit, better framed, better images of the sights, but they would not have any of me in them nor the memories of the experience of creating them.
  14. Rather than counting perhaps using brightness as a weight would help reduce the dominance of background and ensure the best colour rendition is in the brighter areas? And as we only need to produce the calibration pairs once per camera (not each time we image like we do for flats) we could have a large number of colour samples rather than just a handful. (But I agree that colours more likely to be in astro images are more important if only a limited number of them are used).
  15. If I keep these pairs, rather than solving for a best fit linear transform matrix, can I not then feed them into the image process workflow and get it to calculate a transform that is optimised for the astro image being taken? Something like: for each pixel (or region of pixels) find the pair that is closest; count how many times each pair is used across the image; and solve for a linear transform matrix using the weighted pairs. i.e. calculate a transform matrix that fits best in the regions most important to the image being processed. This then seems similar to the well established process of feeding in flats to inform the workflow about imperfections in your light path; this is feeding in information to inform the workflow about the colour profile of your camera. Anything else you do (with generalised tools or manual slider movements) will have to reply on typical averages, that most camera designs are similar, the judgement or preferences of the user, etc. We could do that with imperfections.... but we don't because we know we'd get it wrong... maybe removing a very interesting feature.
  16. With a measured field stop of 21.7 that gives a calculated AFOV of 71 degrees. So a measured FOV of 74 gives geometric distortion of about +4.2%. Sounds fairly high… ?
  17. Thanks @vlaiv. On the face of it most of that looks automatable... if there is software with exposed APIs to supply values and perform actions..... surely these exist these days? One issue springs to mind... solving multiple points in 3 dimension space to create what is, as I understand it, a linear transformation matrix, will only give a very good fit if the behaviour of RAW and the behaviour of XYZ are fundamentally the same/similar. In other words, using your temperature conversion analogy above, if you are converting from one linear scale to another linear scale (e.g F to C) then all is well.... but if you have C as one scale and, say, log(1/C) as another scale (a bit odd, but you never know) then a best fit linear transformation is not going to be very good. My guess is you have done this enough (or know enough about how RAW and XYZ spaces work) to be happy that a linear transform is workable? In any case the regression technique used to solve for the transform will throw out enough info to calculate goodness of fit... so I guess I'll know myself when I try it. And then there is my second worry (or may be it's the same one expressed differently).... Your process above is to derive a transformation using colours on a calibrated screen / or camera. Presumably this is a fairly restricted range of frequencies? And in particular will not include lots of frequencies present in different astronomical targets. So how do you produce a family of transformations? And if you do end up with a family of linear transformations.... couldn't you replace them all with a single polynomial transformation with degrees of freedom set at a level that gives low / acceptable goodness of fit? If this is doable then it would seem to help support the hypothesis that this is "fairly accurate". But if it's not doable doesn't that mean may be it not accurate enough and/or as consistent enough of an approach to become an international standard?
  18. @vlaiv are you saying that I can: * gather data from any astro target under any typical astro viewing conditions with my astro camera * apply THE raw to XYZ transformation matrix for my astro camera * and I'll have a standardised XYZ colour rendition (that you wish the world would accept as such, a bit like we already do for other characteristics (magnitude for star brightness, or light year distance, or that stars are round...) * and I can then apply a stock XYZ to RGB transform to view the image in an environment of my choosing knowing the colour rendition will be as intended * and I can derive the single RAW to XYZ transformation matrix for my camera as a one off exercise in normal terrestrial conditions? If so I would be very tempted to enter the AP world.... all the hours of manual tinkering with images that I see discussed on the forums is the main turn off for me. Do you have another tread somewhere that explains in more detail how to go about deriving the XYZ transformation matrix for a given camera.... steps and suggested software etc?
  19. Mr B Bird or Mrs S Tarling would be my guess.
  20. My most enjoyable sessions usually occur when I’m sketching. I tend to linger longer at the eyepiece and really take in the details. If I’m not sketching it’s often a quick ‘wow’ and on to the next target. Keep it up
  21. If they tend to be travelling inwards towards the sun (and I imagine that being attracted to the sun is likely to dominate in this region of space) then it would seem logical that The Earth would get in the way on the side facing away from the sun more often than not… i.e. the bit in darkness at the time… Im just guessing / speculating… and I”m sure you”d rather get a response from someone who knows… so sorry about that.
  22. Isn’t it just that there are more hours of darkness after midnight than before… especially in the summer due to daylight saving. Or have I misunderstood your point?
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.