Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

What maybe a silly question….🤔


Stuart1971

Recommended Posts

So here is an image of my camera sensor, now when looking at the sensor like this, from the front, would, for instance the bottom left of the sensor, correspond to what I see in one of my images when opened in Pixinsight, so bottom left of the image….

I have an issue with tilt and have been looking at images in PI with the aberration tool, and it clearly shows the bottom left corner of my images have the slight tilt….so trying to work out which corner of the sensor this corresponds too….

 

 

3EB6BE1B-ABF1-438F-9859-FEF7EA79852B.jpeg

Edited by Stuart1971
Link to comment
Share on other sites

1 hour ago, Stuart1971 said:

So here is an image of my camera sensor, now when looking at the sensor like this, from the front, would, for instance the bottom left of the sensor, correspond to what I see in one of my images when opened in Pixinsight, so bottom left of the image….

I have an issue with tilt and have been looking at images in PI with the aberration tool, and it clearly shows the bottom left corner of my images have the slight tilt….so trying to work out which corner of the sensor this corresponds too….

 

 

3EB6BE1B-ABF1-438F-9859-FEF7EA79852B.jpeg

It will depend on the optics in front of the sensor. A simple lens, for example, produces an inverted image.

The simplest way to figure out what is happening with your optics and no thinking required is to cover a small piece of one corner of the sensor, take an image and compare. 

  • Like 1
Link to comment
Share on other sites

21 minutes ago, Mandy D said:

It will depend on the optics in front of the sensor. A simple lens, for example, produces an inverted image.

The simplest way to figure out what is happening with your optics and no thinking required is to cover a small piece of one corner of the sensor, take an image and compare. 

Thanks

Well it makes no odds about the optics, as this is happening after that, just want to know how the image corners correspond with the sensor corners….after the image is taken, I know that the optics will do things, but that is moot really here….👍🏻

Link to comment
Share on other sites

24 minutes ago, Stuart1971 said:

Thanks

Well it makes no odds about the optics, as this is happening after that, just want to know how the image corners correspond with the sensor corners….after the image is taken, I know that the optics will do things, but that is moot really here….👍🏻

Sorry, not sure what your complaint is. The masking of a corner will positively identify that corner in your image, regardless of optics.

  • Like 1
Link to comment
Share on other sites

47 minutes ago, Stuart1971 said:

Thanks

Well it makes no odds about the optics, as this is happening after that, just want to know how the image corners correspond with the sensor corners….after the image is taken, I know that the optics will do things, but that is moot really here….👍🏻

Some types of scopes fully invert image - like left/right and up/down, while other types invert just one one direction - like Newtonian.

There is then issue of sensor orientation in firmware - how is it being read? From "left to right" or in reverse (this has to do with internal addressing).

It is very hard to tell which corner of the image corresponds to which side of optical arrangement - without testing it.

Covering one side of sensor is sensible way to go about it. You can use piece of foil and filter cell or just filter wheel / slider to put tin foil in.

  • Like 1
Link to comment
Share on other sites

42 minutes ago, Mandy D said:

Sorry, not sure what your complaint is. The masking of a corner will positively identify that corner in your image, regardless of optics.

I think it's a fair question - if you consider a dSLR, the lens produces an inverted image on the sensor, but what's saved (or output to the image capture software) is restored to the 'correct' alignment, so a piece of dust at the bottom of the sensor will actually affect the top of the output image. 

If you're using an astro imaging camera it's not obvious (to me at least) whether the same 'correction' applies. 

However, as you point out - masking a corner of the sensor will clarify for the OP how to relate the sensor position to the camera output.  

  • Like 1
Link to comment
Share on other sites

3 hours ago, Mandy D said:

Sorry, not sure what your complaint is. The masking of a corner will positively identify that corner in your image, regardless of optics.

Yes sorry, I was getting all confused in my head and over thinking it…thanks…👍🏻

Edited by Stuart1971
  • Like 1
Link to comment
Share on other sites

39 minutes ago, newbie alert said:

Thought you sorted your tilt, or have I got the wrong Stuart?

I have, on my QHY268c camera, but have moved to a filter wheel, with a new 268m mono camera, which is a bit more awkward to get on the laser Jig, so was trying to sort an even simpler way, but needed the info I asked for above, but failing that I will 3D print an adapter to hold the full imaging train with wheel….👍🏻

Edited by Stuart1971
Link to comment
Share on other sites

7 hours ago, vlaiv said:

Some types of scopes fully invert image - like left/right and up/down, while other types invert just one one direction - like Newtonian.

There is then issue of sensor orientation in firmware - how is it being read? From "left to right" or in reverse (this has to do with internal addressing).

It is very hard to tell which corner of the image corresponds to which side of optical arrangement - without testing it.

Covering one side of sensor is sensible way to go about it. You can use piece of foil and filter cell or just filter wheel / slider to put tin foil in.

Wait, so should I have been flipping my newtonian's images horizontally? (assuming camera's sensor's wide length points to the front and back of the tube, if that makes sense?)

I never considered that I was looking at everything backwards in only one direction!

Link to comment
Share on other sites

22 minutes ago, pipnina said:

Wait, so should I have been flipping my newtonian's images horizontally? (assuming camera's sensor's wide length points to the front and back of the tube, if that makes sense?)

I never considered that I was looking at everything backwards in only one direction!

You can always plate solve your images.

Ideally - you want DEC to be vertical, RA to be horizontal and both "grow"  "up and right".

However, some targets might be better in some other orientation - so there is no "definitive" rule on that on.

Your recent HH and Flame image is rotated roughly 220 degrees:

image.png.2484af11739bb0fb1502eb13ccffb472.png

 

 

 

  • Like 1
Link to comment
Share on other sites

32 minutes ago, pipnina said:

Wait, so should I have been flipping my newtonian's images horizontally? (assuming camera's sensor's wide length points to the front and back of the tube, if that makes sense?)

I never considered that I was looking at everything backwards in only one direction!

Its all relative and subjective in space where there is no real up or down and really even left and right carry little meaning IMO. Whatever composure makes the best image is the "right" way to take the image. Up to the imager to choose that orientation of course.

Link to comment
Share on other sites

Just now, ONIKKINEN said:

Its all relative and subjective in space where there is no real up or down and really even left and right carry little meaning IMO. Whatever composure makes the best image is the "right" way to take the image. Up to the imager to choose that orientation of course.

Yes, but there is something called Chirality, and one should not invert it.

It is "handedness" - pretty much like the fact that we have right handed screws. If you turn that screw clockwise - it will move "in". No amount of rotation / orientation and direction of observing can switch that screw to be left handed.

For that you need to use mirror.

Regular objective (like refractor or primary of reflector) - swaps up / down and left / right - effectively rotating image by 180 degrees. Add one diagonal mirror to the mix - and you now have changed Chirality. That mirror can be secondary of newtonian or diagonal mirror for refractors and MCTs/SCTs for visual.

That could be thought of as changing the nature of object - as no "normal" way of observing will produce such image - only use of a mirror will.

  • Like 2
Link to comment
Share on other sites

2 minutes ago, vlaiv said:

Yes, but there is something called Chirality, and one should not invert it.

It is "handedness" - pretty much like the fact that we have right handed screws. If you turn that screw clockwise - it will move "in". No amount of rotation / orientation and direction of observing can switch that screw to be left handed.

For that you need to use mirror.

Regular objective (like refractor or primary of reflector) - swaps up / down and left / right - effectively rotating image by 180 degrees. Add one diagonal mirror to the mix - and you now have changed Chirality. That mirror can be secondary of newtonian or diagonal mirror for refractors and MCTs/SCTs for visual.

That could be thought of as changing the nature of object - as no "normal" way of observing will produce such image - only use of a mirror will.

Problem is, almost all telescope systems will have a mirror or 2, or 3 in them. So the average guy looking through a telescope is just as likely to have a different experience as the next one with another system; hence the RA and DEC coordinates used in real measurements and not left and rightedness that are rather subjective and useless.

But i get what you mean, some are more right than others if we were to compare to naked eye orientation.

Link to comment
Share on other sites

9 hours ago, ONIKKINEN said:

Problem is, almost all telescope systems will have a mirror or 2, or 3 in them. So the average guy looking through a telescope is just as likely to have a different experience as the next one with another system; hence the RA and DEC coordinates used in real measurements and not left and rightedness that are rather subjective and useless.

But i get what you mean, some are more right than others if we were to compare to naked eye orientation.

I'm not sure you do fully understand.

RA and DEC coordinates are part of coordinate system. In our 3d space (which is orientable, and not all spaces are) - there is notion of handedness - coordinate system can be left or right handed. RA and DEC system is right handed coordinate system.

Maybe best explanation is this - if you look at an image of a hand - you can always know if it is someone's left or right hand.

image.png.0647b059928834823d451357706bcf73.png

hand in the image is left. No matter from what angle and in which orientation you image it - everyone will be able to tell that it is left hand.

Only operation that "messes up" this information is mirroring the image (either by use of real mirror - or flipping on axis).

Link to comment
Share on other sites

Not sure if you figured this out already Stuart, but here's a picture of my QHY268M. I've added a label to my camera stating where the top edge of my sensor is when I image using NINA. 

This the orientation of my sensor when captured using a refractor, so there shouldn't be any flipping of the image in either axis. If you have the filter wheel port at 9 o'clock and the USB cable at 3 o'clock, the top of your sensor should be at 12 o'clock 🙂

 

email167014816789620221128_134720.thumb.JPG.da3031c61a1cb076fcd2cfdc524601c3.JPG

  • Thanks 1
Link to comment
Share on other sites

6 minutes ago, Richard_ said:

Not sure if you figured this out already Stuart, but here's a picture of my QHY268M. I've added a label to my camera stating where the top edge of my sensor is when I image using NINA. 

This the orientation of my sensor when captured using a refractor, so there shouldn't be any flipping of the image in either axis. If you have the filter wheel port at 9 o'clock and the USB cable at 3 o'clock, the top of your sensor should be at 12 o'clock 🙂

 

email167014816789620221128_134720.thumb.JPG.da3031c61a1cb076fcd2cfdc524601c3.JPG

Thanks for that, but yes I know where my sensor top is,  but I don’t think I asked the initial question very well…

What I wanted to know, is when looking at the sensor from the front of the camera, the photons that hit the bottom left corner of that sensor, where do they show in the subsequent image opened in PI, do they show in the same bottom left corner of the image.

the scope has no bearing on the question I asked, it’s purely about what light hit which part of the sensor and how it correlates to the image on screen…

Hope that makes sense….it does in my head….👍🏻

Link to comment
Share on other sites

1 minute ago, Stuart1971 said:

Thanks for that, but yes I know where my sensor top is,  but I don’t think I asked the initial question very well…

What I wanted to know, is when looking at the sensor from the front of the camera, the photons that hit the bottom left corner of that sensor, where do they show in the subsequent image opened in PI, do they show in the same bottom left corner of the image.

the scope has no bearing on the question I asked, it’s purely about what light hit which part of the sensor and how it correlates to the image on screen…

Hope that makes sense….it does in my head….👍🏻

Ah that makes sense! In that case, the image is just flipped in the vertical axis, like if you were to look at yourself in the mirror. If you know where the top of the sensor is, then's it's only the left/right sides which will be flipped when you look at the sensor per your original post. Up/down won't be flipped. 

Link to comment
Share on other sites

6 minutes ago, Richard_ said:

Ah that makes sense! In that case, the image is just flipped in the vertical axis, like if you were to look at yourself in the mirror. If you know where the top of the sensor is, then's it's only the left/right sides which will be flipped when you look at the sensor per your original post. Up/down won't be flipped. 

So as I look at the sensor from the front, the information recoded by the bottom left corner, shows in the top left when I open the image in Pixinsight….correct…?

Link to comment
Share on other sites

9 minutes ago, Stuart1971 said:

So as I look at the sensor from the front, the information recoded by the bottom left corner, shows in the top left when I open the image in Pixinsight….correct…?

Not quite. If your aberration inspector tool in pixinsight says the bottom left corner of the image is out of tilt, this would correspond to the lower right of the sensor when you look at it per your original post. 

This assumes the top edge of the sensor and image are at the 12 o'clock position. 

  • Thanks 1
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.