Jump to content

sgl_imaging_challenge_2021_2.thumb.jpg.72789c04780d7659f5b63ea05534a956.jpg

What really is EEVA


Recommended Posts

I know this might seem like a silly question being as I have been playing around at this for a while now. A while in this case defined as about 2 years with a gap in this period. 

I tried reading the "what is eeva" section again and when I did, I remembered why I didn't bother going beyond the 2nd section at my first attempt as it seemingly turned into quite a lot of complaining and whinging about what is what in eeva.

So in the past I have used a 12" dob, 5" sct, 4" frac  and currently have a 130 newt and a 224mc and newly acquired 183mc to use. 

Anyway, has eeva as a branch of astronomy been defined clearly yet?

Oh to me it is enjoying looking at an object on the screen whilst capturing data to play with later. 

Link to post
Share on other sites
  • Replies 91
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Popular Posts

I don't think it is clearly defined. It would more be "observing style" / "astronomy practicing style" than anything else. Imagine you have two approaches to astronomy so far: - Without

I totally agree with you how it can be both rewarding and interesting to take on new challenges, to test yourself and do something new and different.  However, in this instance, for the vast majo

Gentlemen please. I started this thread to become more informed and confirm some ongoing ideas I had as well as stimulate debate in a friendly manner.  Let us not start falling out with each othe

Posted Images

I don't think it is clearly defined.

It would more be "observing style" / "astronomy practicing style" than anything else.

Imagine you have two approaches to astronomy so far:

- Without computers

This style has been extensively used for centuries - it consists of observing, sketching, various measurements with analog devices (clocks, angle measures, etc ...).

Many aspects of this style are still practiced in what we call visual astronomy - either just observing for joy or maybe doing more serious work like timing doubles periods or variable star periods, etc ... all with analog means.

- With computers

This has grown into main stream professional astronomy and most of the time astronomy is practiced without being close to telescope at all. It is all robotic - images are taken or other data gathered via surveys and then analyzed "offline" - months after it has been recorded.

Most imagers end up with similar style of work to this - robotic sheds / observatories where you program sequence for image acquisition and later you process your data.

- EEVA is sort of mix of these two styles - it provides means to do "analog" approach - but with use of electronic. It is in real time / connected to sky and equipment but you still use electronic devices to enhance what ever you are doing. It can be image amplifiers / night vision that will allow to see deeper visually, or it could be computer and camera that will do the same except you are not at eyepiece - but it is still enhanced and real time.

Nice side of EEVA is that it can be practiced in groups. Sure you can observe in a group but it is still somewhat isolated experience - everyone is having their own experience at an eyepiece even if all are observing the same target. With EEVA - you can put up a screen / projector and image that is captured in real time can be viewed by multiple people at the same time. It becomes a bit more shared experience.

Does this help?

 

  • Like 5
  • Thanks 1
Link to post
Share on other sites

Well put Vlaiv. I would add it is a means of observing using electronic adds and in a relatively short time. Imaging tends to end up more about the means of generating the image, rather thn focusing on the object of interest.

Mike

  • Like 1
Link to post
Share on other sites

Does this mean that watching an enhanced image, of say Mars, live on screen using  APT, FireCapture, Sharpcap or similar, and at the same time recording the data for subsequent processing, comes under this category of EEVA?

EDIT:  Would it still be EEVA anyway even if not recording the data but just watching the image on screen as it means Electronically Enhanced Visual Astronomy?

Edited by Moonshed
Link to post
Share on other sites
32 minutes ago, Moonshed said:

Does this mean that watching an enhanced image, say Mars, live on screen using  APT, FireCapture, Sharpcap or similar, and at the same time recording the data for subsequent processing, comes under this category of EEVA?

Personally I would have said yes

EDIT: Given vlaiv's description above.

Edited by bomberbaz
  • Thanks 1
Link to post
Share on other sites
31 minutes ago, Moonshed said:

Does this mean that watching an enhanced image, say Mars, live on screen using  APT, FireCapture, Sharpcap or similar, and at the same time recording the data for subsequent processing, comes under this category of EEVA?

EDIT: would it be EEVA anyway even if not recording but just watching the image on screen, as it means Electronically Enhanced Visual Astronomy?

Yes, ideally you would want to do it with application that has ability to enhance the view for you.

As an example - human eye "records" around 30fps - which is equivalent of 33ms exposure. Any seeing influence would be averaged over this period and we would see blurred image (as we often do when observing visually). Ideally, planetary EEVA application would still capture image at lucky imaging speeds - more like 5-6ms rather than 33ms and would let you choose:

- enhanced real time view

Here app would take 5-6 successive frames, combine them (align + stack), do a bit of sharpening (small amount or otherwise noise would amplify significantly) and a bit of denoising - and display live feed at speeds that we are used to visually - like 20-30fps. This would somewhat lessen atmospheric impact but would still be considered real time

- extra enhanced view

Here app would take much more successive frames, do the same processing - stacking, sharpening, denoising - but "deeper" to create more stable and better looking image and update view every few seconds. This would let you examine details more but would still allow you to follow changes in "real time" - like Jovian moon/shadow transits and similar that happen over dozens of minutes

Of course there would be "rec" button that would allow you to save raw data and process even better afterwards.

At the moment, primary issue I see with EEVA in this department is lack of dedicated software with these special capabilities - but as long as people are interested, software support will get developed.

Link to post
Share on other sites
3 minutes ago, vlaiv said:

 

At the moment, primary issue I see with EEVA in this department is lack of dedicated software with these special capabilities - but as long as people are interested, software support will get developed.

What about sharpcap with live stacking, I would have thought that counts as eeva or is this more imaging as you are viewing a composite image rather than the actual live image?

Link to post
Share on other sites
1 minute ago, bomberbaz said:

What about sharpcap with live stacking, I would have thought that counts as eeva or is this more imaging as you are viewing a composite image rather than the actual live image?

Sharpcap is one example of a good EEVA software for DSO. In fact - I would say it would be my go to software for EEVA as it allows for both:

- live stacking, which is basically EEVA for deep sky objects

- live view from the camera that is very basic version of planetary stacking.

In both cases there are features that I would like to see implemented for both styles of observing.

For deep sky observing I would like to see:

- plate solving integrated - after you start doing live stack, I would love that image is plate solved in the background and I can click an object and get at least basic info on it - coordinates, type, catalog name - a sort of thing that you get in Stellarium when you select an object

- Automatic processing of the image with advanced algorithms - there are plenty of advanced algorithms that people use when processing image - why not have that automatically applied to live view (with few sensible controls) - color calibration, smart denoising / sharpening, good automatic stretch based on noise statistics - so you can always have best image without too much noise (or couple of presets - like nice, some noise, show it all :D ).

- Sampling rate adjustments / binning with zoom pan controls. Imagine you have camera that has 5496 x 3672 (well, you don't have to imagine, right? :D ) and you can select if you want to view image as 1374 x 918 - which is most zoomed out / the least magnification, but best SNR as software automatically bins image to that size, or perhaps you want to see image at 2748 x 1836 with view port of 1300x900 that you can pan around - medium magnification and you can move around and examine things in more detail, or you want to have even higher resolution of 5496 x 3672 - highest magnification and again 1300x900 window to pan around even more.

Add to that fractional binning and you can have multiple zoom levels with different level of noise. You start zoomed out and as the image builds up and enhances - you can start magnifying it to see smaller and smaller detail

- multiple output: so you can work at laptop and have actual live view displayed on projector or streamed via internet to be shared with friends / family

- multiple input: maybe you have two telescopes side by side and you want to use one mono and one color camera at the same time

- dithering - some people have good mounts and would like to dither randomly between each exposure - this is used in imaging to reduce noise as fixed pattern noise is spread around and shuffled. Otherwise on less good mounts this happens naturally as tracking is not as good and image slowly drifts, same for AZ mounts where there is rotation of the field - but it is better to control this process and make it truly random.

For planetary - I already outlined what should be incorporated - basically the lucky imaging work flow while still retaining live feed capability and ability to set output fps (like real time at 20-30fps, near real time - 10-20s, slow mo - every minute or two).

 

Link to post
Share on other sites
48 minutes ago, vlaiv said:

Yes, ideally you would want to do it with application that has ability to enhance the view for you.

As an example - human eye "records" around 30fps - which is equivalent of 33ms exposure. Any seeing influence would be averaged over this period and we would see blurred image (as we often do when observing visually). Ideally, planetary EEVA application would still capture image at lucky imaging speeds - more like 5-6ms rather than 33ms and would let you choose:

- enhanced real time view

Here app would take 5-6 successive frames, combine them (align + stack), do a bit of sharpening (small amount or otherwise noise would amplify significantly) and a bit of denoising - and display live feed at speeds that we are used to visually - like 20-30fps. This would somewhat lessen atmospheric impact but would still be considered real time

- extra enhanced view

Here app would take much more successive frames, do the same processing - stacking, sharpening, denoising - but "deeper" to create more stable and better looking image and update view every few seconds. This would let you examine details more but would still allow you to follow changes in "real time" - like Jovian moon/shadow transits and similar that happen over dozens of minutes

Of course there would be "rec" button that would allow you to save raw data and process even better afterwards.

At the moment, primary issue I see with EEVA in this department is lack of dedicated software with these special capabilities - but as long as people are interested, software support will get developed.

Hi vlaiv, thank you for supplying that information, it’s a big help.
It seems to me though that unless we stick with eyeball to eyepiece observing then as technology continues to improve we are, step by step, moving further and further away from reality. What we see on screen is enhanced, yes, we can see more detail, but what actually is it are we looking at? (This could get a bit deep). The difference in seeing an object sharp and crisp in the eyepiece is far removed from that very magnified and detailed image on the screen. 
Both have their place obviously, it depends on what you want to achieve, but if it is only to observe I feel that you can’t beat the experience of looking through the eyepiece. 
We are at the stage where some members do all their observing remotely, the scope is out in the observatory and they are indoors in the warm watching the images on a screen. Yes, it does have a certain appeal, but apart from selecting your own target I feel you may as well be watching a television program about astronomical images, there being no connection between you and your telescope other than electronically.

Maybe it’s just me, I don’t know.

  • Like 1
Link to post
Share on other sites
4 minutes ago, Moonshed said:

We are at the stage where some members do all their observing remotely, the scope is out in the observatory and they are indoors in the warm watching the images on a screen. Yes, it does have a certain appeal, but apart from selecting your own target I feel you may as well be watching a television program about astronomical images, there being no connection between you and your telescope other than electronically.

Maybe it’s just me, I don’t know.

It's not just you 😀

  • Like 1
Link to post
Share on other sites
18 minutes ago, Moonshed said:

Hi vlaiv, thank you for supplying that information, it’s a big help.
It seems to me though that unless we stick with eyeball to eyepiece observing then as technology continues to improve we are, step by step, moving further and further away from reality. What we see on screen is enhanced, yes, we can see more detail, but what actually is it are we looking at? (This could get a bit deep). The difference in seeing an object sharp and crisp in the eyepiece is far removed from that very magnified and detailed image on the screen. 
Both have their place obviously, it depends on what you want to achieve, but if it is only to observe I feel that you can’t beat the experience of looking through the eyepiece. 
We are at the stage where some members do all their observing remotely, the scope is out in the observatory and they are indoors in the warm watching the images on a screen. Yes, it does have a certain appeal, but apart from selecting your own target I feel you may as well be watching a television program about astronomical images, there being no connection between you and your telescope other than electronically.

Maybe it’s just me, I don’t know.

Many people feel similarly, but important thing to understand is that EEVA is not trying to remove / replace traditional observing. It is just another way of having fun under the stars and if it's not your cup of tea - that is perfectly fine. You should not feel pressured to try it if you don't feel like it and many people can enjoy both - visual and EEVA.

I know I do - I enjoy both imaging and visual and am really happy to play around and think about EEVA as well. I don't consider myself to be primarily visual or primarily imager or whatever - just enjoy activities that interest me.

  • Like 2
Link to post
Share on other sites
14 minutes ago, vlaiv said:

Many people feel similarly, but important thing to understand is that EEVA is not trying to remove / replace traditional observing. It is just another way of having fun under the stars and if it's not your cup of tea - that is perfectly fine. You should not feel pressured to try it if you don't feel like it and many people can enjoy both - visual and EEVA.

I know I do - I enjoy both imaging and visual and am really happy to play around and think about EEVA as well. I don't consider myself to be primarily visual or primarily imager or whatever - just enjoy activities that interest me.

I do share your sentiments on this subject. As I said both visual observing and watching an enhanced screen image have their place, it depends what you want to achieve. In my case I feel that as I have moved deeper into the photography side of astronomy and spent less time at the eyepiece some of the fun has gone out of it. I know it’s good to have a permanent record of beautiful night sky objects but astrophotography can be a very frustrating hobby sprinkled with moments of great satisfaction. I shall be continuing with my astrophotography because I do in the main enjoy it, but not so much as I have these past three years. For me I find my greatest enjoyment at the eyepiece and that is where I intend to spend most of my time.

We are all different, we have different likes and dislikes and I fully appreciate how other members meet their astronomical needs in various different ways and combinations of ways. It’s a hobby where we can do that, astronomy is a rapidly changing and developing field that provides an ever expanding way of observing the universe.

  • Like 1
Link to post
Share on other sites

I am in the same camp as vlaiv here, I do not consider myself anything other than a simple astronomer. I get enjoyment from visual and eeva. 

It depends on the conditions as to what I will choose to do. Moonlight polution makes me want to do eeva or maybe moon with my bv's.  I am lucky to have a close to bortle 3 site at 20 minutes drive so it is great for eeva.

But on those rare occasions when visual conditions are good then my dob is in the back of the car and I am off to my dark sky site to try to get those dark sky objects protons falling onto my retina. 

Link to post
Share on other sites
1 hour ago, Tiny Clanger said:

I feel you may as well be watching a television program about astronomical images, there being no connection between you and your telescope other than electronically.

 

EEVA,IMHO, also includes "near real time stacking" be it software or hardware stacking. The later "Electronically Enhancers" the visual astronomy which is why its called EEVA .

Using an eyepiece or looking at a screen we are both using a common input method - our eye's.

Dont forget many EEVA users use simple mobile AZ set ups which they use "in the field" just like visual observers so in my book we (EEVA users) are just as "connected".

How the image is projected onto the eye is irrelevant.

 

  • Like 1
Link to post
Share on other sites
3 minutes ago, stash_old said:

 

EEVA,IMHO, also includes "near real time stacking" be it software or hardware stacking. The later "Electronically Enhancers" the visual astronomy which is why its called EEVA .

Using an eyepiece or looking at a screen we are both using a common input method - our eye's.

Dont forget many EEVA users use simple mobile AZ set ups which they use "in the field" just like visual observers so in my book we (EEVA users) are just as "connected".

How the image is projected onto the eye is irrelevant.

 

Wasn't me who said what you quoted

1 hour ago, Moonshed said:

Hi vlaiv, thank you for supplying that information, it’s a big help.
It seems to me though that unless we stick with eyeball to eyepiece observing then as technology continues to improve we are, step by step, moving further and further away from reality. What we see on screen is enhanced, yes, we can see more detail, but what actually is it are we looking at? (This could get a bit deep). The difference in seeing an object sharp and crisp in the eyepiece is far removed from that very magnified and detailed image on the screen. 
Both have their place obviously, it depends on what you want to achieve, but if it is only to observe I feel that you can’t beat the experience of looking through the eyepiece. 
We are at the stage where some members do all their observing remotely, the scope is out in the observatory and they are indoors in the warm watching the images on a screen. Yes, it does have a certain appeal, but apart from selecting your own target I feel you may as well be watching a television program about astronomical images, there being no connection between you and your telescope other than electronically.

Maybe it’s just me, I don’t know.

I said

1 hour ago, Tiny Clanger said:

It's not just you 😀

No idea how you managed that, when I click 'quote' it inserts what was added by the poster, not any quote they included within their post.  You need to address Moonshed.

Link to post
Share on other sites
1 hour ago, stash_old said:

Using an eyepiece or looking at a screen we are both using a common input method - our eye’s

Yes we are, but I don’t understand what your point is. For instance I could just as equally say it doesn’t matter if you are looking at the sat nav screen when driving or through the windscreen, we are using a common input method - our eyes.  The sat nav screen is only a digital representation of the road ahead, but is not the road ahead, that’s only seen through the windscreen 

 

1 hour ago, stash_old said:

How the image is projected onto the eye is irrelevant.

 

Again, I fail to understand what you are saying here, of course it matters how the image is projected onto the eye. I would say there is a massive difference in seeing the Grand Canyon if I am actually standing there and the image that is projected into my eye is light reflected from the Grand Canyon itself, as opposed to having those images projected onto my eye from a tv screen in Norfolk.

It’s the same difference between looking at a screen or looking through an eyepiece. The screen is an electrical digital representation of the photons falling onto a camera’s sensor chip whereas looking through the eyepiece is seeing the actual, real photons coming from the object, not an electronic representation of it.

All that aside, as I have already said there are numerous ways we can observe the universe and technology is discovering different ways all the time, it is up to the observer which method they prefer to use, they all have their uses. My only point is that looking through the eyepiece is seeing the actual, real object, whereas looking at a screen indoors is far removed from that.

Link to post
Share on other sites

What it’s not for me is night vision - night vision is just another eyepiece in my eyepiece case that can be used in the same way as my Televue, explore scientific eyepieces etc.

Virtually all the discussion in this section is about EAA which uses cameras and computers.

Edited by GavStar
  • Like 1
Link to post
Share on other sites
1 minute ago, GavStar said:

What it’s not for me is night vision - night vision is just another eyepiece in my eyepiece case that can be used in the same way as my Televue, explore scientific eyepieces etc.

Well, at first it was just called EAA :D and at that time I would agree with you, but subsequently it was renamed to EEVA - precisely to include night vision devices which are electronic in nature and in the same way you are watching computer screen - and small display in devices like evScope - you are looking at phosphorous screen and not actual target with night vision device.

Does it require electricity to be able to observe (but not for tracking - for actual observation)? Then it is EEVA :D

  • Like 1
Link to post
Share on other sites
10 minutes ago, vlaiv said:

Well, at first it was just called EAA :D and at that time I would agree with you, but subsequently it was renamed to EEVA - precisely to include night vision devices which are electronic in nature and in the same way you are watching computer screen - and small display in devices like evScope - you are looking at phosphorous screen and not actual target with night vision device.

Does it require electricity to be able to observe (but not for tracking - for actual observation)? Then it is EEVA :D

So is using a Quark for solar visual observing EEVA since it needs electricity? Defining an approach purely based on use or not of electricity is a poor definition imo since the various methods are just so different.

Edited by GavStar
Link to post
Share on other sites
13 minutes ago, GavStar said:

What it’s not for me is night vision - night vision is just another eyepiece in my eyepiece case that can be used in the same way as my Televue, explore scientific eyepieces etc.

Virtually all the discussion in this section is about EAA which uses cameras and computers.

Curious as to what you mean by a night vision eyepiece that you class as just another eyepiece.

Link to post
Share on other sites
4 minutes ago, GavStar said:

So is using a Quark for solar visual observing EEVA since it needs electricity?

Not in the same sense using dew shields that are electrically powered not EEVA.

You are still hit by original photons from the target - regardless of the fact that Mica crystal filter needs to be kept at certain temperature in order to operate on band.

Link to post
Share on other sites
1 minute ago, vlaiv said:

Not in the same sense using dew shields that are electrically powered not EEVA.

You are still hit by original photons from the target - regardless of the fact that Mica crystal filter needs to be kept at certain temperature in order to operate on band.

Ah now you are changing it from use of electricity to original photons etc...

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.