Jump to content

Stargazers Lounge Uses Cookies

Like most websites, SGL uses cookies in order to deliver a secure, personalised service, to provide social media functions and to analyse our traffic. Continued use of SGL indicates your acceptance of our cookie policy.

sgl_imaging_challenge_banner_dslr_mirrorlesss_winners.thumb.jpg.9deb4a8db27e7485a7bb99d98667c94e.jpg

Buzzard75

Members
  • Content Count

    492
  • Joined

  • Last visited

Community Reputation

595 Excellent

3 Followers

About Buzzard75

  • Rank
    Star Forming

Profile Information

  • Location
    New Bern, NC
  1. I don't claim to speak on behalf of Unistellar, all I know is what they've communicated through emails and Kickstarter updates. They just started deliveries to EU and North America in December for their Kickstarter backers, which ended in 2017. That's their first obligation. I know they are having issues getting certification for countries outside of that, but they are working through it as they can. Indiegogo backers would come after Kickstarter backers as that was their second round of crowdfunding and they said as much on Kickstarter. Personal opinion, if they can't deliver to particular crowdfunding backers for some reason like country certification, I wouldn't expect them to just let product sit in their warehouse or wherever, they should move on to the next person on the list. As for how many spots they have left for June delivery, I can't speak to that or how that fits in with their production schedule or what that means for crowd funding fulfillment. I would imagine that as particular issues are resolved for country certification, the backers of their Kickstarter and Indiegogo campaigns will get priority over orders placed on their website. And if they can't get resolved for whatever reason, then they should be issuing a refund. Again, to be clear, none of that is an official line from Unistellar.
  2. Thanks for the compliment! It's one of my favorites that I've done so far too.
  3. Agreed. I had to dig into the support documentation on the website to find the details when I was considering backing it a couple of years ago. They probably should have defined it up front. In any case, that's the logic behind the claim. And in case anyone else is curious, a difference of 5 magnitudes equates to a factor of 100 in terms of brightness. So a star that is mag 10 will appear 100 times brighter than a mag 15 star. I honestly haven't done any testing to confirm their calculations under my own conditions.
  4. A few comments on recent points. I'm not going to go back and quote everyone and address each individual. Price: I paid $1299, plus shipping and import fees. While the eVscope presented great value for the money at that price, I saw it as an opportunity to fund development of a project that aligned with my interests and had great potential for the future of amateur astronomy. That should be the main goal of a Kickstarter, not just to get a discount on something. However, who would want to fund something and not get anything in return, right? Had it been $3000+, I would have passed, so I completely understand where those people are coming from. As there wasn't anything else like it on the market at the time, they have to start somewhere though. Initial startup and development costs are always high. There will be more competitors to this sector of astronomical devices, they will continue development and fine tuning their processes. All of this equates to reduction in prices, increased capabilities and a wider variety of products available to consumers. Tripod: There's nothing wrong with the tripod. It's perfectly stable, albeit very tall. I probably wouldn't take it out in 10mph winds, but I wouldn't bother taking any of my other equipment out either. The legs of the tripod spread pretty far to give it a good stable base. As was mentioned, you don't HAVE to extend the legs out all the way and can leave it so it sits closer to the ground for more stability. I've actually done this myself a few times more because it was unnecessary for me to look through the eyepiece viewer than anything, but it did help with image integration. If the software detects too much vibration in the captured data, it will drop the frame rather than integrate it. This tells me that the software may be looking for eccentricity or HFD values or both. Image Quality: Some of the images on Unistellar's website are renderings. Some of the images on there though, including some in the Kickstarter campaign main page and updates and ones being posted on Twitter, are indeed directly from the scope. When I received my eVscope and tried to save an image, it was watermarked with beta all over the image. I don't know if that was just for the beta test or if it had always been like that during development. It's possible that is the reason why a lot of the images they posted were taken looking through the eyepiece rather than from the app. I honestly don't know. As for the actual quality of what they have posted. They are the developers. They have been working with this technology for years. I've only had mine for a few months now. They have had a lot more time to fine tune their device and for all I know they are using different processing algorithms for some of those images. In addition, I'm sure that some of the images they posted are from light accumulations totaling 15-20 minutes or possibly even more. Most of the images I posted were on the order of a few minutes. We all know how the SNR improves the more time you spend on something. I also fully admit that my focus and collimation may not have been perfect. Again, overtime, my instrument will also be fine tuned to provide the best image quality possible. Light Pollution: They have used the device in places like San Francisco and Las Vegas and provided images to that effect. Yes, they displayed some of the brightest objects in the night sky as those are obviously the most impressive and will show the most detail. Would you not do the same if you were taking a device like that to the public? The images I have shown are taken from my driveway in a suburban area. According to the latest data available on lightpollutionmap.info, I live in a Bortle 5/6 area. I suspect it's much worse than this though and probably closer to a 7. On an EXCEPTIONALLY good night, I can see the brightest stars in the constellations. I can't see the Milky Way at all, anywhere in the sky, straight up or on the horizon. I can detect some haze where M42 is and I can pick out individual stars in M45 with averted vision. I can barely detect M31 with averted vision. Under normal conditions, I can't see any of that. M45 is a fuzzy mess and M31 is nowhere to be found. I can see Polaris and the two stars on the end of the pot of the Little Dipper. The other Mag 4-5 stars are completely invisible. These are the conditions under which I've been capturing my data with the eVscope. 100x More Powerful: This one definitely needs to be taken with a grain of salt or at least be understood a bit better. It is certainly a bold claim. Based on what I've read and been told, the origin of this claim was from some of the initial testing of the eVscope prototypes. It is said among the developers that a colleague told them they could see more with the eVscope than they could VISUALLY with their 1 meter scope. Now, 1 meter is pretty large and that statement was a bit anecdotal and had no scientific evidence to back it up. So, as any good scientist would do, they went to get the data. They determined the limiting magnitude of a star that was detectable by an eVscope under conditions in San Francisco. They then calculated the limiting magnitude that was VISUALLY detectable by an average observer under the same conditions with a similar sized scope. They converted those magnitudes to flux values to represent apparent brightness. They then used a flux ratio to determine that, in fact, the eVscope was very close to the 100x more powerful claim. People just need to understand that they are comparing a digital light accumulation and stacking process to a purely VISUAL one. I think that's the part that's lost on a lot of people and, without doing the research, unfortunately makes great fodder for debasing this product.
  5. Welcome to the forum, Tom! Thank you and glad you were able to find the review.
  6. I've done several Kickstarters, but never one of this magnitude so I was a bit hesitant about it. The partnership with SETI is what sold me on it. The outreach is exactly what I had in mind when I did back it and will be its primary use. I would love to see one of those big name astronomy companies come up with a competing product as well.
  7. As you've outlined and as I provided in my review and the subsequent comments, it definitely has pros and cons. I hope I didn't come across as a shill for the product. I definitely wouldn't recommend it to everyone and I actually cautioned a fellow club member who was interested in buying one. I wanted it to be clear that this wasn't as capable as a dedicated EEVA or astrophotography setup, but I also wanted it to be clear that it isn't intended to be. I've definitely seen better images from EEVA setups and produced some myself. You absolutely need to fully understand what it's capabilities and limitations are. The price is certainly one of the biggest cons. Since you're in Europe, your price would actually be in Euros (€3000 is roughly £2550), still more than I paid in USD. You can certainly get a lot of nice gear for that price, but that's when you have to look at all the pros and cons of this and all the pros and cons of having a dedicated setup. As I said, had I been required to pay full price, I most likely would have passed. As for being an unfinished product, the scope itself is done. There will be no further changes to the hardware. The app software and scope firmware is still in work and probably always will be, as is the nature of software. About the only features that haven't been fully implemented or I haven't had the chance to take part in are the citizen science projects and raw data processing. The raw data processing is really what interests me. I want to see how stacking and processing software on my PC compares to the stacking and processing software of the mount itself. For comparison, this is 20 minutes worth of live stacked data that I captured with my astrophotography setup using LRGB filters (15x20s per filter, live stacked and calibrated with darks) and then combined and processed on my PC. It puts the 4-minute eVscope processed image to shame, but I want to see what I can do with the raw data myself.
  8. This is my hope. As I said, it is one of the first of many to come. As it is with all technology, it should become more affordable over time. If you look at the price of all the individual components that make it up, it probably is overpriced as Geoff suggests. It's certainly not 'new' technology by any means, but the form factor definitely is and integrating all of it together into a package like this comes with a price attached both monetarily and otherwise. The downside to packaging all of it together like this is that if one thing goes wrong (say your sensor stops working, the battery goes out, or the mount fails) the whole thing dies and you can't just swap out a single component with a new one. That's where EAA setups certainly have the advantage. Pros and cons to everything I suppose and the eVscope is no exception in that regard. As a Kickstarter backer who received a discount for taking a risk, I am happy with the price I paid and think it was well worth the investment. Had I been asked to put down a deposit at full price, I honestly would have passed.
  9. It is certainly that. I will admit the images I posted may not be the most flattering display of its capabilities. They were a bit rushed as you can see by the integration time. The longer you observe an object the better it will get, obviously. It is good for public outreach though. I can't even count the number of times I've tried to point out a galaxy or nebula in the eyepiece to someone and they say they think they see it, but aren't sure or say they don't see anything at all. This will allow them to see and I can have a screen in my hand to point it out to them and show them any structure that may be visible in the object. Can you do the same with an EAA setup? Of course, but as I said, it won't be this compact or easy to setup and use.
  10. Right on with the FOV, it is about half a degree. Those images were captured using the app. They haven't implemented it yet, but there is supposed to be a feature to allow you to download the raw images and process them yourself. And yes, those images are exactly how it appears on your mobile device and in the eyepiece. There are some settings you can tweak for brightness and contrast as well as exposure length and camera gain.
  11. Unistellar started shipping eVscope units to Kickstarter backers a couple weeks ago. I was fortunate enough to be selected as one of the beta testers. Now that it’s released and the app is available on Google Play and iTunes, I thought it was time to provide a review. I won’t talk too much about the beta test, but I will give you details about my own personal experience with the eVscope over the last few months and thoughts on its construction. I received my eVscope at the beginning of October. I was lucky enough to be able to use it the same day I received it. When was the last time you received a piece of astronomy equipment and the skies were clear enough on the same day that you were able to use it? In fact, I had two clear nights in a row. Upon my first viewing, I was honestly rather unimpressed and somewhat disappointed in the performance. The mount had some trouble tracking and I was getting star trails. I could see them drifting through the live view. There was also a considerable amount of noise in the stacked image in the form of light pollution that should have been processed out. I wrote up my report and sent it to Unistellar. A few days later I received an email that they had taken the feedback of myself and other users and made some improvements to the software and firmware. All I had to do was update the app and the next time I connected to the eVscope it would be updated. To my surprise, it helped tremendously. I started seeing stacked images that appeared much closer to what I would have expected. Over the course of the next couple months I continued to make observations and provide my feedback. In turn, Unistellar continued to make improvements to the app and the eVscope firmware and provided some more information on how to calibrate the hardware to get a better image. In addition to a large focus knob on the back of the scope, there are collimation screws to better align the primary mirror to the sensor. That’s all the collimation and adjustment that’s required and it’s very easy to do. During the couple of months I had the eVscope for beta testing, I took it to a star party one evening that our club was hosting. This would be the first time anyone other than myself and my immediate family had seen the device. We had about 450 people show up that night. That size of a crowd was a bit unexpected and overwhelming. Per the NDA I signed, I wasn’t supposed to show the eVscope to anyone or even talk about it or the beta testing. I chose not to show it to the general public. However, my club members knew I had it and knew all about it as I had been talking about it for a couple of years. I wanted to get their impressions and feedback so that I could also provide that to Unistellar as they may be more objective than I was being since I was invested in it. I waited until all the guests left before I pulled it out and set it up for a handful of members. I also really wanted to try it out at that location as that will be the primary place it will get used. All of the members who had the opportunity to look at the images presented were impressed by them. Again, the eVscope was performing as I had originally expected it to and as the club members had expected it to based on what I had told them about the product. There are those who will say this is just a gimmick and it’s not the same as the photons falling directly in your eye. I would say to those people that this device is not for you then. It is also not for the dedicated astrophotographers who want highly detailed and vibrant images of objects using data that has been collected over several hours. I have an entire setup dedicated for astrophotography for that myself and I fully admit it is much more capable at pulling out detail than the eVscope. Arnaud said in the promotional videos that when he looked through his telescope and could only see faint fuzzy objects with no detail that he was disappointed. That is who this telescope is for, the people who want to see more detail than a faint fuzzy spot, but don’t want to deal with a full astrophotography setup and spend hours getting the data. We want to spend 10-15 minutes looking at one object and move onto something else. Some would also say that looking at a image projected on an OLED screen in a viewer is not the same as looking through an eyepiece. I would also tell those people they are also incorrect. I got the same reactions out of experienced club members looking through the viewer that we get out of the general public when they look through an eyepiece for the first time. It still provides some sense of wonder and personal connection with the object you’re viewing. Certainly, more so than looking at a computer or mobile device screen. I think the decision to include an OLED screen viewer was the correct one. Is it expensive? Absolutely. I backed the Kickstarter, so I received mine for a significantly lower cost than retail, but that savings was not without risk. So, what do you get for the retail price of $3000 USD? You get a sturdy, lightweight tripod with bubble level, a mount, a 4.5” telescope, a camera, a power supply, a computer and image integration and processing software with plate solving capability. I’ve said it before, many times, but if you were to build a similar system, you could easily spend a couple thousand dollars. What you won’t get for that though is something that is as easily transportable and simple to setup. The tripod, telescope and small toolkit all fit in a backpack and the whole thing weighs around 30lbs. Setting it up couldn’t be any simpler either. You setup the tripod, put the telescope/mount on top of it, turn it on, connect to your phone over Wi-Fi and you’re up and running in less than five minutes. There’s no polar alignment or star alignment required as it does plate solving at the push of a button. This system really is about as easy as it gets to setup and use, but it obviously comes with a cost. And while the idea of a backpack is nice, I opted to put everything into a Pelican Air 1615 case which brings the weight closer to 40-45lbs. Could the imaging quality be improved? Probably. Everything can always be improved. The question though is how? They could have used a larger optical tube, but then that comes at the cost of portability which is a key feature of this product. The sensor used is the Sony IMX224LQR. This sensor is very sensitive and has an extremely low read noise. Both are obviously critical in this application. They could have used a larger sensor or one with smaller pixels, but why? A larger sensor would certainly give a wider field of view. It’s already a fairly wide field of view with a focal length of 450mm. I can fit in the entire moon in the field of view. A slightly wider field would take in all of Andromeda and the Orion Nebula though. Those are some of the largest objects we would look at though and the widefield would be wasted on smaller things like planetary nebula or other galaxies. As for the pixel size, 3.75µm is neither small nor large and is a perfectly reasonable size. With this focal length it gives a sampling ratio of 1.72”/pixel. It is neither oversampled nor undersampled, except in possibly some of the best seeing conditions. A slightly smaller pixel size of 2.9 or even 2.4 would give slightly better details, but would also require much more accurate tracking. Speaking of tracking, how accurate does the tracking need to be? It is an alt-az type mount. The exposures are currently software limited to about four seconds. At that short of an exposure, the tracking doesn’t need to be so accurate as to compare with an autoguided system. I honestly think for the size of device it is and the length of exposures being used, the chosen sensor is about as optimal as it gets. You’re going to be hard pressed to find a sensor with smaller pixels that is more sensitive and has such low read noise. Overall, I would say I’m satisfied with how it currently performs, but I fully admit there is room for improvement. Unistellar continues to listen to its users and take feedback into consideration on how to make their product more capable than it already is. I know they value my input and that of others as is evidenced by the improvements they’ve already made to the device software and firmware. I look forward to all the improvements they have planned. I also look forward to all the innovations that will be made in this area of the field in the future. This device is just one of the first of many to follow.
  12. If you're looking for help for your specific issue, you might be better off creating your own topic in the Beginners or Observing sections. However, the only piece of advice I can give you is that your focuser may not have enough backfocus (outward travel) to reach focus using your barlow. If you have one, you may want to try an extension ring on your focuser, put the barlow in that, and then try to achieve focus again with your eyepieces.
  13. If anyone saw my post on the Imaging Tips and Tricks board you would know that I had some problems the other night with my rig. I was apparently out of focus and didn't find out until I went to process the data and had holes in my dimmer stars. Well, that got fixed the next night. I was fortunate enough to have two nights in a row that I could image. That almost never happens around here. But it was extremely cold both nights. Temps dropped down around -5C both nights. I about froze my toes off. I also had some issues with PixInsight while processing that I just couldn't quite figure out at first, until I realized my error. M45 is a very tricky object to image and process. I've seen other images of M45 with much larger sweeping clouds and it makes my image feel a bit inadequate. Then I realize those images were taken with a telescope that has 16x the aperture with an f/ratio twice as fast and they have at least 50% more integration time. So I'll take what I can get with my little Redcat. Tech specs: ZWO ASI183MM Pro ZWO EFW with 1.25" ZWO LRGB filters WO Redcat 51 iOptron CEM40EC 15x180s each filter (3 hours total integration), unguided Calibrated with darks, flats, and dark flats Captured in APT and processed in PixInsight with some minor level adjustments in Photoshop
  14. My donuts have been eaten! Lesson learned is don't trust the stretch in APT. I put the image in Pixinsight and it was much more obvious that something was wrong. The stretch in APT is so noisy it was hard to tell that anything was amiss.
  15. I have a friend/club member who has the AF3. He hasn't had much time to use it. I wanted his thoughts on it before I pulled out my wallet, but I will probably end up with one at some point. Manually focusing a helical focuser is somewhat of a pain. I'm going to try again tonight since we will have even better seeing conditions than last night. I'll just have to be more careful I guess and be sure to check the stretched data as it's coming in. I've just never had what I thought was good focus only to find holes in the stars when I start processing. The while situation is just very strange to me. Thankful for any other tips or things to look out for and check.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.