Jump to content

Stargazers Lounge Uses Cookies

Like most websites, SGL uses cookies in order to deliver a secure, personalised service, to provide social media functions and to analyse our traffic. Continued use of SGL indicates your acceptance of our cookie policy.



Advanced Members
  • Content Count

  • Joined

  • Last visited

Community Reputation

384 Excellent

About noah4x4

  • Rank
    Star Forming

Profile Information

  • Location
  1. Picking up from the previous post.... The question for me is pursue "longer exposures" or "more stacks" or both? I have been using an Atik Horizon CMOS for around a year. It is awesome when using short stacks on Hyperstar for EAA "observing". Means I need no wedge, no polar alignment routine etc and I see stuff that I will never see through an eyepiece even if I had an enormous Dobsonian. My local light pollution is pants, and "EAA" using this high resolution large sensor CMOS camera has offered me much joy. However, my increasing conclusion is that if we exclude NV and 'Video' the vast majority of EAA participants are actually frustrated astrophotographers that will probably not be satisfied unless their "observations" ultimately produce 'images' that mirror the quality of images that we see generally see posted in the forums. After all, adding 'darks on the fly' and other like processing has become routine practice to gain an incremental advantage. I now find myself in that group, and my frustration is generally due to poor seeing and light pollution. My exposures have hence progressively grown longer and my stacks numerically greater up to the limit where field rotation permits on an Alt-Az (under 30 seconds). I have reached a stage where my 'imaging' seems to have reached a plateau. Hence, in the last week I have reverted back to using my wedge and polar alignment with the intent of increasing my typical subs from 20 second to 60 seconds or longer (but still on Hyperstar). So far, clouds have frustrated me, but I have no doubt that tripling the length of my exposures must boost things. I obviously don't know for sure, but I suspect Hyperstar on GEM or wedge using 60 second to 120 second exposures may be where I probably want to be. As that on Hyperstar at f/2 equates to exposures of 28x that of regular long exposures then maybe this is where more of us will eventually head, albeit traditional CCD will survive. But it does appear to me that "EAA" has moved on with a trend towards far longer exposures than the 5s to 10s where its pioneers started. The blurring of lines between EAA and AP probably now means its all AP (except for NV). But I don't think that matters, it's just an inevitable progression.
  2. I take my grab & go Nexstar SE4 to Tenerife as inside the caldera of Mt Tiede is one of the best locations I have ever found on this planet. Its steep lip even shields glare on the horizon. It's a long drive up in a hired underpowered Fiat Panda, but quicker going down. But beware.... My first time visit, after three peaceful idyllic hours, I heard the roar of dozens of motorcycles at midnight and saw a cloud of dust heading in my direction. Heck, was it the Santa Cruz Hells Angels? Mad Max? Whacky Races? I have never packed my gear so fast.... But I didn't need to do so. Almost every night there is an organised tour party of quad-bikes that come up the mountain to "see the stars" and enjoy the desert like experience. They will stop, perhaps chat with me and my wife and show some curiosity, and always be respectfully gone in ten minutes as the organising leader of the group is very polite. But first time it's a bit unnerving.
  3. I use a 'tool box' like this to help speed up getting Scope + Camera + Autofocusser connected to my indoor 'Mission Control'. Inside a regular (£10) 'Stanley' plastic toolbox I have built a wooden cradle to which I affixed a power source (Tracer 22Ah); my auto-focuser controller; Intel NUC computer etc. Three cables then run from my Intel NUC up to my OTA inside the white cable wrap (e.g. Camera power; Camera USB3 and Focuser's USB2 cable). The Intel NUC depicted is also connected to my scope using a SkyPortal External WiFi accessory. I control scope using Celestron CPWI for Windows PCs. Then, using a second (USB external) wireless adapter plugged into my scope side NUC, I control this 'Tool Box' and its devices within from indoors using Windows Remote Desktop over a wireless network. Whilst my set up is 'end to end' wireless embracing 4K UHD over 12 metres remote control distance, the 'Tool Box' basic design concept is possibly also of help to those that travel to Dark Sky sites and have need of a similar highly portable pre-assembled 'gadget box'. It could contain merely your power sources and any other loose bits you need to transport. What is perhaps its most neat feature is that the spaghetti of cables connecting my power source, with focuser and computer is pre-assembled and neatly hidden inside the wooden cradle inside the 'Tool Box' below the devices. I could easily run an autoguider cable from it too. Previously I had an obsession to hang everything off my scope and nearly hanged myself as I had so many trailing cables. Then the 'penny' dropped as regards how best to do this. But there was an interim alternative that I post details below as it too might be of assistance... Below is an earlier prototype where I had instead affixed the (then three sided) wooden 'cradle' to the tripod legs. This photograph highlights how easy it is for the cabling to the OTA to be neatly managed in the cable tidy wrap. But this affixed to scope design merely added weight to the Tripd/Mount, and as I wanted to revert to my wedge, that overall weight became more prohibitive. This was how I came up with the 'device cradle in a toolbox' idea. But both concepts have some merit, notably cable management.
  4. Much depends on equipment and accurate user data inputs. My Celestron Evolution or Nexstar SE4 if autoaligned using Starsense with data sourced from GPS has superb GoTo accuracy. Will consistently centre my Targets in even a 12.5mm cross hair illuminated reticle EP.
  5. Sorry to say this, and I am not saying it is impossible, but I think you have to be realistic and accept that a Celestron Astromaster 130EQ has amongst the least likely mounts suitable for doing astrophotography with a bulky DSLR. I suspect that you will suffer from wind-shake and general instability. Balance may be the least of the physical challenges. You might be OK with some short stacked exposures of the Moon, Jupiter and the Orion Nebula etc, but merely keeping your target in your FOV might be very difficult. I owned its predecessor and even with cheap lightweight Celestron web-cams it was very difficult to achieve more than I have described. My advice is give it a go, have a bit of fun experimenting, but don't commit more money unless it is upon a superior mount. It is unfortunate, but this hobby can be very expensive if you aspire to do Astrophotography, and first requirement is having a quality mount.
  6. The computer at the scope is running Windows 10 Professional. Hence you reduce or disable RemoteFX compression ONLY on that as it is acting as the primary 'server' device. The computer running indoors is simply replicating the screen output for the primary device with keyboard/mouse (albeit that is running 'headless'). It hence performs a the role of a dumb terminal. But even if simply receiving and re-drawing 4K UHD screen data it requires a data transfer rate exceeding the artificial restriction of 10 Mbps. However, at 1080p HD no amendment might be necessary. Frankly, I would initially try without adjusting RemoteFX compression as if your camera isn't demanding it may run fine on normal Windows Remote Desktop settings. However, if you suffer any lag or stutter, then try it before investing in greater computing or network power.
  7. Upgrade to Win 10 Professional is easy. You pay your money to Microsoft and it auto-updates. Windows 10 professional gives you so much more control. See my tip about RemoteFX compression in previous post.
  8. Some relevant information I learned about this month... A reason why high resolution (16 megapixel) cameras often stutter and suffer from lag over Windows Remote Desktop (or similar) is 'RemoteFX Compression'. You might have a 433Mbps wireless network or an even faster Cat 6 cable LAN and it won't let more than 10 Mbps through the system. 4K UHD screen data requires far more than this. The reason it exists is that in commerical networks or across the Internet it is designed to prevent any single user from throttling the wider network by excessive data. But if strictly confined to your home network you don't need this degree of compression and if you reduce it or disable it then your WAN/LAN network will become turbo charged. Not only can I now run my 'end to end' 4K UHD system wirelessly, I reckon that I could now achieve that using lesser computing power and certainly lesser network capacity, hence saving considerable money. Hence, lower specification compute sticks are potentially more viable, but do check their memory and storage capabilities as each individual 16 megapixel frame tends to exceed 48Mb. I still favour an Intel NUC (and you would need Iris 640 Graphics for 4K UHD). But this tip will speed up data flow on any system, including 1080p HD. To learn how to fully manage RemoteFX Compression and its enviroment visit this document; https://docs.microsoft.com/en-us/windows-server/administration/performance-tuning/role/remote-desktop/virtualization-hosts But below is the key text for the purposes of reducing or disabling RemoteFX Compression (note applies to Windows 10 Professional etc). RemoteFX data compression Microsoft RemoteFX compression can be configured by using Group Policy under Computer Configuration > Administrative Templates > Windows Components > Remote Desktop Services > Remote Desktop Session Host > Remote Session Environment > Configure compression for RemoteFX data. Three values are possible: Optimized to use less memory Consumes the least amount of memory per session but has the lowest compression ratio and therefore the highest bandwidth consumption. Balances memory and network bandwidth Reduced bandwidth consumption while marginally increasing memory consumption (approximately 200 KB per session). Optimized to use less network bandwidth Further reduces network bandwidth usage at a cost of approximately 2 MB per session. If you want to use this setting, you should assess the maximum number of sessions and test to that level with this setting before you place the server in production. You can also choose to not use a RemoteFX compression algorithm. Choosing to not use a RemoteFX compression algorithm will use more network bandwidth, and it is only recommended if you are using a hardware device that is designed to optimize network traffic. Even if you choose not to use a RemoteFX compression algorithm, some graphics data will be compressed. I was so frustrated by lag and stutter that I decided to completely disable the RemoteFX compression algorithm, but others tell me that the less aggressive three values (bullet points) will also work great - just don't do it over the Internet or your office network!. My system now flys irrespective of the screen data load I am transmitting using Remote Desktop. Sorry, I don't know how to do this in TeamViewer or VNC, or even if it is possible.
  9. This might be controversial having read the posts so far, but the question for me is given that 'Imagers' already have their own (multiple) forums where should Electronically Assisted Astronomers, hence pure 'observers' go? Why shouldn't there be an exclusive forum with emphasis on observing other than through an eyepiece? The CN EAA Forum has recently become overwhelmed by NV (fair enough) and because there are no integration or exposure time limits, by CCD/CMOS Astrophotographers that will argue that 20 minutes total integration time is 'near live' observing because it comprises of 80 x 15s stacked frames, hence "short" exposures. The truth is that allowing for discarded 'bad' frames it is probably nearer 30 minutes of activity. Then they add 'darks on the fly' and other camera processing, yet post processing is prohibited. Then they post 'images' to demonstrate their prowess as astrophotographers and it's all become too competitive and critical of images with negligible discussion about 'observing'. That forum has hence become far more about astronomy as an Art form rather than as a Science and many pure EAA observers have, as Bruce suggests, stopped contributing. I haven't because I am hoping to see future solutions. As others have commented, the rules are restrictive and counter productive as there are no winners, just confusion about forum aims and objectives. We don't want this SGL Forum to suffer similarly. Those of us that are genuinely interested in Electronically Assisted Astronomy and rarely save our work as an image now feel reluctant to post anything other than equipment discussions. I can have a decent observation of the Horsehead on screen within 30 seconds using Hyperstar. But I might as well go and watch football if I need to wait a further 30 minutes to 'develop' it. I don't publish my results as that has become a competition for best pictures, and I am not into photography. I use "EAA" simply to defeat light pollution. However, I respect that 'Imaging' can be a natural bi-product of observing. But why can't 'imagers' remain in the existing imaging forums? In summary, EAA embracing Video and NV is truly 'live' and that could embrace EAA using CCD/CMOS strictly as an 'near live' observing discipline. But anything else is either long exposure or long integration short exposure astrophotography. My view is that solutions lie in creating SINGLE exposure and stacked MULTIPLE (stacked) exposure sub-forums under 'Imaging'. Then create NV, Video and EAA sub-forums within "Electronically Assisted Astronomy" with the emphasis there on observing, but stacked CCD/CMOS can feature as regards an observation tool. Equipment discussions can feature in either. Then everybody has a suitable forum to accomodate their interests. Just my two cents.
  10. Stuck behind tractor for 20 minutes. Then you safely overtake it, go round a long bend at a safe 40 MPH, spot 30MPH sign, brake sensibly and progressively, then within 20 yards caught by a speed gun operator hiding in hedge, 34 MPH., £100 fine + 3 points. My last visit to Kelling Heath. Be warned, speed restrictions are ruthlessly employed in Norfolk to support the local economy.
  11. I bought a 7" x 720p 12v camera field monitor for exactly the same diagnostic purposes and learned a costly lesson. It was a waste of money. The monitor at the scope should NOT have a resolution below that of your display device 'indoors'. Here is why... What should be noted is that if you connect (say) two computers using Windows Remote Desktop the host computer's resolution (now limited by the diagnostics monitor) will dictate your 'indoor' screen resolution and that will also become your default resolution. So even if you disconnect the lower resolution monitor you will be stuck in that lower mode until you once again connect a higher resolution monitor again and reset its display settings. What is also true is that if you set NUC A at 4k UHD and NUC B at 4k UHD then both will run at that resolution. However, if either is merely 1080p HD then that will be the limit. BTW, the secret to get 4K UHD camera screen data across a wireless network is disable RemoteFX Compression in Group Profiles (but I digress). So, in summary, if your primary display is a fairly regulation 1080p HD, don't buy a lower resolution monitor for diagnostics. But if it is 4K UHD there is no inexpensive solution. I now temporarily take my (indoor) 4K monitor to the scope if I need diagnostics as it is easier/cheaper to run a 50ft AC power cable to it than other solutions.
  12. Caution when driving on all North Norfolk rural roads approaching Kelling Heath. Almost all of these long rural roads are strictly 30 MPH and do expect speed guns or cameras around every corner. North Norfolk is notorious for vigorous speed enforcement.
  13. Why not helpful to the OP that said he is planning to buy a mount? If his original choice can't meet his needs due to being unable to see Polaris his best option is to choose another mount that does offer a solution.
  14. I echo this. I can't see Polaris due to obstructions but Celestron mounts have a feature called All Star Polar Align (ASPA). Works brilliantly with just HC or Starsense aided. Has also been added to Celestron's new CPWI software now in beta test.
  15. Don't follow the instructions printed in older SE manuals that describe a 'Wedge Align' if using a Nexstar + HC. The process has been replaced by an ASPA. You need the revised instructions for an 'All Star Polar Align' (ASPA) for which your start position is with OTA aimed South at its index marks; mount leaning North set at your latitude if using a Celestron wedge. I wonder, is that your problem. Is your home made "latitude scale" correctly calibrated? Or are you simply trying to target Polaris using a redundant 'Wedge Align'? An ASPA permits a polar alignment even where Polaris is not visible. Assuming your home made wedge replicates a Celestron wedge and you can roughly set it to your correct latitude you then need to perform an EqNorth Align (follow instructions in HC). Then perform a Polar Align. Then when mount and wedge are physically polar aligned (here you twiddle knobs with Celestron wedge to remove the "Polar Alignment Error"), reset OTA to its start position, recycle power, and conclude with a further EQNorth Align. I think crucial to success will be how you calibrate your angle to set latitude. I am not sure how you do that with a home made wedge. Even with a proper Celestron wedge I found the margins for error quite slim (I now use Hyperstar, as I became so frustrated with my wedge, albeit I mastered it). P.s. Your OTA is upside down. Your mount needs to be rotated 180° and the OTA correctly affixed.
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.