Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

GalaxyGael

Members
  • Posts

    385
  • Joined

  • Last visited

  • Days Won

    1

Everything posted by GalaxyGael

  1. Looks like a new vixen flatfield scope is being announced soon. Live feed here And some images from the booth. Image from T Yoshida attending CP+2023. Just north of 4700 EUR or £4200. Like the look of it, wonder how it performs and what the image circle and other features are.
  2. Oh yes, I used to a lot. Lived in fairly remote area, bortle 2-ish, and the summer sky after midnight with the milky way in full view never got old. Miss it terribly now that I think of it. Even here in B6/7, orion as a constellation always gets a double take when i look up,
  3. Christmas tree cluster, cone nebula and IC 2169 among other reflection and vdB regions in a fairly widefield (300 mm fl, 60 mm aperture) system using the Askar FRA300 quintuplet. Limited to 5.8 hrs, much less than planned but such is the permacloud of this seasons here, and scope changed for spring. Captured with ASI2600 MC, -10C at gain 100 on this one, using 178 x 120 s subs. This process was through APP and PS CS5. I think I will revisit the processing as my night time process and day time process on such a large and low(ish) resolution screen come out so differently! Hope you all have some clear skies.
  4. That could be it. Was going to ask if you had an IR cut filter installed (but maybe your L filter has UV/IR cuts?). Curious, are the broad more diffuse shadows likely from IR with banding from the iffy connection?
  5. Seems like a good idea, but is it safe at night when there is no-one or fewer people around? Luckily there likely not going to be too many flying their planes at night, cause they can end up coming down in an unintended and uncontrolled way!!
  6. For very widefield (depending on sensor size) the FRA300 is nice and easy to use. Good color correction for imaging, flat field and wide (300 mm) at f/5. Light and a well thought out long dovetail and handle/guide scope placement if you need it. I enjoyed using it for a few months and got a good sample. Off to the classifieds for a new camera. I found adding an astrozap dew shield removed the need for heated options, improved spurious light cover too. the Askar focusers are solid, no sag.
  7. I am in the same camp when it comes to choice, at least. So, aside from what I consider to be the main focus for my images and how they are processed, the big enjoyment is the act of processing and of course (I can only speak for myself) the use of my own gear to image. That could easily go down a rabbit hole of remote vs home etc., and I'm not referring to that, but for my case or remote cases, it is the ability for us to choose what we do and spending the time and effort to take and enjoy our own images of something that has a million version in existence already. We also have the chance, with careful dedication, to find faint and new things out there. On the fence whether computational approaches could tease out 'anomalies' in many, many images to guess at a common, but oft ignored data that could be new. I doubt it unless we image enough that we have that data to make the discovery anyway. AI does this frequently with scientific data sets, correlations, interpolations etc that many people miss due to the brute force nature of millions of options that AI can do rather quickly. But maybe that is really not relevant for us. We will always have the choice option to enjoy taking our own images I think and that's the big enjoyment and emotional factor in the rollercoaster ride of imaging, but AI and networking algorithms that can link, predict and act on existing archival data, will be able to do what I wrote and more (sparked from a recent showcase from colleagues developing such tools for large data mining of battery systems to predict failure modes and new chemistries from existing knowledge, that blew my mind). But, like we do now, doubtful we will apply a phone app sepia and scratched film filter on our images because the option is there. And i do think single click option can be done - just needs someone to have a good enough reason to implement it, and that might just be the death knell before I would ever happen. Then again, there is vaonis and others making essentially single click all-in-one systems. But the control and application of your knowledge and experience is very limited in turnkeys systems. Anyway, I just bought a 460 ex out of technological nostalgia and this thread may have turned left at Orion.
  8. I had wondered something similar. We already have it for questions, comments and prose in the form of ChatGP. Whether AI uses neural networks and other mechanisms within itself to assess and process data in the form of an image (Like Russel Croman's efforts), or whether (like Topaz) it is trained on available images and imposes that knowledge is another aspect. In the latter, given the we can process 10 images in 10 different ways according to taste, AI training on existing images will result in a whatever-you're-having-yourself outcome, maybe. these tools don't parse and learn from a standardized set of descriptions and processing steps. If we had these and had to log them for each image (like a FITS header, where such info was needed), in the form of final image meta data that gave the processing steps, values etc. that converted a raw image to what is posted, then it is possible that AI can see the influence of parameters and learn accordingly. But the artistic side of astrophotography is very inconsistent in terms of how much information is provided, and there is no reason for that to be necessary unless quantitative measurements etc. are needed. Like the professional astronomers, who need ghosting and blooming in the 'perfectly' linear CCDs to do the photometry and other measurements, many of us take quite similar stacked fits file and end up with something that aligns with our taste. But, to each their own. I like star fields, and not a fan of diminishing them, since the nebula is never the main target for me if it is part of that region of sky. Others like the nebula and its details as their main interest. I am also sensitive to 'added stars' and see it immediately - balancing stretch for starless and stars can be tricky for some targets. While their representation is a function of the optical characteristics of the scope (airy disc for refractors, diffraction spike for some reflectors etc.,) their color, relative size, density are very appealing to me as the headline of an image, but not at all to some people. And between us all I get to look at lots of images and see almost everything about a target. Many complex processes are now single click to start with, and I predict it will be inevitable that automatic processing will become an option very soon. We already have it on phones and PCs for pictures but I foresee a series of single click 'views' for a fits that includes nebula prioritisation, starless option, reduced stars, color saturation (the candy crush color calibration will make a comeback :), noise reduction etc. all built in to single click options, just like b&w, sepia, vivid, soft, polaroid, and other effects we have for photos. End result may not be what is preferred by some or perhaps many, but there is a wealth of existing data, often voted as being excellent through awards or through forums discussion, that AI can find, rank and learn from. In such as scenario, AI can track how many people liked an image, how soon they reacted to it, how many did and how soon after posting (emotional intensity), what comments were made in text and how often the image stood in regard over time, and many other factors. This is easy for AI to do and notionally, we all collectively would be the parents or the village that raised the AI child, i.e. the auto-process of a particular target based on all available data and our reaction to it aesthetically and technically. Imagine that 😬
  9. If your secondary is concentric and collimation is good, but you see the vignette move around when you move the focuser and camera together, you might have a sag or tilt in your focuser drawtube and the sensor is not parallel to the long secondary-to-primary optical axis of the OTA. It could happen also if your secondary is not perfectly concentric and centered with the optical and mechanical axis of the focuser, where the secondary is positioned a little too far down or up the tube. Less likely if your collimation is good. worth checking, just in case any focuser sag was already existing when you collimated.
  10. Lucky you with clear skies😁. Murphy and his law like to come out at night...you could sneak a spacer on in the day and measure before they he gets wind of it! I'm hopeless with Allen keys and the like in the dark
  11. Yes, and if you have a calipers (very useful) measure to confirm. I found that having many screw adaptions adds a tiny extension from mating of each screw component, so I always measure from sensor (the bottom of the black dot the side of ASI camera bodies for example) to the bottom of the part that eventually screws onto the flattener.
  12. Yes, if you are adding an extra filter with 2 mm thickness. 56.2 mm is typical of takahashi scopes, but often people make the mistake of assuming you are using a normal flattener/reducer with 55 mm backspacing. They then assume that your 56.2mm is the standard 55 mm + 1.2 mm, or in some cases 55.5 mm + 0.67 mm. nbot the case if the stated backspacing for your particular system is 56.2 mm (i.e. not 55 mm like most flattener/reducers). Is your scope a Tak with a 56.2 mm spacing? If so, and you are adding a new filter, it will increase to ~56.9 mm as you wrote. EDIT: I should add that if you are not using a takahashi scope where the flattener back spacing is 56.2 mm, and you are using a 0.8x reducer that has a backfocus that depends on focal length, e.g. , https://www.teleskop-express.de/shop/product_info.php/language/en/info/p5965_TS-Optics-REFRAKTOR-0-79x-2--ED-Reducer-Korrektor-fuer-Apo-und-EDs.html, then you just need to add 0.67 mm to the recommended, and fine tune it from there. It all assumes that the manufacturer backfocus values have no wiggle room, and many of them do. For tak, its 56.2 mm, and you add 0.67 mm with extra filter. For a 55 mm backfocus, increase to 55.7mm and fine tune from there. I dont know which reducer/flattener you are using
  13. If the refractive index of the filter is~ 1.5, then the refraction effect it causes to the light at 2 mm thickness can be compensated by adding approx. 1/3 the thickness value to the backspacing. Approx. 0.67 mm for a 2mm thick filter. NOrmally measured at a single wavelength though, but we assume that they are coming to focus together by the time they hit the filter and that any refraction difference from green to red is negligible for such a 2 mm thickness. For cameras with either IR-cut or AR coated glass installed (sometimes for Ar-filled chambers it is quartz glass to pass certain UV wavelengths, which has a refractive index measured at a similar wavelength of ~1.55), compensation would be made also. But, I did read somewhere that for ASI cameras, say a 2600MC with IR cut filter installed as a working example, that the 17.5 mm backspacing to the sensor is the optical backspacing, i.e. the effect of the filter is included in the value. Only need to compensate with additional backspacing if you are adding additional filters into the optical train in that example.
  14. That's a handsome view. the odd few panes each year would amount to quite the view as a 5 year project, spare time thing. This turned out so well, what could go wrong
  15. My guess is a new type of r&P focuser option, removing the moving mirror capability and the potential for image shift etc. sharpstars hommage to this scope, the SCA260 uses a beefy focuser with corrector installed, not the same optical setup of course. Maybe Tak are on the verge or a modernization of focusers and the optical design of refractors given the small pixel cameras at relatively high sampling rate, that in some cases require a user to use a flattener with backfocus to respect for scope designed as a petzval. That is pure conjecture on my part of course. Maybe they might be changing the three-pronged spider vanes on the 180, although with the advent of JWST iamge with 6 spikes, that was an missed advertising campaign right there Or, while the on axis capability of the big mewlons is stunning, there is 'scope' for a modification of the Dall-Kirkham design with a reducer to possible allow for DSO imaging....even more speculation on my part.
  16. The gem45 has been flawless and very light to handle. Mine is the 'transitional' model, to coin a phrase from the watch community where the saddle is from the gem45g, includes two 12v 3a 2.1mm power outputs and a USB port. It has the space for the iguider, but none installed. Could be added too.
  17. Here a couple of images that I found. What a great scope. Tinge of sellers remorse just occurred! After the rig first image, I got a shortened usb cable so I could power and connect the camera to the GEM45 saddle, and power the EAF and guide cam from the main camera. Only one cable coming from the mount to the laptop.
  18. Cheers, that scope gave that pinpoint star appearance.
  19. I think I do, will have a look. Possibly one on the mount, maybe at dusk though, another of the scope and imaging train that I have on the PC. Will post it in a while for you.
  20. IC 342, Hidden Galaxy way out there in the cirrus soup. Looking like it would through your school jumper . The exposure captured some of the dusty cirrus and given that it was relatively small in the fov, I kept the 'hidden' appearance Imaged in 203 x 120 s exposure (~6.75 h) using my former (sob) Tak e-130D and an ASI2600MC Pro, at gain 0, -10C. I think that was the last image I took using that scope. All on an iOptron GEM45 with OAG/290 MM mini. Captured sometime around Oct 20, 2022 but its been a cloudy winter here, and its snowing here this evening. CS everyone
  21. Thanks Lee. I was happy with this and the first time I used the 90 mm triplet for imaging. I always saw a small person with a scarf for this target, the description is a tag on of my experience as a 6 year old. That included swims on rainy days because 'the water feels warm in the rain', or so I was told
  22. This was from August 2022 when I took a shine to my 12 year old 90 mm f/6.6 (the fpl 53 tecnosky carbon fibre with the wonderfully figured lens, similar to the SVR-90T from stellarvue and others) and brought it out for a spin. Reduced to 480 mm at f/5.3 with an ASI2600MC Pro in tow, I spent about 16 h in 480 x 120s subs under 97% moon. But I think the moon was on the other side of the house on those nights. Barnard 150, some call it a seahorse, I see a young boy with his dad's scarf getting blown away on the beach when he was made to go walking cause that's what you do when granny's house is 50 m from the beach and it took 3.5 hours driving with 4 kids in a ford escort to get there 😆 Stacked in APP and finished in PS CS5. CS all
  23. Mak Newts are wonderful. Always fancied the 190 mn as you get 1000 mm at f/5.3, right at the limit for 3.76um pixel cameras for my skies anyway. I still have my Intes MN56, tiny 29 mm secondary though but exquisite mirror and mirror cell. There are a couple of comet hunters available again, but they might need a mirror baffle and a focuser upgrade for imaging. Another nice underdog dwarf galaxy capture too👍
  24. I missed this Ciaran, but the star reduction really does help in this one, very nice. for APP, one good tip to bring out any faint background color before getting to your star rich first version is to reduce the Sat. Th. (saturation threshold) slider to about 5% rather than 155 default. I find that in deep broadband images with good integration it brings up the background nebulosity some. It does this without affecting highlights. you might have some reds in there maybe. EDIT, oops just re-read that you were LRGB, not OSC, so this APP technique is only useful if you combined in APP itself.
  25. good to see they published it open access
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.