Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

rotatux

Members
  • Posts

    387
  • Joined

  • Last visited

Everything posted by rotatux

  1. Nice... I don't know that (Android) app, using Stellarium to plan my framing, what is it ?
  2. As Ken reminded us, the longer focal length requires more precision in tracking. I may just as well retry my MAK on small objects, as mine is somewhat shorter than Ken's. Yes aperture and/or focal ratio (also based on aperture). 80/400 is f/5 so you would have to close your 135 past 5 (e.g. f/5.6 and smaller) for your scope to bring you more photons on the sensor than the lens. Which you probably won't do, as most 135 are good enough from f/3.5 or f/4 (I dream I could find a f/2 one :-P).
  3. BTW, is there such a thing as Alt-Az guiding ? Field rotation put apart, it could be a way to achieve longer subs. Or said otherwise, achieve subs as long as field rotation allows, without being annoyed with balancing or tracking errors. Thinking about it, that would severely impact transportability and ease of setup though (add laptop, guide scope+cam, cables, etc), but I wonder whether anyone has done it.
  4. Pretty obvious indeed. IMO this helps to bring an answer to a long asked question about the equivalence between various apertures. In all cases it's agreed that the bigger the (true) aperture, the more photons you get and the less exposure time you need. And the longer the focal length, the more magnifying and less photons (*) you get and the more exposure time you need. True aperture is focal / focal ratio, so 150mm for your scope and 48mm (2.8) or 39mm (3.5) for your lens. (*) actually it's less photons per pixel, since the same quantity of photons flowing through the same aperture are just spread over a wider surface. Question is by how much those two factors combine and compensate each other, and which one wins over the other. I've read many articles about it. Some say / demonstrate the captured light level per pixel is proportional to D²/F (or F/R², R being the focal ratio F/D), others say / demonstrate it's proportional to only D/F (= 1/R). If the former was right, your scope (150²/750 = 30) should capture more light at the same sub length than your lens (135/2.8² = 17.2). Your shots seem to prove the latter is right, i.e. 1/5 < 1/3.5 or 1/2.8. Of course post-processing vary, you should really compare out-of-sensors RAWs, or JPEGs processed strictly equally. But it also correlates with my own experience as a wide-field imager.
  5. Good luck. A fixed observatory certainly alleviates the burden of EQ setup, when it's mostly done once. Hoping your observatory will get enough clear skies to enjoy.
  6. So yours is fully conformant. With the SWCC mine gives 565 or 590 depending on the spacing (shortest or longest, about 18mm between the two). Pretty cool actually, as it gives more FoV on demand to my smaller sensor
  7. Maybe it's actually loading the JPEG "thumbnail" embedded in the RAW, rather than the RAW itself ? It's a feature I have in my "ORF" RAW format, so other RAW formats may have it too.
  8. I'm afraid you must not have measured the right stars :-) In my Stellarium Propus and Tejat are 1.865° apart, so the approximate corresponding sensor distance should be on the order of tan(1.865°)x650mm = 21.17mm. That's way off your measure, so either you missed the right stars in your image or you are measuring a resized image (maybe onscreen display rather than true-image pixels ?). Apart from that I agree on the formula. The /2 division on both sides isn't really necessary because of small angles.
  9. Hi all, my first post in this long-time read thread. 1 year ago it finished convincing me and I bought a 130PDS, and don't regret it. These days I essentially use it for Alt-Az imaging though I also have an Eq (but don't use it often). Incidentally I used some of my past months images to check and calibrate the pixel resolution with different combinations of filters, CC position, oculars (I also do projection imaging sometimes). It's pretty easy: knowing the sensors dimensions and number of pixels, and reference angular distances between stars (thanks to measure tool of Stellarium and catalogs), and just apply some basic trigonometry. For reference I also calculated the resolution with no additional optics (the cam barely adapted to the OTA), and I found 632.5mm ± 1.3mm. This is also confirmed indirectly by other optical combinations. I know for normal that actual focal varies a bit from theorical / sold specification, mine is away by ~2.7%. Is it a normal value and what's your own values ?
  10. You are on a good way, maybe not aggressive enough in your stretching. Look at what I get with only an 8-bit stretch of your JPEG, this was with only levels black point and mid point. So you must be able to get far more from your 16-bit image. The main idea is 1/ strech with levels (gamma / mid point), or brightness/contrast, or curves, then 2/ trim the backgroung with levels black point. Ok, now unless I'm wrong you will see you some apparent (strong?) vignetting, so you will need flats. But keep that for after you are comfortable with processing
  11. Exactly, that was the "saturation-limit" part of my reasonning. Though a physicist will argue the sensor's pixels keep the same "full-well" capacity whatever the ISO, only the maximum representable electronic level drops when ISO raises. I've read many similar, but maybe not this one. I learned many of those articles have to be taken with care as there are often gotchas in the reasonning, sometimes small sometimes big, so you need to keep awake. Always interesting, gonna read it, thanks Ian.
  12. My go too Not as stretched as others and I tried to calibrate the color and get back some core (failed at the latter). I saw core is not totally saturated but didn't find the right tool to get it back. Most stars are saturated too so had to resort to background stars for B-V calibration. Was funny, thanks again for letting us play And Nige... WOW you got the core back and nice nebula color changes.
  13. Not so poor ! That RGB did much good to your image, with nice star colors (even if it lacks a bit of blue) and delightful and delicate color change between red, pink and orange in the nebula. Eventually it's not all red ;-) Thanks for that superb image. I knew all this region is dust and feature rich, but seing it all revealed like this blows me off. Don't mind blurring too much, some images (including this one) are better appreciated as a whole rather going into the details.
  14. Thanks for pointing that Ian. It was already discussed in the thread (which I have read once entirely) so I was aware of it, and is actually one of the triggers which made me come back at Alt-az imaging But it's also a matter of precision: if you get one bit (literally) of signal out of each sub, it would take 256 subs to gain 8 bits of precision (not taking noise into account) and have something to stretch -- maybe less bits would do, I take 8 as a known reference. Also, depending on sub exposure, you could also get less than a full bit of signal (lol! I like quantum physics ), needing even more subs. There's also the matter of image depth through imaging software chain: if I get subs with 1-bit of signal each it means they are at position 12, so to get 8 bits of precision requires software to process at least up to 20 bits. That's where I may be hitting Regim's 16-bit depth limit (of images). On M42 all was fine but on Rosette and Horsehead I feel I'm at the stretching limit. With the number of subs I take, the remaining noise should be near level 0 or 1. I stop stretching steps when noise becomes significant, but at the same time I understand there's no more signal to come because I've used all the bits within available depth. Feels like a dead end -- though I'm in the process of trying to "pre-scale" the images, which *will* saturate more stars but I hope allow stacking to expose more bits of signal.
  15. You must be right, though I could not find the specific response curve for my filter. When I look through it during day, I see all seems red and blue, something like a mix between 2 eyes of a red-blue 3D glasses. At first I thought it would remove essentially yellow and a bit of red, but it's narrower actually and removes green as well (confirmed by looking at trees folliage and grass, which turn brown or dark red). So Regim's calibration must be trying to compensate for the lack of green in stars, and the image ends with too much green. Sounds correct. Now, what's the best and more practical way to get back a correct green channel ? If I play with balance to raise green level, I may end with the same green cast as auto-calibration since there's not enough green from the start (but will try). Capture green from another session, like you do with L+R+G+B(+Ha) ? Apart from the inconveniences of multi-session and number of subs, I would have to do it with a narrow green filter to be efficient -- if it exists and that would somewhere negate the use of a UHC filter. Or synthetize a fictive green channel from the two others ? Would be wrong as the red+green/blue ratio is precisely what changes from star and star and enters the definition of the B-V index. Back to home to think and try :-P
  16. Thanks. I hoped some shots I've seen had real colors rather than mapped, but maybe not after all. As Ken reminds us, the Rosette is very bright in red. Just wonder then why the sensor captured 2 different colors. That's a matter of camera. Actually I've tested 800, 1000, 1250, and 1600. With only 12-bits depth and 30% quantum efficiency (taken from sensorgen), most if not all deep sky (not clusters) subjects' data just happens to be below the minimum first value above zero with short exposures. Said otherwise, lower ISOs just don't give me enough data to stretch within my exposure limits. So I need to amplify more than all of you 14-bits sensors owners, and 2500 is my camera's maximum of analog amplification (3200 and above is digital). Would be different if my mount accepted to do 40-50s subs Good news is my read noise is about the same at every iso, so I prefer to always use the same ISO level and vary the length and number of subs. Bad news is my noise is quite high so I need about 80-100 subs to tame it. BTW don't be fooled by the DR such as indicated at sensorgen.info, it's for a single shot: If you consider stacking removes the noise, you get full DR for any analog-scaled ISO from a sensor-only point of view; What matters then is to avoid saturation of your subject wanted parts (you may saturate some subs for HDR to get faint parts), and hence match the subject brightness range to your camera by adapting the exposure. It's good if you have the choice to adapt the exposure, to lower ISO in addition to varying sub length, I don't think I have that choice.
  17. Just to not let unfollowed my horrible Alnitak shot proviously shown, this is my 2nd try at Alnitak and its nebula friends. Much better I hope, though still not on the par. Capture: 101 good of 123 lights x 25s x 2500iso, 30 NG darks, Olympus E-PM1 with Skywatcher 130PDS on Celestron Nexstar SLT, TS-UHC filter. Thanks to much more subs, there's much more detail. However I had difficulty with color calibration: I had to revert to the semi-manual version, the full automatic gave green stars rather than blue. Again I suspect the UHC filter to be the cause, just as if it was ripping out some information essential to the calibration But I've no idea what yet. On my calibrated displays this results in white/grey rather than blue stars, including Alnitak itself (and some nebula too), but I'm happy I eventually managed to avoid getting green stars. Gives me a headache: On one hand the filter would allow me up to at least 30s subs (if not limited by mount accuracy) and catch many targets, while without it I am limited to 8-10s subs by light pollution and unable to catch Rosette or Horse Head. On the other hand the filter seriously reduces Regim's ability to auto-calibrate colors. And I love good (correctly) colored stars...
  18. I feel there's a lot more detail than what appears, so it's a good shot ! And the same time the core is quite saturated... Typical of my own historic stretches with Gimp levels You could try using Color->Curves rather than ->Levels so that you can avoid to stretch the brighest parts and only affect low brightness parts. If you feel like it, you could also try 16+-bit aware imaging software, starting with Gimp 2.9 since you are a Gimp user -- though I have yet to try it myself. This will help recovering faint details.
  19. From the album: Alt-Az / NoEQ DSO challenge

    Try 2 at all nebulas around alnitak, from heavy light-polluted skies oh home around Paris but with UHC filter. Much better than try 1, thanks to number of subs, but still difficult to come, I suppose because of limited sensor depth. (or is it LP?). However some color is missing, had to use manual BV calibration in Regim to get something barely acceptable. Capture: 101 good of 123 lights x 25s x 2500iso, 30 NG darks, Olympus E-PM1 with Skywatcher 130PDS on Celestron Nexstar SLT, Skywatcher ComaCorr and TS-UHC filter. Processing: Regim, Fotoxx Date: 2017-01-21 Place: suburbs 10km from Paris, France.

    © Fabien COUTANT

  20. From the album: Alt-Az / NoEQ DSO challenge

    Capture: 7 good + 7 average from 28 lights x 30s x 2500iso, 30 NG darks, Olympus E-PM1 with Skywatcher 130PDS on Celestron Nexstar SLT, TS UHC filter. Processing: Regim, Fotoxx

    © Fabien COUTANT

  21. A good surprise, I already got back my Internet at home, so here the image I talked about. It's 7 good + 7 average lights x 30s x 2500iso and 30 NG darks (from another subject in the same session) usual Alt-Az train + my new TS UHC filter. Ignore the noise... This is without any color calibration, direct colormap from the sensor. Usually I get really weird star colors and a B-V calibration is required. But with the new filter I think the colors are already very good. I had a doubt when seeing those very red stars but I checked their B-V index and the color is plain justified (there's one at 2.30 !). Just the stars of the cluster are cyan when they should be pale blue, and the brighest of the cluster should be orange rather than white (or whatever). Means there's too much green ? I can't believe it, where would it come from and why only those stars. I also perceive a noticeable color transition from red around to blue or grey in the center, which I think is nice, if not physically real. A thing I rarely notice in most other shots. However I find the stars are too saturated at 30s x 2500iso, that's why I only did 20s for the real session. I just tried a quick BV but not good, most stars are good but the nebula turns all red, loosing the center blue/grey. I hope the big stack will turn better.
  22. If you can't add weight at the back, you could try removing some at the top: attach a few (helium) balloons to the finder, for example. Er, I admit it could look like a joke but it might work...
  23. I also had another session yesterday at dinner time. It gets more and more confirmed that the SLT tracks better (to east at least) when back heavy: I can steadily achieve good keep rates up to 20-25 seconds (30s seems impossible). Maybe it was designed for refractors / cassegrains type scopes -- focal plane and eyepiece/optics at the back, since I got it with a MAK. I first had a quick try at M1 before it got out of my view, but could not frame it adequately -- I just can't remember what optical direction gives each command arrow, apart the axis, so with 5s framing subs I can't manage to correctly shift the subject in the image, given no reference stars and subject deeply burried in LP. Gave up this time on this one. Than had a first test at cone nebula too, about 25-40 x 20s subs. Will process later to check what comes. During few past days I had a process at my first rosette test. It's interesting because with only 14 x 30s subs (at <30% keep rate!) it shows some color differenciation in the nebula just out of sensor. I want to show you and get thoughts here about a matter on star colors, but no more Internet at home (I'm switching providers which will take a few days) so no more B-V color calibration and posting will have to wait. Anyway, given what I already got I decided to have a true go at it yesterday. So now there's also ~120x20s of Rosette waiting for a processing of mine (will take longer).
  24. I prefer the 2nd version, being more contrasty especially in faint zones and details (such as dust clouds "digits" around the center). Maybe you have applied too much noise reduction: the background is almost flat, when I would expect some grain... or is the jpeg. BTW about your TV calibration problem: is the Spyder actually calibrating the driver / or graphics card in your Windows PC, or the TV itself ? My TV is calibrated (though I don't use it for PC) but it's a builtin feature, not done with an external device. Conversely, calibrating my PCs involved selecting an existing ICC color profile by eye and then doing minor adjustments, but it acts on the color-lookup-tables in the graphic card's rendering stage (CRTC for VGA/D-SUB or whatever it is for HDMI/DVI).
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.