Jump to content

SkySurveyBanner.jpg.21855908fce40597655603b6c9af720d.jpg

pipnina

Members
  • Posts

    1,907
  • Joined

  • Last visited

  • Days Won

    1

Posts posted by pipnina

  1. 5 hours ago, Mr Spock said:

    Film has had its day. It's like advocating the use of a horse and cart on a transport forum :biggrin:

    And yet people still partake in horse races and take cart rides as a tourist and holiday activity, sometimes things are not done because they're practical but because we just enjoy them or are interested in the methodology!

    I personally quite like how tactile the film shots I've done have been, even though my attempts at astro film havent been so successful!

    • Like 1
  2. 4 minutes ago, Louis D said:

    I checked an old astrophotography film exposure reference, and the exposure formulas were as follows:

    Standard Exposure Formula:

    t (seconds) = f**2 / (A * B)

    where f is the f-ratio, A is the ISO film speed, and B is the relative brightness of the object.  B would generally come from astrophotography tables.

    The exposure time then needs corrected for the film's reciprocity failure as follows:

    t (corrected) = [(t + 1) ** (1/p)] - 1

    where p is the Schwarzschild exponent.  0.7 is a typical value for non-hypersensitized fillm.

    If I've put that reciprocity failure equation into the desmos calculator correctly, the 0.7 reciprocity factor is very extreme!

    Screenshot_20230619_212857.thumb.png.75f96da57ee3b34f44736143ffc3cd95.png

    I believe I've set it up so the x axis regards to metered exposure, and y is the corrected exposure for said meter.

     

    In theory modern films might be a lot better for reciprocity failure than those of old if 0.7 was typical then. As the worst reciprocity film i've used so far has been Rollei infrared (line chart for such below) and the best one so far is velvia 100 which only has a table (also below)

    Fuji across II 100 claims only 1/2 stop correction between 120 and 1000 seconds, and no correction up to 120 seconds!Screenshot_20230619_213326.png.09b18c2234dca2f648d05912f319d923.pngScreenshot_20230619_213357.png.f69491bca2ca960e4d82e6a358932695.png

  3. I've received the scans for my Velvia shoot, the film itself should be with me this week at some point in the post.

    Sadly the space images are very dark, despite my attempts to meter it correctly!

    I suspect perhaps the meter in the Canon AE1-program cannot handle the dark conditions present and passed me a reading several stops brighter than I actually needed. This is the result of the image I took intentionally 1 stop over-exposing in Cygnus (deneb and sadr in left and upper center). The jpeg attached is heavily boosted so the stars can be seen clearly. This is under bortle 5 skies at a 6 min exposure, on velvia 100 and at f5.6. I note the stars are creating odd shadows too, I don't know if this is due to the film stock itself... or because I let the film sit at room temp for too long before getting it processed! None of the more normal photos I took had this issue.

    Will have to see what (if anything) is visible on the film itself when I receive it.

    000024760029.jpg

    • Like 1
  4. 28 minutes ago, Shimrod said:

    I switched to a digital SLR in 2006 after I came back from a holiday and realised that the cost of film plus development was around the same price as a Canon EOS 450. I don't miss carrying 30+ films in a little lead pouch either so they didn't get damaged by x-ray machines at airports!

    I made some attempts at astrophotography with my film camera but soon gave up as it was an expensive and frustrating exercise - no idea what if anything you had captured and spending £8 (film and development) to find out it was nothing!

    Of course, some of the ISO1600 film was also so grainy as to be almost unusable.

    Indeed higher ISO films are quite untasteful. See my orion image above which was captured on Illford Delta 3200!

    Even 400iso is a bit questionable, here's a (slightly underexposed) 400iso image i took on rollei infrared 000093220012.thumb.jpg.3b1d739645ecfa8a14dbdd732d109db1.jpgCNV00004.thumb.jpg.c1266a81550a068f54ca4462a9a5ce95.jpg

    And here's a 200iso shot (A fuji colour stock I got in 2018, can't remember which type exactly), the grain is noticeable here too!

    • Like 1
  5. 1 hour ago, fwm891 said:

    That's interesting. I have a 5x4 format aerial camera lens that I set-up years ago to do night sky work with but it's never really been used...

    You can buy 5x4 and larger sheets from various companies including kodak and Fujifilm as well

    Velvia 100 is available in 5x4 but you will be paying like 140 quid for a 20 sheet box! Then you have to pay for development...

  6. 1 hour ago, jjohnson3803 said:

    I was casually wondering awhile ago who even sells film anymore, much less does commercial processing.

    I do recall seeing something somewhere on how to build your own hypering chamber for cheap. 

    Most cities have one or more shops dedicated to just film photography!

    There's two in my small westcountry city, and I went to one in my Germany trip too which was sat right up against the Kölner Dom

    Your selection of films is limited however, and may shrink further as Fujifilm seems to be having various troubles getting raw material for their existing formulas, and some of their best stock (velvia 100) is actually banned in the USA for containing certain chemicals their agencies aren't happy with.

    You can still get film developed, scanned and printed at Boots pharmacies (only specific stores though) but by the sounds of things they aren't as good as many non-chain options.

  7. 8 minutes ago, rwg said:

    I've been waiting for someone to build a sensor that does this *at the pixel level*. Just like we have microlenses and (for colour cameras) filters on each pixel now, imagine a camera where each pixel had a prism arrangement splitting the photons between 3 separate detection sites.... Once camera, one shot colour, no 66% loss of photons.

    It might seem far-fetched, but the sort of features built into camera pixels now were pretty far fetched only 15 or 20 years back.

    cheers,

    Robin

    Cameras now have micro lenses, and now that you mention it I recall a special kind of camera that uses a similar trick to bypass the need for perfect focus by having the lens bring different sub-pixels underneath it to a different focal plane, allowing for software controlled focus in post-production.

     

    I recall as well a type of camera used in professional telescopes which might use EXACTLY what you describe, they act as per-pixel spectrometers but I think they only work at a narrow range of wavelengths near to the central wavelength observed or some other restriction. If they were perfect the pros would use them instead of dichroics!

     

  8. 19 minutes ago, ollypenrice said:

    When did cost ever deter astrophotographers??? :grin:  If it did, there wouldn't be any...

    This is, of course, a very cute idea. The theory is compelling. If the focus could be adjusted at each individual wavelength (as it would be) the objective would not need to be so well corrected either.

    I wonder what the professionals make of it. They must have thought about it.

    Olly

    Back in the 90s it was seemingly the only way to produce colour digital cameras and camcorders that had any quality to them, albeit they used 1/3 sized sensors and the units still cost $3000 in the case of this sony camcorder:

    This is a revised model from 2003 which seems to have used the same 3CCD system but likely lower noise / higher efficiency sensors. It seems reasonably well corrected.

    The limited information I have found suggests it can result in reflections and it limits the system to a slower f-ratio due to the light path length, but us astrophotographers often have telescopes of f5+ which most "normal" photographers would consider quite slow these days.

    Technicolour used a two-beam prism for their 3-colour camera in the 30s to the 50s (the camera that shot the now infamous wizard of oz film), so the idea goes back some ways although in this case it was not to maximise light use, but to allow for colour cinematography to exist at all:

    I do note that it seems to be that the JWST NIRCAM instrument uses a dichroic beamsplitter (so a two-channel prism most likely? exact details are a bit vague) to allow the instrument to observe one filter of longer wavelengths, and another filter of shorter wavelengths at the same time.

    undefined

    Why they didn't extend it to 3, 4, or even further simulaneously acting cameras is anyone's guess... But being a space telescope I would have to bet it comes down to launch payload weight, and following that cost.

    But quite frankly with 10.5 billion dollars you can probably make almost any idea work so perhaps this isn't too indicative of the tech's feasability haha.

  9. 9 hours ago, dan_adi said:

    Interesting idea. It should work but the cost will be prohibitive. You will need 3 cameras, and possibly some way to fine tune the focus for each camera separately, ideally with a electronic focuser.

    For a while I used a on axis guider from innovation foresight that uses a dicroic mirror to split the light into visible and near infrared.

    With the main focuser I would focus the visible spectrum and with a helical manual focuser the NIR.

    As you can imagine the manual focuser ia not so precise.

    Also the weight would increase.

    Also such a device would eat up a lot of backfocus. My ONAG if I recall correctly needed around 60 mm.

    So overall, 3 cameras, 3 focusers, backfocus problems, more weight... 

     

    The extra focusers are not necessarily a problem as a helical one similar to those used for OAGs can be used for 2/3 of the cameras, which can be accurate if used in conjunction with a bahtinov mask for initial setup.

    All focusing after that can be carried out with just the main focuser as the 3 cameras would be appropriately spaced to the 3 focal planes and thermal expansion would be minimal in that area relative to the that of the main objective to the main focuser.

    As for cost, in theory this is cheaper than dual or triple mounting telescopes to a single mount or having three separate setups, and was clearly cheap enough to use in pro-sumer equipment back in the 90s, so I feel like it could be quite viable in theory!

    Backfocus could be an issue though yes, it would probably either have to be used in place of, or as part of an OAG-prism combined unit.. 

  10. I don't know about historic interest but I have been trying to capture stuff on my dad's old Canon AE1-program and various films.

    I made an attempt on Rollei Infrared 400iso but despite 6 minute+ exposures there was little to see as the reciprocity failure of the film is very high.

    I then made an attempt on Illford 3200 Delta which promised less reciprocity failure and ok sensitivity for h-alpha but the extreme grain proved unworkable

    I am waiting for my roll of Fujifilm Velvia 100 to be developed now, which only has 2/3 of a stop reciprocity failure at 8 minute as per spec sheet and SHOULD have high sensitivity to halpha. However, the film also was used to take my germany holiday snaps and I wasn't able to stop the people at various airports from running it through their xray machines 😕. Velvia is expensive stuff too at 22 quid for the roll and then another 20 for development! More than a pound per snap!

    To top it all off nowhere I can find will give me anything better than an 8-bit tiff scan (and these images are as I was provided them... in processes jpeg form 🤮)

    To say film astro is challenging would be an understatement!

    000098810005.thumb.jpg.0c60cf6d759071b981578a4d98b24ad8.jpg

    000093220008.thumb.jpg.d1e35f96b252c7581fa632fbcb5f11fa.jpg

    • Like 6
  11. 4 hours ago, iwols said:

    hi just been using lrgb filters and now going to try narrowband with a ha filter and oiii filter on the elephant trunk,will these two filters be enough,not tried narrowband before thanks and what should i expect cheers

    OIII and Ha alone on the trunk should produce a nice image :D

    Many go further and add SII but HOO is still very nice on many targets like the trunk neb.

    By and large I think you can expect similar to other nebulae imaged in HOO, where the hydrogen looks red-pinky and the oxygen adds a blue and pale hue to certain regions.

    I am not very good at HOO processing as I think for some people there is some wizardry involved to give it that extra edge but I think you'll probably be pleased with your results if you go for HOO here!

    • Like 3
  12. I personally used water and cotton balls

    Rinse with water to remove the worst of the dirt non-abrasively. Then I use wet cotton balls to wipe once on each side of the ball in one direction to remove the last of the dust

    Any blobs of water left will leave minerals behind so take dry cotton balls and gently touch them to any drops to soak them up.

    Cleaned up my newt mirrors nicely at least.

    Your cleaning method seems to have worked really well! I love seeing a nice quality and clean mirror

  13. Yes this should work!

    I have used home versions of windows since xp and afaik all have been able to remote into eachother with the microsoft inbuilt tool :D

    I haven't used windows for a few years now though (for more than a few minutes or for more than using a web browser) and whenever I do I am filled with rage so sadly I can't be too much more help haha.

    • Like 1
  14. https://en.wikipedia.org/wiki/Dichroic_prism

    I read about the Technicolour 3-colour process and how it used 3 black and white films running through the camera at once, splitting the light into purple and green with a prism and capturing green on one film, then blue and red on two films that ran sandwiched against eachother and where one film had a filter in its backing to create red and blue.

    I saw this and realised my old obsessive thoughts about minimising telescope photon waste could be compatible.

    As we know, if we shoot RGB images we must lose 2/3 of the light entering our scope. Either to bayer filters (which knock red and blue down to 1/4 efficiency and green to 1/2)  or to our RGB dichroic filters in mono cam systems... But what if we could use 3 sensors and a prism to get closer to 90%+ utilisation in this range? Well it seems at the slower speeds (f5+?) we tend to see in our typical astro scope this could work. Back in the early days of digital sensors, this setup (3CCD cameras) was done to maximise resolution and quality in the earliest DSLRs and camcorders, to results that looked fine colour wise but were seriously held back by the noisy low efficiency sensors.

    But now our sensors are super efficient, our ability to make these prisms hasn't disappeared and astro seems like the perfect modern use for the technology! Instead of buying 3 mono cams and three telescopes and (potentially) three mounts, we could get the same result with three cameras, ONE telescope and ONE mount!

    In theory you could even expand on this by having not just an RGB prism but a SHO prism too, and capture narrowband images at insane speed for your preferred telescope type.

    Who knows, maybe with some design tweaking it would even be possible to create a very complex prism to allow the simulaneous capture of all 6 common astro bands at once haha.

    Does anyone know more about these? I have become a little fascinated...

  15. I had not noticed this in winter, but now that I am doing shorter narrowband imaging stints during twilight months I have seen my focus slipping through the night.

    Due to the good weather I have kept my scope outside, and I suspect as it is a carbon-reinforced plastic construction of some thickness, it is absorbing a great deal of heat from the sun during the day. At night, I will autofocus at the start of my session, however I have noticed that my focus is quite poor by night's end. Last night I set it up to refocus every 45 minutes (3 focuses total, as I forgot to ask it to focus at the start, oops!)

    This resulted in a total change of 55 ZWO EAF steps between first focus and last focus. A rather considerable change. And since this is on a R&P focuser I don't believe the focuser itself is moving between focuses. Could it just be that the tube is getting very hot in the day and slowly cooling off at night, leading to dramatic thermal contraction?

    Curious if others experience this issue, or if there is a remedy to this that doesn't involve air conditioning the scope during the day to match the forecast night's temps haha.

  16. My anecdotal evidence would suggest that better PA usually results in better guiding. But only to a point. Sub-minute is what i aim for and often get 0.4 to 0.6 total RMS on my belt mod HEQ5 with an overweight (unbalanced) load.

    I sometimes forget to polar align... And it ends up 2-3 degrees out. I might still get sorta-ok guiding but not for the 1.2s/pix scale i image at (about 1-1.2RMS common here, plus likely to see some rotation through the night)

  17. 3 hours ago, Paul M said:

    We were that poor when I was a boy, we used to carve out turnips at Halloween and put a candle inside.

    Ok, fair enough, pumpkins probably weren't invented back then.. 🤣 

    Since pumpkins are not native to Europe (an American veg) it actually was traditional to celebrate All Saints Day (IIRC four days of Halloween back then) by giving out food to the poor (sweet cakes, or apples and pears if you didn't have those) and by putting candles (at the time, very expensive) into hollowed out turnips to "keep the bad spirits away".

    So actually you were practicing the original British tradition :) 

    • Like 1
  18. 18 hours ago, Avocette said:

    One option might be to avoid kstars-bleeding which is always the cutting edge version. I have only had experience of the so-called ‘stable’ versions, which occasionally still have issues, but may work for you. 

    I got an email from the kstars bugtracker system as I added to another person's report on this bug, a developer has marked it as fixed in git, and the fix should be released in 3.6.5 which is great!

  19. 13 minutes ago, Avocette said:

    One option might be to avoid kstars-bleeding which is always the cutting edge version. I have only had experience of the so-called ‘stable’ versions, which occasionally still have issues, but may work for you. 

    The thing is, these kstars-bleeding packages give me 3.6.4 which reports itself as a stable release in the about tab. 😕

  20. With so many equipment (mostly software) bugs recently I needed a win of some sort. A big mosaic might not have been the best idea as I have never made one successfully before, yet somehow both the images captured properly, AND I wrangled PixInsight to produce the images correctly the next day!

    I thought I'd need to spend ages wrestling the images to be the same brightness, but the merge mosaic feature (once tuned a bit) seems to have pulled it off without me having to bash my head against a wall using linear fit.

    Well chuffed, now I wish I had more imaging time to make it look clean when you zoom in haha.CygnusMosaicV2.thumb.jpg.7e32f6a73f983ff32ab6acece9e06b29.jpg

    Captured with HEQ5, 130/650mm triplet, RisingCam571 mono, 3nm Chroma Ha filter, processed in Pix and then tweaked in RawTherapee. 10 panels, and one single sub of 10 minute exposure per panel.

     

    • Like 7
  21. Kstars generally worked quite well when I picked it up early last year, and there's been a few cool features since. But when they started adding the optical train system (as good as it could be for multi-scope control) it started getting very dodgy.

    On my laptop a while back the optical trains would get a mind of their own. Deciding that my scope in fact did not have a reducer and refusing to let me save changes to 0.8x, crashing during autofocus capture, and now trying to even edit the optical trains that I MUST use on current kstars versions crashes the program, making it impossible to use at all.

    This happens on my EndeavourOS (Arch) computer, as well as my Ubuntu 20.04 laptop and my Lenovo ThinkCentre with Ubuntu MATE 22.04. I get my Kstars and indi from the following commands:

    sudo apt-add-repository ppa:mutlaqja/ppa
    sudo apt-get update
    sudo apt-get install indi-full kstars-bleeding

    And on Endeavour I used just "yay -S kstars"

    Meanwhile the indi driver seems to have updated for my RisingCam 571 and now Kstars and Indi cannot control the offset used, it is always 0 resulting in very suboptimal results in narrowband images (a lot of 0 ADU pixels in 2 minute Ha subs last night for the brief time I had it working, the Lenovo had Kstars working very briefly...)

     

    Is this a known issue with the most recent branches, should I be seeking to go back to the previous version packaged with ubuntu 22.04? And is there any way at all to solve this Touptek driver issue, because of all the problems I have had to deal with I was not expecting this one 😕

    Many thanks to anyone who can help

  22. if the sensor is moving, then that implies the collimation is being thrown off at the focuser via slopping perhaps?

    If that's rigid, I dunno why your flats would be unbalanced...

    It is possible though that the flat panel being close to the aperture is allowing light into the focuser, this could potentially cause a visible issue...

    However the only other possible cause might be that the flats are overcorrecting. If your flats and lights are collected with an offset, try calibrating with input pedestal set to match the ADU value of the offset (the median pixel value in a bias frame)

  23. Is this on the 200P?

    I didn't have any issues with my 130-PDS in this reagard, however I struggled endlessly to get my mirrors to stay put on my TS-PHOTON 8"

    My conclusion has been that smaller mirrors can be held securely far more easily, and even the simple step up to 8" can make a world of difference to the rigidity required.

    Out of interest, have you tried using a laser collimator, and turning the scope in the mount and seeing if the return point of the laser moves as the scope flips around? This revealed a lot of things in my photon.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.