Jump to content

Stargazers Lounge Uses Cookies

Like most websites, SGL uses cookies in order to deliver a secure, personalised service, to provide social media functions and to analyse our traffic. Continued use of SGL indicates your acceptance of our cookie policy.


Martin Meredith

Advanced Members
  • Content count

  • Joined

  • Last visited

Community Reputation

1,556 Excellent


About Martin Meredith

  • Rank
    Sub Dwarf

Profile Information

  • Gender
  1. Martin Meredith

    SLL and SX Trius 694 / Filter Wheel - summer objects

    Hi Greg Good to see you here again. I think its been a slow year for many of us... You're getting some excellent results with the Trius. I love the FOV, especially with the 3" Borg (e.g. fitting M31 in a single field). I have a 77mm Borg that I bought for EAA but have been having so much fun with my 8" Newt that I haven't fully got round to testing it, but your results have inspired me to have a go at the next opportunity (although with my Lodestar the FOV won't quite be so impressive). Martin
  2. Martin Meredith

    Advice before my trip

    Hi I don't think very many of us bother with guiding, so if you're looking to save complexity of setup that is what I would consider dropping. You can get great results without it. That said, the 290 mini would work pretty well as your main EAA camera for some objects. Best of luck Martin
  3. Martin Meredith


    This is sad news. I hope Maurice realised what an enormous influence he had on a great many of us through his innovations and encouragement to not let any obstacle get in the way of observing. I'm glad he himself was able to pursue astronomy throughout so many decades even in the heavy light pollution of London. I'm not sure I would have got into (and become obsessed by) EAA without Maurice and his promotion of the humble Lodestar for many years (often alone, and in the face of skeptics). He was quick to encourage new observers and to promote the search for obscure and fascinating objects. His love of the sky, both near and ultra-deep, shone through in his posts. I will miss him. Martin
  4. Martin Meredith

    A quick midsummer session

    Thanks Shirva and Neil. The data is all squirrelled away automatically (each object viewed gets its own folder of fits along with some metadata) so it can be reprocessed easily enough. Nothing is lost and deletion is made relatively difficult... Martin
  5. Martin Meredith

    A quick midsummer session

    Hi Rob No problem with feature requests at this stage. They're probably a lot easier to implement while things are still flexible... Switching to display the full frame ought to be feasible. I have no plans to integrate with the maps (9G is just too much to do a lot with) but I probably will at some point include an auto-annotate option which will use some subset of this data. I should also say that I don't plan to implement native captures at present, nor polar alignment/mount control/plate-solving. Not do I see the software as being useful for sensors with huge numbers of pixels or at very fast rates in the first instance (due to the processing/memory load). The tool will occupy a different feature space from others at least to start with. Martin
  6. Martin Meredith

    A quick midsummer session

    Hi David I had to look up skeumorphic (as you can see I am not a GUI designer...). I read it is making comeback though and I think what I'm doing fits more with the modern approach (lack of shadows, excessive borders etc etc). At least I hope so as I prefer minimalism myself. Regarding the eyepiece/camera lens style view, the way it came about was a desire to allow non-ugly free rotation of the image, since I do a lot of star-hopping using charts, and it is a lot less disruptive to orient the display than the camera. But having implemented it, many more advantages became apparent e.g. reducing the visual impact of certain aberrations (tilt, coma, vignetting), and providing an unobtrusive yet extensive canvas for display controls. And I find that the circular aperture has a strong connection with the idea of observing, while a rectangular canvas does not. Having said that, you can easily see the entire rectangular image by unzooming, and you can pan with touch or with the mouse to get access to the entire image -- no pixels are lost. Re the motivations for a new tool and new features...Programming being the art of the possible, I guess I'm mainly interested in seeing how I can streamline my own workflow while incorporating new functionality. I firmly believe that nearly all AP processes can be incorporated into a live system, and that we are at an early stage of doing so in current EAA software. One piece of design philosophy that won't be apparent in the screenshots is that no decision in the tool is irreversible. Should I have used darks? Would I have been better with median stack combination? Why did I allow that blurred sub to get added to the stack? What would have happened if I'd used different alignment parameters? Is it failing to stack because the first frame had poor reference stars? The tool actually maintains the complete stack internally and allows the user to view and edit it, or realign from any given sub (this is done via the part of the ring at the base of the screen). It will in the near future allow a full reprocess when anything is changed, even in the middle of a stacking session. The challenge is to do this in the simplest possible way without interfering with what should primarily be an observing experience. As for other functionality, it currently supports organisation and reprocessing of previous observations, observing lists, session and object logging, automated dark and flat library acquisition and application, gradient removal, hot pixel removal, 7 or 8 parameterised stretch families, various stack combination options including percentile-based approaches etc. It will at some point support one-button press RGB and LRGB captures, a range of colour models, G2V and other colour calibration schemes; plus image annotation. I've only ever used SLL so I don't know how much of this is novel. My criterion is a selfish one: is this something I would find useful? The whole thing will be open source (Python + Kivy). It is good to have options (esp. on a Mac)! Martin
  7. Martin Meredith

    A quick midsummer session

    Thanks DP, Roel & Rob. The inversion I find so useful that I've 'promoted it' to a top-level command (double clicking the image). And I've developed a new stretch that is mainly used with inversion. Gradient removal has a nice side-effect of background estimation, so it is possible to automatically set the black point, at least for a starting guess. The other indispensable control is the short line sticking out at 9 o'clock, which is like a ratchet, and makes very fine adjustments to the black point up and down. That's when I notice the gradient removal most. Setting the correct black point is a nightmare with a strong stretch and a gradient. I've not started on the multispectral yet but I'd hope the same principles behind the gradient removal approach will work there too. v1 will be mono only as I want to do a thorough job on the colour side later. Actually, nearly all the mono processing side is complete. It is possible to run an entire session just using the interface as you can see in the screenshots, along with the occasional (hideable) widget. The aim is to have a distraction free interface to allow concentration on the DSOs at hand. It is mainly the offline mode that is taking time at present (organising previous captures, reloading/processing etc). But it is getting there slowly. Then I have to figure out how to make a distributable application.... Martin
  8. Martin Meredith

    A quick midsummer session

    Thanks Bruce. I did take some flats but that part of the tool is in the middle of a rewrite so I didn't use them. I'll construct a master and see how much difference they make. Certainly the sensor is very dusty at the moment but it doesn't seem to show on these images. There is also (automatic, but tweakable) gradient removal used on these images which does make quite a difference when the stretch is extreme. Thank you too, JAO. Video works pretty well in most conditions (compared to visual where the moon stops play), although the quality will vary. But you will be able to see a lot of detail in most DSOs even with the moon out. Most of the shots above were taken in the same quarter of the sky as the moon in fact. Inversion of the image really helps when there is LP/moonlight as the noise becomes less noticeable and you can push things further to pull out details. You can always do narrowband when the moon is out too. Martin
  9. Martin Meredith

    A quick midsummer session

    Hard to believe that the nights are drawing in again starting next week after what has been a lousy start to the year here... Just time for a quick session last night, the first for about 5 weeks. Here's a series of screenshots from the live session. Object details are on the screenshots apart from the first, which is M104. The entire session used StarlightLive as a capture engine running non-stop, and my software picked up the FITs from a monitored folder. Quattro 8" f4, Lodestar X2 mono, alt-az mount, no filters. No darks nor flats applied, just bad pixel removal. Exposure length is calculated automatically but I'm still working on making this robust (hence the ??). This M12 was based on 5s subs. Here's the Black Eye Galaxy (M64) shown inverted. Exposure is actually 7 x 15s. And non-inverted, 11 x 15s This is a galaxy pairing I'd not come across before: NGC 5846 and 5850 in Virgo near the border with Serpens. The distances are quite different (88 vs 125 MLyears) This is quite a long exposure (37 x 15s) as I wanted to see the faint structure around the barred spiral emerge (I gave this one a really strong stretch, much easier to tolerate when the image is inverted). The elliptical is actually two ellipticals, NGC 5846A being the smaller of the two; the distance estimates for the pair of ellipticals also suggest no clear relationship (88, 113). Finally, a few of Arp's peculiar galaxies: Arp 49 in Bootes (21 x 15s) Arp 72 in Serpens and Arp 336 in ursa Major (10 x 30s), the so-called Helix Galaxy. This is a polar ring galaxy where the galaxy is surrounded by a ring of gas nearly at right angles. You can just about make out the ring in this shot. High clouds were coming and going throughout the session and this one seems to have been affected more than most. === Thanks for looking Martin
  10. Martin Meredith

    Starlight Live 3.3 and below now CRASHES on startup!

    All I can say is that v3.3 still works for me as robustly as ever on El Capitan v10.11.6 on my 2015 MacBook Air -- not that there have been many occasions to use it recently. Martin
  11. Martin Meredith

    converting sqm mag to number of photons

    I've always found this article by Herbert Raab on point sources to be useful for these types of calculations. There are some example calculations in the appendix. Martin
  12. Martin Meredith

    Flats problem

    Thanks Oddsocks. I learned a lot too from that explanation. Martin
  13. Martin Meredith

    Astronomy heaven plus a black eye

    Looks like a good session! I really like your image of NGC 4559. There's something about its inclination that makes it appear to just hang there in the star field. Its also great to see the effect of different stretches. I'm finding that log does a good job on globulars. At some point I'll try to put together a variety of stretches (with the black point held constant) for M3 that I captured recently. BTW Reading the title I feared you'd misjudged the distance to the eyepiece or something .... Its great when the temperature and clear nights come together isn't it? Yesterday was looking perfect over here and I was all set up on the grass when a single large cloud hovered over the region I was hoping to observe (deep down in Hydra -- possibly will have to wait until next year now to mop up the last few Hicksons). All I had to show for the evening was a mosquito bite. Martin
  14. Martin Meredith

    Flats problem

    Did you also use a master bias or flat dark? If not, you can get these kind of over-correction effects. Or: it does look as though the dust may have moved slightly since the artefacts are darker on one side and lighter on the other. Martin
  15. I'm guessing you're not on a Mac? If you were, Time Machine does exactly what you want https://support.apple.com/en-us/HT201250 I simply plug in a hard disk every now and then and forget about it (it also runs wirelessly if you have a Time Capsule). Not only that, it makes recovering files a doddle, so much so that I no longer worry about versioning any code I write -- I just delve into the backup and choose a date and you see the entire file system as it was on that date. Super reliable. Maybe there is something similar for other OSs. Martin

Important Information

By using this site, you agree to our Terms of Use.