Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

teoria_del_big_bang

Members
  • Posts

    3,880
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by teoria_del_big_bang

  1. No problem, that's what SGL is about and believe me I have had more than my fair share of help from these good members 🙂 Steve
  2. I think i am right in what I am saying here. To group images together so you can use WBPP to calibrate images over several nights and group them so images from one night use the flats from that night you need to use keywords. Just to recap beow is basically what was in one of my replies to another thread (the one you refer to in the OP) there are some added comments to help. I have data on NGC 7000 for 3 different nights. I take flats after each session or night. I make sure the flats go in the same directory as the lights for each night. Now I used to just name the directories the with the date, such as 22-09-13 *** This I have had to change to use this grouping as I will demonstrate below ***. I use the format Year-Month-Day that way the directories are easy to ist in order as wimdows file manager looks at the year first then month and finally day, if use Day-Month-Year then they cannot be sorted chronologically by the filename, only the date attributes but they can change when you transfer or change any data so not reliable. Indecently in the main directory of the target I also keep one of my images without a crop to use for framing up any future sessions if I return to that target to add further data. So in each of these directories I have lights and flats foe each night, with further sub directories for filters and exposure times. You also have to add the darks and Bias if using one, I often like to keep a copy of the masters (not al the raw frames) in the same directory then they all load at once. So when it comes to using WPBB I can just select the directory "NGC_7000_North_American. Without us doing anything WBPP will group both flat and light frames by filter and exposure time as it takes these values from the Fits header But it will not differentiate them by the dates to group into the 3 different sessions. Now why WBPP is not clever enough to do that I am not sure as the date is also in the header, albeit a date and time. However this may not be too useful as often the date will change at midnight and usually a session spans over two days anyway. Although in NINA you can select an option not to change date at midnight and so that would actually work if WPBB was changed to group automatically by date but as it stands it would not do this, also I guess if done in future that is still no good for old data with dates already in the headers. So we have Flats:- And Lights:- As you can see we have all the 300S Ha lights together for the 3 sessions. But that's not we want because we want the flats from each night to be made into separate master flats and then the correct master used on the correct lights for that night. So we need to use a keyword to further group these fats and lights. To do this we use keywords and enter the keyword(s) into WPBB So I use the keyword "DATE". But as i have the directories named this does nothing because WBPP cannot find a keyword called DATE and so nothing changes it is still grouping all frames from the 3 nights together. Sorry for all that waffling you probably already knew but to use keyword the keyword, in this case "DATE", must either be in the Fits header or in the filepath. It will first look in the header and if it finds this keyword, but there isn't one, there are some date related keywords as shown below but not just "DATE", also these are no good as they also have the time in them. After looking in the header it will look in the filepath, not just the filename but the whole path, in this instance the filepaths are like below :- Z:\DATA_and_PROJECTS\LIGHTS\DSOs\QHY268M\NGC_7000_North_America\2022-09-13\LIGHT\HA\300.00\2022-09-13_23-17-18_HA_-10.00_300.00s_0000.xisf Z:\DATA_and_PROJECTS\LIGHTS\DSOs\QHY268M\NGC_7000_North_America\2022-09-16\LIGHT\HA\300.00\2022-09-16_23-27-55_HA_-10.00_300.00s_0000.xisf Z:\DATA_and_PROJECTS\LIGHTS\DSOs\QHY268M\NGC_7000_North_America\2022-09-19\LIGHT\HA\300.00\2022-09-20_02-21-26_HA_-10.00_300.00s_0012.xisf Now although I do have all the frames grouped under three different directories that are dates as it stands WPBB does not know these are what we want to group them to. So I have to add the keyword I want to use "DATE" to those directories followed by an underscore. The underscore is very important. So with renamed directories we get this :- And now when we load the same "NGC_7000_North_America" directory it will group them as we have the keyword DATE enabled. So that's how WPBB uses keywords. Now what does that mean how you organise your directories. Well mine are basically as shown in this post with the change of adding "DATE_" + the actual date and then making sure all the flats and lights for that night are within that directory somewhere. So I have the format:- CCD ---- TARGET ---- DATE_YEAR-MONTH-DAY ---- FRAME TYPE (Flat, Light) ---- FILTER ---- EXPOSURE ---- FRAME NAME In NINA to do this automatically I have the following in options:- \$$CAMERA$$\$$TARGETNAME$$\date_$$DATEMINUS12$$\$$IMAGETYPE$$\$$FILTER$$\$$EXPOSURETIME$$\$$DATETIME$$_$$FILTER$$_$$SENSORTEMP$$_$$EXPOSURETIME$$s_$$FRAMENR$$ So you can use any name you want as a keyword and it can be either in the actual filename, or in any of the directories as WBPP lools at the whole file path. In actual fact I have only used the "DATE_" keyword and not found a need to use anything else. Of course it doesn't have to be date you could call them "Session1", "Session2" etc. I guess if you has some old image files without everything in the headers then grouping them under your own keywords could be a way round that too. I hope that helps and sorry if a bit long winded, I do waffle on somewhat as I get older 🤣 Steve
  3. I can answer this better when I get home from work and look at my directory structure. Basically I would say yes that level of organisation is fine but you need to add an underscore after the keyword you want to use as PI uses the underscore as a marker to look for the keywords. Let me get back to you later and explain better. Steve
  4. Great image and I love the subtle colours 🙂 Steve
  5. Vey nice image, love the colours. What's the issue with the flats ? Maybe somebody with a similar setup can help here ? Steve
  6. That's such bad luck and I really feel your pain. But I take my hat off to you, and others, that do get out away from what seems to be ever increasing LP ner our homes. I keep telling myself to get out there with my imaging but apart from a few UK holidays in remote cottages and a washout at Kelling last year have not got off my backside yet 🙂 I hope that irons out all the mistakes you can make and the next venture is more rewarding, keep trying 🙂 Steve
  7. Just seen it on Astrobin and that really is a great image 🙂 Steve
  8. It is and we all have been there believe me. Guiding took my a fair while to get my head round but once the penny drops and you get your setup right it just works, every time (well nearly 🙂 ). Once running there are things in PHD2 you can run so it learns what might help and will suggest you change some parameters, but first just get it working you can come back and ask then what to do to improve. Good luck, I just looked out and clouds are clearing so my lens cap also off not and starting to get aligned on my target. Steve
  9. One thing as well that confuses me is without visible stars how you managed to get it calibrated. Normally when using for the first time it will lock on a star and then move the scope in al directions by several pixels to see the star move so it know the orientation of the guide camera and when the scope moves one way which way it then sees the star move. I don't want to complicate things yet till you get it seeing stars and focus the stars as that is nothing to do with your screen full of noise but that neds to happen once you do 🙂 Steve
  10. I wont insult you by asking if the lens cap was removed 🙂 Difficult to say exactly what is wrong from that photo but the first thing that strikes me is the slider shown below ringed in red appears to be fully to the right on yours and if I have the one on mine anywhere near half way then iI have am almost white screen. So I would move this to the left and then move it right slowly till you have the stars showing up. Below is where my slider is (ignore the fact there are no stars my lens cap is on as its still passing clouds here). Keep the frame rate between 2 and 4 seconds I think that is what most people use. Make sure the guide camera has connected correctly and that PHD2 has not connected to your imaging ccd by mistake. If that doesn't help then maybe explain what camera and guidescope you have and then put some screenshots of your PHD2 settings. Also if this is your first time imaging I found it helped not to bother with guiding to start with and take shortish frames, say 30 to 60 seconds if your tracking allows until you get some sort of images you can stack. Also going this route and trying to get the longest exposures you can, without guiding and without noticeable trailing means your set up regarding PA, tracking etc is as good as it can be and then the guiding is not having to do lots of work just that extra nudge to allow longer exposures. PHD will not fix bad tracking or bad PA so you may be able to get at least 1 min subs without guiding and maybe even 2 or 3 minutes, I think I managed to tune mine up to 5 minutes without too much issues. Then get the guiding working and ramp it up to get 5 mins or longer. Just a thought ? Steve
  11. Managed from around 7pm till fog engulfed the area about 2am on Monday night. Nebula SII 600S X 10 Ha 300S X 12 OIII 600S X 11 Stars Red 30 sec X 12 Grn 30 sec X 12 Blu 30 sec X 12 Not spent too long on processing this so colours may be a bit too much, just wanted to see what the data was like, it probably doesn't need it but I imaged the same target last night so will concentrate on the image with all the data. As usual any suggestions how to improve my final image would be helpful 🙂 Steve
  12. Very nice, bring on some more clear nights I say 🙂 Steve
  13. Ah found it, not one of my best images actually now I look back, done in my early days and I think the seeing was not great as only managed 2 frames at 240 sec of R,G and B and 2 frames at 60 seconds of RGB for the core and I think 60 S was far too much for the core so 30 seconds probably better. If you use THIS blending technique then probably don't need very long exposures for the rest as you will be able to stretch that more without affecting the core. For what it is worth my meagre image: https://stargazerslounge.com/topic/347340-first-attempt-at-combining-different-exposures-another-m42/#comment-3776318 Steve
  14. Nice 1st attempt, far better than mine that's for sure. The only way I tamed the core was to use short exposures and longer exposures then blend the core which was cut from the short exposures over the main image using the longer exposures done in Photoshop, although that is not my main processing software so had to just follow a tutorial I found on line. I am not sure what sort of exposure times I used, i think pretty short for the core, i will look see if i can find the info. Steve
  15. Insanity ????? I am now about 4 years into AP and so take 1st year as a learning curve (the steep bit) then 3 years properly and it's not a bad idea, if you get the clear nights, to get a fair few targets under your belt, with enough data to get some images, develop your processing skills and like you say see what's out there that suits your FOV on whatever gear you have. Then you can go back to the ones that suit your setups and improve, maybe a better framing, more data and so on, and hopefully you see how you have improved, well that's my theory anyway, we will see 😁 Steve
  16. Nice image Adrian 🙂 , more data would be great I hope you get the clear skies. I don't know about you but I have had 2 out of 3 nights this week that started out looking really good, almost like the very cold but very clear winter nights we used to get not too long ago, then 1st night got up at 2 AM to check everything and completely fogged out, scope in and back to bed, last night almost the same at same time but this time snow. Luckily the snow had just started and the scope was not pointing directly up to the heavens, so all was okay just froze to death taking scope and electrics in dressed in PJ.s and dressing gown 🙂 Almost thinking I just stay up till about 1 or two and then just end the session, can't seem to leave kit out all night anymore, really need an automated obsy. Steve
  17. The term AI kind of baffles me too. As I see it human Intelligence really develops more through the mistakes it makes more so than the things it does correctly, although in many ways we never seem to learn 😂 But I will leave that argument there for all our sakes 🙂 Nice analysis, but thinking about it further that is not quite true to what some people are maybe wanting. Maybe it is more that they want to make an instrument but get a musician to play it for them, or they want to grow their own vegetables and rear a cow (heaven forbid) but then get a chef to create the meal. And, whilst it is not what You or I want to do, if you are only interested in the data acquisition, then I guess I see no reason not to use some wonderful script. Like I said in PI there is one already that does it all and I tried it the once on some old data of mine and the results were acceptable, thankfully not quite as good as my attempt (well that was my opinion 🙂 ). It would be a fair fight if then when the image was shared it had accompanying text to say processed in full with PI DoItAll script, or whatever was used, but nothing to say they have to. What would be a tragedy is if somehow the script basically platesolved your data and then retrieved other data to enhance the image or determine what should be there. If the process was a pure mathematical algorithm then fair enough, then you will never get a silk purse just the sow's ear if the data set is not up to much. Personally, I am in your camp and want to grow my own and then take it to the table myself and also blow my own trumpet (something I am often accused of 🙂 ) but each to their own. Thread has taken a bit of a detour from the OP's question , now isn't that unusual ? Steve
  18. Just looking at this, how do you set NINA to refocus after a flip, mine has just flipped but did everything perfectly, recentered and all that but no refocus. Do you have to use the plug in DIY Meridian Flip ? Steve
  19. Cheats ! how very dare you 🤣 Fantastic image, well done, nothing better than being able to get out with some new gear and nothing worse than first getting it then looking out every night at clouds, rain and whatever else we get thrown at us for weeks on end. Steve
  20. I know some people really hate the processing side of this hobby, some love doing it and will willingly take on the task of having a go at processing somebodies data to help them out to show what can be done with their data. I am in between at the moment, I used to hate processing because I just couldn't do it like some others are so good at but with more practice I am jumping the fence a bit and getting to like it. But really do we want auto post processing ? I am all for auto pre processing, as that is purely mathematical to get the best results, and I guess sometime, not too far away in PI for sure there will be scripts to do all the post processing (there already is and actually does a fair job of it if that's all you want) but do we all want the same results from our data ? or at least nearly the same as some people will always have better gear than you, better darker skies, more time available and so on, but essentially they would all start to look very generic. Or, do we still want that artistic side to it to interpret our data as we see fit ? I am no authority on this by a long chalk but still maintain there are no natural, true to life astro images of any DSO taken from Earth. Yes we all comment about others images and often say that's so natural looking, normally as you say when colours are not so garish and in your face , but even LRGB, to my way of thinking, are not really natural, we could never see what we process images to show, so many stars, and colour. NB images certainly not natural colours not even the stars unless we falsify the colours by colour calibration of some sort. But we look at so many images we perceive what the proper colours are, or what they should be, and even whereby we expect many of the DSO's to be with a SHO pallet, totally unnatural, and who's to say we all expect to see the same colors. I have not been on SGL that long but when I joined I remember seeing quite a lot of mono images, and to me starting out having seen all those wonderful HST images wondered what the heck was that all about, but really are they mot more natural images ? I think we think maybe not as interesting and maybe that's why we do not see so many these days. I don't really know what is right, just what I think , but I guess we are not all the same and want different things from what we are doing, and that probably how it should be 🙂 Steve
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.