Jump to content

Stargazers Lounge Uses Cookies

Like most websites, SGL uses cookies in order to deliver a secure, personalised service, to provide social media functions and to analyse our traffic. Continued use of SGL indicates your acceptance of our cookie policy.

jakeybob

New Members
  • Content Count

    24
  • Joined

  • Last visited

Community Reputation

9 Neutral

About jakeybob

  • Rank
    Nebula

Profile Information

  • Gender
    Male
  • Location
    Glasgow, Scotland
  1. At first glance this looks like an unfortunate double typo to me. 1MPC should be 3.26 million LY, and 6,715m should be 67,150m. is a weird way of phrasing it, even if the numbers were correct. To be fair km/sec/MPc is barking mad, even as astronomical units go, so easy to slip up on
  2. There are ways to work around the uncertainty principle to a certain extent. For instance, if you observe a "quantum non-demolition" observable like momentum as opposed to position. The uncertainty principle states that deltaX * deltaP = h (give or take some factors of 2 or whatever . So, imagine you are trying to measure some quantum system - if you measure x (the position) you necessarily impart some momentum P. This means that when you take your next position measurement the object will have moved unexpectedly, due to the momentum you imparted by observing it. If you can directly sense the momentum, then all that happens is you impart a change in position. This change in position has no effect on successive measurements of the momentum. http://en.wikipedia.org/wiki/Quantum_nondemolition_measurement You can also do some funky things using "squeezed" light. This light has more noise in one quadrature (e.g. amplitude or phase, which are analogous to position and momentum in terms of the uncertainty principle) than "regular" light, where the noise is uncorrelated. This allows you to make e.g. extra sensitive measurements of light phase using an interferometer. This has been done with success on table-top experiments, and has been employed on a large-scale in the GEO600 gravitational wave interferometer ( http://arxiv.org/pdf/1109.2295v1.pdf ), with similar squeezing and QND techniques likely to be implemented in future detectors in order to make measurements below the Standard Quantum Limit set by the uncertainty principle. No laws are being broken here, it's more that the "standard" limit doesn't apply to special cases
  3. Also, this link has a nice graphic depicting the increase in resolution compared to WMAP. http://sci.esa.int/science-e/www/object/index.cfm?fobjectid=51554
  4. Yup. I find it extremely bizarre that Maxwell in particular isn't better known in his own (or indeed any other) country. I think most physicists would put him in the top three (certainly the top five) physicists of all time, but most people haven't heard of him.
  5. "we are going to multiply Jupiter's orbital period in years by factors that contain Pi to determine if a consistent relationship between Jupiter's orbital period and the orbital periods of all of the planets as well as the large asteroid Vesta can be established" This says it all really. Another way of saying it would be "I multiplied pi by random fractions until I got within a few percent of the accepted values, and because it has pi in it it looks significant." At first I thought this would at least have some theory for where the factors come from - a numerical progression, a power series or whatever. But nope, as far as I can tell, it's just pick and choose. At least the Titius-Bode law has a way of generating the data (with a much better fit to reality, Neptune aside). http://en.wikipedia.org/wiki/Titius%E2%80%93Bode_law
  6. I was wondering this too! As far as I can tell the main advantages to webcams are their price, and (with some of them) the ease of modifying the firmware etc. Of course, the price advantage is nullified if you already own a DSLR. The small sensor size is often cited as an advantage of webcams, but I'm not entirely sure why. The individual pixel size of my DSLR sensor is essentially the same as the Xbox Live cam I've been using recently, and if anything using a larger sensor would make my life easier as the planet would take much longer to move across the field. One other factor may be the somewhat black-box nature of DSLR movie modes. The sensor in my DSLR is 4,288x2848 pixels, but the video is output at 720p. Who knows what sort of interpolation/noise reduction is going on in there? If it was possible to get a DSLR to output a "native" avi file then it should be, as far as I can tell, as good as or better than a webcam. Of course, cropping/editing that mahooosive avi might not be too much fun Also, it should be possible (and potentially preferable) to use the DSLR to take bursts of still photos instead of movies. These should be full res and in RAW format should be relatively free from weird post-processing effects. So long as the mirror flipping didn't cause the mount to shake too much this should be perfectly doable, but wouldn't be able to achieve the ~30fps that some webcams can happily do.
  7. I think there are many ways to interpret this one. This is how I would have answered it if asked by my (5 year old) niece. I'd have said "you barely have to move at all". Simply stand and look at Polaris (or indeed any circumpolar star). A "fixed" star would fit the definition of "the same point in space" as far as most people would mean it, and you can face it 24 hours a day without moving an inch! To be fair, at this point my niece would probably have gone "<SIGH!> That's not what I MEEEAN Bob!" and flounced off.
  8. A good way to visualize RA-DEC coordinates is to turn on the RA-DEC grid in Stellarium by pressing the "E" key, then speed up time a bit by tapping "L" a few times. You can see that the coordinate grid rotates with the stars, and all the "longitude" (right ascension really) lines meet at the celestial pole (i.e. near Polaris if you're in the northern hemisphere). You can also turn on the ALT-AZ grid by pressing "A" and see the difference between the two systems. RA-DEC is useful for defining the positions of objects in an absolute way. For instance, if I discovered an asteroid or something it wouldn't be very useful if I told you its position in terms of altitude and azimuth, because it would appear in a different position in the sky depending on your location and what time of night you chose to look for it. RA-DEC coordinates can always be translated to any observer's local coordinates.
  9. I bought the Cheshire eyepiece linked to by estwing above from FLO. Works great, but you'll also need some instructions on how to use it. There are lots of guides to collimation using a Cheshire online, but I found these two the most useful: http://www.astro-baby.com/collimation/astro%20babys%20collimation%20guide.htm http://www.astronomyhints.com/collimation.html
  10. Yes, as Burger mentions, you can't really see anything except a massive blur. You can point it at a light, and then cover it up to make sure it goes from a big white blur to a big black blur but that's about it. But, seeing this reminded me of a weird effect I noticed. Once I modded the cam, I plugged it into sharpcap (indoors) to see if it was still working and there was loads and loads of banding. After thinking on it for a minute, I figured it might be 50Hz noise from the lights. Thankfully, it was, and no banding appears on anything taken under natural light. But I wonder if I did something when modding the camera that caused this? I can't remember if the effect was there *before* I modded it Maybe it's just an artefact of choosing a non-native framerate or something.
  11. It wasn't attached; I unscrewed the 1.25'' section of my focus tube exposing the thread that would attached to a T-ring adapter (if I had one) and just tried to hold the camera tight against it. I adjusted the focus by looking at the live view on the camera's LCD. It was tricky to hold the camera still and in-focus, while also not moving the telescope and also depressing the shutter release, so only about 10% of the shots were usable, and most of those were from a single good burst of shots.
  12. Nice photo! Also thanks for the thought on shutter speed and ISO. I hadn't considered that making the shutter faster than the seeing would be something to think about. The seeing was pretty good when I took the shot so I guess I lucked out. I'll keep it in mind for future snaps.
  13. Yes I always shoot in RAW (i.e. in general use as well as astronomy) as it gives you that extra latitude when making adjustments. I selected the "good" frames by eye (it was pretty obvious as they were all either *terrible* or *passable* ) As JamesF mentions above, I used PIPP (which can handle RAW files) to do cropping/centring and conversion to TIFF format. I then used Registax6 to stack the TIFFs and apply some wavelets. After that I used Aperture and Pixelmator for some contrast/levels adjustments.
  14. The settings I used were: ISO: 200 Shutter: 1/320 I took the photo at prime focus with no lens in place, so no aperture setting to speak of, and the camera was in Manual mode (indeed my camera will only release the shutter in manual mode if you don't have a lens in A shutter speed of 1/200 looked ok too, but 1/100 blew out lots of pixels. I meant to experiment with different ISO settings but forgot. ISO200 is the lowest "normal" setting my camera has; below that it goes into "Lo1, Lo2" etc and I've never been too sure what they actually do to the gain at an electronic level so I ignore them. I was actually quite surprised at how *long* the shutter speeds were. Given how bright the moon is (especially in a scope) I thought <1/1000 speeds would be needed. Also, thanks for the comments folks!
  15. I tried jamming my Nikon d5000 into my scope (SW 200P) for the first time the other night to get a shot of the moon. This is the end result; eleven shots stacked/processed in Registax then curves etc adjusted in Aperture and Pixelmator. I took >100 photos but only eleven were decent Probably because I was waggling the camera about like a mad thing. If nothing else this experiment has convinced me to invest in a T-ring adapter. I'm pretty pleased with it though, might set it as my wallpaper I wonder how much extra detail would even be visible with more shots in the stack...? I feel like it's verging on being "over-processed" as it is.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.