Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

Astronomy dashboard


cjdawson

Recommended Posts

I’m thinking about a project for a raspberry pi.  On a recent trip to Kielder observatory I saw they had some dashboard screens that I liked the look of.  I know they were running from raspberry pis cause I saw them.  I’d like to make something similar myself.

 

im hoping there’s people here with loads of pi experience and can offer some useful pointers.

 

firstly, I want to do this on top of raspbian.  My initial thoughts here is to split thproject into two parts. 
 

part 1

will be a cron job or something similar that will scrape information from websites and store the bits locally.

ive got no experience with scraping websites

 

part2

will display a web browser a display in full screen mode.   The browser will display pages in rotation.    Like a billboard.

for fun as I have a pi 4.   I’d like to have two displays each showing a different page.    This can come later though.

 

 

does anyone have any ideas where I can look to get started on this?

Link to comment
Share on other sites

Scraping data from websites isn't always simple, depending on how it is generated in the first place.  If it's all embedded as HTML then it's not that hard to do using Perl, PHP, Python or any number of other programming languages.  You could quite easily use that to generate static pages with a bit of javascript that loads each page in turn, or just replaces the current content with the current version of the page.

If the content of the pages you want to scrape is itself generated using javascript, say, then it can get a lot more complicated.

And obviously you should pay attention to the content any robots.txt files for the sites you're scraping.

James

Link to comment
Share on other sites

31 minutes ago, cjdawson said:

Why should I worry about robots.txt?   That's for search engines.    What I will be doing isn't a search engine, it's effectively caching data so that I can display it in the way that I want.

Strictly speaking it's not only for search engines, though they may well be the main target.  It's for any mechanical system that grabs content from a site.  For example, I believe at least one of the curl or wget applications will refuse to fetch a page from a site by default if the site disallows access to the page via robots.txt.

James

Link to comment
Share on other sites

For a quick Dashboard approach you could use Node Red Dashboard and the many tools it provides. Runs very well on RPI3b+ so should be just as good on RPI4. You could even install Docker to control the versions of Node-red.  Example of live system running on older RPI3 !

astroboard.png

Link to comment
Share on other sites

On reflection I think that Dashboard isn't the right way of describing what I'm wanting.    It's more like a continuous slideshow.    And I'll be running on on a PI3 initially, maybe even my PI2.  The PI4 dual display idea is something for "later".

The idea is that it's going to be a display that shows information, then after a short delay (to be decided, something like 30 seconds or a minute, it will show the next "slide", once it gets to the end, it'll start over.

The slides will be things like....

1. Clearoutside forecast for the next week

2. Aurura information

3. Sun spot info

4. Current Sky chart for my location

5. List of "Tonights best" observing targets.

 

something like that.   The web pages "slides" I'm sure will change the content over time.

Link to comment
Share on other sites

I'll be using Chromium on the PI, running in full screen mode.  I've been thinking about this and first things first, I want to build the pages that will be displayed.   Getting close to the first one.   Just need to find a place where I can get the current number of sunspots.   It's proving a little tricky to find on the NOAA website.

Link to comment
Share on other sites

I'm making progress on this, in the end I needed to split the work into two parts.

The first is a PHP script that grabs the data and does the scraping.   This saves json files, and images that will be used later.    This script is going to be setup to execute on a CRON job.    Before I'm done with it, I'll make sure that the script stops if it does not have an internet connection.

 

The second part is running an Apache2 web server on the PI, turning OFF all caching (this is deliberate) so that the updates are passed properly.     Then I'll create a page that gets queries something like     index.php     This will then server up one of the "slide" screens,  i.e. one of the html files that I'm creating to display each page of data.   This script will keep track of which pages it can serve, and which page it just served.  Each time it is called, the next page is served.

This way, I can have 2/3/4 browsers requesting data and each will then display a different pages.  As long as  no. of Pages > no. of displays, it no two displays will have the same information on.

 

This way, I can start Chromium in Kiosk mode, have it load the page and hey presto it's got a unique display.

 

By making sure that each HTML file has an auto refresh set, they will rotate happily without any extra effort.

 

How to ensure that all pages have the refresh?   easy.  I'll get index.php to insert the tag into the html heater or the http header (which is better imo)

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.