Raspberry Pi Calendar:

Intro

Wondering around goodwill I came across an older LCD picture frame, a Coby DP740. Normally I wouldn't be too interesting in a low resolultion screen that can only take pictures off a memory card, but this one was differant. It had a Mini-USB input port. I found this around the time I was working on making a Photobooth for my sister's wedding and the printer I was using for the photobooth took long enough to print I was concered people would think it was broken. The Coby picture frame seemed like a great way to deal with the 1-2 minute printer lag, I could connect to the picture frame over USB and upload a picture to it that said "Please Wait, now printing..."

So I got home, found a power supply that matched the Picture Frame, dug up a Mini-USB cable (getting hard to find those in my collection) and plugged the frame into my computer. Beyond the fact it only had a small amount of storage space (something like 128MB) it worked as expected, showed up as an external drive. While the frame is connected to the computer it displays Connected to Computer on the display, but after disconnecting it from the computer then it starts the slide show.

I was very suprised when I then plugged the frame into my Raspberry Pi (which powered the Photobooth) and the screen went black. I figured I had just fried the thing somehow, but I plugged it back into the PC and it worked normally... Every time I plugged the picture frame into the Pi the screen on it would go black rather then saying Connected to Computer.

After some digging I found the controller in the picture frame happened to be the same one found in some Pico Projectors and Linux was idenifying this picture frame as a projector rather then external memory. Well this was even better then I hoped, now I can just "project" any messages I want to the frame rather then having to worry about mounting mass storage devices and dissconnecting USB and all that crazyness.

So this worked great for the photobooth, but recently I got it in my head that it would be nice to have a calendar at home that would update based on events Courtney and I had in our online calendars, possibily also display the weather and time. A Raspberry Pi Zero and this picture frame seemed perfect for this.

Loading the Calendar Data

Most of my projects use way too much Bash Scripting and PHP, but I know them well enough by now it's good for rapid protoying at least. On the Photobooth to generate the graphic for the Picture Frame I had been using a program (webkit2png.py) to render a PHP page as a PNG file. So to keep as much the same as possible my base was going to be a PHP dynamically generated calendar.

First thing I had to do was find out how to get data from a Gmail calendar and a Microsoft Exchange calendar. They both will output a calendar feed as an ICS file. Rather then learn all about how to parse an ICS file I did some looking to see if anyone had a nice simple PHP library that already parsed the data. After looking at some differant options I settled on ics-parser. This project is over 7 years old, but it had a nice simple one file for me to download and include and I'm done.

There is a link that the poject page to a newer and much more up-to-date project HERE. I didn't go with this project, partly becuase I liked the simple install of the older project (I've never heard of "Composer" before and it just sounded anoying for my small use case). Also becuase I didn't think I would use the advanced features of the newer project. I was wrong about that and ended up creating most of those features from scratch, so if you are looking to duplicate my work I'd highly suggest using the newer ics-parser.

Interpreting the Calendar Data

So now I had the data loaded into a PHP arrary, the first thing I thought about is I'm going to need some way to compare these dates, first to filter out anything that wasn't part of the week I wanted to display. To me it made the most sense to convert all the times to UNIX Epoch (a count of the number of seconds since January 1st 1970, UTC timezone, so 1PM May 20th 2018 UTC would be represented as 1526778000). I thought, this is a simple number, much easier to work with than a Month/Day/Year. This was not a good idea, I should have converted them all to standard PHP time objects, which handles things like when a day is more/less then 24 hours. (I assumed a day is 24 hours, that's not always the case).

Also correctly converting Times to UNIX Epoch was made harder by Gmail giving me all times in UTC and Exchange giving me times in Central Time. Another reason to use the standard PHP time object it will worry about TimeZones, rather than you, still at the time I thought it would be easier to compare numbers (a count of seconds) rather then something as complex as a date/time. Once I got the time stuff figured out the rest of the ics file was very easy to read (print_r() is your friend). The interesting fields I pulled out are:

Like I said, pretty self explanatory, there are some other fields as well, but nothing I needed for this project.

Building the Calendar Layout

After alot of back and forth I found my 800x480 picture frame could display 14 days (two rows of 7). I decided to display the previous day, the current day and 6 days into the future to leave room for 4 days of weather and alittle bit of space for the future. This let me label each colum with a day of the week and keep both rows on the same days. This means that "today" is always in the same spot on the screen but the days of the week are always shifting. I've found this to make sense and be easy to follow.

The harderst thing I ran into when coding the calendar is Daylight Savings Time (have I mentioned that I really should have used the PHP time object?). To figure out which events belonged on which day I took the current day and added or subtracted multiples of 86400 (24 Hours * 60 Minutes in an hour * 60 seconds in a minute). This worked great until we had a day that was less then 24 hours. When using the PHP timeobject you can do things like say "-1 day" and it's smart enough to realize you don't mean 24 hours, but one calendar day. I rewrote this section of code to use the time object so at least my calendar doesn't go from Sautday to Monday when the clocks spring forward on Sunday, but I was too lazy to re-write the whole thing.

It is a picture frame so my intention was to have the background of the calendar be a nice image like what Bing has on their homepage so all the background colors are transparent.

Rendering HTML to PNG

I'm a big beliver in "if it's doing it's job don't try to "impove" it. So even though the photobooth has been used several times since my sister's wedding and I've made changes I keep working off the same setup I started with. So I haven't had to reinstall any of the software in a while and since the photobooth is not online I don't worry about doing updates (which might break things). For the photobooth as I mentioned before I was using webkit2png.py. This uses QT to render a webpage. Due to security changes in the QT library I couldn't get this to work anymore.

After a bunch of hunting for something that could render a webpage to PNG without needing a graphical interface I found the PhantomJSproject. It's way more powerful then what I need but a simple couple line script gets it to render the PNG and it even correctly leaves the background transparent. My sample code (obviously everything in {} needs to be replaced with real values:

var page = require('webpage').create(); page.viewportSize = { width: 800, height: 480 }; page.open('http://{url}cal/cal.php', function(status) { console.log("Status: " + status); if(status === "success") { page.render('{directory}cal/output/cal.png'); } phantom.exit(); });

"Projecting" an Image (or a Video)

This is where things went alittle sideways (in a good way I think). I know from the photobooth that I can use the software am7xxx to "project" images onto the screen. am7xxx gives me two format options, JPG and NV12. Again from the photobooth I know NV12 is required. I happened to be browsing through the documention and noticed a usage example:

am7xxx-play -F2 -i http://download.blender.org/peach/bigbuckbunny_movies/BigBuckBunny_640x360.m4v

Just for the heck of it I ran the command expecting the Raspberry Pi to choke on the stream. I was amazed to find that the Pi had no problem streaming that video to the Picture Frame. This formed a new idea :)

Bing sometimes uses looping motion backgrounds, wouldn't it be cool if I could have a moving background behind the calendar? Some problems to over come immediately came to mind:

I have a small personal websever that is sitting around bored most of the time so I decided to have it do all the heavy lifting and just let the Pi just worry about playing the video file. As I said I decided to use the video loops that Bing publishes on their homepage. I found searching Bing for "az/hprichv" got me a pretty good list of high quality, looping MP4 files from Bing to download. I'm unsure legally exactly how OK using Bing's video for this is, so I won't post the links, but if you are actaully going to undertake this project getting the video will be the easy part.

The videos from Bing are of course much higher resolution then my picture frame, so step one is to shink the video file down to something more resonable. I used ffmpeg for all my video file work, it's nice and easy to script in Linux. I did find the verison of ffmpeg on my CentOS 6 server was outdated, so I ended up downloading the newest verison direct from them https://www.ffmpeg.org/download.html.

So to resize the video file is simply:

ffmpeg -i SourceVideo.mp4 -vf scale=800x480,setsar=1:1 ReducedSizeVideo.mp4

This is pretty straight forward, -i is input -vf is video filter. I believe that setsar=1:1 forces ffmpeg to maintain the video aspect ratio and will crop the video to acheve the size needed rather than streching or squeezing it. And obviously the last file name is the output file.

Then just pop the PNG file onto the ReducedSizeVideo:

ffmpeg -y -i ReducedSizeVideo.mp4 -i RenderedPNGFile.png -filter_complex "[0:v][1:v]overlay" -b:v 20000000 OutputFile.ts

This isn't too much more complex, -y tells it to overwrite any exsiting files. -i is input (notice there are two input files this time). -filter_complex is for new, more advanced filters, in this case we are telling it to overlay the second file (our PNG) on top of the first file. -b:v specifies the video bitrate, for some reason if I didn't do those ffmpeg would pick a low bit rate and the video looked bad. Then finally the output file name. You'll notice I use a file ending in .ts, this is so the rendered file is just a data stream with no header or (more importantly) any length information.

One of the problems with paying back video using am7xxx-play is there's a 2-5 second lag before starting playback. Not that big a deal but I want it to be smooth and seamless. I could render out an hour long video, that way I only have the lag of it starting over every hour, but that limits me to only updating the information on the picture frame once an hour and who wants all that extra video data laying around? I remebered the old trick to kill a fax machine. Because the video files are headless I can use PHP to do the same thing, read out one video file after another, or just read out the same file endlessly. am7xxx-play just assumes it's a live streaming video and keeps playing the data.

Extras (weather data/clock)