Table of Contents
This project aims to display a static image using a raspberry pi onto a display and animate it via MyHeritage Deep Nostalgia and a sonar/proximity sensor. The project is broken down into two parts:
- Building a raspberry powered picture frame.
- Tracking data from the sonar to play a video within a desired range and duration using a Python script.
To not re-invent the wheel, I've included some Instructables tutorials that I've found which showcase the building of the picture frame. As there are several display options, I've chosen a wide gamut of display options. Personally, I've chosen a simple RPI display and just made a custom frame to suit my needs. I've included my own material list and included non-affiliated links.
Here are a few different versions I have found on instructables that hold your hand with regard to creating the entire frame. They are a great resource, even to a seasoned builder.
- Moving Sirius Black Wanted Poster (Harry Potter)
- this project has both, static and moving, components. Great resource.
- Moving Portrait
- This project utilizes a kindle for a display, which is rather neat.
- Live Portrait
- This is the most similar to what I did but uses different components. Eas
- Raspberry Pi
- Any version of Raspberry Pi is technically usable but I would recommend at least Pi 2 as the original may be under powered. In my case, I used a Pi 2, Model B.
- RPI Display
- Screen Dimensions: 194mm x 110mm x 20mm (including standoffs)
- Viewable screen size: 155mm x 86mm
- Sonar Sensor
- HC-SR04 Distance Sensor to detect if someone approaches the display.
- Small Breadboard (Optional)
- I used the breadboard to connect HC-SR04 to the pie in a neat way.
- OPTIONAL
- Picture frame mat (Optional)
- I purchased mine from Staples. I do suggest getting the uncut version so you can size it to your particular needs.
- Speaker Fabric (Optional)
- To clean up the appearance and cover the sonar, I used a speaker fabric to overlay on top of the picture frame.
- External Power Supply
- Misc wires, ties, tubes for installation
- Wood & Tools, depending on which version of the frame you decide to build.
omxplayer-wrapper
, which can be found HERE is a wrapper for the native RPI video player. It was one of the simplest solutions to use but is a bit finicky, at times.
There are several variables that you have to adjust:
VIDEO_PATH
- the path of the video on the RPI. The first frame will be displayed as the static image.
DISTANCE
- the distance at which the auto-play will trigger, in centimeters.
_SLEEP
- a delay to allow the video to be loaded into the buffer on initialization. This may be obsolete in newer versions of RPI.
_PIN_TRIGGER
- corresponds to the pin that monitors the sonar sensor. When a signal is detected within the provided distance, the video will play.
_PIN_ECHO
- corresponds to the pin that sends the signal to the sonar to actively monitor the surroundings.
Please see the open issues for a full list of proposed features (and known issues).
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature
) - Commit your Changes (
git commit -m 'Add some AmazingFeature'
) - Push to the Branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
Distributed under the MIT License. See LICENSE.txt
for more information.
If you'd like to get in contact, the best way would probably be twitter or open an issue.