If none of the software that I try is able to manage to achieve what I would like them to I need to have another method of providing information to the user about various aspects of the exhibit. I must therefore overlay the video stream with another layer that gives the info, rather than have it integrated into the video itself.
The pressure pads work by sending a signal to the computer when two conductive surfaces touch. They could be placed under the terrain, and activate specific info when in a specific area. This however requires the terrain to be thin enough and the rover to be heavy enough to trigger the pads.
The conductive plates work by sending a signal to the computer when a wire underneath the rover makes contact with them. The biggest problem with this is that they would have to be on top of the terrain and thus visible.
These devices are often used in burglar alarms to detect whether a door is open or not, and could be modified to send a signal to the computer when a magnet underneath the rover passes above them. They could be above or below the terrain depending on the magnet strength, however, the activation area would be quite small and require the rover to be driven precisely over it to activate.
This would be a camera attached to the computer placed high above the exhibit, which would detect when the rover which is a contrasting colour to the terrain drives into a certain area of the exhibit. This is quite a complicated solution and requires being able to place the camera high up. It would also require re-exploring Pure Data again.
The Raspberry Pi can output video via a http protocol in an MJPEG stream and it can also stream via RTSP which uses the h264 codec. So far I have successfully managed to get both of these formats working.
The difficulty then is to pull them into a program which can manipulate them, using shape or colour recognition and chroma-keying. The programmes that I previously highlighted as potential solutions failed to work.
Manipulating video in Pure Data
Both Pure Data and Quartz Composer are effective for the manipulation of video from a webcam attached to the computer, but are less capable receiving a stream from over a network. Pure Data’s pix_video and pix_film objects can access webcams and files, but not network streams. Quartz Composer has a patch that has been created for it that can work with certain IP cameras, but only selective Axis Camera models are compatible.
I may need to utilise various software or applications to ensure the correct running of my project.
Scratch is a visual, tile based programming environment that can be used to create and control animations, games, music and even external outputs. It is aimed at children for its simplicity, but it has the capabilities to do as much as more complicated environments. It comes pre-loaded on the Raspberry Pi and could be used to control the rover via the GPIOs.
Pure Data is another visual programming language and environment that can control and process sound, graphics, animation and interface with external inputs and outputs. It could be used to Chroma Key the video so that it can be combined with the other info or videos.
Yet another visual programming language, Quartz Composer is used to create various visual applications for Mac OS. It is similar to Pure Data in a lot of its capabilities, but it brings a few different possibilities because it can handle video streams better.
This idea is similar to some of the designs that have been created for the Google Lunar XPrize.
>2 large wheels and two smaller non-powered stabilising wheels
>Body houses all the electronics
>Camera angle can be changed by changing the stabilising wheel angle