Hannah Wolfe Media Art & Technology R.O.V.E.R.: Reactive Observant Vacuous Emotive Robot |
|||
R.O.V.E.R. is an autonomous robot. He was built and designed as an interactive art installation specifically for the MAT EoYS 2013. He navigates the space with a heat array and bump sensors and interacts with attendees. When R.O.V.E.R. recognizes that a person is present he plays a song. If people are still visible and how much they are smiling effects whether he thinks they like the song, and he tries the next song. A genetic algorithm is run with the songs and people's reactions to create the most likable song. The heat sensor data, space and people he interacts with was mapped in processing and visualized in the space. The installation consisted of the robot exploring the space and a monitor with the visualization. This project explores Human Robotic Interaction, Computer Vision, Physical Computing, Interactive Environments, Bio-mimicry, and Genetic Learning Algorithms. |
|||
ROVER Hallway from Hannah Wolfe on Vimeo. |
|||
ROVER Visualization from Hannah Wolfe on Vimeo. |
|||
Goals: The minimal required outcome was the Arduino sending heat data to a off board computer via xBee, the Arduino would direct the wheels using the heat data, and there would be a visualization with face tracking, eigen data and the amount people are smiling. The reach goal was a mapping visualization of the space. All of these goals were reached though the real time face tracking visualization was deemed not interesting enough to be shown in the end of the year show. |
|||
Facial Recognition and smile detection visualization with eigenface breakdown. |
|||
Visualization Design: The visualization takes the last 3000 points and maps it. For each point there is the 64 bit temperature sensor array displayed. The left and right wheel velocities were converted to a trajectory, based off of time between each data point. If a face is seen at that point a circle is drawn next to the heat grid. The circle's radius is based off of how many faces were visible. In the upper left corner is a larger realtime grid of the last heat sensor data received, and a red circle next to it if a face was seen. The visualization is updated each frame with the newest data. |
|||
R.O.V.E.R. Design: R.O.V.E.R. was inspired by the loneliness and emptiness of the hallways of Elings Hall and the need for companionship. R.O.V.E.R.'s also came from the idea of Pavlov's dog which learns via classical conditioning. I viewed R.O.V.E.R. to be like a puppy or alien trying to learn how to interact and communicate with people, due to this I chose to sew a furry felt covering. Many attendees of the show wanted to pet and touch R.O.V.E.R. for this reason. A future conception of this project would include touch sensors in the fur so R.O.V.E.R. could use this information also as information. R.O.V.E.R.'s structural design was driven by the technical needs and the flaws were embraced to make it more life like and comedic. The design though was intentionally simple and ambiguous so that the viewer could place his or her own ideas onto the piece. R.O.V.E.R. initially was conceived in the TransLab where I repurposed a remote control car and had it tracked by the tracking system. The direction sent to the car and spatialized sound was dependent on where the car was in the space. I really wanted to take the project outside of the TransLab and tracking system and allow it to roam the hallways. |
|||
Technical Details: R.O.V.E.R.'s onboard brain is an Arduino Mega 2560. It is connected to an iRobot Create, which the Arduino receives bump sensor data from and sends drive directions and songs to. The Arduino has one infrared thermophile which is a 16X4 pixel heat camera (Melexis MLX90620) to track body heat. There is also a GoPro with a wifi bacpac that is taking photos every 5 seconds. The Arduino is connected wirelessly to the laptop via two xBees and sends the drive directions and heat sensor array. The laptop has a python script which takes the drive directions and heat sensor data and inserts it into a MySQL database. The python script also pulls the most recent photo from the GoPro's wifi network and runs a Haar Cascade on it to see if there are any faces in it, if there are it runs a second Haar Cascade to see if there are any smiles and how large it is. This data is also saved to the MySQL database. If a face is detected a song is sent to the Arduino via the xBee. A processing application is running on the laptop pulling the xBee and facial recognition data from the database and visualizing a map of the data. |
|||
Future Work: I would be interested in integrating the facial recognition code so that R.O.V.E.R. could learn specifically for different people. I would like R.O.V.E.R. to be able to detect when his battery is low and charge himself. I would look into using streaming video instead of still frames for facial recognition. I would want the camera to have a longer battery life and be able to charge it through the iCreate. I would have the camera higher so it is more at face level. I would look at finding a better way to show the world from his point of view. This could be done by putting an accelerometer in the head and show the heat data or video swaying like how he would see it. Lighting was an issue, this could be fixed by having R.O.V.E.R.'s head be translucent and glow from inside illuminating the space around it. I would also like to do further testing to see what tune R.O.V.E.R.'s genetic algorithm settles on. |
|||
Literature Review: |
|||