An AI powered museum guide that can sit on your shoulder. Intended as a demonstration of how physical movement can compliment AI capabilities.
More process photos are featured in my Lab Notebook. All original models are on OnShape.
Currently the code cannot be provided as the project is still in the process of being published. Any updates on this process would be published here.
Process
PuppetGuide is a research project led by Suibi Che-Chuan Weng and Krithik Ranjan in ATLAS's ACME Lab. In this project I took on the role of physical electronics and fabrication. An ESP8266 was utilized to control the servo, through values stored in a queue. Additional python programming is used to process the amplitude values from an MP3 file, wherein the values are then averaged, interpolated, and mapped. Inspiration was taken from a book on the art of Karakuri to find means of translating the servo's rotations into a wider variety of movement.
For the more aesthetic side of the project, the goals were to have the puppet appear to be speaking along with an indicator that the puppet is listening. To achieve this, it was decided that the puppet would be a bat with big ears that raise and lower depending on its current state. The fabrication was a mix of 3D prints and upholstery foam, and reached the point of a rough prototype before my internship ended.
Final Presentation
At the end of the internship program I gave a presentation on my contribution to the project. Below is the recording from Zoom, including my final slides.