An expanded-cinema-experiment that connects a participant's brain activity, (both voluntary and involuntary), to camera controls via an open source brain-computer-interface. This system is currently under development, but when finished will functionally operate in lieu of a traditional cinematographer. In this case, the user's brainwaves will dictate the final composition.
With this new tool I hope to produce a new mode of cinematic production that allows for the participant/actor's affect to be transcribed directly into the image itself. For example, instead of a portrayal of emotions seen on video, we see the neural state itself as the system weaves emotional output into the construction of the image. Research has been ongoing since 2014 across a myriad of platforms ...
Prototype 002.??.15 (under construction)
In Spring 2015, with support from the Coca-Cola Critical Difference for Women Graduate Student Grant, I purchased Open BCI, an open source bio-sensing controller. This Summer, at The Laboratory Residency, I learned how the system works through trial and error, and playing around with programming examples from the Open BCI community. At present I am working on a fully integrated system for capturing and manipulating video on the fly, using Max/MSP & Jitter for image control and Arduino for motor control (camera position). Because of Max/MSP & Jitter's capability to integrate with Kinect, there may be some interesting possibilities for brain-to-3D imaging on the fly as well. Check back in the Fall for updates.
ORIGINAL Prototype 001.11.14
My first prototype was developed at MediaLab-Prado’s Interactivos?14 workshop in Madrid, Spain. There I worked in collaboration with Gregory Hough and Salud Lopez to develop Biofeedback Cinema Prototype 001.11.14. The initial prototype connected the participant's brain activity to camera-position and camera-focus. We developed the system across Neurosky Mindwave EEG Reader Headset, Raspberry Pi (utilizing python 3 scripts w/Open CV), and Arduino platforms.
For a complete overview of this first prototype, as well as step-by-step instructions to build your own, visit the Instructable online.
Prototype 001.11.14 on display at The Ohio State University, Columbus, OH2
To bring camera into focus, a participant concentrates while reading. A small monitor displays the results. The image focuses relative to participant's own level of focus, based on alpha brainwaves.
Small webcam position on a servo motor moved based on involuntary brainwaves.
Prototype 001.11.14 demo on display at MediaLab-Prado in Madrid, Spain
Participants had fun attempting to bring the camera into focus using their alpha brainwaves.
Salud Lopez choreographer and dancer performs with the system at Interactivos?!4 Exhibition
Collaborators Greg and Salud working hard to bring Prototype 001.11.14 to life at Interactivos?14. Greg's Python experience, specifically his knowledge of Open CV made this prototype successful.
Amazing support from MediaLab-Prado's Fab Lab staff. They designed and laser cut the housing for Biofeedback Cinema Prototype 001.11.14.
Salud drawing connections between physical movements of the body and brainwave frequency patterns.