Augmented Reality Interaction with the Pausch Bridge research - May 2013

Programmer

Augmented Reality Interaction with the Pausch Bridge was an undergraduate research project I did in Spring 2013 at Carnegie Mellon University. I was advised by Kayvon Fatahalian over the course of the project. It won two awards at the 2013 Meeting of the Minds undergraduate research symposium, the 1st place Alcoa Undergraduate Research Award and the 1st place Frank-Ratchye STUDIO for Creative Inquiry Award.

The project allows users to control the bridge lighting by touching the panels on an iPad screen. The system uses openCV's 2D feature matching functions to detect feature points in a reference image and match them to feature points in a reference image. Once the points have been matched, the openCV computeHomography function is used to get a transform from screen point to reference image point. From there, the system looks up the point in the reference image in a different image that maps coordinate to panel ID. If a panel is hit, it sends off a HTTP GET request to a proxy server connected to the bridge's Pharos Lighting Controller. The proxy then relays the proper command to the Pharos controller.

This project was made possible by a SURG Grant for the 2013 Spring semester at Carnegie Mellon University.

Abstract

Augmented Reality Interaction with the Pausch Bridge seeks to provide a virtual interface on mobile devices that allows anyone to point their camera at the bridge and change the lighting, opening up new methods of interaction with the lighting system and opening up the bridge lighting to a larger part of the campus community. Using this interface, I've developed a virtual fingerpainting application where a user viewing the bridge through an iPad viewfinder can change the lighting by swiping their finger across the display. Changes are reflected in real time both on the viewfinder and on the bridge.

Poster

2013 Meeting of the Minds Poster (.pdf)