- Take Classes
- Youth Programs
- Digital Pathways
- Advanced Tracks
- The Factory
- BUMP Records
- BUMP at 10
- Remix Videos
- Adobe Youth Voices
- Digital Pathways
- SF Commons
- Independent Media
- Client Profiles
- A/V Artifact Atlas
- Submit an Inquiry
- Preserving Dance Heritage
- Lost Treasures
- Preservation Access Program
- Get A Job
HomeThe Software Behind the Pandemic
The Software Behind the Pandemic
Posted on: Monday, February 14 2011 |
by Wendy Levy, BAVC Creative Director
One of the highlights for me at Sundance this year was was Lance Weiler's Pandemic 1.0 at New Frontiers and all over Park City - an immersive issue-driven treasure hunt that extends story worlds and real worlds, human and computer interaction, conscious and intentional game play. I got a chance to talk to Mark Harris, the developer who wrote the Mission Control software that makes it all possible. Before I got a chance to distill everything he told me (between Mark and Lance, I got the best docent tour of the project I could ever have hoped for!), Mark sent his recent blogpost from Desperate Comfort that gives a great perspective. We've been talking about and designing interfaces for story/data projects for a while now at Producers Institute - I love when the talk gets real and someone builds something where a body is immersed in a narrative experience, a mind plays a game, an event brings people together to make change in the world. Check it out:
Lance Weiler‘s Pandemic 1.0 was an immersive experience at the Sundance 2011 Film Festival, involving mobile phones, a real world scavenger hunt for storyworld objects, NFC, Stickybits, Twitter, Gowalla, Facebook, and viral videos created by Sabi Films. I wrote the Mission Control software that ties all of this together.
Mission Control uses various interactions from the sources above, to actually drive the story forward. The story moves according to mass user interactions. This is perfect for a story about the spread of a virus, because peoples’ actions can both spread and retract the virus in the story. Some actions help, some hurt. In Pandemic11, the story is then expressed through a series of data visualizations by Jeff Clark.
I also integrated the Mission Control software with a Microsoft Surface table programmed and provided by VectorForm. Users could use the Surface table to map the spread of the virus on a global level. VectorForm also developed a mobile application, deployed on 50 new Google Nexus S phones, which encouraged users to engage in a number of tasks, such as answering certain morality questions, and taking photos of themselves to include in the exhibit. I provided the back-end services necessary to capture all of this data from the phones, including videos and photos, geolocation data and morality questions, and integrate that too into the story the Mission Control software was telling. All of the phone interactions fed into the story, and had an effect on the progress of the virus as well. Some objects, such as the phones, or special branded Pandemic water bottles, could be brought to Mission Control, set on the Surface table, and interact with the Mission Control software. For instance, laying phones on the Surface table would display all media shot on that phone. Water bottles would heal a certain number of people affected by the virus, and change the entire room to alert you of the goal you’d just achieved. All affecting the story in real time.
The main goal of this software is to tell a story through data. If some major event really occurred in the world, we would piece together the story of that event through news, twitter, blogs, user videos, etc. We would have to take in all of these sources and put the story together ourselves. This software mimics this kind of storytelling in a fictional world. Of course, what’s nice is that the same software can be used to tell a real (documentary) story in the same way. The software can track the progress of a story, and express that story through any kind of media.
I’ve been interested in this form of storytelling for some time, and even my upcoming feature film THE LOST CHILDREN makes use of it to some degree. So I was excited to have such a widely-viewed platform on which to launch this software.
I will be doing a more involved write-up soon, detailing how I plan on using this same software to tell many kinds of stories.
More links about Pandemic 1.0:
Mission Control Photos by Elaine Zelker.
Here are some of the data visualizations Jeff created for the project: