MIT Invents A Magic Lens That Combines All Your Screens Into A Single Experience
When we need to send a command or a file from our smartphone to our laptop, we do it through menus. But imagine if we didn’t have to send something to another device by dragging it to a file, or selecting it from a menu. What if we could use our iPhones as a magic lens instead, so that our smartphones could seamlessly interact with what’s happening on the screens of our laptops and tablets?
This is the vision for THAW, the latest project to emerge from the MITMedia Lab. THAW can see and interact with what’s happening on other screens, allowing your iPhone to download a song from your laptop just by looking at it through the camera, or to literally “pick up” a video game you were playing on your television console and seamlessly continue playing it on your smartphone. And that’s just for starters.
It’s the result of an internal collaboration between Philipp Schoessler of MIT’s Tangible Media Group, which also birthed a revolutionary, shapeshifting display you can reach through and touch, and Sang-won Leigh of the Fluid Interface Group. The former group spends its time trying to make the digital world more physical; the latter works to find new ways to make the physical world more digital. Sometimes partners, sometimes rivals, the two are like Yin and Yang.
“We live in an increasingly digital world, but that world is fractured between many screens and interfaces,” Schoessler says. “The question we wanted to try to answer with THAW was how can we combine these computer interfaces and screens into a single seamless experience.”
THAW works by placing a color grid on the monitor, and using the iPhone’s camera lens as a way to detect what part of the monitor it’s hovering over, similar to the sensor on a computer mouse. But THAW is much smarter than your average mouse: it can see what’s underneath it, and use the phone’s screen to interact with it. To not distract users, the team devised a way to hide the color grid everywhere except directly underneath the iPhone camera.
Although the THAW’s position-detecting mechanism for the iPhone is relatively simple, the interactions it enables are surprisingly complex. A lifelong fan of Super Mario Bros., Leigh programmed up a simple video game that shows how THAW can be used in a variety of ways that put the likes of the Microsoft Kinect and Nintendo Wii to shame. These are video game consoles that can see you, sure, but what they can’t do is see the other screens in your life. And that opens a lot of gameplay possibilities.
In Leigh’s game, the goal is to move a polar bear to a flag at the end of the level, but each level has a different mechanism. In one level, you might have to cross a pit full of spikes by using your iPhone as a physical platform for your bear to jump across; in the next, you might capture the bear in your iPhone, physically shake the device, then shoot him over to an island across the world like a champagne cork.
The concept of THAW is inherently interactive: smartphones overlaid upon tablets overlaid upon laptops overlaid upon desktops. That’s why Schoessler and Leigh think that the first use of technology like THAW that people see in the wild will probably be in video-game development. Imagine sharing a file with someone by “snapping” a picture of it with your iPhone on your laptop screen, pointing your iPhone at someone else, and flicking it across the room to share with someone else. THAW–or technology like it–could make that possible.
“I’d love to see this become the next Bump,” Schoessler says, alluding to the popular file-sharing app that allowed users to share contact info, files, and photos just by bumping their devices together. Bump was purchased by Google last year.
As for Leigh, his dream is very different: he’d love his game-design hero, Shigeru Miyamoto, to use THAW to make the next cross-platformMario game. “And I hope Nintendo asks me along for the ride,” he laughs.