http://www.wired.com/2015/12/mits-reali ... d-reality/
BY NOW, WE can probably all agree that the term “smart object” is a bit of a misnomer. The connected devices in our homes, while not dumb per se, aren’t exactly geniuses either. They sit there with the limited functionality assigned to them straight out of the box—a smoke detector’s a smoke detector, a toaster’s a toaster. “You buy a radio or something, and that’s as good as it gets,” says Valentin Heun. “You can’t change how it works.”
Heun is a researcher at the MIT Media Lab’s Fluid Interfaces Group where he’s building an app that he hopes will effectively turn the one-dimensional objects in our lives into a library of parts whose functionalities can be mixed and matched via an augmented reality interface. If that description sounds confusing, that’s because it sort of is. The app’s name, Reality Editor, hints at its wild ambition, which is— plainly put—to reprogram the physical world.
picture181
OPEN HYBRID
Heun describes Reality Editor as a “digital screwdriver that allows you to fix the things around you.” The app, which doesn’t work with out-of-box consumer products just yet, runs on an open-source platform called Open Hybrid that allows a tablet or phone to map a virtual interface directly onto an object using augmented reality. Using Open Hybrid your smartphone not only learns the various parts and functionalities of a given object, it also enables you to add that functionality to another, different object.
For instance, using Open Hybrid a user could break down a toaster into modular components—a slider, a timer, a heating mechanism—and then, using Reality Editor, reassign any of those functionalities to a totally different object simply by drawing a line from one component to the next in the app. A more concrete example: If you wanted your food processor to have a timer, all you’d have to do is look at the object through the Reality Editor interface and drag a line from the toaster’s timer knob to the food processor’s motor; the two would then automatically be connected over the Open Hybrid server.
Now, you could argue that using a virtual interface to control the tangible objects around you isn’t a particularly compelling idea on its own. We use digital interfaces to control the connected objects in our home all the time. That’s the problem, says Heun. As we add more “smart” objects into our lives, we’re becoming increasingly detached from the physicality of them. For every doorknob and smart bulb installed there’s an app with dropdown menus that comes with it. “It becomes very crowded and very complex,” he says. Poking around apps to dim a light doesn’t make your life easier—in fact, when you compare it to the ease of flipping a switch, it actually makes it harder.
Controlling objects through their physical interfaces is the ideal, but it’s hard to build in enough functionality. “To put a lot of functionality on physical objects can look very ugly and cost a lot of money,” he says. Using Reality Editor to reassign functionality from one object to another could not only declutter a physical interface, it could also ultimately free us from the shackles of our smartphones, instead letting us rely on the physical world to manipulate our devices.
In Heun’s video, he shows how a technology like this could be used in an “if this, then that” situation. By connecting a lamp, a chair, and his car to the Open Hybrid network, Heun programmed a sequence of events intended to streamline the process of leaving work. We watch as Heun stands up from his chair and walks out the door. The lamp automatically flicks off, which activates his car to start automatically. In the car, the temperature is just right and the radio is set to his favorite station. Everything works just so—because he programmed it that way. It’s a compelling vision for the future, where the objects around us know us so well that we simply have to exist in order for them to do our bidding. And indeed, the big promise of smart objects is that over time, as the ecosystem grows, our devices will passively learn enough about our habits to predict (and perhaps encourage) certain behaviors. Products like Nest rely on artificial intelligence to detect and interpret human behavior and then make choices on our behalf. What Heun is proposing is a reassignment of control.
Interestingly, Heun’s app assumes that users want more control over the objects in their lives, which isn’t always the case. Smart objects are attractive if only because they cater to our laziness—we don’t think about them simply because we don’t want to. Reality Editor requires active participation—at least at first. And maybe that’s not such a bad thing. Heun believes that by making choices about how we want our devices to work, we’re tapping into the innately human capability of muscle memory. His vision is that eventually the digital world will blend into the physical—no longer will our smartphones be a buffer between the two. Reality Editor then, isn’t a solution so much as it is a tool to get us to the point where we can naturally and seamlessly interact with both the digital and physical worlds around us.