An article in today’s Boston Globe discusses new gadgets that are being developed to allow residents of virtual worlds, such as Second Life, to interact on the road. As the virtual worlds continue to become more realistic and more accessible from multiple locations, 3DWalkthroughs.com predicts they will take over as the social networking platform of choice.
According to the article, motion sensors will allow people to live through their avatars in the real world and even allow their virtual world friends to see what they are up to in the real world. We are keeping a close eye on the development and release of the various applications that will be released as a function of this technology.
Gadgets may help merge virtual reality with real life
By Mark Baard, Globe Correspondent | July 9, 2007
That hipster you always see talking into his Bluetooth headset might soon be able to use a similar device to leap into Second Life without even stepping out of line at Trader Joe’s.
The company behind Second Life, Linden Lab, hopes to introduce hand-held and wearable systems that act as gateways between the real and virtual worlds. Linden Lab and other virtual worlds also are developing versions that run on existing mobile phones.
Researchers at a recent virtual worlds conference at MIT said that special eyewear, display “badges,” and speakers worn about the neck will allow us to live more fully through our avatars — those idealized versions of ourselves that typically boast better proportions than the saggy originals.
Second Lifers wearing the gadgets will be able to attend “in-world” parties and gallery openings, whether they are sucking down beers at Cornwall’s or stuck in Fenway traffic. Motion detectors and other sensors in the devices will also show your virtual mates what you are up to in the real world.
It might sound like public safety officers will need to shift focus away from the risks associated with driving while chatting on cellphones to the inherent dangers of operating in two realities at the same time. But conference participants said such concerns are premature.
“It’s like you’re not going to be allowed to be in a virtual world while driving in the real world,” said Robert Sutor, vice president of open source and standards at IBM.
Linden Lab vice president Joe Miller described one of the early products that will bridge the two worlds as a wearable box that creates a “3D sound field” that allows the wearer to hear voices from his virtual world without completely shutting out the real people around him.
The prototype speaker device presented to Linden recently by a developer “is not ready for prime time yet but it’s working pretty well,” said Miller, speaking at “Virtual Worlds: Where Business, Society, Technology & Policy Converge,” sponsored by MIT and IBM.
Linden is encouraging open source developers to create client software for mobile devices. And Blizzard Entertainment, creator of the online multiplayer game World of Warcraft, is hiring developers with experience in Symbian and Adobe Flash Lite for its mobile interface initiative.
Conference participants said cellphones are likely to be the first mobile devices to create two-way connections between real and virtual reality.
“The idea of cell phone as sensor has started to catch on in the sensor network community,” Joseph Paradiso, leader of the Responsive Environments Group at the MIT Media Lab, wrote in an e-mail last week. “They’re much heavier platforms than usually seen in sensor networks, but they are certainly ubiquitous!”
ResEnv has produced a prototype “tricorder” — inspired by the information- synthesizing gadget from “Star Trek” — that gathers data from real-world surroundings and translates that information into virtual desks and chairs.
In a video at the ResEnv website, media.mit.edu/resenv, grad students demonstrate how the tricorder’s sensors can detect someone swiveling in a desk chair and typing on a computer keyboard. The device can also show the user what is happening in the virtual space he or she is helping to create.
It will take some retooling before virtual worlds can accommodate all of the data streaming from ubiquitous sensors.
“We’re talking with Linden Lab [about creating] more efficient pipes of sensor data into their environment,” said Paradiso. “I can certainly stream video, but I can’t efficiently input diverse sensor data.