a, Science & Technology

Shared reality enriches human-computer interaction

Imagine arriving late to class, ears burning from the cold, and stepping over puddles and bags until you finally find a seat. As the professor lectures away, a student several rows down raises his hand. The professor, unaware, continues the lecture. After some time, the student eventually puts it down.

For most McGill students, this scenario isn’t hard to imagine. Professors might not always be fully aware of the hands that go up at the top or the far sides of a lecture hall, and it’s not their fault. They’re only human after all.

What if there was a way to detect if a student raised their hand in a large crowd, or to recognize if students were confused during a lecture? These possibilities are being explored at the McGill Shared Reality lab.

“A prototype system that we developed years back was used to automatically detect if a student raised his or her hand up for a question,” Dr. Jeremy Cooperstock, Director of the Shared Reality Lab, said. “Since it was in a large class setting, the system would then inform the instructor by raising a flag on the screen about who has their hand up, and what part of the room they’re in.”

Experiments at the Shared Reality Lab involve the use of virtualized reality techniques and advanced networking protocols to give users a strong sense of co-presence—the feeling of being together in a shared physical environment. This is accomplished using a number of screens, cameras, projectors, and microphones, along with a high-fidelity vibrosensory system.

“This high fidelity and low latency type of communication gives people the ability to feel like they are engaged in coordinated synchronous activity with those far away, while in the comfort of McGill,” Cooperstock explained.

The lab was one of the first research groups in the world to support a distributed music performance. Jazz students from Montreal and Stanford performed together at the same time using the technology developed by the Shared Reality Lab. They were able to see and hear each other in the same capacity as if they were physically in the same space.

cim.mcgill.ca
cim.mcgill.ca

The lab also looks at how to adapt users to a different kind of environment by rendering physical scenarios, such as the sensation of walking on different ground surfaces, such as snow, gravel, or sand.

“And they can experience that even though they are physically in a laboratory environment, walking on floor tiles,” Cooperstock explained.

The third dimension of the lab involves looking at sensory substitutions for those unable to experience a certain aspect of the everyday world around them. By working with the blind community, the Shared Reality Lab looks for ways to give these people the visual experience of the world around them. This is accomplished by providing the information usually available to vision through audio, explains Cooperstock. The lab has also given demonstrations on new technologies for the Android and iPhone, which give users a constant display, through audio, of what points of interest are around them while walking outside.

Along with developing sensory substitutions, the lab has created different applications to help Music and Medical students at McGill. For musical training, the lab developed a simulator known as “Open Orchestra” which has received significant recognition.

“The simulator was [developed] to give classical and jazz musicians the experience of rehearsing with the rest of their band or the orchestra around them,”  Cooperstock explained. “It wasn’t a live performance scenario. Rather, it was a rich multimodal experience of what it feels like, looks like, and sounds like to be sitting in, for example, the second violinist seat in a 30 seat orchestra and playing along with the different musicians, while seeing and hearing the conductor at the same time.”

Similarly, a training system was used to train McGill medical students in response scenarios. This project was funded by the Canarie’s Network Enabled Platforms (NEP) program, and was completed in 2010. It involved using medical mannequins to mimic different physiological functions, such as blood flow to the heart, and teaching students through scenarios where they had to experience and address real situations, such as a patient in a car accident.

Through this new approach to human-computer interactions, the Shared Reality Lab offers a glimpse into the future of virtualized reality, which could change the way we interact with people around the world. One day, attending the opera could involve listening to different opera singers in different time zones, standing on different stages in a worldwide opera house located in cyberspace.

Share this:

Leave a Comment

Your email address will not be published.

*

Read the latest issue

Read the latest issue