Skip to main content

If you were a tree

If you were a tree

Virtual reality opens up a novel perspective in storytelling and engagement. It’s what seeded our VR film projects TreeSense and Tree.

By Xin Liu and Yedan Qian
In June of last year, we started a research project on understanding and developing body ownership illusion in virtual reality (VR). A couple of months later, our research grew into a VR narrative film we called TreeSense. Then, in October, we started collaborating with movie directors to push the project further and to develop a hyper-realistic VR film, called Tree.
All this work is based on our belief that virtual reality (VR) offers new ways of storytelling and engagement. Through systematic alteration of the human body’s sensory stimuli, such as vision, touch, motor control, and proprioception, our brains can be trained to inhabit different entities. Virtually experiencing a different state of being can lead to a heightening of empathy, of appreciation for others’ realities.
To showcase the power of this embodied storytelling method, we decided to create a sensory-enhanced VR film in which a person inhabits a tree — indeed becomes a tree — seeing and feeling their arms as branches and their body as the trunk. The audience members collectively share the experience of being a tree throughout its life cycle — from a seed rising through the dirt, to sprouting branches and growing to full size, until finally the tree is destroyed by fire. We called this project TreeSense.
TreeSense is a sensory VR system that transforms a person into a tree, from a seedling to its full-size form, to its final destiny. Credit: Fluid Interfaces group.

TreeSense cultivates Tree

While we were developing TreeSense, we met VR film directors Milica Zec and Winslow Porter when they spoke to our class at the Media Lab. At the time, and by an odd coincidence, they were also developing a similar story about “being” a tree. It was a very exciting bit of serendipity, and we decided to work together in order to transform our ideas into the comprehensive VR film experience, Tree. In the Fluid Interfaces group, we took charge of the design and construction of the tactile experiences that are represented throughout Tree, while artist Jakob Steensen designed its stunning hyper-realistic visuals.
In both short films, TreeSense and Tree, each viewer wears a VR headset and becomes immersed in a virtual forest. But these projects differ in their tactile technologies. In TreeSense, we rely on electronic muscle stimulation (EMS) to provide visuotactile feedback. We’ve designed a series of EMS signals by varying combinations of pulse amplitude, pulse width, current frequency, and the electrodes’ location. The exciting part of EMS is that we’re able to explore uncommon body sensations, including feeling electricity underneath the skin, or fingers moving involuntarily. Audience members can actually sense their branches growing or getting poked by a bird. This implementation of EMS is definitely experimental.
In Tree, we follow a more conventional and mature approach because the project has to work reliably for hundreds of people when it’s demonstrated in public events such as film festivals in theaters. Mostly we use commercial products, such as a Subpac for body vibration. We’ve also customized localized vibration points on the arm, and used subwoofers in constructing a floor that vibrates so users can actually feel the rumblings of a thunderstorm. Whenever there’s thunder, lightning, or fire in the VR, people also experience how those phenomena “touch” the tree. We’ve also made use of an air mover to simulate the wind, and added real-life heating in the space to simulate the fire that threatens the “tree person.”
This graphic shows the haptic hardware we use to create the virtual reality tree experiences of movement, air, temperature, and smell. Credit: Xin Liu and Yedan Qian
We see the film, Tree, as a polished extension of our project, TreeSense. The VR movie was shown outside the academic environment and seen by hundreds of audiences in 2017 festivals, including Sundance, TribeCa, and TED: The Future You. At the Tribeca film festival, the immersive VR experience of Tree also included the use of scents. The audience literally smelled the dirt, the rain forest, and the fire.
So far, the presentations at film festivals and elsewhere have been very well-received. We’ve seen audiences coming out of the VR film experience in tears. It’s been exhilarating for us to witness the power of body sensations in this new form of storytelling. People have told us they really felt like the tree and found its destruction to be terrifying and emotional. Meanwhile, TreeSense has just received an interaction prize in this year’s Core77 Design Awards and it will be shown at Design Dubai Week in the fall.
This gif from TreeSense shows a bird interacting with a tree, as seen and felt by the viewer. Credit: Xin Liu and Yedan Qian

Challenges and expectations

Both our projects — Tree and TreeSense — experiment with new methodologies for immersive participatory storytelling by leveraging new technologies such as VR and tactile feedback mechanisms. We’re aiming to evoke believable bodily experiences through electronic muscle stimulation, vibration, temperature, and scents to unlock a higher level of realism. As an audience member said after feeling immersed in Tree, “You know it’s not real but your body really believes it!”
The design of multi-sensory experiences is a complicated process of composition and choreography. We constantly have to make sure the experiences are perfectly synced, both in terms of timing and intensity. There’s already high-fidelity visual and audio inside the VR headset, but it’s not a trivial thing to add tactile elements to enhance the experience, and not distract or disturb the user.
In Tree, we have multitrack bass audios for each part of the body vibration, so that a person can feel the thunder, or the forest fire disturbance, or a bird landing on a branch. The whole tactile experience is digitally controlled by Max/MSP and Arduino software while communicating with the Unreal engine through Open Sound Control protocol. The tactile, olfactory, and temperature feedback in real life is precisely synced with the visual experience inside the Oculus headset. We went through various iterations to match the virtual visual details with the intensity, texture, and timing of the physical experiences.
Credit: Xin Liu and Yedan Qian
We believe that TreeSense and Tree are tapping into more senses to help people connect with the narrative. We expect that the intimate, visceral, and emotional VR experience also has a real-world impact in that it helps users to develop a personal and immediate identification with the natural environment and the need to protect it.
In that sense, Tree is an example of a multisensory VR platform that has vast potential outside the research world. We’re convinced of the technology’s potential value for diverse purposes in many areas, such as telecommunication, active learning, and even medical applications. We are actively exploring these possibilities now.

Xin Liu is a research assistant in the Fluid Interfaces group at the MIT Media Lab. She graduates with a master’s degree in August. Yedan Qian is a visiting student in Fluid Interfaces.
Acknowledgments:
  • Advisor: Pattie Maes, head of the Fluid Interfaces research group at the MIT Media Lab.
  • Collaborators: VR filmmakers Milica Zec and Winslow Porter.
This post was originally published on the Media Lab website.

Popular posts from this blog

Hidden Wiki

Welcome to The Hidden WikiNew hidden wiki url 2015 http://zqktlwi4fecvo6ri.onion Add it to bookmarks and spread it!!!
Editor's picks Bored? Pick a random page from the article index and replace one of these slots with it.
The Matrix - Very nice to read. How to Exit the Matrix - Learn how to Protect yourself and your rights, online and off. Verifying PGP signatures - A short and simple how-to guide. In Praise Of Hawala - Anonymous informal value transfer system. Volunteer Here are five different things that you can help us out with.
Plunder other hidden service lists for links and place them here! File the SnapBBSIndex links wherever they go. Set external links to HTTPS where available, good certificate, and same content. Care to start recording onionland's history? Check out Onionland's Museum Perform Dead Services Duties. Introduction PointsAhmia.fi - Clearnet search engine for Tor Hidden Services (allows you to add new sites to its database). DuckDuckGo - A Hidden S…

[SOLVED] IDM WAS REGISTERED WITH A FAKE SERIAL NUMBER

[SOLVED] IDM WAS REGISTERED WITH A FAKE SERIAL NUMBER
Good News [May 08, 2015]: IDM developers got smarter, but the crackers are always a step ahead. Follow this article and send an email to uglyduckblog@gmail.com if you are desperate. I can NOT post any crack here for legal reasons. Happy Downloading with IDM. ;) *********** first tip is to use latest crack for idm from  onhax.net idm universal web crack and make sure u are using all latest vers I am sure many of us are too much dependent on Internet Download Manager a.k.a. IDM. The main reason didn’t permanently switch to linux was IDM. I mainly use it for batch downloading and download streaming videos. Till yesterday, IDM was working fine with me (of course with fake serial numbers, keygen, crack, patch etc. which could be found with little effort). But few days ago, with the latest update version 6.18 build 7 (released on Nov 09, 2013) Internet Download Manager was literally had a breakthrough and crushed all the serial numbers, …

DoubleAgent Attack Turns Your Antivirus Into Malware And Hijacks Your PC



Short Bytes: Cybellum security researchers have uncovered a new attack mechanism that can be used to take control of your antivirus and turn it into a malware. Called DoubleAgent, this attack exploits an old and undocumented vulnerability in Windows operating system. This Zero Day code injection technique affects all major antivirus vendors and has the power to hijack permissions. The security researchers from Cybellum have found a new technique that can be used by the cybercriminals to hijack your computer by injecting malicious code. This new Zero-Day attack can be used to take full control over all the major antivirus software. Instead of hiding from the antivirus, this attack takes control of the antivirus itself. Called DoubleAgent, this attack makes use of a 15-year-old legitimate feature of Windows (read vulnerability)–that’s why it can’t be patched. It affects all versions of Microsoft Windows. Cybellum blog mentions that this flaw is still unpatched by most antivirus v…