Tutorials

If you have any questions do not hesitate to write to the indicated contact person. Please use the form at the bottom of the page to register for the sessions.

Mobile VR with Google Cardboard

Date: Wednesday, July 27th
Time: 18:30 - 20:00
Location: University Department of Psychiatry and Psychotherapy, Room 101, 1st Floor (Calwerstraße 14)
Contact: Philipp Schröder (philipp.schroeder[ at ]uni-tuebingen.de)
Registration required: yes
Maximum number of participants: 20

Currently, most virtual reality setups require a rather powerful computer equipped with a graphics card able to handle the 3D rendering. Even if there are currently attempts to develop stand-alone HMDs, one already available solution for mobile VR is the google cardboard . The required viewer works on most modern smartphone operating systems (Android version 4.1 or higher, iOS version 8.0 or higher). The tutorial features the setup of the cardboard and an introduction to application development and deployment for mobile VR.

VREX: A Unity® Toolbox for Creating 3D Virtual Reality Experiments

Date: Wednesday, July 27th
Time: 18:30 - 20:00
Location: University Department of Psychiatry and Psychotherapy, Room 102A, 3rd Floor (Calwerstraße 14)
Contact: Madis Vasser (madis.vasser[ at ]ut.ee)
Registration required: yes
Maximum number of participants: 20

Virtual reality as an experimental tool is becoming more and more accessible to researchers in different fields. Yet, creating virtual worlds involves technologies most experimentalists are not familiar with. At the Virtual Neuroscience Lab of the University of Tartu, we developed VREX, an open source and intuitive Unity toolbox to significantly simplify the creation of VR experiments, mainly for various psychological studies. The following features are included in the current version:

  • Graphical user interface for building experiments
  • Procedural generation of different (interconnected) rooms
  • Automatic furnishing at the touch of a button
  • Manual object placement and custom model import
  • A menu system for creating and storing experiments with different stages
  • Support for change detection experiments:
    • change objects visibility, colour, location
  • Support for memory studies:
    • log all objects visited by a participant and present them later for recall
  • Spatial audio integration
  • Support for various locomotion systems and the Oculus Rift HMD

The toolbox includes template experiments and commented C# source code. Different tutorial videos are available. Madis Vasser will present the current version of the toolbox, demonstrate some of its features and discuss possible future applications. Example pictures can be obtained here. A video sample of a change detection study created with the VREX toolbox can be found here. Supplementary material will be available at the Materials page.

Virtual Reality and Eye Tracking in Experimental Psychology Research (SMI and VTplus)

Date: Friday, July 29th
Time: 9:30 - 12:00
Location: University Department of Psychiatry and Psychotherapy, Room 102, 3rd Floor (Calwerstraße 14)
Contact: Stefanie Gehrke (marcom[ at ]smi.de)
Registration required: yes
Maximum number of participants: -

In this joint session, VTplus and SMI will present their hard- and software solutions for psychological research in virtual reality – with a special focus on the application of eye tracking in VR. The tutorial will cover the implementation of an experimental setup with the CyberSession simulation software, provided by VTplus. Further, SMI will give an overview of their integrated eye tracking system for head mounted displays – e.g. for the Oculus Rift DK 2 and the Samsung Gear VR. A sample application of both technologies will be discussed: The application of eye tracking to control the direction of attention in an anxiety related virtual reality exposure session. A detailed description of the scenario can be found in here. The tutorial consists of three parts and will cover the following points:

  • Introduction to experimental psychology research in virtual environments (VTplus)
    • VR components, features and possible fields of application: HMDs, projection, tracking, eye tracking, and interaction in VR setups
    • Introduction of CyberSession - data acquisition and experiment control, short presentation of implemented VR paradigms
    • Generation of virtual environments with the Valve Source SDK
    • Case study: Eye tracking to control gaze direction and visual attention during fear exposure therapy in VR (eye tracking integrated in the Oculus DK2 powered by SMI)
  • Application of eye tracking in virtual reality (SMI)
    • Introduction to the use of eye tracking in virtual environments: Demonstration of SMI Eye Tracking integration for the Oculus Rift DK 2 and the Samsung Gear VR
    • Creation of a virtual environment with eye tracking using the SMI Plug-in for Unity and the SMI Mobile Eye Tracking HMD based on a Samsung Gear VR
  • Demo / Hands-on
    • Demonstration of selected virtual environments
    • Hands-on with the eye tracking equipment

The session will be hosted by Daniel Gromer (VTplus) and Martin Pötter (SMI).

Setting up an Experiment with the Unity® engine

Date: Friday, July 29th
Time: 9:00 - 12:00
Location: University Department of Psychiatry and Psychotherapy, Room 104, 2nd Floor (Calwerstraße 14)
Contact: Johannes Lohmann (johannes.lohmann[ at ]uni-tuebingen.de)
Registration required: yes
Maximum number of participants: 20

In order to set up a virtual environment, an engine is required which handles the 3D rendering. Further, realistic object interactions require some kind of physic simulation. Various game engines fulfill these requirements, for instance the Unity game engine, or the Unreal engine 4. Both engines are available as free versions and allow the integration of the Oculus Rift HMD. In this tutorial, we will have a look how to implement a psychological experiment with the Unity engine, using the programming language C#. The tutorial is primarily intended for people without previous experience with Unity and addresses the following topics:

  • Getting familiar with the Unity editor
  • Have a look at the programming interface
  • Scripting a basic trial controller
  • Handle data collection and data output
  • Using speech recognition for response time collection
  • Integration of the Oculus Rift
  • Deploy an experiment as executable file

The Unity project as well as install instructions can be obtained from the Materials page. Please install Unity on your machine before the tutorial.

Combining VR and Motion Capture with Unity®

Date: Friday, July 29th
Time: 13:00 - 16:00
Location: University Department of Psychiatry and Psychotherapy, Room 104, 2nd Floor (Calwerstraße 14)
Contact: Johannes Lohmann (johannes.lohmann[ at ]uni-tuebingen.de)
Registration required: yes
Maximum number of participants: 20

To allow natural user interactions in immersive virtual reality scenarios, some kind of on-line motion capturing is necessary. While full scale optical as well as inertial tracking systems are expensive, there are more affordable solutions. The microsoft kinect sensor is an example for a reliable and inexpensive optical tracking system. Especially for hand-tracking, an even smaller sensor system is available, the leap motion sensor. There exists a free Unity integration for this sensor and currently the developers try to improve the integration of the sensor into HMD setups. In this tutorial we will have a look at the sensor and integrate it into an experimental setup. Please note that we only have five sensors available, which restricts the number of participants in this workshop. The following topics are addressed in this tutorial:

  • Setting up the leap sensor
  • Unity integration and leap assets
  • Position checks and data acquisition in an experimental setup
  • Combination of hand tracking and the Oculus Rift
  • Troubleshooting and best practices

A sample Unity project as well as install instructions and download links can be obtained from the Materials page. Please install Unity and the leap assets on your machine before the tutorial.

Developing Epic VR Experiments with the Unreal Engine

Date: Friday, July 29th
Time: 12:30 - 14:00
Location: University Department of Psychiatry and Psychotherapy, Room 102, 3rd Floor (Calwerstraße 14)
Contact: Madis Vasser (madis.vasser[ at ]ut.ee)
Registration required: yes
Maximum number of participants: 20

The combination of the Unity game engine with the Oculus Rift HMD is a common setup for virtual reality experiments. But other combinations of engines and HMDs are viable as well and provide some advantages compared to the Oculus & Unity setup. Only recently HTC released its Vive HMD. Compared to the Oculus, the HTC Vive has a larger tracking area, further, the API is open-source, allowing a more throughout control via custom software. With respect to the engine, the Unreal engine 4 offers an alternative to Unity. Compared to Unity, the Unreal engine allows more complex and detailed visualizations, further, it provides a visual scripting system. The tutorial features a throughout comparison of the Unity & Oculus and the Unreal & HTC Vive setup. The following points will be covered:

  • Introduction of the VR group at Tartu University, ongoing projects and experimental setups
  • Comparison of Unity & Oculus and the Unreal & HTC Vive setups
  • General design principles: Visual quality, gamification, and nausea-free experiences
  • Setting up a VR scene with the Unreal editor
  • Visualization capabilities of the Unreal engine
  • Build-In VR optimization within the Unreal engine
  • Visual scripting via Blueprints
  • The VR editor: Building VR inside VR

Supplementary material will be available at the Materials page.

Demo Session, Social and Spatial Cognition Group, Max Planck Institute for Biological Cybernetics

Date: Friday, July 29th
Time: 14:30 - 15:30
Location: Max Planck Institute for Biological Cybernetics Tübingen, Department of Perception, Cognition and Action (Spemannstr. 38)
Contact: Stephan de la Rosa (delarosa[ at ]tuebingen.mpg.de)
Registration required: yes
Maximum number of participants: 20

We offer a tour through some of the virtual reality facilities of the Department of Perception, Cognition and Action of the Max Planck Institute for Biological Cybernetics. We will show both the technical and experimental setups used in the Social and Spatial Cognition group for the examination of human behavior under naturalistic conditions. This includes head mounted displays, larger screen displays, and interactive stereo displays. The tour is scheduled for about an hour.

Demo Session, Space & Body Perception Group, Max Planck Institute for Biological Cybernetics

Date: Friday, July 29th
Time: 15:30 - 16:30
Location: Max Planck Institute for Biological Cybernetics Tübingen, Cyberneum (Spemannstr. 44)
Contact: Anne Thaler (anne.thaler[ at ]tuebingen.mpg.de)
Registration required: yes
Maximum number of participants: 20

Our research group uses virtual reality technology as a tool to investigate space and body perception in ecologically valid scenarios. We will present demos of our experiments on body size perception, the influence of emotion on space perception, and the contribution of body shape and walking motion on perceived attractiveness / self-confidence. The tour is scheduled for about an hour.

The registration is closed.