Workshop 1

Brain-Computer interfaces in Virtual Environments

 

Date/Place/Time

September, 9th
At the site of the conference.
From 16h to 19h30′ (with coffe break)
 

Organization Committee

Daniel

Daniel Pérez-Marcos, IDIBAPS (Inst d’Investigacions Biomèdiques August Pi i Sunyer) Daniel Pérez Marcos was born in Alicante, Spain, in 1977. He received the M.S. degree in Industrial Electronics and Control Systems Engineering from the Valencia Technical University in 2001, and the Dr.-Ing. (Ph.D.) degree in Biomedical Engineering from the Technische Universität Ilmenau, Germany, in 2007. He was a postdoctoral researcher at the Instituto de Neurociencias de Alicante, Universidad Miguel Hernández-CSIC, from 2006 for two years. Since 2008, he has been a scientific researcher at the IDIBAPS (Institut d’Investigacions Biomèdiques August Pi i Sunyer) in Barcelona, Spain. Currently, he is visiting researcher at the Experimental Virtual Environments Lab for Neuroscience and Technology (EVENT-Lab). His major research interests include biomedical signal processing, brain-computer interface, self-regulation of electrical brain processes, body perception and virtual reality.

Mel

Mel Slater, Facultad de Psicología, Universitat de Barcelona-ICREA Mel Slater became an ICREA Research Professor in Barcelona in January 2006, and is in the Faculty of Psychology at the University of Barcelona. He became Professor of Virtual Environments at University College London in 1997 in the Department of Computer Science. He was a UK EPSRC Senior Research Fellow from 1999 to 2004 at UCL, during which time he worked on the virtual light field approach to computer graphics, and he set up the virtual reality Cave system at UCL. Twenty two of his PhD students have obtained their PhDs since 1989. In 2005 he was awarded the Virtual Reality Career Award by IEEE Virtual Reality In Recognition of Seminal Achievements in Engineering Virtual Reality.’ He currently leads the PRESENCCIA European Integrated Project funded under the 6th Framework Future and Emerging Technologies programme, and has recently been awarded a Senior ERC grant entitled Transcending Reality: Activating Virtual Environment Responses through Sensory Enrichment (TRAVERSE)’. In Barcelona he co-leads the Experimental Virtual Environments lab for Neuroscience and Technology (www.event-lab.org). His main work is in the area of understanding how people respond to experiences in immersive virtual environments and associated technology in computer graphics and human-computer interaction.

Mavi

María V. Sánchez-Vives, IDIBAPS-ICREA Maria V. Sanchez-Vives received her MD and PhD in Neuroscience at the University of Alicante (Spain). She was a postdoctoral fellow at Rockefeller University and Yale University. She was Profesora Titular of Physiology at Universidad Miguel Hernandez and researcher at the Instituto de Neurociencias de Alicante (Universidad Miguel Hernández-Consejo Superior de Investigaciones Científicas 2000-2007). Dr. Sanchez-Vives is currently ICREA Research Professor at the IDIBAPS (Institute of Biomedical Research Augusto Pi y Suñer) in Barcelona, Spain, and co-leads the Experimental Virtual Environments lab for Neuroscience and Technology (www.event-lab.org).

Christoph

Christoph Guger, g.tec Guger Technologies OEG Christoph Guger studied biomedical engineering at the University of Technology Graz and Johns Hopkings University, Baltimore, USA. He developed one of the first real-time brain computer interfaces during his PHD at the University of Technology Graz, Austria. 1999 is founded g.tec Guger Technologies OEG to develop and sell products for biosignal acquisition and analysis. Currently g.tec is involved in 5 EC projects: Presenccia, Synthetic Foreager, Renachip, SM4all and RGS.

Chris

Christoph Groenegress, Facultad de Psicología, Universitat de Barcelona Christoph Groenegress was born in Werther, Germany, in 1976. He holds a BSc in Artificial Intelligence from Manchester University and a MRes in Computer Vision, Image Processing, Graphics and Simulation from UCL. He is currently completing his PhD at the University of Barcelona, under supervision by Professor Mel Slater. He was a research assistant at the Fraunhofer Institute for Media Communication, where he worked on Mixed Reality interfaces for use in public spaces. Before moving to Barcelona he worked on virtual out-of-the-body illusions at UCL with Professor Mel Slater. His current focus is on whole-body interaction in virtual environments, including BCIs and physiological measurements.

Mar

Mar Gonzalez achieves both BSc degrees in Technical Computer
Engineering and Multimedia Engineering, distincted with the best
academic curricula award 2008 in La Salle Engineering School URL
(Ramon Llull University), Spain. During her bachelor studies she
worked as a research programmer at the University ARC Group
(Architecture, Representation and Computation), and as practice
assistant on the Digital Image Processing curse at the Communication
and Signal Processing Department. Currently she is enrolled on the
Biomedical Engineering MSc given by UB (University of Barcelona) and
UPC (Polytechnic University of Catalonia). At this time, she is member
of the EVENT-Lab (Experimental Virtual Environments Lab for
Neuroscience and Technology) at UB. Previously, she researched at
CREB-UPC (Centre de Recerca en Enginyeria Biomedica), where she
developed a 3D cognitive rehabilitation virtual environment for the
Neurorhabilitation Hospital Institut Guttmann. Recently she has got a
grant to develop her MSc final project on BCI at the Institute of
Neural Engineering in Tsinghua University Beijing. She’ll be leaving
on February 2010.

Aim and Description

The aim of this workshop is to learn how to use a brain-computer interface (BCI) to communicate with a virtual environment (VE) and control virtual objects through thoughts. In general terms, a brain-computer interface (BCI) is a system that facilitates the communication between brain and computer. BCI systems are based on brain electric signals and do not require the use of peripheral nerves or muscles for communication. A particular technique to generate movement in virtual environments is the direct brain control of virtual objects by means of a BCI. In the last years, several BCI applications based on VR have been developed, e.g., for the control of a virtual apartment [1-3], walking in a VE [4], making avatars walk [5-6], or moving body parts of your virtual body [7]. In this workshop, we’ll use several electrical brain potentials recorded over different brain areas in order to compete in a virtual race controlling the movement of an avatar, and to navigate inside a virtual apartment and also to manipulate its content. The workshop will be divided into two parts. The first part will be devoted to the theoretical aspects of the technology. Several talks from experts will introduce participants to the brain-computer interfaces research field, the different techniques, approaches and applications. Further, an introduction to virtual reality will be given as well. Experts will explain not only the features of this technology but point to areas where both BCI and VR converge. The second part of the workshop will be principally practical. Participants will have the opportunity to test the technology and learn about the main steps required for setting up such a system. There will be three applications using different brain signals:

  1. Virtual Olympics: Two participants compete in a virtual race controlling their own avatar with thoughts through a motor-imagery based BCI. .
  2. SmartHome: In this application, participants can navigate within a virtual apartment, open doors, switch on TV and so on. This application is based on a P300 BCI. .
  3. SSVEP application: Controlling a robot with thoughts. A SSVEP (Steady-state Visual Evoked Potential) based application where subjects can select and move virtual objects. In this application, flickering visual stimuli with different frequencies are provided. When subjects gaze to a particular target, changes in power spectrum are analysed. The main frequency will correspond to the stimulus frequency.

References

[1] Bayliss JD., “Use of the evoked potential P3 component for control in a virtual apartment”, IEEE Trans Neural Syst Rehabil Eng., 11(2):113-6, 2003.
[2] Christoph Guger, Clemens Holzner, Christoph Groenegress, Guenter Edlinger, Mel Slater. Goal-Oriented Control with Brain-Computer Interface. (To be published in) HCI.
[3] Christoph Guger, Clemens Holzner, Christoph Groenegress, Guenter Edlinger and Mel Slater. Control of a smart home with a brain-computer interface. Proceedings of the 4th International Brain-Computer Interface Workshop and Training Course 2008, September 18-21:339-342, 2008. [4] Pfurtscheller G, Leeb R, Keinrath C, Friedman D, Neuper C, Guger C, Slater M., “Walking from thought”, Brain Res., 1071(1):145-52, 2006.
[5] D. Perez-Marcos, M.V. Sanchez-Vives, M. Slater. Virtual reality-based brain-computer interfaces as a science dissemination tool. Real Actions in Virtual Environments Workshop, March 4th 2009, Barcelona.
[6] D. Pérez-Marcos, M. Slater, M.V. Sanchez-Vives. Emerging Technologies and Education: Brain-Computer Interfaces for Science Popularization. European Future Technologies Conference, 21-23 April 2009, Prague.
[7] D. Pérez-Marcos, M. Slater, M.V. Sanchez-Vives. Inducing a virtual hand ownership illusion through a brain-computer interface, Neuroreport (in press).

Intended Audience

This workshop is addressed to those people attracted for new communication pathways between brain and machine. Since BCI application fields extend from medicine and rehabilitation to business and entertainment, the audience will be multidisciplinary. Basic programming knowledge will be a plus, but not essential, since most BCI companies already offer user-friendly software packages.

Table of Contents

  • PART 1: THEORY
    1. Introduction to BCI
    2. Non-invasive BCI : Main techniques and applications
    3. Invasive BCI: Main techniques and applications
    4. Introduction to virtual reality
    5. BCI in virtual environments
  • PART 2: APPLICATIONS
    1. Virtual Olympics
    2. SmartHome
    3. SSVEP application