Workshop on Mixed Reality and Interactions at IRT bcom, Rennes

Workshop on Mixed Reality and Interactions

Friday, July 22nd 2016

From 9am to 12pm

IRT b<>com

1219 av champs blancs, 35510 Cesson Sévigné,


We are pleased to invite you to a workshop on Mixed Reality and Interactions organized at IRT b<>com.

Please, register on https://workshop-realite-mixte-et-interactions.eventbrite.fr

Laurence Nigay

Laurence Nigay is a full professor of Computer Science at Université Grenoble Alpes, head of the Engineering Human-Computer Interaction Research Group (EHCI). Her research focuses on the Engineering of Human-Computer Interaction (HCI). Her interests include ergonomic as well as software design aspects of HCI. Her primary motivation is to develop ways for making interaction techniques more usable. In particular her research studies centre on new interaction techniques, Multimodal and Augmented Reality (AR) user interfaces such as menu techniques, fusion mechanisms, service/component-based approaches for the development of multimodal and AR interfaces.

She will present firstly the Engineering Human-Computer Interaction (EHCI) research group, one of the 24 research teams of the Grenoble Informatics Laboratory (LIG). EHCI is primarily concerned with the software aspects of Human-Computer Interaction. Our mission is to define new concepts, models and software tools for designing, implementing, and evaluating interaction techniques that are effective, usable, and enjoyable. This group has extensive experience in software architecture for interactive systems, multimodal and mixed reality interaction, context-aware distributed and migratory user interfaces.

Next, Laurence will focus on distance pointing technics where users evolve in a physico-numerical world or mixed interactive space. Within this interactive space, a need is to interact remotely whether to manipulate numerical objects on a distant screen or physical objects. She will present three pointing technics to interact within a ubiquitous environment:

  • The first one to remotely point a numerical target, targeting interaction in operating theater;
  • The other two for pointing physical objects : One for directly pointing with 3D gesture (targeting light contrôle at home), the other based on augmented reality technology using a mobile phone (targeting maintenance operations on production machines)

 

Torsten Kuhlen

Torsten Kuhlen is a full professor at RWTH Aachen University, head of the Virtual Reality and Immersive Visualization Group  research and teaching.

His talk will give a coarse overview of research and service activities in the VR group at RWTH Aachen University. While the group’s  basic research focuses on the design and development of 3D, multimodal interaction methods for virtual environments, including haptic and audio interfaces, the application fields comprise production technology, simulation science, medicine, life science, neuroscience, psychology, and more. As a part of the RWTH Computing Center, the group’s major goal has always been to make its research work available to scientific partners, allowing them to explore complex technical and physical phenomena in an intuitive way. The talk will give examples for basic research as well as applications.

 

Guillaume Moreau

Guillaume MOREAU holds a PhD from Rennes University (1998). He is currently full Professor at Ecole Centrale de Nantes, France, where he is the head of the Computer Science and Mathematics Department. For twenty years he has worked in the field of virtual and augmented reality, including several projects with the industry. His topics are perception in virtual/mixed environments and large-scale tracking for AR.

In this talk, he will present the advantages of textureless tracking for Augmented Reality and our recent work which extends the Random Dot Markers defined by Uchiyama. Tracking without textures means that the only cue that may be used for matching model features and image features are the spatial layout of the features and is thus much less dependent to lighting conditions for example. Uchiyama and Marchand claimed it could be used for « augmenting everything ». In their LGC-algorithm (ISMAR 2015), points were used as features and neighborhood relationships were used for matching. LGC can be applied to augment paper maps or construction plans in the building industry. Another application of LGC is fast and precise projector calibration.

At last he will give a very brief overview of work in progress in Augmented Reality.

 

Votre commentaire

Entrez vos coordonnées ci-dessous ou cliquez sur une icône pour vous connecter:

Logo WordPress.com

Vous commentez à l’aide de votre compte WordPress.com. Déconnexion /  Changer )

Image Twitter

Vous commentez à l’aide de votre compte Twitter. Déconnexion /  Changer )

Photo Facebook

Vous commentez à l’aide de votre compte Facebook. Déconnexion /  Changer )

Connexion à %s

Créez votre site Web avec WordPress.com
Commencer
%d blogueurs aiment cette page :