ISMAR 2017, the premier conference for Augmented Reality (AR) and Mixed Reality (MR), will be held in beautiful Nantes, France.
ISMAR is responding to the recent explosion of commercial and research activities related to AR and MR by continuing the expansion of its scope over the past several years. ISMAR 2017 will cover the full range of technologies encompassed by the MR continuum, from interfaces in the real world to fully immersive experiences. This range goes far beyond the traditional definition of AR, which focused on precise 3D tracking, visual display, and real-time performance.
We specifically invite contributions from areas such as Computer Graphics, Human-Computer Interaction, Psychology, Computer Vision, Optics, and in particular VR, and how these areas contribute to advancing AR / MR technology.
This year, we continue to have an open call for selecting Program Committee members, in the hope that this further increases transparency and widens scope.
- Submission: 15 March 2017 (all deadlines: 23:59 AoE)
- Final notification: 8 June 2017
- Camera-ready version: 10 July 2017
There is only one paper submission category, from 4 to 10 pages (excluding references). Papers ready for journal publication will be directly published in a special issue of IEEE Transactions on Visualization and Computer Graphics (TVCG). Other accepted papers will be published in the ISMAR proceedings. Paper quality versus length will be assessed according to a contribution-per-page judgment.
- All accepted papers will be orally presented at the ISMAR conference.
- All accepted papers will have the opportunity to be presented as a demo.
- All accepted papers will have the opportunity to be presented as a poster.
- All accepted papers will be archived in the IEEE Xplore digital library.
Detailed submission and review guidelines will be posted on the conference webpage (http://www.ismar17.org). Poster submissions will be accepted as usual with a submission date to be announced later.
Topics of Interest
All topics relevant to AR and MR are of interest. These include, but are not limited to:
- Information Presentation
- Mediated and diminished reality
- Multisensory rendering, registration, and synchronization
- Photorealistic and non-photorealistic rendering
- Real-time and non-real-time interactive rendering
- Visual, aural, haptic, and olfactory augmentation
- Acquisition of 3D video and scene descriptions
- Calibration and registration (of sensing systems)
- Location sensing technologies (of any kind, including non-real-time)
- Projector-camera systems
- Sensor fusion
- Smart spaces
- Touch, tangible and gesture interfaces
- Video processing and streaming
- Visual mapping
- Wearable sensors, ambient-device interaction
- Display hardware, including 3D, stereoscopic, and multi-user
- Live video stream augmentation (e.g., in robotics and broadcast)
- Wearable actuators and augmented humans
- Wearable and situated displays (e.g., eyewear, smart watches, pico-projectors)
- User Experience Design ===
- Collaborative interfaces
- Technology acceptance and social implications
- Therapy and rehabilitation
- Usability studies and experiments
- Virtual analytics and entertainment
- VR simulations of AR/MR
- Human Performance and Perception ==
- Interaction techniques
- Learning and training
- Multimodal input and output
- Perception of virtual objects
- System Architecture
- Content creation and management
- Distributed and collaborative architectures
- Online services
- Real-time performance issues
- Scene description and management issues
- Wearable and mobile computing
- Art, cultural heritage, education and training
- Automotive and aerospace
- Entertainment, broadcast
- Industrial, military, emergency response
- Health, wellbeing, and medical
- Personal information systems
- Visual effects / video processing
Wolfgang Broll, Holger Regenbrecht, J. Edward Swan II
ISMAR 2017 Science & Technology Program Co-Chairs