spacer
spacer SEARCH
spacer
Hilton San Francisco, Union Square
San Francisco, California, USA
8 - 12 February 2015
Conference 9392
spacer Share Email Print
The Engineering Reality of Virtual Reality 2015
Monday - Tuesday 9 - 10 February 2015
Conference Sessions
At A Glance
show | hide
1: Session 1
2: Session 2
3: Session 3
4: Session 4
Plenary Session and Society Award Presentations
Interactive Paper Session
Symposium Demonstration Session
Plenary Session and Conference Award Presentations
Important
Dates
show | hide
Abstract Due:
15 August 2014

Author Notification:
6 October 2014

Manuscript Due Date:
12 January 2015

Conference
Committee
show | hide
Conference Chairs
  • Margaret Dolinsky, Indiana Univ. (United States)
  • Ian E. McDowall, Fakespace Labs, Inc. (United States)

Monday 9 February Show All Abstracts
Session 1:
Session 1
Monday 9 February 2015
8:30 AM - 10:10 AM
  • Session Chair:
  • Ian E. McDowall, Fakespace Labs, Inc. (United States)
Game-day football visualization experience on dissimilar virtual reality platforms
Paper 9392-1
Author(s): Vijay K. Kalivarapu, Anastacia MacAllister, Anthony Civitate, Melynda T. Hoover, Iowa State Univ. (United States); Phillip Thompkins, Jesse Smith, Univ. of Maryland, Baltimore County (United States); Janae Hoyle, Tufts Univ. (United States); Eliot Winer, Shubang Sridhar, Jonathan Schlueter, Gerrit Chernoff, Iowa State Univ. (United States)
Show Abstract
College football recruiting is a competitive process, where athletic administrations attempt to gain an edge by bringing recruits to a home game, to highlight the atmosphere unique to campus. This is however not always possible since most recruiting efforts happen off-season. So, they relate the football game experience through video recordings and visits to football facilities such as an empty stadium. While these substitutes provide a general idea of a game, a recruit cannot fully imagine himself playing cheered on by a crowd of 50,000 people. To address this challenge and improve the recruitment process, the Iowa State University athletic department and the Virtual Reality Applications Center teamed up to build an alternative to the game-day experience using the world’s highest resolution six-sided VR environment - the C6, and a portable VR system. The portable system works to provide a similar experience to the C6, only in a recruit’s living room. While building an immersive game-day experience is a challenging task, quantifying its effectiveness over traditional recruiting methods is another. This paper presents techniques used in the development of the immersive and portable VR environments followed by validation of the work through quantifying immersion and presence through formal user studies.
archAR: an archaeological augmented reality experience
Paper 9392-2
Author(s): Bridgette Wiley, Jürgen P. Schulze, Univ. of California, San Diego (United States)
Show Abstract
We present an application for Android phones or tablets called “archAR” that uses augmented reality as an alternative, portable way of viewing archaeological information from UCSD’s Levantine Archaeology Laboratory. archAR provides a unique experience of flying through an archaeological dig site in the Levantine area and exploring the artifacts uncovered there. Using a Google Nexus tablet and Qualcomm’s Vuforia API, we use an image target as a map and overlay a three­dimensional model of the dig site onto it, augmenting reality such that we are able to interact with the plotted artifacts. The user can physically move the Android device around the image target and see the dig site model from any perspective. The user can also move the device closer to the model in order to “zoom” into the view of a particular section of the model and its associated artifacts. This is especially useful, as the dig site model and the collection of artifacts are very detailed. The artifacts are plotted as points, colored by type. The user can touch the virtual points to trigger a popup information window that contains details of the artifact, such as photographs, material descriptions, and more.
Photorealistic 3D omni-directional stereo simulator
Paper 9392-3
Author(s): Dirk Reiners, Carolina Cruz-Neira, Univ. of Arkansas at Little Rock (United States)
Show Abstract
Most existing aircraft and vehicle simulators are based on jet simulators. These simulators do not address the training needs for aircrafts or vehicles that need to operate at much closer distances from objects in the environment like helicopters. We present an innovative approach for omnidirectional stereo in spherical environments without the need of any user-location technology. Our approach provides high quality rendering and true stereo separation at a pixel level throughout the entire image. The innovative features of our image generator are: accurate 360-degree surround stereoscopic immersive display, and unprecedented visual quality at near real-time performance rates. Specifically, our research results include: a real-time ray tracer incorporating omnidirectional stereoscopic display and rendering algorithms to display small-detail visuals, specifically power lines. Our prototype shows that it is possible to adapt ray tracing algorithms to run at real-time interactive speeds incorporating 360-degrees 3D stereo to allow immersive operating in close proximity to objects like radio towers, other vehicles, or an urban landscape. Our work enables the development of a new class of simulators for the maneuvers needed for helicopters with increased capabilities for training in situations that are currently not possible in existing simulators due to limited perception and visual quality.
Composing a model of outer space through virtual experiences
Paper 9392-4
Author(s): Julieta C. Aguilera, Adler Planetarium & Astronomy Museum (United States)
Show Abstract
This paper frames issues of trans-scalar perception in visualization, reflecting on the limits of the human senses, particularly those which are related to space, and showcases planetarium shows, presentations, and exhibit experiences of spatial immersion and interaction in real time.
How to avoid simulation sickness in virtual environments during user displacement
Paper 9392-5
Author(s): Andras Kemeny, Renault Technocentre (France), Ecole Nationale Supérieure d'Arts et Métiers (France); Florent Colombet, Thomas Denoual, THEORIS (France)
Show Abstract
Driving simulation and Virtual Reality (VR) share the same technologies for visualization, head movement tracking and 3D vision as well as similar difficulties when rendering the displacements of the observer in virtual environments, especially when these displacements are carried out using driver commands, including steering wheels, joysticks and nomad devices. High values for transport delay, the time lag between the action and the corresponding rendering cues or visual-vestibular conflict, due to the discrepancies perceived by the human visual and vestibular systems when driving or displacing using a control device, induces the so-called simulation sickness. While the visual transport delay can be efficiently reduced using high frequency frame rate, the visual-vestibular conflict is inherent to VR, when not using motion platforms. In order to study the impact of displacements on simulation sickness, we have tested various driving scenarios in Renault’s 5-sided ultra-high resolution CAVE. First results indicate that low speed displacements without longitudinal and lateral accelerations are well accepted and a worst case scenario is corresponding to rotational displacements in well detailed graphical environments. These results will be used for optimization technics at Arts et Metiers ParisTech for motion sickness reduction in virtual environments for industrial, research, educational or gaming applications.
Session 2:
Session 2
Monday 9 February 2015
10:50 AM - 12:30 PM
  • Session Chair:
  • Margaret Dolinsky, Indiana Univ. (United States)
Development of simulation interfaces for evaluation task with the use of physiological data and virtual reality applied to a vehicle simulator
Paper 9392-6
Author(s): Mateus R. Miranda, Diana G. Domingues, Alessandro Oliveira, Cristiano J. Miosso, Carla Silva Rocha Aguiar, Thiago Bernardes, Henrik Costa, Luiz Oliveira, Univ. de Brasília (Brazil)
Show Abstract
This paper, with application in modeling simulation games and collecting experimental data, aims the description of an experimental platform for evaluating immersive games. The platform proposed in this paper is embedded in an immersive environment, in a CAVE of Virtual Reality and consists of a base frame with actuators with three degrees of freedom, sensor array interface and physiological sensors. Physiological data of breathing, galvanic skin resistance (GSR) and pressure in the hand of the driver and a subjective questionnaire were collected during the experiments. This work includes presenting the theoretical background used in a project focused on Engineering Software, Biomedical Engineering and Creative Technologies. The case study involves the evaluation of a vehicular simulator. As the integration of simulation software with immersion system interferes directly with the actions of the driver, the evaluation was performed by correlation between the analysis of their physiological data obtained before, in a period of rest and during the simulation with and without moviments at the simulator; also by the use of images captured through time at simulation and data collected by the subjective questionnaire.
An indoor augmented reality mobile application for simulation of building evacuation
Paper 9392-7
Author(s): Sharad Sharma, Shanmukha Jerripothula, Bowie State Univ. (United States)
Show Abstract
Augmented Reality enables people to remain connected with the physical environment they are in, and invites them to look at the world from new and alternative perspectives. There has been an increasing interest in emergency evacuation applications for mobile devices. Nearly all the smart phones these days are Wi-Fi and GPS enabled. In this paper, we propose a novel emergency evacuation system that will help people to safely evacuate a building in case of an emergency situation. It will further enhance knowledge and understanding of where the exits are in the building and safety evacuation procedures. It was tested using matrix based markers used by standard mobile camera. We show how the application is able to display a 3D model of the building using markers and web camera. The system gives a visual representation of a building in 3D space, allowing people to see where exits are in the building through the use of a smart phone or laptop. Pilot studies are being conducted with the system showing its partial success and demonstrate the effectiveness of the application in emergency evacuation.
Programmable immersive peripheral environmental system (PIPE)
Paper 9392-8
Author(s): Chauncey E. Frend, Michael J. Boyles, Indiana Univ. (United States)
Show Abstract
Improved virtual environment (VE) design requires new tools and techniques that enhance user presence. Despite being relatively sparsely studied and implemented, the employment of environmental devices (e.g. those that provide wind, warmth, or vibration) within the context of virtual reality (VR) provides increased presence. Many previously created peripheral environmental devices (PEDs) are not sufficiently prescriptive for the research or development community and suffer from a steep development or software integration learning curve. In this paper, we introduce a peripheral environmental device control system, called the “PIPE” System.” The PIPE achieves enhanced user presence, while also lowering the barrier of entry for engineers and designers. The system is low cost, requires little hardware setup time, and promotes easy programming and integration into existing VEs using the Unity development engine. VR systems often strive to present VEs that resemble the real world, but developers are limited by VR systems that cannot simulate environmental conditions. The PIPE better equips developers to use existing VR systems with a broader range of environmental effects.
Explorations in dual-view, co-located VR
Paper 9392-9
Author(s): Silvia P. Ruzanka, Benjamin C. Chang, Rensselaer Polytechnic Institute (United States)
Show Abstract
One of the major features of projection-based VR is that it allows multiple users to share both the same virtual space and the same physical space. However, the use of user-centered stereoscopy means that only one user actually has an accurate view of the scene, and in general only one user at a time can interact. Using modified polarized projection, we developed a system for two co-located users with independent views and interaction, in a monoscopic view using head-tracking and multi-screen panoramic projection. Giving users the ability to actively collaborate or explore different spaces simultaneously opens up new possibilities for VE’s. We present prototype interaction designs, game designs, and experimental artworks based on this paradigm, pointing towards future developments in VR toward physically co-located designs that allow for co-presence, fostering innovative collaborations.
From CAVEWoman to VR diva: breaking the mold
Paper 9392-10
Author(s): Carolina Cruz-Neira, Univ. of Arkansas at Little Rock (United States)
Show Abstract
One of the main ground-breaking developments in the history of virtual reality (VR) was the creation of the CAVE virtual reality system. It was first introduced in the early 90s and it is still one of the key technologies that define the field. What it is not so well known in circles outside the core research communities is that this technology was primarily conceived, designed, implemented and put to work outside the research labs by a woman, the author of this paper. After the development of the CAVE, her work expanded to spread the use of VR technology into a wide range of disciplines, ranging from deeply scientific and engineering areas to rigorous humanities applications, to creative art experiences. Being a woman, she brings a pragmatic perspective on what VR is, how the supporting tools and technologies need to be designed to simplify its use, and to enable unexpected groups to explore VR technology as a new medium. This paper presents a set of truly interdisciplinary VR projects that were made possible by having a strong technical expertise rooted in a pragmatic feminine interpretation of the technology and its capabilities. Examples of these projects are: turning a low-power wireless embedded software scientific project into a dance performance, using field work of religious studies about religious rituals of 15th Century India as a testbed for a distributed computing software architecture, and extending a scripting language framework to support storytelling to document the sad events of 9/11. The paper also discussed the successes and struggles to gain acceptance and credibility as a VR researcher with this unconventional approach to the technology.
Lunch Break 12:30 PM - 2:00 PM
Session 3:
Session 3
Monday 9 February 2015
2:00 PM - 3:20 PM
  • Session Chair:
  • Ian E. McDowall, Fakespace Labs, Inc. (United States)
gipoco.com is neither affiliated with the authors of this page nor responsible for its contents. This is a safe-cache copy of the original web site.