VR/AR Showcase 2018

vr-ar-showcase

 

New for 2018, the VR/AR Showcase will take place on IT Summit Workshop Day
Wednesday, November 7, 2018
Student Union North Ballroom

Experience virtual reality, augmented reality, 360 video, and more. Consult with campus innovators about teaching and research applications.

Advances in technology offer new and exciting ways to deliver content, engage students, and better ensure comprehension across disciplines. Virtual, augmented, and mixed reality create significant immersive experiences for research and training and could greatly enhance any field of study in the sciences, from medicine to engineering. It can also enrich the experience of the liberal arts and social sciences.

Meet the professionals and learn about some of the innovative teaching and research applications currently being developed in education and on the UA campus.

Schedule

9-12:15pm – Open demo

12:15-12:30pm - DoD/DSRI Opportunities in VR Presentation

12:30pm – Lunch is served

1:00-2:15pm – Virtual Reality and Education Panel Discussion

  • Moderator: Ash Black, Global Initiatives
  • Panelists:
  • Lila Bozgeyikly, College of Social and Behavioral Sciences
  • Bryan Carter, College of Humanities
  • Arne Ekstrom, College of Social and Behavioral Sciences
  • Joe Farbrook, College of Fine Arts
  • Hong Hua, College of Science
  • Sam Rodriguez, College of Fine Arts

2:15-2:30pm - Break

2:30-4pm – Open Demo

Lunch

Lunch will be provided for the first 100 guests who register for lunch with the panel discussion. Register via EventBrite.

REGISTER NOW!

Presentations

Department of Defense/Defense and Security Research Institute Opportunities in VR Presentation

Chris Fox, Research, Discovery, and Information/DSRI
12:15-12:30 p.m.

At the University of Arizona’s Defense & Security Research Institute, we promote the unique capabilities of UA researchers across a range of disciplines to foster investment and collaboration with the nation's aerospace and defense economy.

Lunch and Panel

12:30 p.m.–2:15 p.m.

Lunch will only be available for the first 100 people to sign-in on the day of the event and ask for a lunch ticket.

Virtual Reality and Education Panel Discussion

Moderator:
Ash Black, Global Initiatives

Panelists:
Lila Bozgeyikly, College of Social and Behavioral Sciences
Bryan Carter, College of Humanities
Arne Ekstrom, College of Social and Behavioral Sciences
Joe Farbrook, College of Fine Arts
Hong Hua, College of Optical Sciences
Sam Rodriguez, College of Fine Arts

Open Demo Sessions

9:00 a.m.–12:30 p.m. and 2:30–4:00 p.m.

 

Memory Leaks

Joe Farbrook, College of Fine Arts

Memory Leaks is a virtual reality art installation investigating memes, media narratives, and cultural mythology. Inspired by the influx of virtual content leaking into the physical world and as well as the human psyche, this phenomenon is magnified in a fully immersive VRexperience. The conceptual underpinning is to use gross enlargement and collaged cinematic to dramatize current shifts in attention, trains of thought, dreams, time, time spent, communication, social life, emotions, expectations, etc. It offers an investigation into the relationship between the democratization of digital technology and its prevailing emergence as the de-facto form of communication.

 

Tangiball: A Tangible Virtual Reality Ball Game

Ren Bozgeyikly, College of Social and Behavioral Sciences

This demo includes an enhanced virtual reality interaction through real-world extensions of virtual objects. Users will interact with a tangible ball in a video game setting and see its movements replicated in the virtual world in real-time. This form of interaction offers them the unique experience of playing with a real ball in a virtual world.

 

VR in Architecture, Planning and Landscape Architecture

Lucas Guthrie, College of Architecture

Learn how students at the College of Architecture, Planning and Landscape Architecture are using Revit and Virtual Reality technology to experience their design in a new way. With the ability to "walk around" in their design, student's are able to fine tune their model and present their vision.

 

A Virtual Factory, Medical Surgery, and Emergency Evacuation based on Unity Game Engine

Son Young-Jun, College of Engineering

In this VR demo, you will be in the middle of a modern manufacturing factory, and observe operations of various equipment, such as CNC machines, robots, automatically-guided vehicles, and overhead crane. This demo will be used for students in the manufacturing courses at UA. The second demo is about VR/AR-based training/planning in medical surgery that has been collaborated with researchers at Banner UMC. The third demo is about emergency evacuation scenarios. For all applications, Unity game engine was utilized to create highly realistic environments.

 

Leaning into Tomorrow: Immersive Student Engagement Experiences with the Center for Digital Humanities and Tech.Global

Bryan Carter, College of Humanities

Tech.Global and the Center for Digital Humanities collaborate on a multitude of interesting 360, AR, and VR projects designed to move UA faculty research agendas forward and put our students in the spotlight. Please come by to explore our current projects including Lingyin Temple VR (Dr. Wu and Dr. Welter - China), Justice 360 (Dr. Kiser, Dr. Beita - Costa Rica), and our African Dance VR Gallery (Dr. Praise - CGI/Motion Capture).

 

Box Cat: Escape XRtist

Sam Rodriguez, College of Fine Arts

"Box Cat: Escape XRtist" is an insight to our relationship with technology and ourselves. In this virtual reality/live experience, an animatronic (MAHT) will play the role of the artist (SR). The machine will play the role of an artificial intelligence (Mac). Participants will play the role of human beings. MAHT is an unknown/ not-well-known artist. As his “debut” performance piece he deliberately puts himself into a “coma” and urges participants to come and experience his “waking dream” where his friend, an artificial intelligence, resides. MAHT is hooked to a heart monitor and two participants must interact with his mental and physical forms. Two participants, one wearing a virtual reality headset and a second person monitoring an animatronic that is connected to the virtual game, will illustrate how we interact with one another as human beings and how we interact with technology. The presentation discussing this would cover topics such as human/human and human/tech relationships as well as our fears and power structure with technology as a tool or as a companion.

 

Virtual Tour of Whipple Observatory

Dallan Porter, MMT Observatory, Department of Astronomy and Steward Observatory

Whipple Observatory, located near Amado, Arizona on Mt. Hopkins, is an astronomical observatory owned and operated by the Smithsonian Astrophysical Observatory (SAO). The MMT Observatory is a large optical telescope at the summit of Mt. Hopkins and is a joint facility of the SAO and the University of Arizona. The MMT's 6.5-meter mirror is the first of the large mirrors made at the mirror lab on the UofA campus under the football stadium. This virtual reality tour will take the visitor from the Whipple Observatory's basecamp visitor's center all the way to the summit giving breathtaking views of all of the telescopes, the Santa Rita mountains, and the entire southern Arizona landscape. The tour is concluded with an amazing Arizona sunset and a view of the night sky from the chamber of the MMT.

How virtual and altered reality can help us to understand the neural basis of human spatial navigation

Arne Ekstrom, College of Social and Behavioral Sciences

Devices such as head-mounted displays and omnidirectional treadmills offer enormous potential for gaming and networking-related applications. However, their use in experimental psychology and cognitive neuroscience, so far, has been relatively limited. One of clearest applications of such novel devices in experimental psychology is the study of human spatial navigation, historically, an understudied area compared to more experimentally-constrained studies in rodents. Here, we present several experiments the lab has recently conducted using VR/AR, and discuss in detail how we overcame some of these obstacles and the new insights we can gain into how humans navigate. First, we show how VR/AR provides novel insight into how we learn large-scale environments with enriched body-based input, typically difficult to study in the real-world. Second, we discuss novel findings from simultaneous wireless scalp EEG recordings and ambulation on an omnidirectional treadmill.

VR Studio and iSpace

Anthony Sanchez, UA Libraries

At the university library, we have been actively creating and iterating on a model of operating a VR service space while supporting a vision of discovery and informal learning within VR environments. We have hosted events and workshops to cultivate learning communities around virtual and augmented reality technology tools in an effort to have a greater impact on the campus curriculum. Additionally, this has made us rethink job responsibilities and models of staffing including hiring students to host drop-in hours and programming around VR tools and activities. Advances in technology are transforming education and offering new and exciting ways to deliver content, engage students, and better ensure comprehension across disciplines. Virtual, augmented, and mixed realities create significant immersive experiences for research and teaching and could greatly enhance any field of study in the sciences, from medicine to engineering. It can also enrich the experience of the liberal arts and social sciences.

 

Hangzhou Buddhist Culture VR

Feng Chen and Albert Welter, College of Humanities

TBD