Augmented Reality: What it Can and Cannot Do
Please join us for this exciting new webinar, August 14, 2019, 3:00PM EDT
Sponsored by the HFES PPTG
Use the link below to register: https://zoom.us/webinar/register/WN_1bCYSEBJSQevDd3tt_NTuA
Augmented Reality (AR), or the perceptual enhancement of the real world by computer generated information, was once a technology looking for a problem. As hardware has advanced, so has the utility of AR. However, AR can still cause problems if not implemented correctly. The panelists will discuss lessons learned from successful and unsuccessful applications of AR, as well as where they think the technology is going.
Below you will find more information about each of our distinguished panel participants.
Dr. Paul Havig, US Air Force
Paul Havig is a Senior Engineering Research Psychologist at the Air Force Research Laboratory at Wright-Patterson Air Force Base. He received his Bachelors in Psychology from the University of California at San Diego in 1989, and his MS and PhD in Experimental Psychology from the University of Texas at Arlington in 1995 and 1997 respectively. He came to Wright-Patterson as a contractor out of graduate school in 1997 and switched to civilian service in 2001. From 2001 to 2019 he was in the Battlespace Visualization Branch and now with the new structure he is in the Sensory Systems Branch of the 711th Human Performance Wing. His virtualized experience has been mostly in augmented reality (helmet-mounted display symbology for aircraft), but has recently been involved in both mixed and virtual reality work.
Kelly Hale, Design Interactive, Inc.
Dr. Kelly S. Hale is Sr. Vice President of Technical Operations at Design Interactive, Inc., a woman-owned small business focused on human-systems integration. In this role, Dr. Hale drives the company in advancing human systems integration solutions in performance augmentation, biosignature analytics, and extended reality solutions built using human-centered system design. She has over 15 years experience in human systems integration research and development across areas of augmented cognition, multimodal interaction, training sciences, and virtual and augmented reality environments. She received her BSc in Kinesiology/Ergonomics Option from the University of Waterloo in Ontario, Canada, and her Masters and PhD in Industrial Engineering, with a focus on Human Factors Engineering, from the University of Central Florida.
Jeff Cowgill, Marxent
Jeffrey Cowgill, Jr has over 20 years of experience in developing and creating virtual environments, virtual reality, mixed reality, augmented reality, and simulations for research and training, retail product visualization, and sales solutions. His interests include computer graphics, human-machine interfaces, software and website design, and modeling and simulation. Mr. Cowgill spent 17 years developing applications to support scientific research at Wright State University, mostly focused on human performance, spatial audio, and visual and auditory search. Currently, he is the Director of Software Development at Marxent Labs, LLC., a world leader in AR and VR software, offering the 3D Cloud platform for retailers and manufacturers investing in an enterprise-wide, omnichannel 3D content strategy.
Dr. So Young Kim, NASA JPL
Dr. So Young Kim is a Senior Lead Designer at NASA JPL. Currently, she is a Product Owner for IMCE’s integrated System Engineering tool suite, Computer-Aided Engineering for Space System Architecture (CAESAR). Previously, she was a UX lead on ProtoSpace, 3D CAD visualization product suite for spacecraft design. Before joining JPL, she worked at General Electric’s Global Research center, shaping and leading programs for designing systems for next generation user experience such as Future Flight Deck and Future Power Plant. The focus has been applying natural user interface technologies and intelligent system supports on human-machine collaboration and communication. Her specialty is Human-System Integration and, in particular, Human-Machine Teaming. She received her Ph.D. in Aerospace Engineering from Georgia Institute of Technology, Atlanta, GA, focused on Human Factors and Cognitive Engineering. Prior to that, she received a M.S. in Aerospace Engineering from Georgia institute of Technology and a B.S. in Electrical and Electronic Engineering from Chun-Ang University in Seoul, South Korea.
Ken Mayer, Ford Motor Company
Ken Mayer is a Human Factors Research Supervisor at Ford Motor Company. He leads a team of human factors research scientists and interaction designers enabling and creating intuitive, emotionally resonant user experiences. For the last 5 years, members of his team has been exploring the role of augmented reality in the automotive domain. He holds a MSIE from the University of Michigan and a BSIE from California Polytechnic State University, San Luis Obispo.
Dr. Rebecca Grier, Ford Motor Company (Moderator)
Human factors researcher, Dr. Rebecca Grier has a history of working on the development of disruptive technologies (e.g., smartphones, streaming video, augmented reality). Currently, she researches the user experience of autonomous vehicles (AV) at Ford Motor Company. She came to Ford from the Institute for Defense Analyses, where she helped to set policy on the test and evaluation of systems for the U.S. Department of Defense. Earlier, she was the human systems integration lead for Aegis modernization at the US Naval Sea Systems Command as well as lead scientist for human automation interaction and interface design at Aptima, Inc. She has also worked at SBC TRI (now known as AT&T Labs). Dr. Grier has a Ph.D. and M.A. in human factors/ experimental psychology from the University of Cincinnati and a B.S. in psychology from Loyola University, Chicago.
The webinar will be structured as a traditional HFES panel event, wherein panelists will speak first and then take questions from attendees.
Don’t forget to register! https://zoom.us/webinar/register/WN_1bCYSEBJSQevDd3tt_NTuA