Augmented Reality - Research

Augmented Reality (AR) is an interactive experience of a real-world environment whereby the objects that reside in the real-world are "augmented" by computer-generated perceptual information, sometimes across multiple sensory modalities, including visualauditoryhapticsomatosensory, and olfactory. The overlaid sensory information can be constructive by as adding to the natural environment or destructive by masking the natural environment and is seamlessly interwoven with the physical world such that it is perceived as an immersive aspect of the real environment. In this way, augmented reality alters one’s ongoing perception of a real-world environment, whereas virtual reality completely replaces the user's real-world environment with a simulated one. Augmented reality is related to two largely synonymous terms: mixed reality and computer-mediated reality.

 

The primary value of augmented reality is that it brings components of the digital world into a person's perception of the real world, and does so not as a simple display of data, but through the integration of immersive sensations that are perceived as natural parts of an environment. The first functional AR systems that provided immersive mixed reality experiences for users were invented in the early 1990s, starting with the Virtual Fixtures system developed at the U.S. Air Force's Armstrong Laboratory in 1992. 

 

Augmented reality is used to enhance natural environments or situations and offer perceptually enriched experiences. With the help of advanced AR technologies, adding computer vision and object recognition the information about the surrounding real world of the user becomes interactive and digitally manipulable. Information about the environment and its objects is overlaid on the real world. This information can be virtual, real or mixed, seeing other real sensed or measured information such as electromagnetic radio waves overlaid in exact alignment with where they actually are in space.

 

As technology is ever evolving, Pegasus Aerospace is currently heavily invested in the research of Augmented Reality capabilities, systems, and integration.

Pegasus Advanced HUD provides a unique turnkey solution addressing the Jet-Wing’s pilots' need to properly visualize terrain, navigation, traffic (ADS-B), instrument, weather, and airspace information with access to vital safety procedures and protocols. Traditional pilots are constantly scanning the instruments for inputs for safety and confirmations. This is especially necessary as the pilot of the Jet-Wing does not have a cockpit where all the pertinent data is provided.

 

Using glass technologies similar to Osterhout Design Group, Epson Moverio, Recon Jet, Microsoft HoloLens and other Head-Mounted Displays, Pegasus Advanced HUD is the first to bring Augmented Reality to pilots providing an unparalleled 3D, 360° situational awareness, and experience in the cockpit, regardless of the visibility or time of day. Pegasus Advanced HUD is the missing link between pilot and Jet-Wing.

System Specs

  • Self-contained smart glasses or custom designed helmet HUD glass

  • All-around 360° display capability

  • 3D Augmented Reality

  • Artificial horizon

  • Airspeed

  • Altitude

  • Glide Ratio

  • Heading

  • Engine Status

 

VFR navigation features

  • Airports

  • Navigation points

  • Cities, villages

  • Airspaces

  • Own Flight Plan

  • Terrain

  • ADS-B traffic

 

IFR navigation features

  • Airways

  • SIDs and STARs

  • Approaches

Actual view through PAHUD

Advanced Head-Up Display

Pegasus Aerospace | Aerospace Engineering | Unmanned Systems | Drones

LINKS
ABOUT

info@pegasp.com

Tel: (850) 376-0991

Destin, FL, USA

DUNS # 095353607
SOCIAL
  • LinkedIn Social Icon
  • Facebook Social Icon
  • Twitter Social Icon
  • Vimeo Social Icon
  • YouTube Social  Icon

© 2013-2019 by Pegasus Aerospace | All rights reserved | No unauthorized copying or saving of any material on this electronic site