Augmented Reality Sandtable


The Augmented Reality Sandtable (ARES) is an interactive, digital sand table that uses augmented reality (AR) technology to create a 3D battlespace map. It was developed by the Human Research and Engineering Directorate (HRED) at the Army Research Laboratory (ARL) to combine the positive aspects of traditional military sand tables with the latest digital technologies to better support soldier training and offer new possibilities of learning.[1] It uses a projector to display a topographical map on top of the sand in a regular sandbox as well as a motion sensor that keeps track of changes in the layout of the sand to appropriately adjust the computer-generated terrain display.[2][3]

An ARL study conducted in 2017 with 52 active duty military personnel (36 males and 16 females) found that the participants who used ARES spent less time setting up the table compared to participants who used a traditional sand table. In addition, ARES demonstrated a lower perceived workload score, as measured using the NASA Task Load Index (NASA-TLX) ratings, compared to the traditional sand table. However, there was no significant difference in post-knowledge test scores in recreating the visual map.[4]

Development

The ARES project was one of the 25 ARL initiatives in development from 1995 to 2015 that focused on visualizing spatial data on virtual or sand table interfaces.[1][5] It was developed by HRED’s Simulation and Training Technology Center (STTC) with Charles Amburn as the principal investigator.[1] Collaborations involved with ARES included Dignitas Technologies, Design Interactive (DI), the University of Central Florida’s Institute for Simulation and Training, and the U.S. Military Academy at West Point.[6]

ARES was largely designed to be a tangible user interface (TUI), in which digital information can be manipulated using physical objects such as a person’s hand. It was constructed using commercial off-the-shelf components, including a projector, a laptop, an LCD monitor, Microsoft’s Xbox Kinect sensor, and government-developed ARES software. With the projector and Kinect sensor both facing down on the surface of the sandbox, the projector provides a digital overlay over the sand and the Kinect sensor scans the surface of the map to detect any user gestures inside the boundaries of the sandbox.[1]

During development, researchers explored the possibility of incorporating ideas such as multi-touch surfaces, 3D holographic displays, and virtual environments. However, budget restrictions limited the implementation of such ideas.[5]

On September 2014 during the Modern Day Marine exhibition in Quantico, Virginia, researchers from ARL showcased ARES for the first time.[7]

Uses

According to a 2015 technical report by ARL scientists, ARES is reported to have the following capabilities.[1]

  • Images, maps, and videos can be projected into the sand table from a top-down point of view in real time.
  • Terrain and scenarios created in ARES can be imported into different simulation applications, such as Virtual Battlespace 3 (VBS3) and One Semi-Automated Forces (OneSAF).
  • Military symbols and graphics can be created, labeled, and placed onto the sand table to set up different scenarios.
  • Visual aids such as color schemes and contour lines can be used to guide users to shaping the sand to replicate a previously saved 3D terrain.
  • Users can use their hand to navigate the different menus available in ARES as if it was the mouse.
  • The ARES sensor can detect and track the presence of hand(s) to identify where the user points on the sand table.
  • A web camera can be used to communicate with other users and provide a top-down view of the sand table to collaborators.
  • AR-based tablet apps and specially-made note cards can be used to project images of different vehicles on the terrain.

References

  1. Amburn, Charles; Vey, Nathan; Boyce, Michael; Mize, Jerry (October 2015). "The Augmented REality Sandtable (ARES)". The US Army Research Laboratory.
  2. "Microsoft's Kinect aids in 'augmented reality sand' mapping tool for Marines, Army". Marine Corps Times. September 23, 2014. Retrieved August 2, 2018.
  3. Mufson, Beckett (November 5, 2014). "Design Digital Terrain with the Army's Projection-Mapped Sandtable". Vice Creators. Retrieved August 2, 2018.
  4. Hale, Kelly; Riley, Jennifer; Amburn, Charles; Vey, Nathan (January 1, 2018). "Evaluation of Augmented REality Sandtable (ARES) during Sand Table Construction". US Army Research Laboratory via Defense Technical Information Center.
  5. Garneau, Christopher; Boyce, Michael; Shorter, Paul; Vey, Nathan; Amburn, Charles (February 1, 2018). "The Augmented Reality Sandtable (ARES) Research Strategy". US Army Research Laboratory via Defense Technical Information Center.
  6. Glass, Dolly (December 18, 2014). "Army and Marines research sand table technology". Team Orlando. Retrieved August 2, 2018.
  7. Hedelt, Carden (September 24, 2014). "New Sand Table Technology Featured at Modern Day Marine". CHIPS. Retrieved August 2, 2018.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.