Mark Billinghurst - 《增强现实的未来方向_Future+Directions+for+Augmented+Reality》_部分1

2020-03-04 191浏览

  • 1.Future Directions for Augmented Reality Mark Billinghurst
  • 2.1968 – Sutherland/Sproull’s HMD
  • 3.https://www.youtube.com/watch?v=NtwZXGprxag
  • 4.Star Wars - 1977
  • 5.Augmented Reality • Combines Real and Virtual Images • Both can be seen at the same time • Interactive in real-time • The virtual content can be interacted with • Registered in 3D • Virtual objects appear fixed in space Azuma, R. T. (1997). A survey of augmented reality. Presence, 6(4), 355-385.
  • 6.2008 - CNNhttps://www.youtube.com/watch?v=v7fQ_EsMJMs
  • 7.Augmented Reality Applications
  • 8.Augmented Reality in 2018 • Large growing market • $1.2B USD in 2016, $3B in 2017 • Many available devices • HMD, phones, tablets, HUDs • Robust developer tools • Vuforia, ARToolKit, Unity, Wikitude, etc • Large number of applications • > 250K developers, > 100K mobile apps • Strong research/business communities • ISMAR, AWE conferences, AugmentedReality.org, etc
  • 9.AR Revenue Projections • > $80Bilion by 2021, > 3x VR Revenue (Digi-Capital)
  • 10.Future directions
  • 11.• ComKbieneys ERneaalbalnidngViTrtueacl hImnaogelos gies Display Technology • Interactive in real-time Interaction Technologies • Registered in 3D Tracking Technologies
  • 12.Display Technology • Past • Bulky Head mounted displays • Current • Handheld, lightweight head mounted • Future • Projected AR • Wide FOV see through • Retinal displays • Contact lens
  • 13.Wide FOV See-Through (3+ • Waveguide techniques years) • Thin, wider FOV • Socially acceptable • Pinlight Displays • LCD panel + point light sources • 110 degree FOV Lumus DK40 Maimone, A., Lanman, D., Rathinavel, K., Keller, K., Luebke, D., & Fuchs, H. (2014). Pinlightdisplays:wide field of view augmented reality eyeglasses using defocused point light sources. In ACM SIGGRAPH 2014 Emerging Technologies (p. 20). ACM.
  • 14.https://www.youtube.com/watch?v=P407DFm0PFQ
  • 15.• PhotonRs secatninnedainltoDeyiesplays (5+ years) • Infinite depth of field • Bright outdoor performance • Overcome visual defects • True 3D stereo with depth modulation • Microvision (1993-) • Head mounted monochrome • MagicLeap (2013-)
  • 16.Contact Lens (10 – 15 + • Contact Lens only • Unobtrusive years) • Significant technical challenges • Power, data, resolution • Babak Parviz (2008) • Contact Lens + Micro-display • Wide FOV • socially acceptable • Innovega (innovega-inc.com)http://spectrum.ieee.org/biomedical/bionics/augmented-reality-in-a-contact-lens/
  • 17.• Past • Limited interaction • Viewpoint manipulation • Present • Screen based, simple gesture • tangible interaction • Future • Natural gesture, Multimodal • Intelligent Interfaces • Physiological/Sensor based Interaction
  • 18.Natural Gesture (2-5 years) • Freehand gesture input • Depth sensors for gesture capture • Rich two handed gestures • E.g. Microsoft Research Hand Tracker • 3D hand tracking, 30 fps, single sensor • Commercial Systems • MeStaha,rMp, TS., KHeosklion,leC.n, Rso,bOertcsocnu, Dlu.,sT,ayLloer,aJ.p, SMhotototni,oJn., ,LeInichtetelr,, De.tKc. C. R. I., ... & Izadi, S. (2015, April). Accurate, Robust, and Flexible Real-time Hand Tracking. In Proc. CHI (Vol. 8).
  • 19.https://www.youtube.com/watch?v=LblxKvbfEoo
  • 20.Multimodal Input (5-10+ years) • Combine gesture and speech input • Gesture good for qualitative input • Speech good for quantitative input • Support combined commands • “Put that there” + pointing • E.g. HIT Lab NZ multimodal input • 3D hand tracking, speech, multimodal fusion • ComBipllilnegtehutarsstk, sMf.a, sPtieurmwsoitmh bMoMonI,, lTe.s, s& eBraroi,rHs . (2014). Hands inSpace:Gesture Interaction with Augmented-Reality Interfaces. IEEE computer graphics and applications, (1), 77-80.
  • 21.HIT Lab NZ Multimodal Input
  • 22.• Past • Location based, marker based, • magnetic/mechanical • Present • Image based, hybrid tracking • Future • Ubiquitous • Model based • Environmental Tracking
  • 23.Environmental Tracking (1-3+ yrs) • Environment capture • Use depth sensors to capture scene & track from model • InifinitAM (www.robots.ox.ac.uk/~victor/infinitam/) • Real time scene capture on mobiles, dense or sparse capture
  • 24.InfinitAM Demohttps://www.youtube.com/watch?v=47zTHHxJjQU
  • 25.Ubiquity6 - AR Cloud Trackinghttps://www.youtube.com/watch?v=LxQY_7COzQg
  • 26.Wide Area Outdoor Tracking (5 + yrs) • Process • Combine panorama’s into point cloud model (offline) • Initialize camera tracking from point cloud • Update pose by aligning camera image to point cloud • Accurate to 25 cm, 0.5 degree over very wide area Ventura, J., & Hollerer, T. (2012). Wide-area scene mapping for mobile visual tracking. In Mixed and Augmented Reality (ISMAR), 2012 IEEE International Symposium on (pp. 3-12). IEEE.
  • 27.Wide Area Outdoor Tracking
  • 28.Social Acceptance • People don’t want to look silly • Only 12% of 4,600 adults would be willing to wear AR glasses • 20% of mobile AR browser users experience social issues • Acceptance more due to Social than Technical issues
  • 29.Example:TAT Augmented ID
  • 30.TAT AugmentedID
  • 31.
  • 32.