An Experiment with AR Core

 

Last year Google’s ARCore SDK came out for supported Android phones.   It had some new features previously lacking from other  Augmented Reality applications.    It does more than just put 3D objects in front of a camera view; it tracks the phone’s position through an arbitrary 3D space.

That got me thinking that we could use that to potentially satisfy something that a lot around here are asking for:  AR Tours.   I’ve been asked several times about the possibility of making an AR guide around campus.

My first experiment was set around the VizLab itself.

There’s a few things going on around the lab space.   3D Models that are aligned with real world features, information markers located near real world features, and Wednesday, our AR Tour guide, who’s dialog is driven by the user’s proximity to different parts of the lab.

All this is possible because AR Core tracks the phone’s position in space.   The first, and greatest challenge is ‘anchoring’ the virtual scene with the real world space.   This is done by having the user pick two points (one for the origin, one for the +X axis direction).   Once those things are set up the virtual space aligns itself with the real one.   Future versions will experiment with different methods of finding and attaching those anchor points in a more automagic method.

Leave a Reply

Your email address will not be published. Required fields are marked *