Nreal, the Chinese creator of highly anticipated augmented reality smartglasses, announced a December 1st launch of its Nreal Light in Japan with KDDI. Following a successful introduction in Korea with LG this summer, pre-orders for Nreal Light can now be made on their Japanese telco partner KDDI’s online store. The Nreal is compatible with 5G smartphones including the Sony Xperia 5 II and Samsung Galaxy Note20 Ultra.
The new glasses must be tethered to an android phone, which acts is its controller. The Nreal Light went on sale in Korea August 21 for just under $600, or $295 when bundled with a Galaxy Note 20 from the LG U+ network in Korea.
Behind the Scenes: Microsoft’s Principal Researcher Eyal Ofek speaking about technical and social perspectives of XR
About this Event
The XR Bootcamp Open Lecture Series continues with Microsoft’s Principal Researcher Eyal Ofek!
Agenda:
Virtual Reality (VR) & Augmented reality (AR) pose challenges and opportunities from both a technical and social perspective. We could now have digital, and not physical objects change our understanding of the world around us. It is a unique opportunity to change reality as we sense it.
The Microsoft Researchers are looking for new possibilities to extend our abilities when we are not bound by our physical limitations, enabling superhuman abilities on one hand, and leveling the playfield for people with physical limitations.
Dr. Ofek will describe efforts to design VR & AR applications that will adjust according to the user’s uncontrolled environment, enabling a continuous use during work and leisure, over the large variance of environments. He will also review efforts to the extent the rendering to new capabilities such as haptic rendering.
His lecture will be followed by a Q&A session where you can ask all your questions about the topic.
Lead Instructors:
Eyal Ofek is a principal researcher at the Microsoft Research lab in Redmond, WA. His research interests include Augmented Reality (AR)/Virtual Reality (VR), Haptics, interactive projection mapping, and computer vision for human-computer interaction. He is also the Specialty Chief Editor of Frontiers in Virtual Reality, for the area of Haptics and an Assoc. Editor of IEEE Computer Graphics and Application (CG&A).
Prior to joining Microsoft Research, he obtained his Ph.D. at the Hebrew University of Jerusalem and has founded a couple of companies in computer graphics, including a successful drawing and photo editing application and developing the world’s first time-of-flight video cameras which was a basis for the HoloLens depth camera.
This event is part of the Global XR Bootcamp event:
The Global XR Bootcamp 2020 will be the biggest community-driven, FREE, online Virtual, Augmented and Mixed Reality event in the world! Join us on YouTube or AltspaceVR for a 24 hour live stream with over 50 high quality talks, panels and sessions. Meet your fellow XR enthousiasts in our Community Zone, and win amazing prizes – from vouchers to XR hardware.
a U.K.-based company called Envisics, believes that he’s found the perfect use case for real-life augmented reality holograms.
However, the biggest reason in-car AR could succeed is this: It solves a problem that actually exists. AR headsets could wind up being the biggest thing since the smartphone.
ForceBot is a four year project to develop an exoskeleton for commercial and enterprise applications using HaptX’s microfluidic touch feedback technology to simulate virtual objects. The NSF grant will be distributed between each company to contribute individual components to ForceBot, and then the resulting IP will be used for commercial products.
Beyond thrilled to finally share a sneak peek of our Facebook partnership with Ray-Ban! Our first smart glasses will launch next year, and that’s just the beginning… The future will be a classic and it’s coming in 2021 😎 pic.twitter.com/l9992ZQGoy
Like all of 8th Wall’s WebAR capabilities, projects created using Curved Image Targets work across iOS and Android devices with an estimated reach of nearly 3 billion smartphones, and can be immediately experienced with the tap of a link or by scanning a QR code.