Searching for "capture"

metaverse definition

What the metaverse will (and won’t) be, according to 28 experts

metaverse (hopefully) won’t be the virtual world of ‘Snow Crash,’ or ‘Ready Player One.’ It will likely be something more complex, diverse, and wild.

The metaverse concept clearly means very different things to different people. What exists right now is a series of embryonic digital spaces, such as Facebook’s HorizonEpic Games’ FortniteRoblox‘s digital space for gaming and game creation, and the blockchain-based digital world Decentraland–all of which have clear borders, different rules and objectives, and differing rates of growth.

TIFFANY ROLFE

different layers of realities that we can all be experiencing, even in the same environment or physical space. We’re already doing that with our phones to a certain extent—passively in a physical environment while mentally in a digital one. But we’ll see more experiences beyond your phone, where our whole bodies are fully engaged, and that’s where the metaverse starts to get interesting—we genuinely begin to explore and live in these alternate realities simultaneously.

RONY ABOVITZ, FOUNDER, MAGIC LEAP

Xverse

It will have legacy parts that look and feel like the web today, but it will have new nodes and capabilities that will look and feel like the Ready Player One Oasis (amazing gaming worlds), immersion leaking into our world (like my Magicverse concept), and every imaginable permutation of these. I feel that the Xverse will have gradients of sentience and autonomy, and we will have the emergence of synthetic life (things Sun and Thunder is working on) and a multitude of amazing worlds to explore. Building a world will become something everyone can do (like building a webpage or a blog) and people will be able to share richer parts of their external and inner lives at incredibly high-speed across the planet.

YAT SIU, COFOUNDER AND EXECUTIVE CHAIRMAN OF GAMING AND BLOCKCHAIN COMPANY ANIMOCA BRANDS

Reality will exist on a spectrum ranging from physical to virtual (VR), but a significant chunk of our time will be spent somewhere between those extremes, in some form of augmented reality (AR). Augmented reality will be a normal part of daily life. Virtual companions will provide information, commentary, updates and advice on matters relevant to you at that point in time, including your assets and activities, in both virtual and real spaces.

TIMONI WEST, VP OF AUGMENTED AND VIRTUAL REALITY, UNITY:

I think we can all agree our initial dreams of a fully immersive, separate digital world is not only unrealistic, but maybe not what we actually want. So I’ve started defining the metaverse differently to capture the zeitgeist: we’re entering an era where every computer we interact with, big or small, is increasingly world-aware. They can recognize faces, voices, hands, relative and absolute position, velocity, and they can react to this data in a useful way. These contextually aware computers are the path to unlocking ambient computing: where computers fade from the foreground to the background of everyday, useful tools. The metaverse is less of a ‘thing’ and more of a computing era. Contextual computing enables a multitude of new types of interactions and apps: VR sculpting tools and social hangouts, self-driving cars, robotics, smart homes.

SAM HAMILTON, HEAD OF COMMUNITY & EVENTS FOR BLOCKCHAIN-BASED METAVERSE CREATOR THE DECENTRALAND FOUNDATION

NITZAN MEKEL-BOBROV, CHIEF AI OFFICER, EBAY

as carbon is to the organic world, AI will be both the matrix that provides the necessary structural support and the material from which digital representation will be made. Of all the ways in which AI will shape the form of the metaverse, perhaps most essential is the role it will play in the physical-digital interface. Translating human actions into digital input–language, eye movement, hand gestures, locomotion–these are all actions which AI companies and researchers have already made tremendous progress on.

HUGO SWART, VICE PRESIDENT AND GM OF XR, QUALCOMM

Qualcomm views the metaverse as an ever-present spatial internet complete with personalized digital experiences that spans the physical and virtual worlds, where everything and everyone can communicate and interact seamlessly.

IBRAHIM BAGGILI, FOUNDING DIRECTOR, CONNECTICUT INSTITUTE OF TECHNOLOGY AT UNIVERSITY OF NEW HAVEN

As an active researcher in the security and forensics of VR systems, should the metaverse come into existence, we should explore and hypothesize the ways it will be misused.

CHITRA RAGAVAN, CHIEF STRATEGY OFFICER AT BLOCKCHAIN DATA ANALYTICS COMPANY ELEMENTUS 

I picture [the metaverse] almost like The Truman Show. Only, instead of walking into a television set, you walk into the internet and can explore any number of different realities

JOHN HANKE, CEO OF POKÉMON GO CREATOR NIANTIC

We imagine the metaverse as reality made better, a world infused with magic, stories, and functionality at the intersection of the digital and physical worlds.

CAROLINA ARGUELLES NAVAS, GLOBAL PRODUCT MARKETING, AUGMENTED REALITY, SNAP

Rather than building the “metaverse,” a separate and fully virtual reality that is disconnected from the physical world, we are focused on augmenting reality, not replacing it. We believe AR–or computing overlaid on the world around us–has a smoother path to mass adoption, but will also be better for the world than a fully virtual world.

URHO KONTTORI, COFOUNDER AND CTO OF AR/VR HEADSET MAKER VARJO

In the reality-based metaverse, we will be able to more effectively design products of the future, meet and collaborate with our colleagues far away, and experience any remote place in real-time.

ATHERINE ALLEN, CEO OF IMMERSIVE TECH RESEARCH CONSULTANCY LIMINA IMMERSIVE

I prefer to think of the metaverse as simply bringing our bodies into the internet.

BRANDS IN THE METAVERSE

VISHAL SHAH, VP OF METAVERSE, FACEBOOK

The metaverse isn’t just VR! Those spaces will connect to AR glasses and to 2D spaces like Instagram. And most importantly, there will be a real sense of continuity where the things you buy are always available to you.

SAYON DEB, MANAGER, MARKET RESEARCH, CONSUMER TECHNOLOGY ASSOCIATION

At its core will be a self-contained economy that allows individuals and businesses to create, own or invest in a range of activities and experiences.

NANDI NOBELL, SENIOR ASSOCIATE AT GLOBAL ARCHITECTURE AND DESIGN FIRM CALLISONRTKL

the metaverse experience can be altered from the individual’s point of view and shaped or curated by any number of agents—whether human or A.I. In that sense, the metaverse does not have an objective look beyond its backend. In essence, the metaverse, together with our physical locations, forms a spatial continuum.

NICK CHERUKURI, CEO AND FOUNDER OF MIXED REALITY GLASSES MAKER THIRDEYE

The AR applications of the metaverse are limitless and it really can become the next great version of the internet.

SAM TABAR, CHIEF STRATEGY OFFICER, BITCOIN MINING COMPANY BIT DIGITAL

It seems fair to predict that the actual aesthetic of any given metaverse will be determined by user demand. If users want to exist in a gamified world populated by outrageous avatars and fantastic landscapes then the metaverse will respond to that demand. Like all things in this world the metaverse will be market driven

+++++++++++++++
More on meta-verse in this blog
https://blog.stcloudstate.edu/ims?s=metaverse

LiDAR data in AR & Apple Car

Apple researching how to use compressed LiDAR data in AR & ‘Apple Car’

https://appleinsider.com/articles/21/07/15/apple-researching-how-to-use-compressed-lidar-data-in-ar-apple-car/amp/

a vehicle equipped with a LIDAR system, a 3-D camera, or a 3-D scanner may include the vehicle’s direction and speed in a point cloud captured by the LIDAR system, the 3-D camera, or the 3-D scanner

++++++++++++++++++++++++
more on LIDAR in this IMS blog
https://blog.stcloudstate.edu/ims?s=lidar

AR for Remote Access and Skill Retention

Manufacturers set the pace in the Augmented Reality race

Vuforia® Expert Capture Technology and Microsoft’s HoloLens glasses were used to create a virtual guide hosted in the cloud and then accessed by engineers in a number of factories across the UK

Industry has been searching for some time for an answer to an ageing workforce and the worrying scenario of traditional engineering skills being potentially lost forever.

AR can be used to record skills as engineers are performing them, saving them in the Cloud for generations to come – almost like a virtual technical library.

Importantly, these instructions can be delivered at the point of use, which has been proven to speed up learning.

+++++++++++++++++++++
more on AR in this IMS blog
https://blog.stcloudstate.edu/ims?s=Augmented+reality

Apple headset

Apple’s upcoming mixed reality headset will reportedly weigh less than an iPhone

Ming-Chi Kuo says it will weigh less than 150 grams

A weight of 150 grams would make Apple’s headset lighter than the Oculus Quest 2 (503 grams), Microsoft’s HoloLens 2 (645 grams), and the Valve Index (809 grams). It would be lighter than Google’s Daydream View, a fabric VR headset designed to hold your phone, which weighed 220 grams. The headset could even be lighter than your iPhone, given that the standard iPhone 12 weighs 164 grams.

The headset, codenamed “N301,” may also have 8K displays, eye-tracking technology, and more than a dozen cameras to both track your hand movements and capture footage that can be displayed inside the headsetapp

++++++++++++++++++
https://blog.stcloudstate.edu/ims/2021/03/25/tech-cycle-ar-vr/

campus wide infrastructure for immersive

Cabada, E., Kurt, E., & Ward, D. (2021). Constructing a campus-wide infrastructure for virtual reality. College & Undergraduate Libraries, 0(0), 1–24. https://doi.org/10.1080/10691316.2021.1881680

As an interdisciplinary hub, academic libraries are uniquely positioned to serve the full lifecycle of immersive environment needs, from development through archiving of successful projects. As and informal learning environment that or discipline neutral and high traffic, the academic library can serve as a clearinghouse for experimentation and transmission of best practices across colleges.

these founda­tional questions:
1. What VR infrastructure needs do faculty and researchers have?
2. Where is campus support lagging?
3. What current partnerships exist?
4. What and where is the campus level of interest in VR?
As marketing for workshops and programs can be challenging, particu­larly for large institutions, data was collected on where workshop partici­pants learned about Step Into VR. The responses show that users learned of the workshops from a variety of ways with email ( 41 % ) as the most cited method (Figure 4). These marketing emails were sent through distributed listservs that reached nearly the entire campus population. Facebook was called out specifically and represented the second largest marketing method at 29% with the library website, friends, instructors, and digital signage rep­resenting the remaining marketing channels.
While new needs continue to emerge, the typical categories of consult­ation support observed include:
• Recommendations on hardware selection, such as choosing the best VR headset for viewing class content
• Guidance on developing VR applications that incorporate domain-spe­cific curricular content
• Support for curricular integration of VR
• Recommendations on 360 capture media and equipment for document­ing environments or experiences, such as the GoPro Fusion and Insta360 One X
• Advice on editing workflows, including software for processing and ren­dering of 360 content
Alex Fogarty
p. 9
While many library patrons understand the basic concepts of recording video on a camera, 360 cameras present a large divergence from this pro­cess in several primary ways. The first is a 360 camera captures every direc­tion at once, so there is no inherent “focus,” and no side of a scene that is not recorded. This significantly changes how someone might compose a video recording, and also adds complexity to post-production, including how to orient viewers within a scene. The second area of divergence is that many of these devices, especially the high-end versions, are recording each lens to a separate data file or memory card and these ftles need to be com­bined, or “stitched,” at a later time using software specific to the camera. A final concern is that data ftles for high-resolution 3 D capture can be huge, requiring both large amounts of disk space and high-end processors and graphic cards for detailed editing to occur. For example, the Insta360 Pro 2 has 6 sensors all capable of data recording at 120 Mbps for a grand total of 720 Mbps. This translates into 43.2 gigabytes of data for every minute o

3D scanning iPhone lidar

3D scanning with the iPhone lidar

https://albn.medium.com/3d-scanning-with-the-iphone-lidar-8cbd723fc9ab

he latest generation of iPhones (12 Pro and 12 Pro Max) comes equipped with a back facing lidar camera.

6 go-to scanning apps, they all come with direct export to Sketchfab:

Scaniverse

Scaniverse captures here, and get the app here.

Polycam

Polycam captures here, and get the app here.

3D scanner app

3D scanner app captures here, and get the app here.

Record3D

Record3D captures here and get the app here.

SiteScape

SiteScape captures here and get the app here.

Everypoint

Everypoint captures here, and get the app here.

++++++++++++++
more on 3D scanning in this IMS blog
https://blog.stcloudstate.edu/ims?s=3d+scanning

video recording lectures

https://www.chronicle.com/article/when-this-is-all-over-keep-recording-your-lectures

“Passively watching a recording is not as good as being an active participant in class, so these videos are a supplement, not a substitute. I’ll keep posting the videos as long as you keep coming to class.”

My note: I wonder if the instructor uses the “VideoQuiz” option in MediaSpace/Kaltura and place questions at important places of the h/er video lecture recording and thus make the experience more engagingKaltura

 

+++++++++++
more on vodcast in this IMS blog
https://blog.stcloudstate.edu/ims?s=vodcast
more on lecture capture in this IMS blog
https://blog.stcloudstate.edu/ims?s=video+lecture

AR the new knowledge management

Augmented Reality: The New Knowledge Management

https://www.forbes.com/sites/tomdavenport/2021/02/05/augmented-reality-the-new-knowledge-management/

Taqtile had a compelling vision for using the Hololens for digital transformation for industrial frontline workers. The goal was to democratize expertise and make “everyone an expert.”

Taqtile’s content platform is called Manifest. It’s an enterprise platform for knowledge capture and reuse for industrial workers—a tool for structuring the “checklist” items for a particular task. It’s unlike anything we saw in the KM era. Manifest procedures contain instructions, photos, videos, pointers, and the like. If that’s not enough, it can also contact experts in real time—as with the BP Virtual Teamwork system.

++++++++++++
more on AR in this IMS blog
https://blog.stcloudstate.edu/ims?s=Augmented+reality

AR HUD

CES 2021: Panasonic Brings AI-Enhanced Situational Awareness to Drivers with AR HUD

https://innotechtoday.com/panasonic-brings-ai-enhanced-situational-awareness-to-drivers-with-ar-hud/

The key features of the new AR HUD include:

  • Eye tracking technology – Projects information at driver’s level of sight based on driver’s eye position, eliminating a potential mismatch between the projected image when the driver moves their head
  • Advanced optics – Advanced optical design techniques provide expanded field-of-view (beyond 10 by 4 degrees) for virtual image distance of 10m or greater; detects pedestrians and objects through enhanced low light and nighttime view; tilted virtual image planes adjust visibility of objects in the driver’s field of view; embedded camera system allows discrete monitoring for the driver’s eye location.
  • AI navigation accuracy – AI-driven AR navigation technology detects and provides multi-color 3D navigation graphics that adjust with moving vehicle’s surroundings, displaying information like lane markers and GPS arrows where turns will occur and sudden changes such as collisions or cyclists in one’s path
  • Vibration control – Panasonic’s proprietary camera image stability algorithm enables AR icons to lock onto the driving environment regardless of the bumpiness of the road
  • Real-time situational awareness – Driving environment updates occur in real-time; ADAS, AI, AR environment information updates in less than 300 milliseconds
  • 3D imaging radar – Sensor-captured full 180-degree forward vision up to 90 meters and across approximately three traffic lanes
  • Compact size – Efficient compact packaging to fit any vehicle configuration
  • 4K resolution – Crisp, bright 4K resolution using advanced laser and holography technology, with static near-field cluster information and far-field image plane for AR graphic overlay

+++++++++++++
https://blog.stcloudstate.edu/ims?s=augmented+reality

1 2 3 4 5 13

Skip to toolbar