Searching for "augmented"

global upskilling and universities

https://www.universityworldnews.com/post.php?story=20210129110449887

Upskilling for Shared Prosperity predicts that upskilling and reskilling could propel the transition to an economy where human labour is increasingly complemented and augmented – rather than replaced – by new technology, thus improving the overall quality of jobs.

Inertia in educational systems

The new Davos report points out that education systems – in particular secondary and tertiary education – must act and embrace this to play a central role in a comprehensive upskilling agenda.

Several higher education areas urgently need addressing:

• Curricula:
• Technology:
• Education providers:
• Qualifications, experiences and recognition:
• Connectivity:
• Credentialing:

Role of universities

“New arrangements – shorter, modular, part-time, with mixtures of in-person teaching and asynchronous self-directed learning – have to be developed. And to do that in a high-quality manner requires an enormous investment.”

+++++++++++
more on 4th industrial revolution in this IMS blog
https://blog.stcloudstate.edu/ims?s=industrial+revolution

AR the new knowledge management

Augmented Reality: The New Knowledge Management

https://www.forbes.com/sites/tomdavenport/2021/02/05/augmented-reality-the-new-knowledge-management/

Taqtile had a compelling vision for using the Hololens for digital transformation for industrial frontline workers. The goal was to democratize expertise and make “everyone an expert.”

Taqtile’s content platform is called Manifest. It’s an enterprise platform for knowledge capture and reuse for industrial workers—a tool for structuring the “checklist” items for a particular task. It’s unlike anything we saw in the KM era. Manifest procedures contain instructions, photos, videos, pointers, and the like. If that’s not enough, it can also contact experts in real time—as with the BP Virtual Teamwork system.

++++++++++++
more on AR in this IMS blog
https://blog.stcloudstate.edu/ims?s=Augmented+reality

Apple mixed-reality headset

Apple mixed reality headset to have two 8K displays, cost $3000 – The Information from r/gadgets

Apple mixed-reality headset to have two 8K displays, cost $3,000 – The Information

Apple’s known interest in this field has so far focused more on augmented reality (AR) than virtual reality (VR), but the recent reports point to a mixed-reality device, which would be mostly VR but including some real-world elements.

+++++++++++++
more on Apple Glass in this IMS blog
https://blog.stcloudstate.edu/ims?s=apple+glass

Apple AAPL

Apple AAPL is expected to launch its first virtual reality (VR) headset in 2022, which will be a forerunner of its much-anticipated augmented reality (AR) glasses

along with VR features like a completely simulated 3-D digital environment, the device might include limited AR functionalities.

Apple’s entry will intensify competition in the VR device market, which includes devices such as Facebook’s FB Oculus Quest 2, Sony’s SNE PlayStation VR, Microsoft’s MSFT Windows Mixed Reality and HTC’s Vive and Vive Pro.

global spending on AR and VR is expected to reach $72.8 billion in 2024 from $12 billion in 2020, reflecting a CAGR of 54%

+++++++++++++
more on Apple Glass in this IMS blog
https://blog.stcloudstate.edu/ims?s=apple+glass

AR HUD

CES 2021: Panasonic Brings AI-Enhanced Situational Awareness to Drivers with AR HUD

https://innotechtoday.com/panasonic-brings-ai-enhanced-situational-awareness-to-drivers-with-ar-hud/

The key features of the new AR HUD include:

  • Eye tracking technology – Projects information at driver’s level of sight based on driver’s eye position, eliminating a potential mismatch between the projected image when the driver moves their head
  • Advanced optics – Advanced optical design techniques provide expanded field-of-view (beyond 10 by 4 degrees) for virtual image distance of 10m or greater; detects pedestrians and objects through enhanced low light and nighttime view; tilted virtual image planes adjust visibility of objects in the driver’s field of view; embedded camera system allows discrete monitoring for the driver’s eye location.
  • AI navigation accuracy – AI-driven AR navigation technology detects and provides multi-color 3D navigation graphics that adjust with moving vehicle’s surroundings, displaying information like lane markers and GPS arrows where turns will occur and sudden changes such as collisions or cyclists in one’s path
  • Vibration control – Panasonic’s proprietary camera image stability algorithm enables AR icons to lock onto the driving environment regardless of the bumpiness of the road
  • Real-time situational awareness – Driving environment updates occur in real-time; ADAS, AI, AR environment information updates in less than 300 milliseconds
  • 3D imaging radar – Sensor-captured full 180-degree forward vision up to 90 meters and across approximately three traffic lanes
  • Compact size – Efficient compact packaging to fit any vehicle configuration
  • 4K resolution – Crisp, bright 4K resolution using advanced laser and holography technology, with static near-field cluster information and far-field image plane for AR graphic overlay

+++++++++++++
https://blog.stcloudstate.edu/ims?s=augmented+reality

iLRN 2021

CALL FOR PAPERS AND PROPOSALS
iLRN 2021: 7th International Conference of the Immersive Learning Research Network
May 17 to June 10, 2021, on iLRN Virtual Campus, powered by Virbela
… and across the Metaverse!
Technically co-sponsored by the IEEE Education Society,
with proceedings to be submitted for inclusion in IEEE Xplore(r)
Conference theme: “TRANSCEND: Accelerating Learner Engagement in XR across Time, Place, and Imagination”
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Conference website: https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fimmersivelrn.org%2Filrn2021%2F&data=04%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C24d0f76661804eca489508d8a66c7801%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637442332084340933%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=6d614jJWaou4vQMNioW4ZGdiHIm2mCD5uRqaZ276VVw%3D&reserved=0
PDF version of this CFP available at: https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fbit.ly%2F3qnFYRu&data=04%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C24d0f76661804eca489508d8a66c7801%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637442332084340933%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=Ksq0YFtUxHI9EM0%2Fa7OyYTeb7ObhOy3JdVquCRvvH54%3D&reserved=0
The 7th International Conference of the Immersive Learning Research Network (iLRN 2021) will be an innovative and interactive virtual gathering for a strengthening global network of researchers and practitioners collaborating to develop the scientific, technical, and applied potential of immersive learning. It is the premier scholarly event focusing on advances in the use of virtual reality (VR), augmented reality (AR), mixed reality (MR), and other extended reality (XR) technologies to support learners across the full span of learning–from K-12 through higher education to work-based, informal, and lifelong learning contexts.
Following the success of iLRN 2020, our first fully online and in-VR conference, this year’s conference will once again be based on the iLRN Virtual Campus, powered by VirBELA, but with a range of activities taking place on various other XR simulation, gaming, and other platforms. Scholars and professionals working from informal and formal education settings as well as those representing diverse industry sectors are invited to participate in the conference, where they may share their research findings, experiences, and insights; network and establish partnerships to envision and shape the future of XR and immersive technologies for learning; and contribute to the emerging scholarly knowledge base on how these technologies can be used to create experiences that educate, engage, and excite learners.
Note: Last year’s iLRN conference drew over 3,600 attendees from across the globe, making the scheduling of sessions a challenge. This year’s conference activities will be spread over a four-week period so as to give attendees more opportunities to participate at times that are conducive to their local time zones.
##### TOPIC AREAS #####
XR and immersive learning in/for:
Serious Games • 3D Collaboration • eSports • AI & Machine Learning • Robotics • Digital Twins • Embodied Pedagogical Agents • Medical & Healthcare Education • Workforce & Industry • Cultural Heritage • Language Learning • K-12 STEM • Higher Ed & Workforce STEM  • Museums & Libraries • Informal Learning • Community & Civic Engagement  • Special Education • Geosciences • Data Visualization and Analytics • Assessment & Evaluation
##### SUBMISSION STREAMS & CATEGORIES #####
ACADEMIC STREAM (Refereed paper published in proceedings):
– Full (6-8 pages) paper for oral presentation
– Short paper (4-5 pages) for oral presentation
– Work-in-progress paper (2-3 pages) for poster presentation
– Doctoral colloquium paper (2-3 pages)
PRACTITIONER STREAM (Refereed paper published in proceedings):
– Oral presentation
– Poster presentation
– Guided virtual adventures
– Immersive learning project showcase
NONTRADITIONAL SESSION STREAM (1-2 page extended abstract describing session published in proceedings):
– Workshop
– Special session
– Panel session
##### SESSION TYPES & SESSION FORMATS #####
– Oral Presentation: Pre-recorded video + 60-minute live in-world discussion with
others presenting on similar/related topics (groupings of presenters into sessions determined by Program Committee)
– Poster Presentation: Live poster session in 3D virtual exhibition hall; pre-recorded video optional
– Doctoral Colloquium: 60-minute live in-world discussion with other doctoral researchers; pre-recorded video optional
– Guided Virtual Adventures: 60-minute small-group guided tours of to various social and collaborative XR/immersive environments and platforms
– Immersive Learning Project Showcase: WebXR space to assemble a collection of virtual artifacts, accessible to attendees throughout the conference
– Workshop: 1- or 2-hour live hands-on session
– Special Session: 30- or 60-minute live interactive session held in world; may optionally be linked to one or more papers
– Panel Session: 60-minute live in-world discussion with a self-formed group of 3-5 panelists (including a lead panelist who serves as a moderator)
Please see the conference website for templates and guidelines.
##### PROGRAM TRACKS #####
Papers and proposals may be submitted to one of 10 program tracks, the first nine of which correspond to the iLRN Houses of application, and the tenth of which is intended for papers making knowledge contributions to the learning sciences, computer science, and/or game studies that are not linked to any particular application area:
Track 1. Assessment and Evaluation (A&E)
Track 2. Early Childhood Development & Learning (ECDL)
Track 3. Galleries, Libraries, Archives, & Museums (GLAM)
Track 4. Inclusion, Diversity, Equity, Access, & Social Justice (IDEAS)
Track 5. K-12 STEM Education
Track 6. Language, Culture, & Heritage (LCH)
Track 7. Medical & Healthcare Education (MHE)
Track 8. Nature & Environmental Sciences (NES)
Track 9. Workforce Development & Industry Training (WDIT)
Track 10. Basic Research and Theory in Immersive Learning (not linked to any particular application area)
##### PAPER/PROPOSAL SUBMISSION & REVIEW #####
Papers for the Academic Stream and extended-abstract proposals for the Nontraditional Session Stream must be prepared in standard IEEE double-column US Letter format using Microsoft Word or LaTeX, and will be accepted only via the online submission system, accessible via the conference website (from which guidelines and templates are also available).
Proposals for the Practitioner Stream are to be submitted via an online form, also accessible from the conference website.
A blind peer-review process will be used to evaluate all submissions.
##### IMPORTANT DATES #####
– Main round submission deadline – all submission types welcome: 2021-01-15
– Notification of review outcomes from main submission round: 2021-04-01
– Late round submission deadline – Work-in-progress papers, practitioner presentations, and nontraditional sessions only: 2021-04-08
– Camera-ready papers for proceedings due – Full and short papers: 2021-04-15
– Presenter registration deadline – Full and short papers (also deadline for early-bird registration rates): 2021-04-15
– Notification of review outcomes from late submission round: 2021-04-19
– Camera-ready work-in-progress papers and nontraditional session extended abstracts for proceedings due; final practitioner abstracts for conference program due: 2021-05-03
– Presenter registration deadline – Work-in-progress papers, practitioner presentations, and nontraditional sessions: 2021-05-03
– Deadline for uploading presentation materials (videos, slides for oral presentations, posters for poster presentations): 2021-05-10
– Conference opening: 2021-05-17
– Conference closing: 2021-06-10
*Full and short papers can only be submitted in the main round.
##### PUBLICATION & INDEXING #####
All accepted and registered papers in the Academic Stream that are presented at iLRN 2021 and all extended abstracts describing the Nontraditional Sessions presented at the conference will be published in the conference proceedings and submitted to the IEEE Xplore(r) digital library.
Content loaded into Xplore is made available by IEEE to its abstracting and indexing partners, including Elsevier (Scopus, EiCompendex), Clarivate Analytics (CPCI–part of Web of Science) and others, for potential inclusion in their respective databases. In addition, the authors of selected papers may be invited to submit revised and expanded versions of their papers for possible publication in the IEEE Transactions on Learning Technologies (2019 JCR Impact Factor: 2.714), the Journal of Universal Computer Science (2019 JCR Impact Factor: 0.91), or another Scopus and/or Web of Science-indexed journal, subject to the relevant journal’s regular editorial and peer-review policies and procedures.
##### CONTACT #####
Inquiries regarding the iLRN 2020 conference should be directed to the Conference Secretariat at conference@immersivelrn.org.
General inquiries about iLRN may be sent to info@immersivelrn.org.

More on Virbela in this IMS blog
https://blog.stcloudstate.edu/ims?s=virbela

NuEyes AR smart Glass

NuEyes Technologies Announces New Pro 3 Augmented Reality Smart Glass Solution

https://www.newswire.com/news/nueyes-technologies-announces-new-pro-3-augmented-reality-smart-glass-21277381

The NuEyes Pro 3 Product Line delivers unparalleled clarity with 4K displays and an ultra-wide 51-degree field of view. This tethered solution weighs only 88g and comes in three versions.

This new and innovative line of AR Smart Glasses will not only address low vision and medical, but it also addresses the needs for training and learning, as well as enterprise and government needs.

+++++++++++++++
more on AR in this IMS blog
https://blog.stcloudstate.edu/ims?s=augmented+reality

Metaverse for XR COP

Discussion on low-end AR (Metaverse)

  1. What is AR (how is it different from VR or MR)
    https://blog.stcloudstate.edu/ims/2019/03/25/peter-rubin-future-presence/
    p. 225
    “augmented reality: Bringing artificial objects into the real world-these can be as simple as a ” heads-up display,” like a speedometer project it onto your car’s windshield, or as complex as seen to be virtual creature woke across your real world leaving room, casting a realistic shadow on the floor”
    https://blog.stcloudstate.edu/ims/2018/11/07/can-xr-help-students-learn/
    p. 12
    Augmented reality provides an “overlay” of some type over the real world through
    the use of a headset or even a smartphone.
    There is no necessary distinction between AR and VR; indeed, much research
    on the subject is based on a conception of a “virtuality continuum” from entirely
    real to entirely virtual, where AR lies somewhere between those ends of the
    spectrum. Paul Milgram and Fumio Kishino, “A Taxonomy of Mixed Reality Visual Displays,

Augmented Reality

 

 

https://blog.stcloudstate.edu/ims/2018/10/17/vr-ar-learning-materials/

Augmented reality superimposes a digital layer on the world around us, often activated by scanning a trigger image or via GPS (think Pokemon Go!). Virtual reality takes users away from the real world, fully immersing students in a digital experience that replaces reality. Mixed reality takes augmented a step further by allowing the digital and real worlds to interact and the digital components to change based on the user’s environment.

  1. Low-end and hi-end AR
    1. Hi-end: Hololens, Google Glass, Apple Glass
      1. Unity-driven content
    2. Low-end: Metaverse
  2. What is Metaverse
        1. Metaverse studio
          https://studio.gometa.io/discover/me
        2. Metaverse app
          1. iOS: https://apps.apple.com/us/app/metaverse-experience-browser/id1159155137
          2. Android: https://play.google.com/store/apps/details?id=com.gometa.metaverse&hl=en&gl=US
        3. Gamifying Library orientation using Metaverse:
          https://mtvrs.io/GenerousJubilantEeve
          (the gateway to the Library orientation project)
          Metaverse experience through the user’s phone:

    1. Student projects using Metaverse
      https://im690group.weebly.com/
      https://mtvrs.io/PreviousImpracticalNandu
    2. Behind the scene, or how does it work
      https://studio.gometa.io/discover/me/a0cc4490-85fb-41d8-849b-bf52ac3ecb70
      YouTube materials:
      https://youtu.be/jLRR6fKtfwY
      https://youtu.be/MLeZo7X5rnA
      https://youtu.be/g9kY41OcR0Y
  3. Discussion
    1. Low-end vs hi-end AR
      1. advantages
      2. disadvantages
    2. gamify learning content with Metaverse
      https://youtu.be/2lUrs3mJSHg
    3. Discuss the following statement:
      low-end AR (Metaverse), like low-end VR (360 degrees) has strong potential to introduce students, faculty and staff to immersive teaching and learning
  4. Alternatives
    1. Merge Cube: https://blog.stcloudstate.edu/ims/2020/10/21/how-to-create-merge-cube/
    2. Aero, GamAR: https://blog.stcloudstate.edu/ims/2020/12/04/augmented-reality-tools/

++++++++++++++++
more on Metavere in this IMS blog
https://blog.stcloudstate.edu/ims?s=metaverse

1 8 9 10 11 12 23