Searching for "sketchfab"

3D scanning iPhone lidar

3D scanning with the iPhone lidar

https://albn.medium.com/3d-scanning-with-the-iphone-lidar-8cbd723fc9ab

he latest generation of iPhones (12 Pro and 12 Pro Max) comes equipped with a back facing lidar camera.

6 go-to scanning apps, they all come with direct export to Sketchfab:

Scaniverse

Scaniverse captures here, and get the app here.

Polycam

Polycam captures here, and get the app here.

3D scanner app

3D scanner app captures here, and get the app here.

Record3D

Record3D captures here and get the app here.

SiteScape

SiteScape captures here and get the app here.

Everypoint

Everypoint captures here, and get the app here.

++++++++++++++
more on 3D scanning in this IMS blog
https://blog.stcloudstate.edu/ims?s=3d+scanning

how to create merge cube objects

Pricing/subscription:

thinglink $300
Sketchfab   https://sketchfab.com/plans  $79 monthly $80 x 12 = ~$900
CoSpaces https://www.cospaces.io/edu/pricing.html $74.99 per year https://www.cospaces.io/edu/pricing.html
Qlone ($29.99)

ready-to-go resources for merge cube:

apps for merge cube:

https://www.arvreduhub.com/merge

+++++++++++++++++++

+++++++++++++
Merge cube, the basics”

to create merge cube content in Cospaces.edu, an paid add-on is needed.

Scan object with Qlone, upload as OBJ file ($29.99) and merge cube it with the object uploader app:

++++++++++++++++++
More on Merge Cube in this IMS blog
https://blog.stcloudstate.edu/ims?s=mergecube

XR anatomy

The EDUCAUSE XR (Extended Reality) Community Group Listserv <XR@LISTSERV.EDUCAUSE.EDU>

Greetings to you all! Presently, I am undertaking a masters course in “Instruction Design and Technology” which has two components: Coursework and Research. For my research, I would like to pursue it in the field of Augmented Reality (AR) and Mobile Learning. I am thinking of an idea that could lead to collaboration among students and directly translate into enhanced learning for students while using an AR application. However, I am having a problem with coming up with an application because I don’t have any computing background. This, in turn, is affecting my ability to come up with a good research topic.

I teach gross anatomy and histology to many students of health sciences at Mbarara University, and this is where I feel I could make a contribution to learning anatomy using AR since almost all students own smartphones. I, therefore, kindly request you to let me know which of the freely-available AR app authoring tools could help me in this regard. In addition, I request for your suggestions regarding which research area(s) I should pursue in order to come up with a good research topic.

Hoping to hear from you soon.

Grace Muwanga Department of Anatomy Mbarara University Uganda (East Africa)

++++++++++++

matthew.macvey@journalism.cuny.edu

Dear Grace, a few augmented reality tools which I’ve found are relatively easy to get started with:

For iOS, iPhone, iPad: https://www.torch.app/ or https://www.adobe.com/products/aero.html

To create AR that will work on social platforms like Facebook and Snapchat (and will work on Android, iOS) try https://sparkar.facebook.com/ar-studio/ or https://lensstudio.snapchat.com/ . You’ll want to look at the tutorials for plane tracking or target tracking https://sparkar.facebook.com/ar-studio/learn/documentation/tracking-people-and-places/effects-in-surroundings/

https://lensstudio.snapchat.com/guides/general/tracking/tracking-modes/

One limitation with Spark and Snap is that file sizes need to be small.

If you’re interested in creating AR experiences that work directly in a web browser and are up for writing some markup code, look at A-Frame AR https://aframe.io/blog/webxr-ar-module/.

For finding and hosting 3D models you can look at Sketchfab and Google Poly. I think both have many examples of anatomy.

Best, Matt

+++++++++++

“Beth L. Ritter-Guth” <britter-guth@NORTHAMPTON.EDU>

I’ve been using Roar. They have a 99$ a year license.

++++++++++++

I have recently been experimenting with an AR development tool called Zappar, which I like because the end users do not have to download an app to view the AR content. Codes can be scanned either with the Zappar app or at web.zappar.com.

From a development standpoint, Zappar has an easy to use drag-and-drop interface called ZapWorks Designer that will help you build basic AR experiences quickly, but for a more complicated, more interactive use case such as learning anatomy, you will probably need ZapWorks Studio, which will have much more of a learning curve. The Hobby (non-commercial) license is free if you are interested in trying it out.

You can check out an AR anatomy mini-lesson with models of the human brain, liver, and heart using ZapWorks here: https://www.zappar.com/campaigns/secrets-human-body/. Even if you choose to go with a different development tool, this example might help nail down ideas for your own project.

Hope this helps,

Brighten

Brighten Jelke Academic Assistant for Virtual Technology Lake Forest College bjelke@lakeforest.edu Office: DO 233 | Phone: 847-735-5168

http://www.lakeforest.edu/academics/resources/innovationspaces/virtualspace.php

+++++++++++++++++
more on XR in education in this IMS blog
https://blog.stcloudstate.edu/ims?s=xr+education

IM 690 VR and AR lab part 2

IM 690 Virtual Reality and Augmented Reality. short link: http://bit.ly/IM690lab

IM 690 lab plan for March 3, MC 205:  Oculus Go and Quest

Readings:

  1. TAM:Technology Acceptances Model
    Read Venkatesh, and Davis and sum up the importance of their model for instructional designers working with VR technologies and creating materials for users of VR technologies.
  2. UTAUT: using the theory to learn well with VR and to design good acceptance model for endusers: https://blog.stcloudstate.edu/ims/2020/02/20/utaut/
    Watch both parts of Victoria Bolotina presentation at the Global VR conference. How is she applying UTAUT for her research?
    Read Bracq et al (2019); how do they apply UTAUT for their VR nursing training?

Lab work (continue):

revision from last week:
How to shoot and edit 360 videos: Ben Claremont
https://www.youtube.com/channel/UCAjSHLRJcDfhDSu7WRpOu-w
and
https://www.youtube.com/channel/UCUFJyy31hGam1uPZMqcjL_A

  1. Oculus Quest as VR advanced level
    1. Using the controllers
    2. Confirm Guardian
    3. Using the menu

Oculus Quest main

    1. Watching 360 video in YouTube
      1. Switch between 2D and 360 VR
        1. Play a game

Climbing


Racketball

View this post on Instagram

Hell yeah, @naysy is the ultimate Beat Saber queen! 💃 #VR #VirtualReality #BeatSaber #PanicAtTheDisco

A post shared by Beat Saber (@beatsaber) on

Practice interactivity (space station)

    1. Broadcast your experience (Facebook Live)
  1. Additional (advanced) features of Oculus Quest
    1. https://engagevr.io/
    2. https://sidequestvr.com/#/setup-howto

Interactivity: communication and working collaboratively with Altspace VR

https://account.altvr.com/

setting up your avatar

joining a space and collaborating and communicating with other users

  1. Assignment: Group work
    1. Find one F2F and one online peer to form a group.
      Based on the questions/directions before you started watching the videos:
      – Does this particular technology fit in the instructional design (ID) frames and theories covered
      – how does this particular technology fit in the instructional design (ID) frames and theories covered so far?
      – what models and ideas from the videos you will see seem possible to be replicated by you?
      exchange thoughts with your peers and make a plan to create similar educational product
    2. Post your writing in the following D2L Discussions thread
  2. Augmented Reality with Hololens Watch videos at computer station)
    1. Start and turn off; go through menu

      https://youtu.be/VX3O650comM
    2. Learn gestures, voice commands,
  1. Augmented Reality with Merge Cube
    1. 3D apps and software packages and their compatibility with AR
  2. Augmented Reality with telephone
  3. Samsung Gear 360 video camera
    1. If all other goggles and devices are busy, please feel welcome to use the camera to practice and/or work toward your final project
    2. CIM card and data transfer – does your phone have a CIM card compatible with the camera?
    3. Upload 360 images and videos on your YouTube and FB accounts
  4. Issues with XR
    1. Ethics
      1. empathy
        Peter Rubin “Future Presence”
        https://blog.stcloudstate.edu/ims/2019/03/25/peter-rubin-future-presence/

+++++++++++++

Enhance your XR instructional Design with other tools: https://blog.stcloudstate.edu/ims/2020/02/07/crs-loop/

https://aframe.io/

https://framevr.io/

https://learn.framevr.io/ (free learning of frame)

https://hubs.mozilla.com/#/

https://sketchfab.com/ WebxR technology

https://mixedreality.mozilla.org/hello-webxr/

https://studio.gometa.io/landing

+++++++++++
Plamen Miltenoff, Ph.D., MLIS
Professor
320-308-3072
pmiltenoff@stcloudstate.edu
http://web.stcloudstate.edu/pmiltenoff/faculty/
schedule a meeting: https://doodle.com/digitalliteracy
find my office: https://youtu.be/QAng6b_FJqs

can XR help students learn

Giving Classroom Experiences (Like VR) More … Dimension

https://www.insidehighered.com/digital-learning/article/2018/11/02/virtual-reality-other-3-d-tools-enhance-classroom-experiences

at a session on the umbrella concept of “mixed reality” (abbreviated XR) here Thursday, attendees had some questions for the panel’s VR/AR/XR evangelists: Can these tools help students learn? Can institutions with limited budgets pull off ambitious projects? Can skeptical faculty members be convinced to experiment with unfamiliar technology?

All four — one each from Florida International UniversityHamilton CollegeSyracuse University and Yale University — have just finished the first year of a joint research project commissioned by Educause and sponsored by Hewlett-Packard to investigate the potential for immersive technology to supplement and even transform classroom experiences.

Campus of the Future” report, written by Jeffrey Pomerantz

Yale has landed on a “hub model” for project development — instructors propose projects and partner with students with technological capabilities to tap into a centralized pool of equipment and funding. (My note: this is what I suggest in my Chapter 2 of Arnheim, Eliot & Rose (2012) Lib Guides)

Several panelists said they had already been getting started on mixed reality initiatives prior to the infusion of support from Educause and HP, which helped them settle on a direction

While 3-D printing might seem to lend itself more naturally to the hard sciences, Yale’s humanities departments have cottoned to the technology as a portal to answering tough philosophical questions.

institutions would be better served forgoing an early investment in hardware and instead gravitating toward free online products like UnityOrganon and You by Sharecare, all of which allow users to create 3-D experiences from their desktop computers.

+++++++++

Campus of the Future” report, written by Jeffrey Pomerantz

https://library.educause.edu/~/media/files/library/2018/8/ers1805.pdf?la=en

XR technologies encompassing 3D simulations, modeling, and production.

This project sought to identify

  • current innovative uses of these 3D technologies,
  • how these uses are currently impacting teaching and learning, and
  • what this information can tell us about possible future uses for these technologies in higher education.

p. 5 Extended reality (XR) technologies, which encompass virtual reality (VR) and augmented reality (AR), are already having a dramatic impact on pedagogy in higher education. XR is a general term that covers a wide range of technologies along a continuum, with the real world at one end and fully immersive simulations at the other.

p. 6The Campus of the Future project was an exploratory evaluation of 3D technologies for instruction and research in higher education: VR, AR, 3D scanning, and 3D printing. The project sought to identify interesting and novel uses of 3D technology

p. 7 HP would provide the hardware, and EDUCAUSE would provide the methodological expertise to conduct an evaluation research project investigating the potential uses of 3D technologies in higher education learning and research.

The institutions that participated in the Campus of the Future project were selected because they were already on the cutting edge of integrating 3D technology into pedagogy. These institutions were therefore not representative, nor were they intended to be representative, of the state of higher education in the United States. These institutions were selected precisely because they already had a set of use cases for 3D technology available for study

p. 9  At some institutions, the group participating in the project was an academic unit (e.g., the Newhouse School of Communications at Syracuse University; the Graduate School of Education at Harvard University). At these institutions, the 3D technology provided by HP was deployed for use more or less exclusively by students and faculty affiliated with the particular academic unit.

p. 10 definitions
there is not universal agreement on the definitions of these
terms or on the scope of these technologies. Also, all of these technologies
currently exist in an active marketplace and, as in many rapidly changing markets, there is a tendency for companies to invent neologisms around 3D technology.

A 3D scanner is not a single device but rather a combination of hardware and
software. There are generally two pieces of hardware: a laser scanner and a digital
camera. The laser scanner bounces laser beams off the surface of an object to
determine its shape and contours.

p. 11 definitions

Virtual reality means that the wearer is completely immersed in a computer
simulation. Several types of VR headsets are currently available, but all involve
a lightweight helmet with a display in front of the eyes (see figure 2). In some
cases, this display may simply be a smartphone (e.g., Google Cardboard); in other
cases, two displays—one for each eye—are integrated into the headset (e.g., HTC
Vive). Most commercially available VR rigs also include handheld controllers
that enable the user to interact with the simulation by moving the controllers
in space and clicking on finger triggers or buttons.

p. 12 definitions

Augmented reality provides an “overlay” of some type over the real world through
the use of a headset or even a smartphone.

In an active technology marketplace, there is a tendency for new terms to be
invented rapidly and for existing terms to be used loosely. This is currently
happening in the VR and AR market space. The HP VR rig and the HTC Vive
unit are marketed as being immersive, meaning that the user is fully immersed in
a simulation—virtual reality. Many currently available AR headsets, however, are
marketed not as AR but rather as MR (mixed reality). These MR headsets have a
display in front of the eyes as well as a pair of front-mounted cameras; they are
therefore capable of supporting both VR and AR functionality.

p. 13 Implementation

Technical difficulties.
Technical issues can generally be divided into two broad categories: hardware
problems and software problems. There is, of course, a common third category:
human error.

p. 15 the technology learning curve

The well-known diffusion of innovations theoretical framework articulates five
adopter categories: innovators, early adopters, early majority, late majority, and
laggards. Everett M. Rogers, Diffusion of Innovations, 5th ed. (New York: Simon and Schuster, 2003).

It is also likely that staff in the campus IT unit or center for teaching and learning already know who (at least some of) these individuals are, since such faculty members are likely to already have had contact with these campus units.
Students may of course also be innovators and early adopters, and in fact
several participating institutions found that some of the most creative uses of 3D technology arose from student projects

p. 30  Zeynep Tufekci, in her book Twitter and Tear Gas

definition: There is no necessary distinction between AR and VR; indeed, much research
on the subject is based on a conception of a “virtuality continuum” from entirely
real to entirely virtual, where AR lies somewhere between those ends of the
spectrum.  Paul Milgram and Fumio Kishino, “A Taxonomy of Mixed Reality Visual Displays,” IEICE Transactions on Information Systems, vol. E77-D, no. 12 (1994); Steve Mann, “Through the Glass, Lightly,” IEEE Technology and Society Magazine 31, no. 3 (2012): 10–14.

For the future of 3D technology in higher education to be realized, that
technology must become as much a part of higher education as any technology:
the learning management system (LMS), the projector, the classroom. New
technologies and practices generally enter institutions of higher education as
initiatives. Several active learning classroom initiatives are currently under
way,36 for example, as well as a multi-institution open educational resources
(OER) degree initiative.37

p. 32 Storytelling

Some scholars have argued that all human communication
is based on storytelling;41 certainly advertisers have long recognized that
storytelling makes for effective persuasion,42 and a growing body of research
shows that narrative is effective for teaching even topics that are not generally
thought of as having a natural story, for example, in the sciences.43

p. 33 accessibility

The experience of Gallaudet University highlights one of the most important
areas for development in 3D technology: accessibility for users with disabilities.

p. 34 instructional design

For that to be the case, 3D technologies must be incorporated into the
instructional design process for building and redesigning courses. And for that
to be the case, it is necessary for faculty and instructional designers to be familiar
with the capabilities of 3D technologies. And for that to be the case, it may
not be necessary but would certainly be helpful for instructional designers to
collaborate closely with the staff in campus IT units who support and maintain
this hardware.

Every institution of higher
education has a slightly different organizational structure, of course, but these
two campus units are often siloed. This siloing may lead to considerable friction
in conducting the most basic organizational tasks, such as setting up meetings
and apportioning responsibilities for shared tasks. Nevertheless, IT units and
centers for teaching and learning are almost compelled to collaborate in order
to support faculty who want to integrate 3D technology into their teaching. It
is necessary to bring the instructional design expertise of a center for teaching
and learning to bear on integrating 3D technology into an instructor’s teaching (My note: and where does this place SCSU?) Therefore,
one of the most critical areas in which IT units and centers for teaching and
learning can collaborate is in assisting instructors to develop this integration
and to develop learning objects that use 3D technology. p. 35 For 3D technology to really gain traction in higher education, it will need to be easier for instructors to deploy without such a large support team.

p. 35 Sites such as Thingiverse, Sketchfab, and Google Poly are libraries of freely
available, user-created 3D models.

ClassVR is a tool that enables the simultaneous delivery of a simulation to
multiple headsets, though the simulation itself may still be single-user.

p. 37 data management:

An institutional repository is a collection of an institution’s intellectual output, often consisting of preprint journal articles and conference papers and the data sets behind them.49 An
institutional repository is often maintained by either the library or a partnership
between the library and the campus IT unit. An institutional repository therefore has the advantage of the long-term curatorial approach of librarianship combined with the systematic backup management of the IT unit. (My note: leaves me wonder where does this put SCSU)

Sharing data sets is critical for collaboration and increasingly the default for
scholarship. Data is as much a product of scholarship as publications, and there
is a growing sentiment among scholars that it should therefore be made public.50

++++++++
more on VR in this IMS blog
https://blog.stcloudstate.edu/ims?s=virtual+reality+definition