Web 3.0 is the third generation of internet services which provide websites and applications with the technology to run. Web 3.0 is set to be powered by AI and peer-to-peer applications like blockchain. The key difference between Web 2.0 and Web 3.0 is that Web 3.0 is more focused on using innovative technologies like machine learning and AI to create more personalized content for each user. It is also expected that Web 3.0 will be more secure than its predecessors because of the system it is built upon.
Blockchains are made up of blocks that store information. Each block has a unique “hash” that differentiates it from other blocks. These blocks are then connected by a chain in chronological order. The information stored in these blocks is permanent, which makes it a very secure way to complete online transactions.
This is why cryptocurrencies, like Bitcoin, are built on blockchain technology.
With the pandemic boosting its videoconferencing business, Zoom more than quadrupled its annual revenue from $622.7 million to $2.7 billion in the 12 months ending January 31, 2021.
Narrow AI is what we see all around us in computers today — intelligent systems that have been taught or have learned how to carry out specific tasks without being explicitly programmed how to do so.
General AI
General AI is very different and is the type of adaptable intellect found in humans, a flexible form of intelligence capable of learning how to carry out vastly different tasks, anything from haircutting to building spreadsheets or reasoning about a wide variety of topics based on its accumulated experience.
What can Narrow AI do?
There are a vast number of emerging applications for narrow AI:
Interpreting video feeds from drones carrying out visual inspections of infrastructure such as oil pipelines.
Organizing personal and business calendars.
Responding to simple customer-service queries.
Coordinating with other intelligent systems to carry out tasks like booking a hotel at a suitable time and location.
Flagging inappropriate content online, detecting wear and tear in elevators from data gathered by IoT devices.
Generating a 3D model of the world from satellite imagery… the list goes on and on.
What can General AI do?
A survey conducted among four groups of experts in 2012/13 by AI researchers Vincent C Müller and philosopher Nick Bostrom reported a 50% chance that Artificial General Intelligence (AGI) would be developed between 2040 and 2050, rising to 90% by 2075.
What is machine learning?
What are neural networks?
What are other types of AI?
Another area of AI research is evolutionary computation.
What is fueling the resurgence in AI?
What are the elements of machine learning?
As mentioned, machine learning is a subset of AI and is generally split into two main categories: supervised and unsupervised learning.
Supervised learning
Unsupervised learning
Which are the leading firms in AI?
Which AI services are available?
All of the major cloud platforms — Amazon Web Services, Microsoft Azure and Google Cloud Platform — provide access to GPU arrays for training and running machine-learning models, with Google also gearing up to let users use its Tensor Processing Units — custom chips whose design is optimized for training and running machine-learning models.
Which countries are leading the way in AI?
It’d be a big mistake to think the US tech giants have the field of AI sewn up. Chinese firms Alibaba, Baidu, and Lenovo, invest heavily in AI in fields ranging from e-commerce to autonomous driving. As a country, China is pursuing a three-step plan to turn AI into a core industry for the country, one that will be worth 150 billion yuan ($22bn) by the end of 2020 to become the world’s leading AI power by 2030.
How can I get started with AI?
While you could buy a moderately powerful Nvidia GPU for your PC — somewhere around the Nvidia GeForce RTX 2060 or faster — and start training a machine-learning model, probably the easiest way to experiment with AI-related services is via the cloud.
Through storytelling and narrative case studies, this book proposes to provide evidence-based practices, practical strategies, administrative considerations, and management tools for K12 and post-secondary school leaders charged with implementing technology at scale. It intentionally takes a broad view across all education levels to tell stories about how large-scale technology implementations might inspire systemic changes and new collaborations. In order to do so, this book proposes to include diverse voices and perspectives representing K12 and post-secondary institutions with the goal of facilitating equitable, sustainable technology access for learner success.
++++++++++++++++++++
More on ED Leadership and Technology in this IMS blog https://blog.stcloudstate.edu/ims?s=edad+technology
Document-based questions have long been a staple of social studies classrooms
Since the human brain is essentially wired to recognize patterns, computational thinking—somewhat paradoxically—doesn’t necessarily require the use of computers at all.
In a 2006 paper for the Association for Computing Machinery, computer scientist Jeanette Wing wrote a definition of computational thinking that used terms native her field—even when she was citing everyday examples. Thus, a student preparing her backpack for the day is “prefetching and caching.” Finding the shortest line at the supermarket is “performance modeling.” And performing a cost-benefit analysis on whether it makes more sense to rent versus buy is running an “online algorithm.” “Computational thinking will have become ingrained in everyone’s lives when words like algorithm and precondition are part of everyone’s vocabulary,” she writes.
three main steps:
Looking at the data: Deciding what’s worth including in the final data set, and what should be left out. What are the different tools that can help manipulate this data—from GIS tools to pen and paper?
Looking for patterns: Typically, this involves shifting to greater levels of abstraction—or conversely, getting more granular.
Decomposition: What’s a trend versus what’s an outlier to the trend? Where do things correlate, and where can you find causal inference?
Robot colleges have de-skilled instruction by paying teams of workers, some qualified and some not, to write content, while computer programs perform instructional and management tasks. Learning management systems with automated instruction programs
The assumption is that managing work this way significantly reduces costs, and it does, at least in the short and medium terms. However, instructional costs are frequently replaced by marketing and advertising expenses to pitch the schools to prospective students and their families.
The business model in higher education for reducing labor power and faculty costs is not reserved to for-profit colleges. Community colleges also rely on a small number of full-time faculty and armies of low-wage contingent labor.
In some cases, colleges and universities, including many brand name schools, utilize outside companies, online program managers (OPMs), to run their online programs, with OPMs like 2U taking up as much as 60 percent of the revenues.