Searching for "intelligence"

Russian manipulation Instagram

Russia’s election manipulation a bigger win on Instagram than on Facebook, report finds

Blockchain next election

Blockchain Disciples Have a New Goal: Running Our Next Election

Amid vote-hacking fears, election officials are jumping on the crypto bandwagon — but cybersecurity experts are sounding an alarm

At democracy’s heart lies a set of paradoxes: a delicate interplay of identity and anonymity, secrecy and transparency. To be sure you are eligible to vote and that you do so only once, the authorities need to know who you are. But when it comes time for you to mark a ballot, the government must guarantee your privacy and anonymity. After the fact, it also needs to provide some means for a third party to audit the election, while also preventing you from obtaining definitive proof of your choice, which could lead to vote selling or coercion.
Building a system that accomplishes all this at once — and does so securely — is challenging enough in the physical world. It’s even harder online, as the recent revelation that Russian intelligence operatives compromised voting systems in multiple states makes clear.
In the decade since the elusive Satoshi Nakamoto published an infamous white paper outlining the idea behind bitcoin, a “peer-to-peer electronic cash system” based on a mathematical “consensus mechanism,” more than 1,500 new cryptocurrencies have come into being.
definition: Nathan Heller in the New Yorker, in which he compares the blockchain to a scarf knit with a single ball of yarn. “It’s impossible to remove part of the fabric, or to substitute a swatch, without leaving some trace,” Heller wrote. Typically, blockchains are created by a set of stakeholders working to achieve consensus at every step, so it might be even more apt to picture a knitting collective creating that single scarf together, moving forward only when a majority agrees that a given knot is acceptable.
Unlike bitcoin, a public blockchain powered by thousands of miners around the world, most voting systems, including Votem’s, employ what’s known as a “permissioned ledger,” in which a handful of approved groups (political parties, election observers, government entities) would be allowed to validate the transactions.
there’s the issue of targeted denial-of-service (DoS) attacks, in which a hacker directs so much traffic at a server that it’s overwhelmed and ceases to function.
Although a distributed ledger itself would likely withstand such an attack, the rest of the system — from voters’ personal devices to the many servers a vote would pass through on its way to the blockchain — would remain vulnerable.
there’s the so-called penetration attack, like the University of Michigan incursion, in which an adversary gains control of a server and deliberately alters the outcome of an election.
While it’s true that information recorded on a blockchain cannot be changed, a determined hacker might well find another way to disrupt the process. Bitcoin itself has never been hacked, for instance, but numerous bitcoin “wallets” have been, resulting in billions of dollars in losses. In early June 2018, a South Korean cryptocurrency exchange was penetrated, causing the value of bitcoin to tumble and resulting in a loss of $42 billion in market value. So although recording the vote tally on a blockchain introduces a new obstacle to penetration attacks, it still leaves holes elsewhere in the system — like putting a new lock on your front door but leaving your basement windows open.
A blockchain is only as valuable as the data stored on it. And whereas traditional paper ballots preserve an indelible record of the actual intent of each voter, digital votes “don’t produce an original hard-copy record of any kind,”
In the end, democracy always depends on a certain leap of faith, and faith can never be reduced to a mathematical formula. The Economist Intelligence Unit regularly ranks the world’s most democratic counties. In 2017, the United States came in 21st place, after Uruguay and Malta. Meanwhile, it’s now widely believed that John F. Kennedy owed his 1960 win to election tampering in Chicago. The Supreme Court decision granting the presidency to George W. Bush rather than calling a do-over — despite Al Gore’s popular-vote win — still seems iffy. Significant doubts remain about the 2016 presidential race.
While little doubt remains that Russia favored Trump in the 2016 election, the Kremlin’s primary target appears to have been our trust in the system itself. So if the blockchain’s trendy allure can bolster trust in American democracy, maybe that’s a net positive for our national security. If someone manages to hack the system, hopefully they’ll do so quietly. Apologies to George Orwell, but sometimes ignorance really is strength.

+++++++++++
more on blockchain in this IMS blog
https://blog.stcloudstate.edu/ims?s=blockchain

eLearning Trends To Treat With Caution

4 eLearning Trends To Treat With Caution

https://elearningindustry.com/instructional-design-models-and-theories

Jumping onboard to a new industry trend with insufficient planning can result in your initiative failing to achieve its objective and, in the worst case, even hinder the learning process. So which hot topics should you treat with care?

1. Virtual Reality, or VR

Ultimately, the key question to consider when adopting anything new is whether it will help you achieve the desired outcome. VR shouldn’t be incorporated into learning just because it’s a common buzzword. Before you decide to give it a go, consider how it’s going to help your learner, and whether it’s truly the most effective or efficient way to meet the learning goal.

2. Gamification

considering introducing an interactive element to your learning, don’t let this deter you—just ensure that it’s relevant to the content and will aid the learning process.

3. Artificial Intelligence, or AI

If you are confident that a trend is going to yield better results for your learners, the ROI you see may well justify the upfront resources it requires.
Again, it all comes down to whether a trend is going to deliver in terms of achieving an objective.

4. Microlearning

The theory behind microlearning makes a lot of sense: organizing content into sections so that learning can fit easily with modern day attention spans and learners’ busy lifestyles is not a bad thing. The worry is that the buzzword, ‘microlearning’, has grown legs of its own, meaning the industry is losing sight of its’ founding principles.

+++++++++
more on elearning in this IMS blog
https://blog.stcloudstate.edu/ims?s=elearning

Does AI favor tyranny

Why Technology Favors Tyranny

Artificial intelligence could erase many practical advantages of democracy, and erode the ideals of liberty and equality. It will further concentrate power among a small elite if we don’t take steps to stop it.

https://www.theatlantic.com/magazine/archive/2018/10/yuval-noah-harari-technology-tyranny/568330/

YUVAL NOAH HARARI  OCTOBER 2018 ISSUE

Ordinary people may not understand artificial intelligence and biotechnology in any detail, but they can sense that the future is passing them by. In 1938 the common man’s condition in the Soviet Union, Germany, or the United States may have been grim, but he was constantly told that he was the most important thing in the world, and that he was the future (provided, of course, that he was an “ordinary man,” rather than, say, a Jew or a woman).

n 2018 the common person feels increasingly irrelevant. Lots of mysterious terms are bandied about excitedly in ted Talks, at government think tanks, and at high-tech conferences—globalizationblockchaingenetic engineeringAImachine learning—and common people, both men and women, may well suspect that none of these terms is about them.

Fears of machines pushing people out of the job market are, of course, nothing new, and in the past such fears proved to be unfounded. But artificial intelligence is different from the old machines. In the past, machines competed with humans mainly in manual skills. Now they are beginning to compete with us in cognitive skills.

Israel is a leader in the field of surveillance technology, and has created in the occupied West Bank a working prototype for a total-surveillance regime. Already today whenever Palestinians make a phone call, post something on Facebook, or travel from one city to another, they are likely to be monitored by Israeli microphones, cameras, drones, or spy software. Algorithms analyze the gathered data, helping the Israeli security forces pinpoint and neutralize what they consider to be potential threats.

The conflict between democracy and dictatorship is actually a conflict between two different data-processing systems. AI may swing the advantage toward the latter.

As we rely more on Google for answers, our ability to locate information independently diminishes. Already today, “truth” is defined by the top results of a Google search. This process has likewise affected our physical abilities, such as navigating space.

So what should we do?

For starters, we need to place a much higher priority on understanding how the human mind works—particularly how our own wisdom and compassion can be cultivated.

+++++++++++++++
more on SCSU student philosophy club in this IMS blog
https://blog.stcloudstate.edu/ims?s=philosophy+student+club

personalized learning in the digital age

If This Is the End of Average, What Comes Next?

By Daniel T. Willingham     Jun 11, 2018

Todd Rose, the director of the Mind, Brain, and Education program at the Harvard Graduate School of Education, has emerged as a central intellectual figure behind the movement. In particular, his 2016 book, “The End of Average,” is seen as an important justification for and guide to the personalization of learning.

what Rose argues against. He holds that our culture is obsessed with measuring and finding averages—averages of human ability and averages of the human body. Sometimes the average is held to be the ideal.

The jaggedness principle means that many of the attributes we care about are multi-faceted, not of a whole. For example, human ability is not one thing, so it doesn’t make sense to talk about someone as “smart” or “dumb.” That’s unidimensional. Someone might be very good with numbers, very bad with words, about average in using space, and gifted in using of visual imagery.

Since the 1930s, psychologists have debated whether intelligence is best characterized as one thing or many.

But most psychologists stopped playing this game in the 1990s. The resolution came through the work of John Carroll, who developed a third model in which abilities form a hierarchy. We can think of abilities as separate, but nested in higher-order abilities. Hence, there is a general, all-purpose intelligence, and it influences other abilities, so they are correlated. But the abilities nested within general intelligence are independent, so the correlations are modest. Thus, Rose’s jaggedness principle is certainly not new to psychology, and it’s incomplete.

The second (Context Principle) of Rose’s principles holds that personality traits don’t exist, and there’s a similar problem with this claim: Rose describes a concept with limited predictive power as having none at all. The most commonly accepted theory holds that personality can be described by variation on five dimensions

Rose’s third principle (pathways principle) suggests that there are multiple ways to reach a goal like walking or reading, and that there is not a fixed set of stages through which each of us passes.

Rose thinks students should earn credentials, not diplomas. In other words, a school would not certify that you’re “educated in computer science” but that you have specific knowledge and skills—that you can program games on handheld devices, for example. He think grades should be replaced by testaments of competency (my note: badges); the school affirms that you’ve mastered the skills and knowledge, period. Finally, Rose argues that students should have more flexibility in choosing their educational pathways.

=++++++++++++++++
more on personalized learning in this IMS blog
https://blog.stcloudstate.edu/ims?s=personalized+learning

data is the new oil in Industry 4.0

Why “data is the new oil” and what happens when energy meets Industry 4.0

By Nicholas Waller PUBLISHED 19:42 NOVEMBER 14, 2018

Why “data is the new oil” and what happens when energy meets Industry 4.0

At the Abu Dhabi International Petroleum Exhibition and Conference (ADIPEC) this week, the UAE’s minister of state for Artificial Intelligence, Omar bin Sultan Al Olama, went so far as to declare that “Data is the new oil.”

according to Pulitzer Prize-winning author, economic historian and one of the world’s leading experts on the oil & gas sector; Daniel Yergin, there is now a “symbiosis” between energy producers and the new knowledge economy. The production of oil & gas and the generation of data are now, Yergin argues, “wholly inter-dependent”.

What does Oil & Gas 4.0 look like in practice?

the greater use of automation and collection of data has allowed an upsurge in the “de-manning” of oil & gas facilities

Thanks to a significant increase in the number of sensors being deployed across operations, companies can monitor what is happening in real time, which markedly improves safety levels.

in the competitive environment of the Fourth Industrial Revolution, no business can afford to be left behind by not investing in new technologies – so strategic discussions are important.

+++++++++++
more on big data in this IMS blog
https://blog.stcloudstate.edu/ims?s=big+data

more on industry 4.0 in this IMS blog
https://blog.stcloudstate.edu/ims?s=industry

deep learning revolution

Sejnowski, T. J. (2018). The Deep Learning Revolution. Cambridge, MA: The MIT Press.

How deep learning―from Google Translate to driverless cars to personal cognitive assistants―is changing our lives and transforming every sector of the economy.

The deep learning revolution has brought us driverless cars, the greatly improved Google Translate, fluent conversations with Siri and Alexa, and enormous profits from automated trading on the New York Stock Exchange. Deep learning networks can play poker better than professional poker players and defeat a world champion at Go. In this book, Terry Sejnowski explains how deep learning went from being an arcane academic field to a disruptive technology in the information economy.

Sejnowski played an important role in the founding of deep learning, as one of a small group of researchers in the 1980s who challenged the prevailing logic-and-symbol based version of AI. The new version of AI Sejnowski and others developed, which became deep learning, is fueled instead by data. Deep networks learn from data in the same way that babies experience the world, starting with fresh eyes and gradually acquiring the skills needed to navigate novel environments. Learning algorithms extract information from raw data; information can be used to create knowledge; knowledge underlies understanding; understanding leads to wisdom. Someday a driverless car will know the road better than you do and drive with more skill; a deep learning network will diagnose your illness; a personal cognitive assistant will augment your puny human brain. It took nature many millions of years to evolve human intelligence; AI is on a trajectory measured in decades. Sejnowski prepares us for a deep learning future.

A pioneering scientist explains ‘deep learning’

Artificial intelligence meets human intelligence

neural networks

Buzzwords like “deep learning” and “neural networks” are everywhere, but so much of the popular understanding is misguided, says Terrence Sejnowski, a computational neuroscientist at the Salk Institute for Biological Studies.

Sejnowski, a pioneer in the study of learning algorithms, is the author of The Deep Learning Revolution (out next week from MIT Press). He argues that the hype about killer AI or robots making us obsolete ignores exciting possibilities happening in the fields of computer science and neuroscience, and what can happen when artificial intelligence meets human intelligence.

Machine learning is a very large field and goes way back. Originally, people were calling it “pattern recognition,” but the algorithms became much broader and much more sophisticated mathematically. Within machine learning are neural networks inspired by the brain, and then deep learning. Deep learning algorithms have a particular architecture with many layers that flow through the network. So basically, deep learning is one part of machine learning and machine learning is one part of AI.

December 2012 at the NIPS meeting, which is the biggest AI conference. There, [computer scientist] Geoff Hinton and two of his graduate students showed you could take a very large dataset called ImageNet, with 10,000 categories and 10 million images, and reduce the classification error by 20 percent using deep learning.Traditionally on that dataset, error decreases by less than 1 percent in one year. In one year, 20 years of research was bypassed. That really opened the floodgates.

The inspiration for deep learning really comes from neuroscience.

AlphaGo, the program that beat the Go champion included not just a model of the cortex, but also a model of a part of the brain called the basal ganglia, which is important for making a sequence of decisions to meet a goal. There’s an algorithm there called temporal differences, developed back in the ‘80s by Richard Sutton, that, when coupled with deep learning, is capable of very sophisticated plays that no human has ever seen before.

there’s a convergence occurring between AI and human intelligence. As we learn more and more about how the brain works, that’s going to reflect back in AI. But at the same time, they’re actually creating a whole theory of learning that can be applied to understanding the brain and allowing us to analyze the thousands of neurons and how their activities are coming out. So there’s this feedback loop between neuroscience and AI

+++++++++++
deep learning revolution
https://blog.stcloudstate.edu/ims?s=deep+learning

Preparing Learners for 21st Century Digital Citizenship

ID2ID webinar (my notes on the bottom)

Digital Fluency: Preparing Learners for 21st Century Digital Citizenship
Eighty-five percent of the jobs available in 2030 do not yet exist.  How does higher education prepare our learners for careers that don’t yet exist?  One opportunity is to provide our students with opportunities to grow their skills in creative problem solving, critical thinking, resiliency, novel thinking, social intelligence, and excellent communication skills.  Instructional designers and faculty can leverage the framework of digital fluency to create opportunities for learners to practice and hone the skills that will prepare them to be 21st-century digital citizens.  In this session, join a discussion about several fluencies that comprise the overarching framework for digital fluency and help to define some of your own.

Please click this URL to join. https://arizona.zoom.us/j/222969448

Dr. Jennifer Sparrow, Senior Director for Teaching and Learning with Technology and Affiliate Assistant Professor of Learning, Design, and Technology at Penn State.    The webinar will take place on Friday, November 9th at 11am EST/4pm UTC (login details below)  

https://arizona.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=e15266ee-7368-4378-b63c-a99301274877

My notes:

Jennifer does NOT see phone use for learning as an usage to obstruct. Similarly as with the calculator some 30-40 years ago, it was frowned upon, so now is technology. To this notion, added the fast-changing job market: new jobs created, old disappearing (https://www.nbcnews.com/news/us-news/students-are-being-prepared-jobs-no-longer-exist-here-s-n865096)

how DF is different from DLiteracy? enable students define how new knowledge can be created through technology. Not only read and write, but create poems, stories, if analogous w learning a language. slide 4 in https://www.slideshare.net/aidemoreto/vr-library

communication fluency. be able to choose the correct media. curiosity/failure fluency; creation fluency (makerspace: create without soldering, programming, 3Dprinting. PLA filament-corn-based plastic; Makers-in-residence)

immersive fluency: video 360, VR and AR. enable student to create new knowledge through environments beyond reality. Immersive Experiences Lab (IMEX). Design: physical vs virtual spaces.

Data fluency: b.book. how to create my own textbook

rubrics and sample projects to assess digital fluency.

https://er.educause.edu/articles/2018/3/digital-fluency-preparing-students-to-create-big-bold-problems

https://events.educause.edu/annual-conference/2018/agenda/ethics-and-digital-fluency-in-vr-and-immersive-learning-environments

Literacy Is NOT Enough: 21st Century Fluencies for the Digital Age (The 21st Century Fluency Series)
https://www.amazon.com/Literacy-NOT-Enough-Century-Fluencies/dp/1412987806

What is Instructional Design 2.0 or 3.0? deep knowledge and understanding of faculty development. second, once faculty understands the new technology, how does this translate into rework of curriculum? third, the research piece; how to improve to be ready for the next cycle. a partnership between ID and faculty.

digital transformation online professional education

<h3 “>Sharpen the digital transformation 
strategy for your business.

Enroll today in Digital Transformation: From AI and IoT to Cloud, Blockchain, and Cybersecurity

https://professionalonline1.mit.edu/digital-transformation/index.php

PROGRAM FEES $2,300 STARTS ON November 28, 20182 months, online
6-8 hours per week

A Digital Revolution Is Underway.

In a rapidly expanding digital marketplace, legacy companies without a clear digital transformation strategy are being left behind. How can we stay on top of rapid—and sometimes radical—change? How can we position our organizations to take advantage of new technologies? How can we track and combat the security threats facing all of us as we are swept forward into the future?

Who is this Program for?

  • Professionals in traditional companies poised to implement strategic change, as well as entrepreneurs seeking to harness the opportunities afforded by new technologies, will learn the fundamentals of digital transformation and secure the necessary tools to navigate their enterprise to a digital platform.
  • Participants come from a wide range of industries and include C-suite executives, business consultants, corporate attorneys, risk officers, marketing, R&D, and innovation enablers.

<h3 “>Your Learning Journey

This online program takes you through the fundamentals of digital technologies transforming our world today. Led by MIT faculty at the forefront of data science, participants will learn the history and application of transformative technologies such as blockchain, artificial intelligence, cloud computing, IoT, and cybersecurity as well as the implications of employing—or ignoring—digitalization.

Brochure_MIT_PE_DigitalTransformation_17_Oct_18_V20-1w4qpjv

<

1 8 9 10 11 12 16