Industrial revolutions are momentous events. By most reckonings, there have been only three. The first was triggered in the 1700s by the commercial steam engine and the mechanical loom. The harnessing of electricity and mass production sparked the second, around the start of the 20th century. The computer set the third in motion after World War II.
Henning Kagermann, the head of the German National Academy of Science and Engineering (Acatech), did exactly that in 2011, when he used the term Industrie 4.0 to describe a proposed government-sponsored industrial initiative.
The term Industry 4.0 refers to the combination of several major innovations in digital technology
These technologies include advanced robotics and artificial intelligence; sophisticated sensors; cloud computing; the Internet of Things; data capture and analytics; digital fabrication (including 3D printing); software-as-a-service and other new marketing models; smartphones and other mobile devices; platforms that use algorithms to direct motor vehicles (including navigation tools, ride-sharing apps, delivery and ride services, and autonomous vehicles); and the embedding of all these elements in an interoperable global value chain, shared by many companies from many countries.
Companies that embrace Industry 4.0 are beginning to track everything they produce from cradle to grave, sending out upgrades for complex products after they are sold (in the same way that software has come to be updated). These companies are learning mass customization: the ability to make products in batches of one as inexpensively as they could make a mass-produced product in the 20th century, while fully tailoring the product to the specifications of the purchaser
Three aspects of digitization form the heart of an Industry 4.0 approach.
• The full digitization of a company’s operations
• The redesign of products and services
• Closer interaction with customers
Making Industry 4.0 work requires major shifts in organizational practices and structures. These shifts include new forms of IT architecture and data management, new approaches to regulatory and tax compliance, new organizational structures, and — most importantly — a new digitally oriented culture, which must embrace data analytics as a core enterprise capability.
Klaus Schwab put it in his recent book The Fourth Industrial Revolution (World Economic Forum, 2016), “Contrary to the previous industrial revolutions, this one is evolving at an exponential rather than linear pace.… It is not only changing the ‘what’ and the ‘how’ of doing things, but also ‘who’ we are.”
This great integrating force is gaining strength at a time of political fragmentation — when many governments are considering making international trade more difficult. It may indeed become harder to move people and products across some national borders. But Industry 4.0 could overcome those barriers by enabling companies to transfer just their intellectual property, including their software, while letting each nation maintain its own manufacturing networks.
more on the Internet of Things in this IMS blog http://blog.stcloudstate.edu/ims?s=internet+of+things
a study, the “Why We Post” project, has just been published by nine anthropologists, led by Daniel Miller of University College, London. worked independently for 15 months at locations in Brazil, Britain, Chile, China (one rural and one industrial site), India, Italy, Trinidad and Tobago, and Turkey.
In rural China and Turkey social media were viewed as a distraction from education. But in industrial China and Brazil they were seen to be an educational resource. Such a divide was evident in India, too. There, high-income families regarded them with suspicion but low-income families advocated them as a supplementary source of schooling. In Britain, meanwhile, they were valued not directly as a means of education, but as a way for pupils, parents and teachers to communicate.
How would you answer if addressed by this study? How do you see social media? Do you see it differently then before?
On a recent visit in 2015, I found the social media landscape dramatically changed, again. Facebook began actively steering reading practices through changes in 2013 to the News Feed algorithm, which determines content in the site’s central feature. That year, Facebook announced an effort to prioritize “high quality content,” defined as timely, relevant, and trustworthy—and not clickbait, memes, or other viral links. This policy, along with changing practices in sharing news content generally, meant that current events can unfold on and through social media.
how much of your news do you acquire through social media? do you trust the information you acquire through social media? #FakeNews – have explored this hashtag? What is your take on fake news?
meaning management :
Anthropologists and the culturally sensitive analysts take complex bits of data and develop a higher-order sense of them. Information and meaning work at cross purposes. In managing meaning, context is everything while in managing information context is error and noise. When we give our social listening projects to information specialists, we lose an appreciation of context and with it the ability to extract the meanings that provide insight for our companies and brands.
Meaning management also involves a deeper appreciation of social listening as a component of a broader meaning-making system, rather than as, simply, a data source to be exploited.
How do you perceive meaning management? Do you see yourself being a professional with the ability to collect, analyze and interpret such data for your company?
Twitter is updating its top search results so that tweets will be ranked based on relevance instead of by time, bringing those search results in line with what users have experienced on their timeline for the past 10 months. Based on early trials, the company claims there has been more engagement in search results and tweets, with more time spent using the service.
In February, Twitter announced that it was shaking up the user timeline for everyone. Previously, it had offered the algorithmic change as an opt-in program, but it became mandatory a month later. The effort was intended to make the service more appealing to new and casual users,
Twitter’s search was broken into various categories, such as most popular or the latest (a live stream), and segmented by people, photos, videos, and more. Now it appears that there’s one more signal being used to algorithmically control how at least the top tweets are shown to you.
more about Twitter in this IMS blog
Alice and Bob have figured out a way to have a conversation without Eve being able to overhear, no matter how hard she tries.
They’re artificial intelligence algorithms created by Google engineers, and their ability to create an encryption protocol that Eve (also an AI algorithm) can’t hack is being hailed as an important advance in machine learning and cryptography.
Martin Abadi and David G. Andersen, explained in a paper published this week that their experiment is intended to find out if neural networks—the building blocks of AI—can learn to communicate secretly.
As the Abadi and Anderson wrote, “instead of training each of Alice and Bob separately to implement some known cryptosystem, we train Alice and Bob jointly to communicate successfully and to defeat Eve without a pre-specified notion of what cryptosystem they may discover for this purpose.”
same in German
Googles AI entwickelt eigenständig Verschlüsselung
Google-Forscher Martin Abadi und David G. Andersen des Deep-Learning-Projekts “Google Brain” eine neue Verschlüsselungsmethode entwickelt beziehungsweise entwickeln lassen. Die Forscher haben verschiedene neurale Netze damit beauftragt, eine abhörsichere Kommunikation aufzustellen.
W3Schools – Fantastic set of interactive tutorials for learning different languages. Their SQL tutorial is second to none. You’ll learn how to manipulate data in MySQL, SQL Server, Access, Oracle, Sybase, DB2 and other database systems.
Treasure Data – The best way to learn is to work towards a goal. That’s what this helpful blog series is all about. You’ll learn SQL from scratch by following along with a simple, but common, data analysis scenario.
10 Queries – This course is recommended for the intermediate SQL-er who wants to brush up on his/her skills. It’s a series of 10 challenges coupled with forums and external videos to help you improve your SQL knowledge and understanding of the underlying principles.
TryR – Created by Code School, this interactive online tutorial system is designed to step you through R for statistics and data modeling. As you work through their seven modules, you’ll earn badges to track your progress helping you to stay on track.
Leada – If you’re a complete R novice, try Lead’s introduction to R. In their 1 hour 30 min course, they’ll cover installation, basic usage, common functions, data structures, and data types. They’ll even set you up with your own development environment in RStudio.
Advanced R – Once you’ve mastered the basics of R, bookmark this page. It’s a fantastically comprehensive style guide to using R. We should all strive to write beautiful code, and this resource (based on Google’s R style guide) is your key to that ideal.
Swirl – Learn R in R – a radical idea certainly. But that’s exactly what Swirl does. They’ll interactively teach you how to program in R and do some basic data science at your own pace. Right in the R console.
Python for beginners – The Python website actually has a pretty comprehensive and easy-to-follow set of tutorials. You can learn everything from installation to complex analyzes. It also gives you access to the Python community, who will be happy to answer your questions.
PythonSpot – A complete list of Python tutorials to take you from zero to Python hero. There are tutorials for beginners, intermediate and advanced learners.
Read all about it: data mining books
Data Jujitsu: The Art of Turning Data into Product – This free book by DJ Patil gives you a brief introduction to the complexity of data problems and how to approach them. He gives nice, understandable examples that cover the most important thought processes of data mining. It’s a great book for beginners but still interesting to the data mining expert. Plus, it’s free!
Data Mining: Concepts and Techniques – The third (and most recent) edition will give you an understanding of the theory and practice of discovering patterns in large data sets. Each chapter is a stand-alone guide to a particular topic, making it a good resource if you’re not into reading in sequence or you want to know about a particular topic.
Mining of Massive Datasets – Based on the Stanford Computer Science course, this book is often sighted by data scientists as one of the most helpful resources around. It’s designed at the undergraduate level with no formal prerequisites. It’s the next best thing to actually going to Stanford!
Hadoop: The Definitive Guide – As a data scientist, you will undoubtedly be asked about Hadoop. So you’d better know how it works. This comprehensive guide will teach you how to build and maintain reliable, scalable, distributed systems with Apache Hadoop. Make sure you get the most recent addition to keep up with this fast-changing service.
Online learning: data mining webinars and courses
DataCamp – Learn data mining from the comfort of your home with DataCamp’s online courses. They have free courses on R, Statistics, Data Manipulation, Dynamic Reporting, Large Data Sets and much more.
Coursera – Coursera brings you all the best University courses straight to your computer. Their online classes will teach you the fundamentals of interpreting data, performing analyzes and communicating insights. They have topics for beginners and advanced learners in Data Analysis, Machine Learning, Probability and Statistics and more.
Udemy – With a range of free and pay for data mining courses, you’re sure to find something you like on Udemy no matter your level. There are 395 in the area of data mining! All their courses are uploaded by other Udemy users meaning quality can fluctuate so make sure you read the reviews.
CodeSchool – These courses are handily organized into “Paths” based on the technology you want to learn. You can do everything from build a foundation in Git to take control of a data layer in SQL. Their engaging online videos will take you step-by-step through each lesson and their challenges will let you practice what you’ve learned in a controlled environment.
Udacity – Master a new skill or programming language with Udacity’s unique series of online courses and projects. Each class is developed by a Silicon Valley tech giant, so you know what your learning will be directly applicable to the real world.
Treehouse – Learn from experts in web design, coding, business and more. The video tutorials from Treehouse will teach you the basics and their quizzes and coding challenges will ensure the information sticks. And their UI is pretty easy on the eyes.
Learn from the best: top data miners to follow
John Foreman – Chief Data Scientist at MailChimp and author of Data Smart, John is worth a follow for his witty yet poignant tweets on data science.
DJ Patil – Author and Chief Data Scientist at The White House OSTP, DJ tweets everything you’ve ever wanted to know about data in politics.
Nate Silver – He’s Editor-in-Chief of FiveThirtyEight, a blog that uses data to analyze news stories in Politics, Sports, and Current Events.
Andrew Ng – As the Chief Data Scientist at Baidu, Andrew is responsible for some of the most groundbreaking developments in Machine Learning and Data Science.
Bernard Marr – He might know pretty much everything there is to know about Big Data.
Gregory Piatetsky – He’s the author of popular data science blog KDNuggets, the leading newsletter on data mining and knowledge discovery.
Christian Rudder – As the Co-founder of OKCupid, Christian has access to one of the most unique datasets on the planet and he uses it to give fascinating insight into human nature, love, and relationships
Dean Abbott – He’s contributed to a number of data blogs and authored his own book on Applied Predictive Analytics. At the moment, Dean is Chief Data Scientist at SmarterHQ.
Practice what you’ve learned: data mining competitions
Kaggle – This is the ultimate data mining competition. The world’s biggest corporations offer big prizes for solving their toughest data problems.
Stack Overflow – The best way to learn is to teach. Stackoverflow offers the perfect forum for you to prove your data mining know-how by answering fellow enthusiast’s questions.
TunedIT – With a live leaderboard and interactive participation, TunedIT offers a great platform to flex your data mining muscles.
DrivenData – You can find a number of nonprofit data mining challenges on DataDriven. All of your mining efforts will go towards a good cause.
Quora – Another great site to answer questions on just about everything. There are plenty of curious data lovers on there asking for help with data mining and data science.
Meet your fellow data miner: social networks, groups and meetups
Facebook – As with many social media platforms, Facebook is a great place to meet and interact with people who have similar interests. There are a number of very active data mining groups you can join.
LinkedIn – If you’re looking for data mining experts in a particular field, look no further than LinkedIn. There are hundreds of data mining groups ranging from the generic to the hyper-specific. In short, there’s sure to be something for everyone.
Meetup – Want to meet your fellow data miners in person? Attend a meetup! Just search for data mining in your city and you’re sure to find an awesome group near you.
Yochai Benklerexplains: “The various formats of the networked public sphere provide anyone with an outlet to speak, to inquire, to investigate, without need to access the resources of a major media organization.”
Democratic bodies are typically elected in periods of three to five years, yet citizen opinions seem to fluctuate daily and sometimes these mood swings grow to enormous proportions. When thousands of people all start tweeting about the same subject on the same day, you know that something is up. With so much dynamic and salient political diversity in the electorate, how can policy-makers ever reach a consensus that could satisfy everyone?
At the same time, it would be a grave mistake to discount the voices of the internet as something that has no connection to real political situations.
What happened in the UK was not only a political disaster, but also a vivid example of what happens when you combine the uncontrollable power of the internet with a lingering visceral feeling that ordinary people have lost control of the politics that shape their lives.
Polarization as a driver of populism
People who have long entertained right-wing populist ideas, but were never confident enough to voice them openly, are now in a position to connect to like-minded others online and use the internet as a megaphone for their opinions.
The resulting echo chambers tend to amplify and reinforce our existing opinions, which is dysfunctional for a healthy democratic discourse. And while social media platforms like Facebook and Twitter generally have the power to expose us to politically diverse opinions, research suggests that the filter bubbles they sometimes create are, in fact, exacerbated by the platforms’ personalization algorithms, which are based on our social networks and our previously expressed ideas. This means that instead of creating an ideal type of a digitally mediated “public agora”, which would allow citizens to voice their concerns and share their hopes, the internet has actually increased conflict and ideological segregation between opposing views, granting a disproportionate amount of clout to the most extreme opinions.
The disintegration of the general will
In political philosophy, the very idea of democracy is based on the principal of the general will, which was proposed by Jean-Jacques Rousseau in the 18th century. Rousseau envisioned that a society needs to be governed by a democratic body that acts according to the imperative will of the people as a whole.
There can be no doubt that a new form of digitally mediated politics is a crucial component of the Fourth Industrial Revolution: the internet is already used for bottom-up agenda-setting, empowering citizens to speak up in a networked public sphere, and pushing the boundaries of the size, sophistication and scope of collective action. In particular, social media has changed the nature of political campaigning and will continue to play an important role in future elections and political campaigns around the world.
more on the impact of technology on democracy in this IMS blog:
At a Ford Foundation conference dubbed Fairness by Design, officials, academics and advocates discussed how to address the problem of encoding human bias in algorithmic analysis. The White House recently issued a report on the topic to accelerate research into the issue.
U.S. CTO Megan Smith said the government has been “creating a seat for these techies,” but that training future generations of data scientists to tackle these issues depends on what we do today. “It’s how did we teach our children?” she said. “Why don’t we teach math and science the way we teach P.E. and art and music and make it fun?”
“Ethics is not just an elective, but some portion of the main core curriculum.”
DARPA’s holographic imaging system hopes to show objects behind a wall or around a corner – Eraser anyone?
04/28/2016 – 18:21 Kim Cobb
SMU’s Lyle School of Engineering will lead a multi-university team funded by the Defense Advanced Research Projects Agency (DARPA) to build a theoretical framework for creating a computer-generated image of an object hidden from sight around a corner or behind a wall.
The core of the proposal is to develop a computer algorithm to unscramble the light that bounces off irregular surfaces to create a holographic image of hidden objects.
Similar technologies purused by MS Hololense as reported in this IMS blog entry:
Are you ready to deal with “denial of sleep” attacks? Those are attacks using malicious code, propagated through the Internet of Things, aimed at draining the batteries of your devices by keeping them awake.
Security. threats extend well beyond denial of sleep: “The IoT introduces a wide range of new security risks and challenges to the IoT devices themselves, their platforms and operating systems, their communications, and even the systems to which they’re connected.
Analytics. IoT will require a new approach to analytics. “New analytic tools and algorithms are needed now, but as data volumes increase through 2021, the needs of the IoT may diverge further from traditional analytics,” according to Gartner.
Device (Thing) Management. IoT things that are not ephemeral — that will be around for a while — will require management like every other device (firmware updates, software updates, etc.), and that introduces problems of scale.
Low-Power, Short-Range IoT Networks. Short-range networks connecting IT devices will be convoluted. There will not be a single common infrastructure connecting devices.
Low-Power, Wide-Area Networks. Current solutions are proprietary, but standards will come to dominate.
Processors and Architecture. Designing devices with an understanding of those devices’ needs will require “deep technical skills.”
Operating Systems. There’s a wide range of systems out there that have been designed for specific purposes.
Event Stream Processing. “Some IoT applications will generate extremely high data rates that must be analyzed in real time.
Platforms. “IoT platforms bundle many of the infrastructure components of an IoT system into a single product.
Standards and Ecosystems. as IoT devices proliferate, new ecosystems will emerge, and there will be “commercial and technical battles between these ecosystems” that “will dominate areas such as the smart home, the smart city and healthcare.
In November 2015, the Open University released the latest edition of its ‘Innovating Pedagogy’ report, the fourth rendition of an annual educational technology and teaching techniques forecast. While the timelines and publishing interval may remind you of the Horizon Report, the methodology for gathering the trends is different.
The NMC Horizon Team uses a modified Delphi survey approach with a panel of experts.
10 Innovative Pedagogy Trends from the 2015 Edition:
Crossover Learning: recognition of diverse, informal achievements with badges.
Learning through Argumentation: To fully understand scientific ideas and effectively participate in public debates students should practice the kinds of inquiry and communication processes that scientists use, and pursue questions without known answers, rather than reproducing facts.
Incidental Learning: A subset of informal learning, incidental learning occurs through unstructured exploration, play and discovery. Mobile technologies can support incidental learning. An example is the app and website Ispot Nature.
Context-based Learning:Mobile applications and augmented reality can enrich the learners’ context. An example is the open source mobile game platform ARIS.
Computational Thinking: The skills that programmers apply to analyze and solve problems are seen as an emerging trend . An example is the programming environment SCRATCH.
Learning by Doing Science with Remote Labs: A collection of accessible labs is ilab
Embodied learning:involving the body is essential for some forms of learning, how physical activities can influence cognitive processes.
Adaptive Teaching:intelligent tutoring systems – computer applications that analyse data from learning activities to provide learners with relevant content and sequence learning activities based on prior knowledge.
Analytics of Emotions: As techniques for tracking eye movements, emotions and engagement have matured over the past decade, the trend prognoses opportunities for emotionally adaptive learning environments.
Stealth Assessment: In computer games the player’s progress gradually changes the game world, setting increasingly difficult problems through unobtrusive, continuous assessment.
6 Themes of Pedagogical Innovation
Based upon a review of previous editions, the report tries to categorize pedagogical innovation into six overarching themes:
“What started as a small set of basic teaching methods (instruction, discovery, inquiry) has been extended to become a profusion of pedagogies and their interactions. So, to try to restore some order, we have examined the previous reports and identified six overarching themes: scale, connectivity, reflection, extension, embodiment, and personalisation.”
Delivering education at massive scale.
Connecting learners from different nations, cultures and perspectives.
Fostering reflection and contemplation.
Extending traditional teaching methods and settings.
Recognizing embodied learning (explore, create, craft, and construct).
Creating a personalized path through educational content.
Follow these links to blog posts and EdITLib resources to further explore selected trends:
Interested in the Innovating Pedagogy report? Read our review of the 2014 edition, and reflect which trends are closer to becoming common practice.