Artificial intelligence (AI) and machine learning are no longer fantastical prospects seen only in science fiction. Products like Amazon Echo and Siri have brought AI into many homes,
Kelly Calhoun Williams, an education analyst for the technology research firm Gartner Inc., cautions there is a clear gap between the promise of AI and the reality of AI.
Artificial intelligence is a broad term used to describe any technology that emulates human intelligence, such as by understanding complex information, drawing its own conclusions and engaging in natural dialog with people.
Machine learning is a subset of AI in which the software can learn or adapt like a human can. Essentially, it analyzes huge amounts of data and looks for patterns in order to classify information or make predictions. The addition of a feedback loop allows the software to “learn” as it goes by modifying its approach based on whether the conclusions it draws are right or wrong.
AI can process far more information than a human can, and it can perform tasks much faster and with more accuracy. Some curriculum software developers have begun harnessing these capabilities to create programs that can adapt to each student’s unique circumstances.
GoGuardian, a Los Angeles company, uses machine learning technology to improve the accuracy of its cloud-based Internet filtering and monitoring software for Chromebooks. (My note: that smells Big Brother).Instead of blocking students’ access to questionable material based on a website’s address or domain name, GoGuardian’s software uses AI to analyze the actual content of a page in real time to determine whether it’s appropriate for students. (my note: privacy)
serious privacy concerns. It requires an increased focus not only on data quality and accuracy, but also on the responsible stewardship of this information. “School leaders need to get ready for AI from a policy standpoint,” Calhoun Williams said. For instance: What steps will administrators take to secure student data and ensure the privacy of this information?
James Dixon, the CTO of Pentaho is credited with naming the concept of a data lake. He uses the following analogy:
“If you think of a datamart as a store of bottled water – cleansed and packaged and structured for easy consumption – the data lake is a large body of water in a more natural state. The contents of the data lake stream in from a source to fill the lake, and various users of the lake can come to examine, dive in, or take samples.”
A data lake holds data in an unstructured way and there is no hierarchy or organization among the individual pieces of data. It holds data in its rawest form—it’s not processed or analyzed. Additionally, a data lakes accepts and retains all data from all data sources, supports all data types and schemas (the way the data is stored in a database) are applied only when the data is ready to be used.
What is a data warehouse?
A data warehouse stores data in an organized manner with everything archived and ordered in a defined way. When a data warehouse is developed, a significant amount of effort occurs during the initial stages to analyze data sources and understand business processes.
Data lakes retain all data—structured, semi-structured and unstructured/raw data. It’s possible that some of the data in a data lake will never be used. Data lakes keep all data as well. A data warehouse only includes data that is processed (structured) and only the data that is necessary to use for reporting or to answer specific business questions.
Since a data lake lacks structure, it’s relatively easy to make changes to models and queries.
Data scientists are typically the ones who access the data in data lakes because they have the skill-set to do deep analysis.
Since data warehouses are more mature than data lakes, the security for data warehouses is also more mature.
Between the “dumb” fixed algorithms and true AI lies the problematic halfway house we’ve already entered with scarcely a thought and almost no debate, much less agreement as to aims, ethics, safety, best practice. If the algorithms around us are not yet intelligent, meaning able to independently say “that calculation/course of action doesn’t look right: I’ll do it again”, they are nonetheless starting to learn from their environments. And once an algorithm is learning, we no longer know to any degree of certainty what its rules and parameters are. At which point we can’t be certain of how it will interact with other algorithms, the physical world, or us. Where the “dumb” fixed algorithms – complex, opaque and inured to real time monitoring as they can be – are in principle predictable and interrogable, these ones are not. After a time in the wild, we no longer know what they are: they have the potential to become erratic. We might be tempted to call these “frankenalgos” – though Mary Shelley couldn’t have made this up.
Twenty years ago, George Dyson anticipated much of what is happening today in his classic book Darwin Among the Machines. The problem, he tells me, is that we’re building systems that are beyond our intellectual means to control. We believe that if a system is deterministic (acting according to fixed rules, this being the definition of an algorithm) it is predictable – and that what is predictable can be controlled. Both assumptions turn out to be wrong.“It’s proceeding on its own, in little bits and pieces,” he says. “What I was obsessed with 20 years ago that has completely taken over the world today are multicellular, metazoan digital organisms, the same way we see in biology, where you have all these pieces of code running on people’s iPhones, and collectively it acts like one multicellular organism.“There’s this old law called Ashby’s law that says a control system has to be as complex as the system it’s controlling, and we’re running into that at full speed now, with this huge push to build self-driving cars where the software has to have a complete model of everything, and almost by definition we’re not going to understand it. Because any model that we understand is gonna do the thing like run into a fire truck ’cause we forgot to put in the fire truck.”
Walsh believes this makes it more, not less, important that the public learn about programming, because the more alienated we become from it, the more it seems like magic beyond our ability to affect. When shown the definition of “algorithm” given earlier in this piece, he found it incomplete, commenting: “I would suggest the problem is that algorithm now means any large, complex decision making software system and the larger environment in which it is embedded, which makes them even more unpredictable.” A chilling thought indeed. Accordingly, he believes ethics to be the new frontier in tech, foreseeing “a golden age for philosophy” – a view with which Eugene Spafford of Purdue University, a cybersecurity expert, concurs. Where there are choices to be made, that’s where ethics comes in.
our existing system of tort law, which requires proof of intention or negligence, will need to be rethought. A dog is not held legally responsible for biting you; its owner might be, but only if the dog’s action is thought foreseeable.
As we wait for a technological answer to the problem of soaring algorithmic entanglement, there are precautions we can take. Paul Wilmott, a British expert in quantitative analysis and vocal critic of high frequency trading on the stock market, wryly suggests “learning to shoot, make jam and knit”
The venerable Association for Computing Machinery has updated its code of ethics along the lines of medicine’s Hippocratic oath, to instruct computing professionals to do no harm and consider the wider impacts of their work.
Privacy researcher Lukasz Olejnik disagreed, noting that the change carried large ramifications for the affected users. “Moving around one and a half billion users into other jurisdictions is not a simple copy-and-paste exercise,” he said.
Facebook To Offer Users Opt-Outs That Comply With New European Privacy Rules
As we reported earlier this week, a federal judge in California ruled that Facebook could be sued in a class-action lawsuit brought by users in Illinois who say the social media company improperly used facial recognition to upload photographs.
THERE ARE HACKABLE security flaws in software. And then there are those that don’t even require hacking at all—just a knock on the door, and asking to be let in. Apple’s macOS High Sierra has the second kind.
Kaspersky Lab advised those who do not use anti-virus products to restrict execution of certain files (C:\Windows\infpub.dat, C:\Windows\cscc.dat) and shut down the Windows Management Instrumentation (WMI) service. My note: let the wolf in the shed with sheep.
The source of the attack remained undetermined, but earlier this month the head of Microsoft, Brad Smith, pinned the blame for it on North Korea, which allegedly used cyber tools or weapons that were stolen from the National Security Agency in the United States. The top executive, however, did not provide evidence to back his claims.
New ransomware attack hits Russia and spreads around globe
Based on my experience in Tallinn, we will see companies become more transparent in how they deal with cyber attacks. After a massive cyber attack in 2007, for example, the Estonian government reacted in the right way.
3. Use parental controls. Check the safety controls on all of the Android and Apple devices that your family uses. On the iPhone, you can tap SETTINGS > GENERAL> RESTRICTIONS and you can create a password that allows you enable/disable apps and phone functions. On Android devices, you can turn on Google Play Parental Controls by going into the Google Play Store settings
4. Friend and follow your children on social media. Whether it’s musical.ly, Instagram or Twitter, chances are that your children use some form of social media. If you have not already, then create an account and get on their friends list.
5. Explore, share and celebrate.
6. Be a good digital role model.
7. Set ground rules and apply sanctions. Just like chore charts or family job lists, consider using a family social media or internet safety contract. These contracts establish ground rules for when devices are to be used; what they should and should not be doing on them; and to establish sanctions based on breaches of the family contract. A simple internet search for “family internet contract” or “family technology contract” will produce a wealth of available ideas and resources to help you implement rules and sanctions revolving around your family’s technology use. A good example of a social media contract for children can be found at imom.com/printable/social-media-contract-for-kids/.
Managing Your Digital Footprint
Your digital footprint, according to dictionary.com, is “one’s unique set of digital activities, actions, and communications that leave a data trace on the internet or on a computer or other digital device and can identify the particular user or device.” Digital footprints can be either passive or active. The passive digital footprint is created without your consent and is driven by the sites and apps that you visit. The data from a passive digital footprint could reveal one’s internet history, IP address, location and is all stored in files on your device without you knowing it. An active digital footprint is more easily managed by the user. Data from an active digital footprint shows social media postings, information sharing, online purchases and activity usage.
Search for yourself online
Check privacy settings.
Use strong passwords
Maintain your device.
Think before you post
Keep These Apps on Your Radar
Afterschool (minimum age 17) – The Afterschool App was rejected twice from the major app stores due to complaints from parents and educators. It is a well-known app that promotes cyberbullying, sexting, pornography and is filled with references to drugs and alcohol.
Blue Whale (minimum age 10) – IF YOU FIND THIS APP ON YOUR CHILD’S DEVICE, DELETE IT. It is a suicide challenge app that attempts to prod children into killing themselves.
BurnBook (minimum age 18) – IF YOU FIND THIS APP ON YOUR CHILD’S DEVICE, DELETE IT. It is a completely anonymous app for posting text, photos, and audio that promote rumors about other people. It is a notorious for cyberbullying
Calculator% (minimum age 4) – IF YOU FIND THIS APP ON YOUR CHILD’S DEVICE, DELETE IT. This is one of hundreds of “secret” calculator apps. This app is designed to help students hide photos and videos that they do not want their parents to see. This app looks and functions like a calculator, but students enter a “.”, a 4-digit passcode, and then a “.” again.
KIK (minimum age 17) – This is a communications app that allows anyone to be contacted by anyone and it 100 percent bypasses the device’s contacts list.
Yik Yak (minimum age 18) – This app is a location-based (most commonly schools) bulletin board app. It works anonymously with anyone pretending to be anyone they want. Many schools across the country have encountered cyberbullying and cyberthreats originating from this app.
StreetChat (minimum age 14) – StreetChat is a photo-sharing board for middle school, high school and college-age students. Members do not need to be a student in the actual school and can impersonate students in schools across the country. It promotes cyberbullying through anonymous posts and private messaging.
ooVoo (minimum age 13) – IF YOU FIND THIS APP ON YOUR CHILD’S DEVICE, DELETE IT. ooVoo is one of the largest video and messages app. Parents should be aware that ooVoo is used by predators to contact underage children. The app can allow users to video chat with up to twelve people at one time.
Wishbone (girls) & Slingshot (boys) (minimum age 13) – Both are comparison apps that allow users to create polls, including ones that are not appropriate for children. Many of the users create polls to shame and cyberbully other children, plus there are inappropriate apps and videos that users are forced to watch via the app’s advertising engine.
Texas Teen May Be Victim in ‘Blue Whale Challenge’ That Encourages Suicide
Isaiah Gonzalez, 15, found hanging from his closet after an apparent suicide, as allegedly instructed by macabre online game
Nationally, the Associated Press reports that educators, law enforcement officers and parents have raised concerns about the challenge, though these two back-to-back deaths mark the first allegations in the United States about deaths directly linked to the online game. Internationally, suicides in Russia, Brazil, and half a dozen other countries have already been linked to the challenge.