In 2018 we witnessed a clash of titans as government and tech companies collided on privacy issues around collecting, culling and using personal data. From GDPR to Facebook scandals, many tech CEOs were defending big data, its use, and how they’re safeguarding the public.
1. Companies will face increased pressure about the data AI-embedded services use.
2. Public concern will lead to AI regulations. But we must understand this tech too.
In 2018, the National Science Foundation invested $100 million in AI research, with special support in 2019 for developing principles for safe, robust and trustworthy AI; addressing issues of bias, fairness and transparency of algorithmic intelligence; developing deeper understanding of human-AI interaction and user education; and developing insights about the influences of AI on people and society.
This investment was dwarfed by DARPA—an agency of the Department of Defence—and its multi-year investment of more than $2 billion in new and existing programs under the “AI Next” campaign. A key area of the campaign includes pioneering the next generation of AI algorithms and applications, such as “explainability” and common sense reasoning.
Roskomnadzor has also exerted pressure on Google to remove certain sites on Russian searches.
Director of National Intelligence Dan Coats told Congress last month that Russia, as well as other foreign actors, will increasingly use cyber operations to “threaten both minds and machines in an expanding number of ways—to steal information, to influence our citizens, or to disrupt critical infrastructure.”
interview with Christopher Loss, one of the editors.
What role does technology play in some of the convergences that occur or are happening?
There’s a great essay in the collection by June Ahn, which deals with the idea of technology as a key mediating source and mechanism for the creation of various kinds of convergences between and among different sectors (my note: K12 and higher ed).
Americans like to see themselves as among the best in the world in education. But lately, the education leaders have been looking abroad for ideas, I think. What can we learn from countries that do have closer links between K-12 and higher ed?
Under the Dewey Decimal System that revolutionized and standardized book shelving starting in 1876, nonfiction essentially already gets the genrefication treatment with, for example, Music located in the 780s and Paleontology in the 560s. Yet most fiction is shelved in one big clump alphabetized by author’s last name.
Many librarians say the “search hurdle” imposed by Dewey classification (a system originally designed for adults) significantly reduces the odds of a child finding something new they’re likely to enjoy. In a genrefied library, on the other hand, a young reader standing near a favorite book need only stick out a hand to find more like it. (It’s a bit like the analog version of Amazon’s recommendation feature: “Customers who bought this item also bought”)
The Dewey-loyal also oppose genrefication in principle for, interestingly enough, the same reason others support it: self-sufficiency. Sure, they argue, kids might be better able to find a book independently in their school library, but what happens when they go to the public one? When they get to high school?
The debate has led to compromise positions. Some leave books for older students in the Dewey arrangement while genrefying for younger ones. Other librarians rearrange middle readers and young adult books but leave picture books shelved by author since it can be unclear how to categorize a story about a duck driving a tractor.
Remember that a blockchain is an immutable, sequential chain of records called Blocks. They can contain transactions, files or any data you like, really. But the important thing is that they’re chained together using hashes.
Twenty years have passed since renowned Harvard Professor Larry Lessig coined the phrase “Code is Law”, suggesting that in the digital age, computer code regulates behavior much like legislative code traditionally did.These days, the computer code that powers artificial intelligence (AI) is a salient example of Lessig’s statement.
Good AI requires sound data.One of the principles,some would say the organizing principle, of privacy and data protection frameworks is data minimization.Data protection laws require organizations to limit data collection to the extent strictly necessary and retain data only so long as it is needed for its stated goal.
Preventing discrimination – intentional or not.
When is a distinction between groups permissible or even merited and when is it untoward? How should organizations address historically entrenched inequalities that are embedded in data? New mathematical theories such as “fairness through awareness” enable sophisticated modeling to guarantee statistical parity between groups.
Assuring explainability – technological due process.In privacy and freedom of information frameworks alike, transparency has traditionally been a bulwark against unfairness and discrimination.As Justice Brandeis once wrote, “Sunlight is the best of disinfectants.”
Deep learning means that iterative computer programs derive conclusions for reasons that may not be evident even after forensic inquiry.
Yet even with code as law and a rising need for law in code, policymakers do not need to become mathematicians, engineers and coders.Instead, institutions must develop and enhance their technical toolbox by hiring experts and consulting with top academics, industry researchers and civil society voices.Responsible AI requires access to not only lawyers, ethicists and philosophers but also to technical leaders and subject matter experts to ensure an appropriate balance between economic and scientific benefits to society on the one hand and individual rights and freedoms on the other hand.
For abstainers, breaking up with Facebook freed up about an hour a day, on average, and more than twice that for the heaviest users.
research led by Ethan Kross, a professor of psychology at the University of Michigan, has found that high levels of passive browsing on social media predict lowered moods, compared to more active engagement.
TAILS is an acronym for “The Amnesic Incognito Live System.”
TAILS is a highly-secure operating system (and a host of cool applications) designed to be booted off of a DVD or USB thumb drive. This not only makes TAILS easy to transport, but also ensures that TAILS can be booted and instantly useful from nearly any PC, Mac, or Chromebook. TAILS is built on Linux, a name you might recognize because it’s a popular, free, and open-source operating system that’s been available since 1991. TAILS, in particular, runs on a variant of Linux known as “Debian,” which became available in 1996.
Third and most importantly, when setup correctly, TAILS helps ensure that all of your communications — email, web browsing, chat, and more — are encrypted, made anonymous, and then routed in such a way that it’s extremely difficult to detect or trace them.
Its unique approach to offering such well-regarded security is the creative use of two virtual machines (or VMs) running in tandem on one host computer. One of these VMs is known as the Gateway while the other is known as the Workstation.
Compared to TAILS, Whonix only provides a few free, open-source applications and those need to be set up fairly extensively. The list includes:
The TOR browser, for safe web browsing
Firefox, for less secure web browsing
Icedove, for emailing, secured by the Enigmail extension to encrypt and authenticate emails using a well-know and secure protocol called “OpenPGP”
HexChat, for internet chats
VLC, to open and view every kind of video file that’s ever existed
Whether the NYC police angle is true or not (it’s being hotly disputed), Facebook and Google are thinking along lines that follow the whims of the Chinese Government.
SenseTime and Megvii won’t just be worth $5 Billion, they will be worth many times that in the future. This is because a facial recognition data-harvesting of everything is the future of consumerism and capitalism, and in some places, the central tenet of social order (think Asia).
China has already ‘won’ the trade-war, because its winning the race to innovation. America doesn’t regulate Amazon, Microsoft, Google or Facebook properly, that stunts innovation and ethics in technology where the West is now forced to copy China just to keep up.