Dollar Photo for stock images. They are closing down as of April 15th. Does anyone use another vendor that comparable? We loved that fact that we could prepay for credits ($1 per image) rather than pay for a monthly subscription.
Facebook’s board works more like an advisory committee than an overseer, because Mark controls around 60 percent of voting shares. Mark alone can decide how to configure Facebook’s algorithms to determine what people see in their News Feeds, what privacy settings they can use and even which messages get delivered. He sets the rules for how to distinguish violent and incendiary speech from the merely offensive, and he can choose to shut down a competitor by acquiring, blocking or copying it.
We are a nation with a tradition of reining in monopolies, no matter how well intentioned the leaders of these companies may be. Mark’s power is unprecedented and un-American.
It is time to break up Facebook.
America was built on the idea that power should not be concentrated in any one person, because we are all fallible. That’s why the founders created a system of checks and balances.
More legislation followed in the 20th century, creating legal and regulatory structures to promote competition and hold the biggest companies accountable.
Starting in the 1970s, a small but dedicated group of economists, lawyers and policymakers sowed the seeds of our cynicism. Over the next 40 years, they financed a network of think tanks, journals, social clubs, academic centers and media outlets to teach an emerging generation that private interests should take precedence over public ones. Their gospel was simple: “Free” markets are dynamic and productive, while government is bureaucratic and ineffective.
From our earliest days, Mark used the word “domination” to describe our ambitions, with no hint of irony or humility.
Facebook’s monopoly is also visible in its usage statistics.About 70 percent of American adults use social media, and a vast majority are on Facebook products. Over two-thirds use the core site, a third use Instagram, and a fifth use WhatsApp. By contrast, fewer than a third report using Pinterest, LinkedIn or Snapchat. What started out as lighthearted entertainment has become the primary way that people of all ages communicate online.
The F.T.C.’s biggest mistake was to allow Facebook to acquire Instagram and WhatsApp. In 2012, the newer platforms were nipping at Facebook’s heels because they had been built for the smartphone, where Facebook was still struggling to gain traction. Mark responded by buying them, and the F.T.C. approved.
The News Feed algorithm reportedly prioritized videos created through Facebook over videos from competitors, like YouTube and Vimeo. In 2012, Twitter introduced a video network called Vine that featured six-second videos. That same day, Facebook blocked Vine from hosting a tool that let its users search for their Facebook friends while on the new network.The decision hobbled Vine, which shut down four years later.
unlike Vine, Snapchat wasn’t interfacing with the Facebook ecosystem; there was no obvious way to handicap the company or shut it out. So Facebook simply copied it. (opyright law does not extend to the abstract concept itself.)
As markets become more concentrated, the number of new start-up businesses declines. This holds true in other high-tech areas dominated by single companies, like search (controlled by Google) and e-commerce (taken over by Amazon). Meanwhile, there has been plenty of innovation in areas where there is no monopolistic domination, such as in workplace productivity (Slack, Trello, Asana), urban transportation (Lyft, Uber, Lime, Bird) and cryptocurrency exchanges (Ripple, Coinbase, Circle).
The choice is mine, but it doesn’t feel like a choice. Facebook seeps into every corner of our lives to capture as much of our attention and data as possible and, without any alternative, we make the trade.
Just last month, Facebook seemingly tried to bury news that it had stored tens of millions of user passwords in plain text format, which thousands of Facebook employees could see. Competition alone wouldn’t necessarily spur privacy protection — regulation is required to ensure accountability — but Facebook’s lock on the market guarantees that users can’t protest by moving to alternative platforms.
Mark used to insist that Facebook was just a “social utility,” a neutral platform for people to communicate what they wished. Now he recognizes that Facebook is both a platform and a publisher and that it is inevitably making decisions about values. The company’s own lawyers have argued in court that Facebook is a publisher and thus entitled to First Amendment protection.
As if Facebook’s opaque algorithms weren’t enough, last year we learned that Facebook executives had permanently deleted their own messages from the platform, erasing them from the inboxes of recipients; the justification was corporate security concerns.
Mark may never have a boss, but he needs to have some check on his power. The American government needs to do two things: break up Facebook’s monopoly and regulate the company to make it more accountable to the American people.
We Don’t Need Social Media
The push to regulate or break up Facebook ignores the fact that its services do more harm than good
Hughes joins a growing chorus of former Silicon Valley unicorn riders who’ve recently had second thoughts about the utility or benefit of the surveillance-attention economy their products and platforms have helped create. He is also not the first to suggest that government might need to step in to clean up the mess they made
Nick Srnicek, author of the book Platform Capitalismand a lecturer in digital economy at King’s College London, wrotelast month, “[I]t’s competition — not size — that demands more data, more attention, more engagement and more profits at all costs
guide (available as PDF here and Google Doc here) to offer some explanations of how to avoid copyright infringement by using media that you can legally re-use for classroom projects including blog posts, web pages, videos, slideshows, and podcasts. The guide also includes 21 places to find media to use in classroom projects.
Evelyn Berezin, a computer scientist who designed the world’s first word processor, has died at the age of 93.
as she explained in an oral history interview, she was having trouble finding work in the physics field, so she started asking about computers — having barely even heard of them.
It wasn’t easy being a woman in the industry. In 1960, Berezin says she was offered a job at the New York Stock Exchange, as a vice president managing the computer system that handled their communications. But then the offer was retracted by the board of directors.
In 2006, Berezin was inducted into the Long Island Technology Hall of Fame, and she joined the Women In Technology Hall of Fame in 2011. In 2015, she became a fellow at the Computer History Museum.
One of the issues dividing the two main parliamentary blocs is whether there should be a cap on profit margins for publicly funded private schools.
The Swedish school system has received considerable internationalattention in recent years due to alarming test scores in the OECD’s international PISA study
Segregation is one of the most serious social problems facing Sweden and many other wealthy nations.
A recent report (the English title would be “A Nation Divided – School Choice and Segregation in Sweden”) that I have co-authored for the Stockholm based think tank Arena Idé shows that well-educated and Swedish-born families increasingly opt out of schools where the children have parents with lower educational attainments and an immigrant background. We also show that this “white flight” in Swedish municipalities throughout the country is increased by school choice and other reforms introduced in the early 1990s, whereby publicly financed private schools are allowed to compete with municipal schools for school vouchers allotted to each individual student.
The results in our study should be viewed in light of two recent reports: one from the OECDand another from UNICEF, both highlighting the inequality in the Swedish school system.
The results make it painfully clear that the Swedish school system effectively works against the very idea that schools should level the playing field for students from all backgrounds and give every child equal opportunity. Even after the rise of right-wing populism in Sweden, our established political parties have proven themselves unable, or unwilling, to rein in the highly unregulated Swedish school market.
Sejnowski, T. J. (2018). The Deep Learning Revolution. Cambridge, MA: The MIT Press.
How deep learning―from Google Translate to driverless cars to personal cognitive assistants―is changing our lives and transforming every sector of the economy.
The deep learning revolution has brought us driverless cars, the greatly improved Google Translate, fluent conversations with Siri and Alexa, and enormous profits from automated trading on the New York Stock Exchange. Deep learning networks can play poker better than professional poker players and defeat a world champion at Go. In this book, Terry Sejnowski explains how deep learning went from being an arcane academic field to a disruptive technology in the information economy.
Sejnowski played an important role in the founding of deep learning, as one of a small group of researchers in the 1980s who challenged the prevailing logic-and-symbol based version of AI. The new version of AI Sejnowski and others developed, which became deep learning, is fueled instead by data. Deep networks learn from data in the same way that babies experience the world, starting with fresh eyes and gradually acquiring the skills needed to navigate novel environments. Learning algorithms extract information from raw data; information can be used to create knowledge; knowledge underlies understanding; understanding leads to wisdom. Someday a driverless car will know the road better than you do and drive with more skill; a deep learning network will diagnose your illness; a personal cognitive assistant will augment your puny human brain. It took nature many millions of years to evolve human intelligence; AI is on a trajectory measured in decades. Sejnowski prepares us for a deep learning future.
Buzzwords like “deep learning” and “neural networks” are everywhere, but so much of the popular understanding is misguided, says Terrence Sejnowski, a computational neuroscientist at the Salk Institute for Biological Studies.
Sejnowski, a pioneer in the study of learning algorithms, is the author of The Deep Learning Revolution(out next week from MIT Press). He argues that the hype about killer AI or robots making us obsolete ignores exciting possibilities happening in the fields of computer science and neuroscience, and what can happen when artificial intelligence meets human intelligence.
Machine learning is a very large field and goes way back. Originally, people were calling it “pattern recognition,” but the algorithms became much broader and much more sophisticated mathematically. Within machine learning are neural networks inspired by the brain, and then deep learning. Deep learning algorithms have a particular architecture with many layers that flow through the network. So basically, deep learning is one part of machine learning and machine learning is one part of AI.
December 2012 at the NIPS meeting, which is the biggest AI conference. There, [computer scientist] Geoff Hinton and two of his graduate students showed you could take a very large dataset called ImageNet, with 10,000 categories and 10 million images, and reduce the classification error by 20 percent using deep learning.Traditionally on that dataset, error decreases by less than 1 percent in one year. In one year, 20 years of research was bypassed. That really opened the floodgates.
The inspiration for deep learning really comes from neuroscience.
AlphaGo, the program that beat the Go champion included not just a model of the cortex, but also a model of a part of the brain called the basal ganglia, which is important for making a sequence of decisions to meet a goal. There’s an algorithm there called temporal differences, developed back in the ‘80s by Richard Sutton, that, when coupled with deep learning, is capable of very sophisticated plays that no human has ever seen before.
there’s a convergence occurring between AI and human intelligence. As we learn more and more about how the brain works, that’s going to reflect back in AI. But at the same time, they’re actually creating a whole theory of learning that can be applied to understanding the brain and allowing us to analyze the thousands of neurons and how their activities are coming out. So there’s this feedback loop between neuroscience and AI
Library of Congress launched the National Screening Room. The National Screening Room currently offers about 300 videos. The videos are digital copies of films made in the 19th and 20th centuries. You can browse the collection by date, location of the filming, and subject. You can also search for videos that are parts of other LOC collections. All of the videos in the National Screening Room can be viewed online and or downloaded as MP4 files.
While employers increasingly demand that new hires have college degrees, the transcripts supporting those hard-earned credentials are no longer the most informative tool students have to exhibit their skills.
An estimated 1 in 5 institutions issue digital badges, which can be posted to social media, stored on digital portfolios and displayed by other specially designed platforms. When clicked on, the badge lists a range of skills a student has demonstrated beyond grades.
“The reason they’re taking off in higher education is most employers are not getting the information they need about people emerging from higher ed, with previous tools we’ve been using,” says Jonathan Finkelstein, founder and CEO of the widely used badging platform Credly. “The degree itself doesn’t get to level of describing particular competencies.”
For instance, a Notre Dame student who goes on a trip to Ecuador to build bridges can earn a badge for mastering the calculations involved in the construction, says G. Alex Ambrose, associate program director of e-portfolio assessment at the Indiana university’s Kaneb Center for Teaching & Learning.
Students can be pretty certain when they have passed calculus or creative writing, but they don’t always recognize when they’ve excelled in demonstrating soft skills such as critical thinking, communication and work ethic, says MJ Bishop, director of the system’s William E. Kirwan Center for Academic Innovation.
Badges have been most popular in the school of education—including with student teachers who, in turn, have created badges for the elementary and secondary classrooms where they’ve apprenticed, says Anna Catterson, the university’s educational technology director.
The campus library is another badging hotspot. Students there have earned microcredentials for research, 3D printing and other skills. These badges are being shared on LinkedIn and other platforms to obtain internships and scholarships.
The university runs faculty training sessions on badging and has established a review process for when faculty submit ideas for microcredentials.
One pothole to avoid is trying to create a schoolwide badge that’s standardized across a wide range of courses or majors. This can force the involvement of committees that can bog down the process, so it’s better to start with skills within single courses, says Ambrose at Notre Dame.
When creating a badge, system faculty have to identify a business or industry interested in that credential.
Badges that have the backing of a college or university are more impressive to job recruiters than are completion certificates from skill-building websites like Lynda.com.
Students won’t be motivated to earn a badge that’s a stock blue ribbon downloaded off the internet. Many institutions put a lot work into the design, and this can include harnessing expertise from the marketing department and graphic designers
Between the “dumb” fixed algorithms and true AI lies the problematic halfway house we’ve already entered with scarcely a thought and almost no debate, much less agreement as to aims, ethics, safety, best practice. If the algorithms around us are not yet intelligent, meaning able to independently say “that calculation/course of action doesn’t look right: I’ll do it again”, they are nonetheless starting to learn from their environments. And once an algorithm is learning, we no longer know to any degree of certainty what its rules and parameters are. At which point we can’t be certain of how it will interact with other algorithms, the physical world, or us. Where the “dumb” fixed algorithms – complex, opaque and inured to real time monitoring as they can be – are in principle predictable and interrogable, these ones are not. After a time in the wild, we no longer know what they are: they have the potential to become erratic. We might be tempted to call these “frankenalgos” – though Mary Shelley couldn’t have made this up.
Twenty years ago, George Dyson anticipated much of what is happening today in his classic book Darwin Among the Machines. The problem, he tells me, is that we’re building systems that are beyond our intellectual means to control. We believe that if a system is deterministic (acting according to fixed rules, this being the definition of an algorithm) it is predictable – and that what is predictable can be controlled. Both assumptions turn out to be wrong.“It’s proceeding on its own, in little bits and pieces,” he says. “What I was obsessed with 20 years ago that has completely taken over the world today are multicellular, metazoan digital organisms, the same way we see in biology, where you have all these pieces of code running on people’s iPhones, and collectively it acts like one multicellular organism.“There’s this old law called Ashby’s law that says a control system has to be as complex as the system it’s controlling, and we’re running into that at full speed now, with this huge push to build self-driving cars where the software has to have a complete model of everything, and almost by definition we’re not going to understand it. Because any model that we understand is gonna do the thing like run into a fire truck ’cause we forgot to put in the fire truck.”
Walsh believes this makes it more, not less, important that the public learn about programming, because the more alienated we become from it, the more it seems like magic beyond our ability to affect. When shown the definition of “algorithm” given earlier in this piece, he found it incomplete, commenting: “I would suggest the problem is that algorithm now means any large, complex decision making software system and the larger environment in which it is embedded, which makes them even more unpredictable.” A chilling thought indeed. Accordingly, he believes ethics to be the new frontier in tech, foreseeing “a golden age for philosophy” – a view with which Eugene Spafford of Purdue University, a cybersecurity expert, concurs. Where there are choices to be made, that’s where ethics comes in.
our existing system of tort law, which requires proof of intention or negligence, will need to be rethought. A dog is not held legally responsible for biting you; its owner might be, but only if the dog’s action is thought foreseeable.
As we wait for a technological answer to the problem of soaring algorithmic entanglement, there are precautions we can take. Paul Wilmott, a British expert in quantitative analysis and vocal critic of high frequency trading on the stock market, wryly suggests “learning to shoot, make jam and knit”
The venerable Association for Computing Machinery has updated its code of ethics along the lines of medicine’s Hippocratic oath, to instruct computing professionals to do no harm and consider the wider impacts of their work.