It's not just you. Your rude coworker is actually ruining the company.Watch the full TED Talk here: http://t.ted.com/iUNgsJE
Posted by TED on Wednesday, January 2, 2019
Searching for "civil"
Encrypted chat app Telegram reverses stance, bans 78 ISIS accounts
Telegram is an encrypted chat service that lets users create anonymous channels that can be followed by hundreds of users.
In addition to Telegram, Twitter and YouTube have also removed ISIS-affiliated content, with hacker organization Anonymous having taken down more than 6,000 Twitter accounts following the Paris attacks.
Meanwhile, Telegram said it only takes steps against confirmed ISIS channels. “For example, if criticizing the government is illegal in a country, Telegram won’t be a part of such politically motivated censorship,” the company said. “While we do block terrorist (e.g. ISIS-related) bots and channels, we will not block anybody who peacefully expresses alternative opinions.”
More on this topic in this IMS blog:
Mobbing in the library workplace: What it is and how to prevent it
Facebook Has Been Profiting From Boogaloo Ads Promoting Civil War And Unrest from r/technology
Derived from the name of a 1984 movie, the term “Boogaloo” covers a range of extremists, including some believed to be violent.
more on Facebook in this IMS blog
more on civil disobedience in this IMS blog
QAnon believers are in a civil war over a shadowy figure calling himself “Santa Claus.” His fans call themselves “elves,” and will do anything to get on his “Nice List.” https://t.co/NOMk5YSYRj
— Will Sommer (@willsommer) August 30, 2019
Nearly 200 colleges face federal civil rights investigations opened in 2019 about whether they are accessible and communicate effectively to people with disabilities.
As a result, colleges are rolling out social media accessibility standards and training communications staff members to take advantage of built-in accessibility tools in platforms including YouTube, Facebook and Twitter.
For example, last fall, a blind man filed 50 lawsuits against colleges whose websites he said didn’t work with his screen reader. And on August 21, in Payan v. Los Angeles Community College District, the Federal District Court for the Central District of California ruled that Los Angeles Community College failed to provide a blind student with “meaningful access to his course materials” via MyMathLab, software developed by Pearson, in a timely manner.
YouTube and Facebook have options to automatically generate captions on videos posted there, while Twitter users with access to its still-developing Media Studio can upload videos with captions. Users can provide alt-text, or descriptive language describing images, through Facebook, Twitter, Instagram and Hootsuite.
California State University at Long Beach, for instance, advises posting main information first and hashtags last to make messages clear for people using screen readers. The University of Minnesota calls for indicating whether hyperlinks point to [AUDIO], [PIC], or [VIDEO]. This summer, leaders at the College of William & Mary held a training workshopfor the institution’s communications staff in response to an Office for Civil Rights investigation.
an online website accessibility center.
more on SM in education
New York’s Lockport City School District, which is using public funds from a Smart Schools bond to help pay for a reported $3.8 million security system that uses facial recognition technology to identify individuals who don’t belong on campus
The Lockport case has drawn the attention of national media, ire of many parents and criticism from the New York Civil Liberties Union, among other privacy groups.
the Future of Privacy Forum (FPF), a nonprofit think tank based in Washington, D.C., published an animated video that illustrates the possible harm that surveillance technology can cause to children and the steps schools should take before making any decisions, such as identifying specific goals for the technology and establishing who will have access to the data and for how long.
A few days later, the nonprofit Center for Democracy and Technology, in partnership with New York University’s Brennan Center for Justice, released a brief examining the same topic.
My note: same considerations were relayed to the SCSU SOE dean in regard of the purchase of Premethean and its installation in SOE building without discussion with faculty, who work with technology. This information was also shared with the dean: https://blog.stcloudstate.edu/ims/2018/10/31/students-data-privacy/
more on surveillance in education in this IMS blog
To RSVP ahead of time, or to jump straight in at 2 pm EDT this Thursday, click here:
On Thursday, February 21st, from 2-3 pm EST, we’ll be joined by Marc Prensky, creator of “Civilization-level Alternative Education.”
Coiner of the term “Digital Native” and author of seven books and over 100 essays, Marc has spoken in over 40 countries, and his writings have been translated into a dozen languages. He currently promotes a new civilization-level alternative in global education, championing an emerging new “Real-World-Impact Education” paradigm that more directly benefits students and the world in which they live.
Previously in his career Marc taught French, mathematics and music and headed an alternative school in New York City, worked as a consultant at the Boston Consulting Group (and was its first Product Development Director), and founded and ran a computer game company. Marc holds an MBA degree from Harvard, with distinction, and a Master of Arts in Teaching degree from Yale.
his new education plan would work in practice. What would it take to get there from here?
more on Future Trends in this IMS blog
Law is Code: Making Policy for Artificial Intelligence
Jules Polonetsky and Omer Tene January 16, 2019
Twenty years have passed since renowned Harvard Professor Larry Lessig coined the phrase “Code is Law”, suggesting that in the digital age, computer code regulates behavior much like legislative code traditionally did. These days, the computer code that powers artificial intelligence (AI) is a salient example of Lessig’s statement.
- Good AI requires sound data. One of the principles, some would say the organizing principle, of privacy and data protection frameworks is data minimization. Data protection laws require organizations to limit data collection to the extent strictly necessary and retain data only so long as it is needed for its stated goal.
- Preventing discrimination – intentional or not.
When is a distinction between groups permissible or even merited and when is it untoward? How should organizations address historically entrenched inequalities that are embedded in data? New mathematical theories such as “fairness through awareness” enable sophisticated modeling to guarantee statistical parity between groups.
- Assuring explainability – technological due process. In privacy and freedom of information frameworks alike, transparency has traditionally been a bulwark against unfairness and discrimination. As Justice Brandeis once wrote, “Sunlight is the best of disinfectants.”
- Deep learning means that iterative computer programs derive conclusions for reasons that may not be evident even after forensic inquiry.
Yet even with code as law and a rising need for law in code, policymakers do not need to become mathematicians, engineers and coders. Instead, institutions must develop and enhance their technical toolbox by hiring experts and consulting with top academics, industry researchers and civil society voices. Responsible AI requires access to not only lawyers, ethicists and philosophers but also to technical leaders and subject matter experts to ensure an appropriate balance between economic and scientific benefits to society on the one hand and individual rights and freedoms on the other hand.
more on AI in this IMS blog
1 2 3 … 5 Next