Two decades ago, as Apple’s operations chief, Mr. Cook spearheaded the company’s entrance into China, a move that helped make Apple the most valuable company in the world and made him the heir apparent to Steve Jobs. Apple now assembles nearly all of its products and earns a fifth of its revenue in the China region. But just as Mr. Cook figured out how to make China work for Apple, China is making Apple work for the Chinese government.
Mr. Cook often talks about Apple’s commitment to civil liberties and privacy. But to stay on the right side of Chinese regulators, his company has put the data of its Chinese customers at risk
Your weekly reminder that Oracle’s PeopleSoft shapes vast aspects of the lives of faculty, students and staff in universities in many countries around the world… https://t.co/06Ajvyc7um
U.S. Policy on China May Move from ‘America First’ to America & Co.
A tech entrepreneur in the State Department is using network theory to counter Chinese pressure.
According to Krach, the Clean Network includes 180 telecom companies and 50 national governments that represent two-thirds of the world’s gross domestic product. Although that’s impressive, all countries aren’t equally committed.
The task of forming networks to counter China’s influence has been made easier by China itself, which has frightened and angered trading partners with its “wolf warrior” diplomacy, a newly belligerent pursuit of China’s national interests.
The Clean Network is to China what George Kennan’s “long telegram” [PDF] of 1946 was to the Soviet Union, wrote David Fidler, adjunct senior fellow for cybersecurity and global health at the Council on Foreign Relations, in a blog post in October.
But trade deals alone are not enough, says Martijn Rasser, a senior fellow at the Center for a New American Security. For instance, they wouldn’t stop China from exporting its surveillance technology to countries such as Venezuela and Uganda, where it’s been used to target political activists, he says.
Algorithmic proctoring software has been around for several years, but its use exploded as the COVID-19 pandemic forced schools to quickly transition to remote learning. Proctoring companies cite studies estimating that between 50 and 70 percent of college students will attempt some form of cheating, and warn that cheating will be rampant if students are left unmonitored in their own homes.
Like many other tech companies, they also balk at the suggestion that they are responsible for how their software is used. While their algorithms flag behavior that the designers have deemed suspicious, these companies argue that the ultimate determination of whether cheating occurred rests in the hands of the class instructor.
As more evidence emerges about how the programs work, and fail to work, critics say the tools are bound to hurt low-income students, students with disabilities, students with children or other dependents, and other groups who already face barriers in higher education.
“Each academic department has almost complete agency to design their curriculum as far as I know, and each professor has the freedom to design their own exams and use whatever monitoring they see fit,” Rohan Singh, a computer engineering student at Michigan State University, told Motherboard.
after students approached faculty members at the University of California Santa Barbara, the faculty association sent a letter to the school’s administration raising concerns about whether ProctorU would share student data with third parties. In response, a ProctorU attorney threatened to sue the faculty association for defamation and violating copyright law (because the association had used the company’s name and linked to its website). He also accused the faculty association of “directly impacting efforts to mitigate civil disruption across the United States” by interfering with education during a national emergency, and said he was sending his complaint to the state’s Attorney General.
here is a link to a community discussion regarding this and similar software use:
Investment continues to flow to ed tech, with $803 million injected during the first six months of the year, according to the industry news website EdSurge. But half of that went to just six companies, including the celebrity tutorial provider MasterClass, the online learning platform Udemy and the school and college review site Niche.
From the outside, the ed-tech sector may appear as if “there’s a bonanza and it’s like the dot-com boom again and everybody’s printing money,” said Michael Hansen, CEO of the K-12 and higher education digital learning provider Cengage. “That is not the case.”
Even if they want to buy more ed-tech tools, meanwhile, schools and colleges are short on cash. Expenses for measures to deal with Covid-19 are up, while budgets are expected to be down.
Analysts and industry insiders now expect a wave of acquisitions as already-dominant brands like these seek to corner even more of the market by snatching up smaller players that provide services they don’t.
++++++++++++++++
Tech-based contact tracing could put schools in murky privacy territory
A white paper from the Surveillance Technology Oversight Project (STOP) suggests the use of contact tracing technology by schools could erode student privacy and may not be effective in preventing the spread of coronavirus.
Despite the pandemic, schools still must conform to the Family Educational Rights and Privacy Act (FERPA) and other laws governing student privacy. Districts can disclose information to public health officials, for example, but information can’t be released to the general public without written consent from parents.
The Safely Reopen Schools mobile app is one tool available for automating contact tracing. The idea is that if two mobile phones are close enough to connect via Bluetooth, the phone owners are close enough to transmit the virus. The app includes daily health check-ins and educational notifications, but no personal information is exchanged between the phones, and the app won’t disclose who tested positive.
Colleges are also using apps to help trace and track students’ exposure to coronavirus. In August, 20,000 participants from the University of Alabama at Birmingham were asked to test the GuideSafe mobile app, which will alert them if they’ve been in contact with someone who tested positive for COVID-19. The app determines the proximity of two people through cell phone signal strength. If someone reports they contracted the virus, an alert will be sent to anyone who has been within six feet of them for at least 15 minutes over the previous two weeks.
Critics of the technology claim these apps aren’t actually capable of contract tracing and could undermine manual efforts to do so.
digital ethics, which I define simply as “doing the right thing at the intersection of technology innovation and accepted social values.”
Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, written by Cathy O’Neil in early 2016, continues to be relevant and illuminating. O’Neil’s book revolves around her insight that “algorithms are opinions embedded in code,” in distinct contrast to the belief that algorithms are based on—and produce—indisputable facts.
Safiya Umoja Noble’s book Algorithms of Oppression: How Search Engines Reinforce Racism
The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power
Instructure has made it clear through their own language that they view the student data they aggregated as one of their chief assets, although they have also insisted that they do not use that data improperly. My note: “improperly” is relative and requires defining.
Yet an article published in the Virginia Journal of Law and Technology, titled “Transparency and the Marketplace for Student Data,” pointed out that there is “an overall lack of transparency in the student information commercial marketplace and an absence of law to protect student information.” As such, some students at the University of California are concerned that — despite reassurances to the contrary — their institution’s new financial relationship with Thoma Bravo will mean their personal data can be sold or otherwise misused.
Once surveillance laws such as an encryption backdoor for the “good guys” is available, it’s just a matter of time until the “good guys” turn bad or abuse their power.
By stressing the fact that tech companies must decrypt sensitive information only after a court issues a warrant, the three Senators believe they can swing the public opinion in favor of this encryption backdoor law.