Searching for "Facebook privacy"

corporate surveillance

Behind the One-Way Mirror: A Deep Dive Into the Technology of Corporate Surveillance
BY BENNETT CYPHERS DECEMBER 2, 2019

https://www.eff.org/wp/behind-the-one-way-mirror

Corporations have built a hall of one-way mirrors: from the inside, you can see only apps, web pages, ads, and yourself reflected by social media. But in the shadows behind the glass, trackers quietly take notes on nearly everything you do. These trackers are not omniscient, but they are widespread and indiscriminate. The data they collect and derive is not perfect, but it is nevertheless extremely sensitive.

A data-snorting company can just make low bids to ensure it never wins while pocketing your data for nothing. This is a flaw in the implied deal where you trade data for benefits.

You can limit what you give away by blocking tracking cookies. Unfortunately, you can still be tracked by other techniques. These include web beaconsbrowser fingerprinting and behavioural data such as mouse movements, pauses and clicks, or sweeps and taps.

The EU’s GDPR (General Data Protection Regulation) was a baby step in the right direction. BOWM also mentions Vermont’s data privacy law, the Illinois Biometric Information Protection Act (BIPA) and next year’s California Consumer Privacy Act (CCPA).

Tor, the original anti-surveillance browser, is based on an old, heavily modified version of Firefox.

Most other browsers are now, like Chrome, based on Google’s open source Chromium. Once enough web developers started coding for Chrome instead of for open standards, it became arduous and expensive to sustain alternative browser engines. Chromium-based browsers now include Opera, Vivaldi, Brave, the Epic Privacy Browser and next year’s new Microsoft Edge.

+++++++++++++
more on surveillance in this IMS blog
https://blog.stcloudstate.edu/ims?s=surveillance

deepfake Zao

https://www.theguardian.com/technology/2019/sep/02/chinese-face-swap-app-zao-triggers-privacy-fears-viral

Released on Friday, the Zao app went viral as Chinese users seized on the chance to see themselves act out scenes from well-known movies using deepfake technology, which has already prompted concerns elsewhere over potential misuse.

As of Monday afternoon it remained the top free download in China, according to the app market data provider App Annie.

Concerns over deepfakes have grown since the 2016 US election campaign, which saw wide use of online misinformation, according to US investigations.

In June, Facebook’s chief executive, Mark Zuckerberg, said the social network was struggling to find ways to deal with deepfake videos, saying they may constitute “a completely different category” of misinformation than anything faced before.

++++++++++
more on deepfake in this IMS blog
https://blog.stcloudstate.edu/ims?s=deepfake

surveillance technology and education

https://www.edsurge.com/news/2019-06-10-is-school-surveillance-going-too-far-privacy-leaders-urge-a-slow-down

New York’s Lockport City School District, which is using public funds from a Smart Schools bond to help pay for a reported $3.8 million security system that uses facial recognition technology to identify individuals who don’t belong on campus

The Lockport case has drawn the attention of national media, ire of many parents and criticism from the New York Civil Liberties Union, among other privacy groups.

the Future of Privacy Forum (FPF), a nonprofit think tank based in Washington, D.C., published an animated video that illustrates the possible harm that surveillance technology can cause to children and the steps schools should take before making any decisions, such as identifying specific goals for the technology and establishing who will have access to the data and for how long.

A few days later, the nonprofit Center for Democracy and Technology, in partnership with New York University’s Brennan Center for Justice, released a brief examining the same topic.

My note: same considerations were relayed to the SCSU SOE dean in regard of the purchase of Premethean and its installation in SOE building without discussion with faculty, who work with technology. This information was also shared with the dean: https://blog.stcloudstate.edu/ims/2018/10/31/students-data-privacy/

++++++++++++
more on surveillance in education in this IMS blog
https://blog.stcloudstate.edu/ims?s=surveillance+education

bluetooth and surveillance

https://www.nytimes.com/interactive/2019/06/14/opinion/bluetooth-wireless-tracking-privacy.html

Recent reports have noted how companies use data gathered from cell towers, ambient Wi-Fi, and GPS. But the location data industry has a much more precise, and unobtrusive, tool: Bluetooth beacons.

Most people aren’t aware they are being watched with beacons, but the “beacosystem” tracks millions of people every day. Beacons are placed at airportsmallssubwaysbusestaxissporting arenasgymshotelshospitalsmusic festivalscinemas and museums, and even on billboards.

Companies like Reveal Mobile collect data from software development kits inside hundreds of frequently used apps. In the United States, another company, inMarket, covers 38 percent of millennial moms and about one-quarter of all smartphones, and tracks 50 million people each month. Other players have similar reach.

What is an S.D.K.?A Software Development Kit is code that’s inserted into an app and enables certain features, like activating your phone’s Bluetooth sensor. Location data companies create S.D.K.s and developers insert them into their apps, creating a conduit for recording and storing your movement data.

Beacons are also being used for smart cities initiatives. The location company Gimbal provided beacons for LinkNYC kiosks that provoked privacy concerns about tracking passers-by. Beacon initiatives have been started in other cities, including Amsterdam (in partnership with Google), London and Norwich.

Familiar tech giants are also players in the beacosystem. In 2015, Facebook began shipping free Facebook Bluetooth beacons to businesses for location marketing inside the Facebook app. Leaked documents show that Facebook worried that users would “freak out” and spread “negative memes” about the program. The company recently removed the Facebook Bluetooth beacons section from their website.

Not to be left out, in 2017, Google introduced Project Beacon and began sending beacons to businesses for use with Google Ads services. Google uses the beacons to send the businesses’ visitors notificationsthat ask them to leave photos and reviews, among other features. And last year, investigators at Quartz found that Google Android can track you using Bluetooth beacons even when you turn Bluetooth off in your phone.

Companies collecting micro-location data defend the practice by arguing that users can opt out of location services. They maintain that consumers embrace targeted ads because they’re more relevant.

You can download an app like Beacon Scanner and scan for beacons when you enter a store. But even if you detect the beacons, you don’t know who is collecting the data.

The Times’s guide on how to stop apps from tracking your location. For Android users, the F-Droid app store hosts free and open- source apps that do not spy on users with hidden trackers.

++++++++++++++++++++
More on surveillance in this IMS Blog
https://blog.stcloudstate.edu/ims?s=surveillance

data driven education

https://www.kqed.org/mindshift/45396/whats-at-risk-when-schools-focus-too-much-on-student-data

The U.S. Department of Education emphasizes “ensuring the use of multiple measures of school success based on academic outcomes, student progress, and school quality.”

starting to hear more about what might be lost when schools focus too much on data. Here are five arguments against the excesses of data-driven instruction.

1) Motivation (decrease)

as stereotype threat. threatening students’ sense of belonging, which is key to academic motivation.

2) Helicoptering

A style of overly involved “intrusive parenting” has been associated in studies with increased levels of anxiety and depression when students reach college.

3) Commercial Monitoring and Marketing

The National Education Policy Center releases annual reports on commercialization and marketing in public schools. In its most recent report in May, researchers there raised concerns about targeted marketing to students using computers for schoolwork and homework.

Companies like Google pledge not to track the content of schoolwork for the purposes of advertising. But in reality these boundaries can be a lot more porous.

4) Missing What Data Can’t Capture

5) Exposing Students’ “Permanent Records”

In the past few years several states have passed laws banning employers from looking at the credit reports of job applicants.
Similarly, for young people who get in trouble with the law, there is a procedure for sealing juvenile records
Educational transcripts, unlike credit reports or juvenile court records, are currently considered fair game for gatekeepers like colleges and employers. These records, though, are getting much more detailed.

education algorithms

https://www.edsurge.com/news/2016-06-10-humanizing-education-s-algorithms

predictive algorithms to better target students’ individual learning needs.

Personalized learning is a lofty aim, however you define it. To truly meet each student where they are, we would have to know their most intimate details, or discover it through their interactions with our digital tools. We would need to track their moods and preferences, their fears and beliefs…perhaps even their memories.

There’s something unsettling about capturing users’ most intimate details. Any prediction model based off historical records risks typecasting the very people it is intended to serve. Even if models can overcome the threat of discrimination, there is still an ethical question to confront – just how much are we entitled to know about students?

We can accept that tutoring algorithms, for all their processing power, are inherently limited in what they can account for. This means steering clear of mythical representations of what such algorithms can achieve. It may even mean giving up on personalization altogether. The alternative is to pack our algorithms to suffocation at the expense of users’ privacy. This approach does not end well.

There is only one way to resolve this trade-off: loop in the educators.

Algorithms and data must exist to serve educators

 

++++++++++++
more on algorithms in this IMS blog
blog.stcloudstate.edu/ims?s=algor

ARLD 2019

ARLD 2019

Paul Goodman

Technology is a branch of moral philosophy, not of science

The process of making technology is design

Design is a branch of moral philosophy, not of a science

 

System design reflects the designer’s values and the cultural content

Andreas Orphanides

 

Fulbright BOYD

 

Byzantine history professor Bulgarian – all that is 200 years old is politics, not history

 

Access, privacy, equity, values for the prof organization ARLD.

 

Mike Monteiro

This is how bad design makes it out into the world, not due to mailcioius intent, but whith nbo intent at all

 

Cody Hanson

Our expertise, our service ethic, and our values remain our greatest strengths. But for us to have the impat we seek into the lives of our users, we must encode our services and our values in to the software

Ethical design.

Design interprets the world to crate useful objects. Ethical design closes the loop, imaging how those object will affect the world.

 

A good science fiction story should be able to predict not the automobile, ut the traffics jam. Frederic Pohl

Victor Papanek The designer’s social and moral judgement must be brought into play long before she begins to design.

 

We need to fear the consequences of our work more than we love the cleverness of our ideas Mike Monteiro

Analytics

Qual and quan data – lirarainas love data, usage, ILL, course reserves, data –  QQLM.

IDEO – the goal of design research isn’t to collect data, I tis to synthesize information and provide insight and guidance that leads to action.

Google Analytics: the trade off. besides privacy concners. sometimes data and analytics is the only thing we can see.

Frank CHimero – remove a person;s humanity and she is just a curiosity, a pinpoint on a map, a line in a list, an entry in a dbase. a person turns into a granular but of information.

Gale analytics on demand – similar the keynote speaker at Macalester LibTech 2019. https://www.facebook.com/InforMediaServices/posts/1995793570531130?comment_id=1995795043864316&comment_tracking=%7B%22tn%22%3A%22R%22%7D

personas

by designing for yourself or your team, you are potentially building discrimination right into your product Erica Hall.

Search algorithms.

what is relevance. the relevance of the ranking algorithm. for whom (what patron). crummy searches.

reckless associsations – made by humans or computers – can do very real harm especially when they appear in supposedly neutral environments.

Donna Lanclos and Andrew Asher Ethonography should be core to the business of the library.

technology as information ecology. co-evolve. prepare to start asking questions to see the effect of our design choices.

ethnography of library: touch point tours – a student to give a tour to the librarians or draw a map of the library , give a sense what spaces they use, what is important. ethnographish

Q from the audience: if instructors warn against Google and Wikipedia and steer students to library and dbases, how do you now warn about the perils of the dbases bias? A: put fires down, and systematically, try to build into existing initiatives: bi-annual magazine, as many places as can

data interference

APRIL 21, 2019 Zeynep Tufekci

Think You’re Discreet Online? Think Again

Because of technological advances and the sheer amount of data now available about billions of other people, discretion no longer suffices to protect your privacy. Computer algorithms and network analyses can now infer, with a sufficiently high degree of accuracy, a wide range of things about you that you may have never disclosed, including your moods, your political beliefs, your sexual orientation and your health.

There is no longer such a thing as individually “opting out” of our privacy-compromised world.

In 2017, the newspaper The Australian published an article, based on a leaked document from Facebook, revealing that the company had told advertisers that it could predict when younger users, including teenagers, were feeling “insecure,” “worthless” or otherwise in need of a “confidence boost.” Facebook was apparently able to draw these inferences by monitoring photos, posts and other social media data.

In 2017, academic researchers, armed with data from more than 40,000 Instagram photos, used machine-learning tools to accurately identify signs of depression in a group of 166 Instagram users. Their computer models turned out to be better predictors of depression than humans who were asked to rate whether photos were happy or sad and so forth.

Computational inference can also be a tool of social control. The Chinese government, having gathered biometric data on its citizens, is trying to use big data and artificial intelligence to single out “threats” to Communist rule, including the country’s Uighurs, a mostly Muslim ethnic group.

+++++++++++++

Zeynep Tufekci and Seth Stephens-Davidowitz: Privacy is over

https://www.centreforideas.com/article/zeynep-tufekci-and-seth-stephens-davidowitz-privacy-over

+++++++++++

Zeynep Tufekci writes about security and data privacy for NY Times, disinformation’s threat to democracy for WIRED

++++++++++
more on privacy in this IMS blog
https://blog.stcloudstate.edu/ims?s=privacy

Gen Z and social media

Under Employers’ Gaze, Gen Z Is Biting Its Tongue On Social Media

April 13, 20195:00 AM ET

https://www.npr.org/2019/04/13/702555175/under-employers-gaze-gen-z-is-biting-its-tongue-on-social-media

The oldest members of Generation Z are around 22 years old — now entering the workforce and adjusting their social media accordingly. They are holding back from posting political opinions and personal information in favor of posting about professional accomplishments.

only about 1 in 10 teenagers say they share personal, religious or political beliefs on social media, according to a recent survey from Pew Research Center.

70 percent of employers and recruiters say they check social media during the hiring process, according to a survey conducted by CareerBuilder

Generation Z, nicknamed “iGen,” is the post-millennial generation responsible for ‘killing’ Facebook and for the rise of TikTok.

Curricula like Common Sense Education’s digital citizenship program are working to educate the younger generation on how to use social media, something the older generations were never taught.

Some users are regularly cleaning up — “re-curating” — their online profiles. Cleanup apps, like TweetDelete,

Gen Zers also use social media in more ephemeral ways than older generations — Snapchat stories that disappear after 24 hours, or Instagram posts that they archive a couple of months later.

Gen Zers already use a multitude of strategies to make sure their online presence is visible only to who they want: They set their account to private, change their profile name or handle, even make completely separate “fake” accounts.

+++++++++++++++
more on social media in this IMS blog
https://blog.stcloudstate.edu/ims?s=social+media

and privacy
https://blog.stcloudstate.edu/ims?s=privacy

philosophy technology

McMullan, T. (2018, April 26). How Technology Got Under Our Skin – Featured Stories. Retrieved April 2, 2019, from Medium website: https://medium.com/s/story/how-technology-got-under-our-skin-cee8a71b241b

anthropocene

Like the circle-bound symmetry of Leonardo Da Vinci’s Vitruvian Man, the meat and bones of the human race are the same in 2018 as they were in 1490. And yet, we are different.

Michael Patrick Lynch, writer and professor of philosophy at the University of Connecticut.
“The digital revolution is more like the revolution brought on by the written word. Just as the written word allowed us to time-travel — to record our thoughts for others, including ourselves, to read in the future — so the internet has allowed for a kind of tele-transportation , breaking down barriers of space and physical limitation and connecting us across the globe in ways we now take for granted, as we do the written word.”

In the book Self-Tracking, authors Gina Neff, a sociology professor at Oxford University, and Dawn Nafus, a research scientist at Intel, describe this phenomenon as a shuffling between physical signs and observed recordings: “The data becomes a ‘prosthetic of feeling,’Advocates of this “prosthetic of feeling” argue that self-tracking can train people to recognize their own body signals, tuning the senses to allow for a greater grasp of biological rhythms.but what if the body-as-data is exploited by the state, or by an insurance company that can predict when you’ll get diabetes, or a data analytics firm that can use it to help sway elections? The Chinese government is going so far as to plan a social credit score for its citizens by 2020, giving each of the country’s 1.3 billion residents a reputation number based on economic and social status. What is particularly subtle about all this is that, like a scientific épistémè, our way of thinking is perhaps unconsciously guided by the configurations of knowledge these new technologies allow. We don’t question it.

Hannah Knox. Computational machines are “shaping what we expect it means to be a human”, Knox wrote for the Corsham Institute’s Observatory for a Connected Society.

Facebook goads us to remember past moments on a daily basis, the stacked boxes of tape in Beckett’s play replaced with stacks of servers in remote data centers in northern Sweden.“There is reasonable evidence that [the internet] has reduced our internal memory ability,” says Phil Reed, a professor of psychology at Swansea University.

Moderate tech use correlated with positive mental health, according to a paper published in Psychological Science by Andrew Przybylski of Oxford and Netta Weinstein at Cardiff University, who surveyed 120,000 British 15-year-olds.Again, the crucial question is one of control. If our way of thinking is changed by our intimacy with these technologies, then is this process being directed by individuals, or the ledgers of private companies, or governments keen on surveilling their citizens? If we conceive of these systems as extensions of our own brains, what happens if they collapse?

Brain-machine interfaces (BMI) are coming in leaps and bounds, with companies like Neuralink and CTRL-Labs in the United States exploring both surgical and noninvasive processes that allow computers to be controlled directly by signals from the brain. It’s a field that involves fundamentally changing the relationship between our minds, bodies, and machines.Kevin Warwick, emeritus professor at Coventry University and a pioneer in implant technology

+++++++++
more on philosophy in this IMS blog
https://blog.stcloudstate.edu/ims?s=philosophy

1 3 4 5 6 7 8