Searching for "data privacy"

surveillance technology and education

https://www.edsurge.com/news/2019-06-10-is-school-surveillance-going-too-far-privacy-leaders-urge-a-slow-down

New York’s Lockport City School District, which is using public funds from a Smart Schools bond to help pay for a reported $3.8 million security system that uses facial recognition technology to identify individuals who don’t belong on campus

The Lockport case has drawn the attention of national media, ire of many parents and criticism from the New York Civil Liberties Union, among other privacy groups.

the Future of Privacy Forum (FPF), a nonprofit think tank based in Washington, D.C., published an animated video that illustrates the possible harm that surveillance technology can cause to children and the steps schools should take before making any decisions, such as identifying specific goals for the technology and establishing who will have access to the data and for how long.

A few days later, the nonprofit Center for Democracy and Technology, in partnership with New York University’s Brennan Center for Justice, released a brief examining the same topic.

My note: same considerations were relayed to the SCSU SOE dean in regard of the purchase of Premethean and its installation in SOE building without discussion with faculty, who work with technology. This information was also shared with the dean: https://blog.stcloudstate.edu/ims/2018/10/31/students-data-privacy/

++++++++++++
more on surveillance in education in this IMS blog
https://blog.stcloudstate.edu/ims?s=surveillance+education

bluetooth and surveillance

https://www.nytimes.com/interactive/2019/06/14/opinion/bluetooth-wireless-tracking-privacy.html

Recent reports have noted how companies use data gathered from cell towers, ambient Wi-Fi, and GPS. But the location data industry has a much more precise, and unobtrusive, tool: Bluetooth beacons.

Most people aren’t aware they are being watched with beacons, but the “beacosystem” tracks millions of people every day. Beacons are placed at airportsmallssubwaysbusestaxissporting arenasgymshotelshospitalsmusic festivalscinemas and museums, and even on billboards.

Companies like Reveal Mobile collect data from software development kits inside hundreds of frequently used apps. In the United States, another company, inMarket, covers 38 percent of millennial moms and about one-quarter of all smartphones, and tracks 50 million people each month. Other players have similar reach.

What is an S.D.K.?A Software Development Kit is code that’s inserted into an app and enables certain features, like activating your phone’s Bluetooth sensor. Location data companies create S.D.K.s and developers insert them into their apps, creating a conduit for recording and storing your movement data.

Beacons are also being used for smart cities initiatives. The location company Gimbal provided beacons for LinkNYC kiosks that provoked privacy concerns about tracking passers-by. Beacon initiatives have been started in other cities, including Amsterdam (in partnership with Google), London and Norwich.

Familiar tech giants are also players in the beacosystem. In 2015, Facebook began shipping free Facebook Bluetooth beacons to businesses for location marketing inside the Facebook app. Leaked documents show that Facebook worried that users would “freak out” and spread “negative memes” about the program. The company recently removed the Facebook Bluetooth beacons section from their website.

Not to be left out, in 2017, Google introduced Project Beacon and began sending beacons to businesses for use with Google Ads services. Google uses the beacons to send the businesses’ visitors notificationsthat ask them to leave photos and reviews, among other features. And last year, investigators at Quartz found that Google Android can track you using Bluetooth beacons even when you turn Bluetooth off in your phone.

Companies collecting micro-location data defend the practice by arguing that users can opt out of location services. They maintain that consumers embrace targeted ads because they’re more relevant.

You can download an app like Beacon Scanner and scan for beacons when you enter a store. But even if you detect the beacons, you don’t know who is collecting the data.

The Times’s guide on how to stop apps from tracking your location. For Android users, the F-Droid app store hosts free and open- source apps that do not spy on users with hidden trackers.

++++++++++++++++++++
More on surveillance in this IMS Blog
https://blog.stcloudstate.edu/ims?s=surveillance

break up Facebook

https://nyti.ms/2LzRzwq

Facebook’s board works more like an advisory committee than an overseer, because Mark controls around 60 percent of voting shares. Mark alone can decide how to configure Facebook’s algorithms to determine what people see in their News Feeds, what privacy settings they can use and even which messages get delivered. He sets the rules for how to distinguish violent and incendiary speech from the merely offensive, and he can choose to shut down a competitor by acquiring, blocking or copying it.

We are a nation with a tradition of reining in monopolies, no matter how well intentioned the leaders of these companies may be. Mark’s power is unprecedented and un-American.

It is time to break up Facebook.

America was built on the idea that power should not be concentrated in any one person, because we are all fallible. That’s why the founders created a system of checks and balances.

More legislation followed in the 20th century, creating legal and regulatory structures to promote competition and hold the biggest companies accountable.

Starting in the 1970s, a small but dedicated group of economists, lawyers and policymakers sowed the seeds of our cynicism. Over the next 40 years, they financed a network of think tanks, journals, social clubs, academic centers and media outlets to teach an emerging generation that private interests should take precedence over public ones. Their gospel was simple: “Free” markets are dynamic and productive, while government is bureaucratic and ineffective.

American industries, from airlines to pharmaceuticals, have experienced increased concentration, and the average size of public companies has tripled. The results are a decline in entrepreneurshipstalled productivity growth, and higher prices and fewer choices for consumers.

From our earliest days, Mark used the word “domination” to describe our ambitions, with no hint of irony or humility.

Facebook’s monopoly is also visible in its usage statistics. About 70 percent of American adults use social media, and a vast majority are on Facebook products. Over two-thirds use the core site, a third use Instagram, and a fifth use WhatsApp. By contrast, fewer than a third report using Pinterest, LinkedIn or Snapchat. What started out as lighthearted entertainment has become the primary way that people of all ages communicate online.

The F.T.C.’s biggest mistake was to allow Facebook to acquire Instagram and WhatsApp. In 2012, the newer platforms were nipping at Facebook’s heels because they had been built for the smartphone, where Facebook was still struggling to gain traction. Mark responded by buying them, and the F.T.C. approved.

The News Feed algorithm reportedly prioritized videos created through Facebook over videos from competitors, like YouTube and Vimeo. In 2012, Twitter introduced a video network called Vine that featured six-second videos. That same day, Facebook blocked Vine from hosting a tool that let its users search for their Facebook friends while on the new network. The decision hobbled Vine, which shut down four years later.

unlike Vine, Snapchat wasn’t interfacing with the Facebook ecosystem; there was no obvious way to handicap the company or shut it out. So Facebook simply copied it. (opyright law does not extend to the abstract concept itself.)

As markets become more concentrated, the number of new start-up businesses declines. This holds true in other high-tech areas dominated by single companies, like search (controlled by Google) and e-commerce (taken over by Amazon). Meanwhile, there has been plenty of innovation in areas where there is no monopolistic domination, such as in workplace productivity (Slack, Trello, Asana), urban transportation (Lyft, Uber, Lime, Bird) and cryptocurrency exchanges (Ripple, Coinbase, Circle).

The choice is mine, but it doesn’t feel like a choice. Facebook seeps into every corner of our lives to capture as much of our attention and data as possible and, without any alternative, we make the trade.

Just last month, Facebook seemingly tried to bury news that it had stored tens of millions of user passwords in plain text format, which thousands of Facebook employees could see. Competition alone wouldn’t necessarily spur privacy protection — regulation is required to ensure accountability — but Facebook’s lock on the market guarantees that users can’t protest by moving to alternative platforms.

Mark used to insist that Facebook was just a “social utility,” a neutral platform for people to communicate what they wished. Now he recognizes that Facebook is both a platform and a publisher and that it is inevitably making decisions about values. The company’s own lawyers have argued in court that Facebook is a publisher and thus entitled to First Amendment protection.

As if Facebook’s opaque algorithms weren’t enough, last year we learned that Facebook executives had permanently deleted their own messages from the platform, erasing them from the inboxes of recipients; the justification was corporate security concerns.

Mark may never have a boss, but he needs to have some check on his power. The American government needs to do two things: break up Facebook’s monopoly and regulate the company to make it more accountable to the American people.

++++++++++++++++++++

We Don’t Need Social Media

The push to regulate or break up Facebook ignores the fact that its services do more harm than good

Colin Horgan, May 13, 2019

https://onezero.medium.com/we-dont-need-social-media-53d5455f4f6b

Hughes joins a growing chorus of former Silicon Valley unicorn riders who’ve recently had second thoughts about the utility or benefit of the surveillance-attention economy their products and platforms have helped create. He is also not the first to suggest that government might need to step in to clean up the mess they made

Nick Srnicek, author of the book Platform Capitalism and a lecturer in digital economy at King’s College London, wrotelast month, “[I]t’s competition — not size — that demands more data, more attention, more engagement and more profits at all costs

 

++++++++++++++++++++
more on Facebook in this IMS blog
https://blog.stcloudstate.edu/ims?s=facebook

education algorithms

https://www.edsurge.com/news/2016-06-10-humanizing-education-s-algorithms

predictive algorithms to better target students’ individual learning needs.

Personalized learning is a lofty aim, however you define it. To truly meet each student where they are, we would have to know their most intimate details, or discover it through their interactions with our digital tools. We would need to track their moods and preferences, their fears and beliefs…perhaps even their memories.

There’s something unsettling about capturing users’ most intimate details. Any prediction model based off historical records risks typecasting the very people it is intended to serve. Even if models can overcome the threat of discrimination, there is still an ethical question to confront – just how much are we entitled to know about students?

We can accept that tutoring algorithms, for all their processing power, are inherently limited in what they can account for. This means steering clear of mythical representations of what such algorithms can achieve. It may even mean giving up on personalization altogether. The alternative is to pack our algorithms to suffocation at the expense of users’ privacy. This approach does not end well.

There is only one way to resolve this trade-off: loop in the educators.

Algorithms and data must exist to serve educators

 

++++++++++++
more on algorithms in this IMS blog
blog.stcloudstate.edu/ims?s=algor

ARLD 2019

ARLD 2019

Paul Goodman

Technology is a branch of moral philosophy, not of science

The process of making technology is design

Design is a branch of moral philosophy, not of a science

 

System design reflects the designer’s values and the cultural content

Andreas Orphanides

 

Fulbright BOYD

 

Byzantine history professor Bulgarian – all that is 200 years old is politics, not history

 

Access, privacy, equity, values for the prof organization ARLD.

 

Mike Monteiro

This is how bad design makes it out into the world, not due to mailcioius intent, but whith nbo intent at all

 

Cody Hanson

Our expertise, our service ethic, and our values remain our greatest strengths. But for us to have the impat we seek into the lives of our users, we must encode our services and our values in to the software

Ethical design.

Design interprets the world to crate useful objects. Ethical design closes the loop, imaging how those object will affect the world.

 

A good science fiction story should be able to predict not the automobile, ut the traffics jam. Frederic Pohl

Victor Papanek The designer’s social and moral judgement must be brought into play long before she begins to design.

 

We need to fear the consequences of our work more than we love the cleverness of our ideas Mike Monteiro

Analytics

Qual and quan data – lirarainas love data, usage, ILL, course reserves, data –  QQLM.

IDEO – the goal of design research isn’t to collect data, I tis to synthesize information and provide insight and guidance that leads to action.

Google Analytics: the trade off. besides privacy concners. sometimes data and analytics is the only thing we can see.

Frank CHimero – remove a person;s humanity and she is just a curiosity, a pinpoint on a map, a line in a list, an entry in a dbase. a person turns into a granular but of information.

Gale analytics on demand – similar the keynote speaker at Macalester LibTech 2019. https://www.facebook.com/InforMediaServices/posts/1995793570531130?comment_id=1995795043864316&comment_tracking=%7B%22tn%22%3A%22R%22%7D

personas

by designing for yourself or your team, you are potentially building discrimination right into your product Erica Hall.

Search algorithms.

what is relevance. the relevance of the ranking algorithm. for whom (what patron). crummy searches.

reckless associsations – made by humans or computers – can do very real harm especially when they appear in supposedly neutral environments.

Donna Lanclos and Andrew Asher Ethonography should be core to the business of the library.

technology as information ecology. co-evolve. prepare to start asking questions to see the effect of our design choices.

ethnography of library: touch point tours – a student to give a tour to the librarians or draw a map of the library , give a sense what spaces they use, what is important. ethnographish

Q from the audience: if instructors warn against Google and Wikipedia and steer students to library and dbases, how do you now warn about the perils of the dbases bias? A: put fires down, and systematically, try to build into existing initiatives: bi-annual magazine, as many places as can

Gen Z and social media

Under Employers’ Gaze, Gen Z Is Biting Its Tongue On Social Media

April 13, 20195:00 AM ET

https://www.npr.org/2019/04/13/702555175/under-employers-gaze-gen-z-is-biting-its-tongue-on-social-media

The oldest members of Generation Z are around 22 years old — now entering the workforce and adjusting their social media accordingly. They are holding back from posting political opinions and personal information in favor of posting about professional accomplishments.

only about 1 in 10 teenagers say they share personal, religious or political beliefs on social media, according to a recent survey from Pew Research Center.

70 percent of employers and recruiters say they check social media during the hiring process, according to a survey conducted by CareerBuilder

Generation Z, nicknamed “iGen,” is the post-millennial generation responsible for ‘killing’ Facebook and for the rise of TikTok.

Curricula like Common Sense Education’s digital citizenship program are working to educate the younger generation on how to use social media, something the older generations were never taught.

Some users are regularly cleaning up — “re-curating” — their online profiles. Cleanup apps, like TweetDelete,

Gen Zers also use social media in more ephemeral ways than older generations — Snapchat stories that disappear after 24 hours, or Instagram posts that they archive a couple of months later.

Gen Zers already use a multitude of strategies to make sure their online presence is visible only to who they want: They set their account to private, change their profile name or handle, even make completely separate “fake” accounts.

+++++++++++++++
more on social media in this IMS blog
https://blog.stcloudstate.edu/ims?s=social+media

and privacy
https://blog.stcloudstate.edu/ims?s=privacy

philosophy technology

McMullan, T. (2018, April 26). How Technology Got Under Our Skin – Featured Stories. Retrieved April 2, 2019, from Medium website: https://medium.com/s/story/how-technology-got-under-our-skin-cee8a71b241b

anthropocene

Like the circle-bound symmetry of Leonardo Da Vinci’s Vitruvian Man, the meat and bones of the human race are the same in 2018 as they were in 1490. And yet, we are different.

Michael Patrick Lynch, writer and professor of philosophy at the University of Connecticut.
“The digital revolution is more like the revolution brought on by the written word. Just as the written word allowed us to time-travel — to record our thoughts for others, including ourselves, to read in the future — so the internet has allowed for a kind of tele-transportation , breaking down barriers of space and physical limitation and connecting us across the globe in ways we now take for granted, as we do the written word.”

In the book Self-Tracking, authors Gina Neff, a sociology professor at Oxford University, and Dawn Nafus, a research scientist at Intel, describe this phenomenon as a shuffling between physical signs and observed recordings: “The data becomes a ‘prosthetic of feeling,’Advocates of this “prosthetic of feeling” argue that self-tracking can train people to recognize their own body signals, tuning the senses to allow for a greater grasp of biological rhythms.but what if the body-as-data is exploited by the state, or by an insurance company that can predict when you’ll get diabetes, or a data analytics firm that can use it to help sway elections? The Chinese government is going so far as to plan a social credit score for its citizens by 2020, giving each of the country’s 1.3 billion residents a reputation number based on economic and social status. What is particularly subtle about all this is that, like a scientific épistémè, our way of thinking is perhaps unconsciously guided by the configurations of knowledge these new technologies allow. We don’t question it.

Hannah Knox. Computational machines are “shaping what we expect it means to be a human”, Knox wrote for the Corsham Institute’s Observatory for a Connected Society.

Facebook goads us to remember past moments on a daily basis, the stacked boxes of tape in Beckett’s play replaced with stacks of servers in remote data centers in northern Sweden.“There is reasonable evidence that [the internet] has reduced our internal memory ability,” says Phil Reed, a professor of psychology at Swansea University.

Moderate tech use correlated with positive mental health, according to a paper published in Psychological Science by Andrew Przybylski of Oxford and Netta Weinstein at Cardiff University, who surveyed 120,000 British 15-year-olds.Again, the crucial question is one of control. If our way of thinking is changed by our intimacy with these technologies, then is this process being directed by individuals, or the ledgers of private companies, or governments keen on surveilling their citizens? If we conceive of these systems as extensions of our own brains, what happens if they collapse?

Brain-machine interfaces (BMI) are coming in leaps and bounds, with companies like Neuralink and CTRL-Labs in the United States exploring both surgical and noninvasive processes that allow computers to be controlled directly by signals from the brain. It’s a field that involves fundamentally changing the relationship between our minds, bodies, and machines.Kevin Warwick, emeritus professor at Coventry University and a pioneer in implant technology

+++++++++
more on philosophy in this IMS blog
https://blog.stcloudstate.edu/ims?s=philosophy

Library Technology Conference 2019

#LTC2019

Intro to XR in Libraries from Plamen Miltenoff

keynote: equitable access to information

keynote spaker

https://sched.co/JAqk
the type of data: wikipedia. the dangers of learning from wikipedia. how individuals can organize mitigate some of these dangers. wikidata, algorithms.
IBM Watson is using wikipedia by algorythms making sense, AI system
youtube videos debunked of conspiracy theories by using wikipedia.

semantic relatedness, Word2Vec
how does algorithms work: large body of unstructured text. picks specific words

lots of AI learns about the world from wikipedia. the neutral point of view policy. WIkipedia asks editors present as proportionally as possible. Wikipedia biases: 1. gender bias (only 20-30 % are women).

conceptnet. debias along different demographic dimensions.

citations analysis gives also an idea about biases. localness of sources cited in spatial articles. structural biases.

geolocation on Twitter by County. predicting the people living in urban areas. FB wants to push more local news.

danger (biases) #3. wikipedia search results vs wkipedia knowledge panel.

collective action against tech: Reddit, boycott for FB and Instagram.

Mechanical Turk https://www.mturk.com/  algorithmic / human intersection

data labor: what the primary resources this companies have. posts, images, reviews etc.

boycott, data strike (data not being available for algorithms in the future). GDPR in EU – all historical data is like the CA Consumer Privacy Act. One can do data strike without data boycott. general vs homogeneous (group with shared identity) boycott.

the wikipedia SPAM policy is obstructing new editors and that hit communities such as women.

++++++++++++++++++

Twitter and Other Social Media: Supporting New Types of Research Materials

https://sched.co/JAWp

Nancy Herther Cody Hennesy

http://z.umn.edu/

twitter librarieshow to access at different levels. methods and methodological concerns. ethical concerns, legal concerns,

tweetdeck for advanced Twitter searches. quoting, likes is relevant, but not enough, sometimes screenshot

engagement option

social listening platforms: crimson hexagon, parsely, sysomos – not yet academic platforms, tools to setup queries and visualization, but difficult to algorythm, the data samples etc. open sources tools (Urbana, Social Media microscope: SMILE (social media intelligence and learning environment) to collect data from twitter, reddit and within the platform they can query Twitter. create trend analysis, sentiment analysis, Voxgov (subscription service: analyzing political social media)

graduate level and faculty research: accessing SM large scale data web scraping & APIs Twitter APIs. Jason script, Python etc. Gnip Firehose API ($) ; Web SCraper Chrome plugin (easy tool, Pyhon and R created); Twint (Twitter scraper)

Facepager (open source) if not Python or R coder. structure and download the data sets.

TAGS archiving google sheets, uses twitter API. anything older 7 days not avaialble, so harvest every week.

social feed manager (GWUniversity) – Justin Litman with Stanford. Install on server but allows much more.

legal concerns: copyright (public info, but not beyond copyrighted). fair use argument is strong, but cannot publish the data. can analyize under fair use. contracts supercede copyright (terms of service/use) licensed data through library.

methods: sampling concerns tufekci, 2014 questions for sm. SM data is a good set for SM, but other fields? not according to her. hashtag studies: self selection bias. twitter as a model organism: over-represnted data in academic studies.

methodological concerns: scope of access – lack of historical data. mechanics of platform and contenxt: retweets are not necessarily endorsements.

ethical concerns. public info – IRB no informed consent. the right to be forgotten. anonymized data is often still traceable.

table discussion: digital humanities, journalism interested, but too narrow. tools are still difficult to find an operate. context of the visuals. how to spread around variety of majors and classes. controversial events more likely to be deleted.

takedowns, lies and corrosion: what is a librarian to do: trolls, takedown,

++++++++++++++vr in library

Crague Cook, Jay Ray

the pilot process. 2017. 3D printing, approaching and assessing success or failure.  https://collegepilot.wiscweb.wisc.edu/

development kit circulation. familiarity with the Oculus Rift resulted in lesser reservation. Downturn also.

An experience station. clean up free apps.

question: spherical video, video 360.

safety issues: policies? instructional perspective: curating,WI people: user testing. touch controllers more intuitive then xbox controller. Retail Oculus Rift

app Scatchfab. 3modelviewer. obj or sdl file. Medium, Tiltbrush.

College of Liberal Arts at the U has their VR, 3D print set up.
Penn State (Paul, librarian, kiniseology, anatomy programs), Information Science and Technology. immersive experiences lab for video 360.

CALIPHA part of it is xrlibraries. libraries equal education. content provider LifeLiqe STEM library of AR and VR objects. https://www.lifeliqe.com/

+++++++++++++++++

Access for All:

https://sched.co/JAXn

accessibilityLeah Root

bloat code (e.g. cleaning up MS Word code)

ILLiad Doctype and Language declaration helps people with disabilities.

https://24ways.org/

 

+++++++++++++++++++

A Seat at the Table: Embedding the Library in Curriculum Development

https://sched.co/JAY5

embedded librarianembed library resources.

libraians, IT staff, IDs. help faculty with course design, primarily online, master courses. Concordia is GROWING, mostly because of online students.

solve issues (putting down fires, such as “gradebook” on BB). Librarians : research and resources experts. Librarians helping with LMS. Broadening definition of Library as support hub.

Facebook’s Content Moderators

Propaganda, Hate Speech, Violence: The Working Lives Of Facebook’s Content Moderators

https://www.npr.org/2019/03/02/699663284/the-working-lives-of-facebooks-content-moderators

In a recent article for The Verge titled “The Trauma Floor: The secret lives of Facebook moderators in America,” a dozen current and former employees of one of the company’s contractors, Cognizant, talked to Newton about the mental health costs of spending hour after hour monitoring graphic content.

Perhaps the most surprising find from his investigation, the reporter said, was how the majority of the employees he talked to started to believe some of the conspiracy theories they reviewed.

 

++++++++++++++++
more on Facebook in this iMS blog
https://blog.stcloudstate.edu/ims?s=Facebook+privacy

 

Education and Ethics

4 Ways AI Education and Ethics Will Disrupt Society in 2019

By Tara Chklovski     Jan 28, 2019

https://www.edsurge.com/news/2019-01-28-4-ways-ai-education-and-ethics-will-disrupt-society-in-2019

In 2018 we witnessed a clash of titans as government and tech companies collided on privacy issues around collecting, culling and using personal data. From GDPR to Facebook scandals, many tech CEOs were defending big data, its use, and how they’re safeguarding the public.

Meanwhile, the public was amazed at technological advances like Boston Dynamic’s Atlas robot doing parkour, while simultaneously being outraged at the thought of our data no longer being ours and Alexa listening in on all our conversations.

1. Companies will face increased pressure about the data AI-embedded services use.

2. Public concern will lead to AI regulations. But we must understand this tech too.

In 2018, the National Science Foundation invested $100 million in AI research, with special support in 2019 for developing principles for safe, robust and trustworthy AI; addressing issues of bias, fairness and transparency of algorithmic intelligence; developing deeper understanding of human-AI interaction and user education; and developing insights about the influences of AI on people and society.

This investment was dwarfed by DARPA—an agency of the Department of Defence—and its multi-year investment of more than $2 billion in new and existing programs under the “AI Next” campaign. A key area of the campaign includes pioneering the next generation of AI algorithms and applications, such as “explainability” and common sense reasoning.

Federally funded initiatives, as well as corporate efforts (such as Google’s “What If” tool) will lead to the rise of explainable AI and interpretable AI, whereby the AI actually explains the logic behind its decision making to humans. But the next step from there would be for the AI regulators and policymakers themselves to learn about how these technologies actually work. This is an overlooked step right now that Richard Danzig, former Secretary of the U.S. Navy advises us to consider, as we create “humans-in-the-loop” systems, which require people to sign off on important AI decisions.

3. More companies will make AI a strategic initiative in corporate social responsibility.

Google invested $25 million in AI for Good and Microsoft added an AI for Humanitarian Action to its prior commitment. While these are positive steps, the tech industry continues to have a diversity problem

4. Funding for AI literacy and public education will skyrocket.

Ryan Calo from the University of Washington explains that it matters how we talk about technologies that we don’t fully understand.

 

 

 

1 8 9 10 11 12 16