Searching for "online privacy"

ed tech companies

Investment continues to flow to ed tech, with $803 million injected during the first six months of the year, according to the industry news website EdSurge. But half of that went to just six companies, including the celebrity tutorial provider MasterClass, the online learning platform Udemy and the school and college review site Niche.

From the outside, the ed-tech sector may appear as if “there’s a bonanza and it’s like the dot-com boom again and everybody’s printing money,” said Michael Hansen, CEO of the K-12 and higher education digital learning provider Cengage. “That is not the case.”

Even if they want to buy more ed-tech tools, meanwhile, schools and colleges are short on cash. Expenses for measures to deal with Covid-19 are up, while budgets are expected to be down.

Analysts and industry insiders now expect a wave of acquisitions as already-dominant brands like these seek to corner even more of the market by snatching up smaller players that provide services they don’t.

++++++++++++++++

Tech-based contact tracing could put schools in murky privacy territory

https://www.educationdive.com/news/tech-based-contact-tracing-could-put-schools-in-murky-privacy-territory/584881/

  • A white paper from the Surveillance Technology Oversight Project (STOP) suggests the use of contact tracing technology by schools could erode student privacy and may not be effective in preventing the spread of coronavirus.

Despite the pandemic, schools still must conform to the Family Educational Rights and Privacy Act (FERPA) and other laws governing student privacy. Districts can disclose information to public health officials, for example, but information can’t be released to the general public without written consent from parents.

The Safely Reopen Schools mobile app is one tool available for automating contact tracing. The idea is that if two mobile phones are close enough to connect via Bluetooth, the phone owners are close enough to transmit the virus. The app includes daily health check-ins and educational notifications, but no personal information is exchanged between the phones, and the app won’t disclose who tested positive.

Colleges are also using apps to help trace and track students’ exposure to coronavirus. In August, 20,000 participants from the University of Alabama at Birmingham were asked to test the GuideSafe mobile app, which will alert them if they’ve been in contact with someone who tested positive for COVID-19. The app determines the proximity of two people through cell phone signal strength. If someone reports they contracted the virus, an alert will be sent to anyone who has been within six feet of them for at least 15 minutes over the previous two weeks.

Critics of the technology claim these apps aren’t actually capable of contract tracing and could undermine manual efforts to do so.

+++++++++++++++++
more on ed tech in this IMS blog
https://blog.stcloudstate.edu/ims?s=educational+technology

Algorithmic Test Proctoring

Our Bodies Encoded: Algorithmic Test Proctoring in Higher Education

SHEA SWAUGER ED-TECH

https://hybridpedagogy.org/our-bodies-encoded-algorithmic-test-proctoring-in-higher-education/

While in-person test proctoring has been used to combat test-based cheating, this can be difficult to translate to online courses. Ed-tech companies have sought to address this concern by offering to watch students take online tests, in real time, through their webcams.

Some of the more prominent companies offering these services include ProctorioRespondusProctorUHonorLockKryterion Global Testing Solutions, and Examity.

Algorithmic test proctoring’s settings have discriminatory consequences across multiple identities and serious privacy implications. 

While racist technology calibrated for white skin isn’t new (everything from photography to soap dispensers do this), we see it deployed through face detection and facial recognition used by algorithmic proctoring systems.

While some test proctoring companies develop their own facial recognition software, most purchase software developed by other companies, but these technologies generally function similarly and have shown a consistent inability to identify people with darker skin or even tell the difference between Chinese people. Facial recognition literally encodes the invisibility of Black people and the racist stereotype that all Asian people look the same.

As Os Keyes has demonstrated, facial recognition has a terrible history with gender. This means that a software asking students to verify their identity is compromising for students who identify as trans, non-binary, or express their gender in ways counter to cis/heteronormativity.

These features and settings create a system of asymmetric surveillance and lack of accountability, things which have always created a risk for abuse and sexual harassment. Technologies like these have a long history of being abused, largely by heterosexual men at the expense of women’s bodies, privacy, and dignity.

Their promotional messaging functions similarly to dog whistle politics which is commonly used in anti-immigration rhetoric. It’s also not a coincidence that these technologies are being used to exclude people not wanted by an institution; biometrics and facial recognition have been connected to anti-immigration policies, supported by both Republican and Democratic administrations, going back to the 1990’s.

Borrowing from Henry A. Giroux, Kevin Seeber describes the pedagogy of punishment and some of its consequences in regards to higher education’s approach to plagiarism in his book chapter “The Failed Pedagogy of Punishment: Moving Discussions of Plagiarism beyond Detection and Discipline.”

my note: I am repeating this for years
Sean Michael Morris and Jesse Stommel’s ongoing critique of Turnitin, a plagiarism detection software, outlines exactly how this logic operates in ed-tech and higher education: 1) don’t trust students, 2) surveil them, 3) ignore the complexity of writing and citation, and 4) monetize the data.

Technological Solutionism

Cheating is not a technological problem, but a social and pedagogical problem.
Our habit of believing that technology will solve pedagogical problems is endemic to narratives produced by the ed-tech community and, as Audrey Watters writes, is tied to the Silicon Valley culture that often funds it. Scholars have been dismantling the narrative of technological solutionism and neutrality for some time now. In her book “Algorithms of Oppression,” Safiya Umoja Noble demonstrates how the algorithms that are responsible for Google Search amplify and “reinforce oppressive social relationships and enact new modes of racial profiling.”

Anna Lauren Hoffmann, who coined the term “data violence” to describe the impact harmful technological systems have on people and how these systems retain the appearance of objectivity despite the disproportionate harm they inflict on marginalized communities.

This system of measuring bodies and behaviors, associating certain bodies and behaviors with desirability and others with inferiority, engages in what Lennard J. Davis calls the Eugenic Gaze.

Higher education is deeply complicit in the eugenics movement. Nazism borrowed many of its ideas about racial purity from the American school of eugenics, and universities were instrumental in supporting eugenics research by publishing copious literature on it, establishing endowed professorships, institutes, and scholarly societies that spearheaded eugenic research and propaganda.

+++++++++++++++++
more on privacy in this IMS blog
https://blog.stcloudstate.edu/ims?s=privacy

deepfake Zao

https://www.theguardian.com/technology/2019/sep/02/chinese-face-swap-app-zao-triggers-privacy-fears-viral

Released on Friday, the Zao app went viral as Chinese users seized on the chance to see themselves act out scenes from well-known movies using deepfake technology, which has already prompted concerns elsewhere over potential misuse.

As of Monday afternoon it remained the top free download in China, according to the app market data provider App Annie.

Concerns over deepfakes have grown since the 2016 US election campaign, which saw wide use of online misinformation, according to US investigations.

In June, Facebook’s chief executive, Mark Zuckerberg, said the social network was struggling to find ways to deal with deepfake videos, saying they may constitute “a completely different category” of misinformation than anything faced before.

++++++++++
more on deepfake in this IMS blog
https://blog.stcloudstate.edu/ims?s=deepfake

Tik Tok and cybersecurity

https://www.axios.com/tiktok-china-online-privacy-personal-data-6b251d22-61f4-47e1-a58d-b167435472e3.html

The bottom line: While the Big Tech behemoths of the U.S. are barred from making inroads in China, the inverse doesn’t apply. That could mark an opening front in the ongoing technological and economic war between the two rivals.

++++++++
more on cybersecurity in this IMS blog
https://blog.stcloudstate.edu/ims?s=cybersecurity

https://blog.stcloudstate.edu/ims?s=tik+tok

https://blog.stcloudstate.edu/ims/2018/10/31/students-data-privacy/

break up Facebook

https://nyti.ms/2LzRzwq

Facebook’s board works more like an advisory committee than an overseer, because Mark controls around 60 percent of voting shares. Mark alone can decide how to configure Facebook’s algorithms to determine what people see in their News Feeds, what privacy settings they can use and even which messages get delivered. He sets the rules for how to distinguish violent and incendiary speech from the merely offensive, and he can choose to shut down a competitor by acquiring, blocking or copying it.

We are a nation with a tradition of reining in monopolies, no matter how well intentioned the leaders of these companies may be. Mark’s power is unprecedented and un-American.

It is time to break up Facebook.

America was built on the idea that power should not be concentrated in any one person, because we are all fallible. That’s why the founders created a system of checks and balances.

More legislation followed in the 20th century, creating legal and regulatory structures to promote competition and hold the biggest companies accountable.

Starting in the 1970s, a small but dedicated group of economists, lawyers and policymakers sowed the seeds of our cynicism. Over the next 40 years, they financed a network of think tanks, journals, social clubs, academic centers and media outlets to teach an emerging generation that private interests should take precedence over public ones. Their gospel was simple: “Free” markets are dynamic and productive, while government is bureaucratic and ineffective.

American industries, from airlines to pharmaceuticals, have experienced increased concentration, and the average size of public companies has tripled. The results are a decline in entrepreneurshipstalled productivity growth, and higher prices and fewer choices for consumers.

From our earliest days, Mark used the word “domination” to describe our ambitions, with no hint of irony or humility.

Facebook’s monopoly is also visible in its usage statistics. About 70 percent of American adults use social media, and a vast majority are on Facebook products. Over two-thirds use the core site, a third use Instagram, and a fifth use WhatsApp. By contrast, fewer than a third report using Pinterest, LinkedIn or Snapchat. What started out as lighthearted entertainment has become the primary way that people of all ages communicate online.

The F.T.C.’s biggest mistake was to allow Facebook to acquire Instagram and WhatsApp. In 2012, the newer platforms were nipping at Facebook’s heels because they had been built for the smartphone, where Facebook was still struggling to gain traction. Mark responded by buying them, and the F.T.C. approved.

The News Feed algorithm reportedly prioritized videos created through Facebook over videos from competitors, like YouTube and Vimeo. In 2012, Twitter introduced a video network called Vine that featured six-second videos. That same day, Facebook blocked Vine from hosting a tool that let its users search for their Facebook friends while on the new network. The decision hobbled Vine, which shut down four years later.

unlike Vine, Snapchat wasn’t interfacing with the Facebook ecosystem; there was no obvious way to handicap the company or shut it out. So Facebook simply copied it. (opyright law does not extend to the abstract concept itself.)

As markets become more concentrated, the number of new start-up businesses declines. This holds true in other high-tech areas dominated by single companies, like search (controlled by Google) and e-commerce (taken over by Amazon). Meanwhile, there has been plenty of innovation in areas where there is no monopolistic domination, such as in workplace productivity (Slack, Trello, Asana), urban transportation (Lyft, Uber, Lime, Bird) and cryptocurrency exchanges (Ripple, Coinbase, Circle).

The choice is mine, but it doesn’t feel like a choice. Facebook seeps into every corner of our lives to capture as much of our attention and data as possible and, without any alternative, we make the trade.

Just last month, Facebook seemingly tried to bury news that it had stored tens of millions of user passwords in plain text format, which thousands of Facebook employees could see. Competition alone wouldn’t necessarily spur privacy protection — regulation is required to ensure accountability — but Facebook’s lock on the market guarantees that users can’t protest by moving to alternative platforms.

Mark used to insist that Facebook was just a “social utility,” a neutral platform for people to communicate what they wished. Now he recognizes that Facebook is both a platform and a publisher and that it is inevitably making decisions about values. The company’s own lawyers have argued in court that Facebook is a publisher and thus entitled to First Amendment protection.

As if Facebook’s opaque algorithms weren’t enough, last year we learned that Facebook executives had permanently deleted their own messages from the platform, erasing them from the inboxes of recipients; the justification was corporate security concerns.

Mark may never have a boss, but he needs to have some check on his power. The American government needs to do two things: break up Facebook’s monopoly and regulate the company to make it more accountable to the American people.

++++++++++++++++++++

We Don’t Need Social Media

The push to regulate or break up Facebook ignores the fact that its services do more harm than good

Colin Horgan, May 13, 2019

https://onezero.medium.com/we-dont-need-social-media-53d5455f4f6b

Hughes joins a growing chorus of former Silicon Valley unicorn riders who’ve recently had second thoughts about the utility or benefit of the surveillance-attention economy their products and platforms have helped create. He is also not the first to suggest that government might need to step in to clean up the mess they made

Nick Srnicek, author of the book Platform Capitalism and a lecturer in digital economy at King’s College London, wrotelast month, “[I]t’s competition — not size — that demands more data, more attention, more engagement and more profits at all costs

 

++++++++++++++++++++
more on Facebook in this IMS blog
https://blog.stcloudstate.edu/ims?s=facebook

data interference

APRIL 21, 2019 Zeynep Tufekci

Think You’re Discreet Online? Think Again

Because of technological advances and the sheer amount of data now available about billions of other people, discretion no longer suffices to protect your privacy. Computer algorithms and network analyses can now infer, with a sufficiently high degree of accuracy, a wide range of things about you that you may have never disclosed, including your moods, your political beliefs, your sexual orientation and your health.

There is no longer such a thing as individually “opting out” of our privacy-compromised world.

In 2017, the newspaper The Australian published an article, based on a leaked document from Facebook, revealing that the company had told advertisers that it could predict when younger users, including teenagers, were feeling “insecure,” “worthless” or otherwise in need of a “confidence boost.” Facebook was apparently able to draw these inferences by monitoring photos, posts and other social media data.

In 2017, academic researchers, armed with data from more than 40,000 Instagram photos, used machine-learning tools to accurately identify signs of depression in a group of 166 Instagram users. Their computer models turned out to be better predictors of depression than humans who were asked to rate whether photos were happy or sad and so forth.

Computational inference can also be a tool of social control. The Chinese government, having gathered biometric data on its citizens, is trying to use big data and artificial intelligence to single out “threats” to Communist rule, including the country’s Uighurs, a mostly Muslim ethnic group.

+++++++++++++

Zeynep Tufekci and Seth Stephens-Davidowitz: Privacy is over

https://www.centreforideas.com/article/zeynep-tufekci-and-seth-stephens-davidowitz-privacy-over

+++++++++++

Zeynep Tufekci writes about security and data privacy for NY Times, disinformation’s threat to democracy for WIRED

++++++++++
more on privacy in this IMS blog
https://blog.stcloudstate.edu/ims?s=privacy

Gen Z and social media

Under Employers’ Gaze, Gen Z Is Biting Its Tongue On Social Media

April 13, 20195:00 AM ET

https://www.npr.org/2019/04/13/702555175/under-employers-gaze-gen-z-is-biting-its-tongue-on-social-media

The oldest members of Generation Z are around 22 years old — now entering the workforce and adjusting their social media accordingly. They are holding back from posting political opinions and personal information in favor of posting about professional accomplishments.

only about 1 in 10 teenagers say they share personal, religious or political beliefs on social media, according to a recent survey from Pew Research Center.

70 percent of employers and recruiters say they check social media during the hiring process, according to a survey conducted by CareerBuilder

Generation Z, nicknamed “iGen,” is the post-millennial generation responsible for ‘killing’ Facebook and for the rise of TikTok.

Curricula like Common Sense Education’s digital citizenship program are working to educate the younger generation on how to use social media, something the older generations were never taught.

Some users are regularly cleaning up — “re-curating” — their online profiles. Cleanup apps, like TweetDelete,

Gen Zers also use social media in more ephemeral ways than older generations — Snapchat stories that disappear after 24 hours, or Instagram posts that they archive a couple of months later.

Gen Zers already use a multitude of strategies to make sure their online presence is visible only to who they want: They set their account to private, change their profile name or handle, even make completely separate “fake” accounts.

+++++++++++++++
more on social media in this IMS blog
https://blog.stcloudstate.edu/ims?s=social+media

and privacy
https://blog.stcloudstate.edu/ims?s=privacy

Library Technology Conference 2019

#LTC2019

Intro to XR in Libraries from Plamen Miltenoff

keynote: equitable access to information

keynote spaker

https://sched.co/JAqk
the type of data: wikipedia. the dangers of learning from wikipedia. how individuals can organize mitigate some of these dangers. wikidata, algorithms.
IBM Watson is using wikipedia by algorythms making sense, AI system
youtube videos debunked of conspiracy theories by using wikipedia.

semantic relatedness, Word2Vec
how does algorithms work: large body of unstructured text. picks specific words

lots of AI learns about the world from wikipedia. the neutral point of view policy. WIkipedia asks editors present as proportionally as possible. Wikipedia biases: 1. gender bias (only 20-30 % are women).

conceptnet. debias along different demographic dimensions.

citations analysis gives also an idea about biases. localness of sources cited in spatial articles. structural biases.

geolocation on Twitter by County. predicting the people living in urban areas. FB wants to push more local news.

danger (biases) #3. wikipedia search results vs wkipedia knowledge panel.

collective action against tech: Reddit, boycott for FB and Instagram.

Mechanical Turk https://www.mturk.com/  algorithmic / human intersection

data labor: what the primary resources this companies have. posts, images, reviews etc.

boycott, data strike (data not being available for algorithms in the future). GDPR in EU – all historical data is like the CA Consumer Privacy Act. One can do data strike without data boycott. general vs homogeneous (group with shared identity) boycott.

the wikipedia SPAM policy is obstructing new editors and that hit communities such as women.

++++++++++++++++++

Twitter and Other Social Media: Supporting New Types of Research Materials

https://sched.co/JAWp

Nancy Herther Cody Hennesy

http://z.umn.edu/

twitter librarieshow to access at different levels. methods and methodological concerns. ethical concerns, legal concerns,

tweetdeck for advanced Twitter searches. quoting, likes is relevant, but not enough, sometimes screenshot

engagement option

social listening platforms: crimson hexagon, parsely, sysomos – not yet academic platforms, tools to setup queries and visualization, but difficult to algorythm, the data samples etc. open sources tools (Urbana, Social Media microscope: SMILE (social media intelligence and learning environment) to collect data from twitter, reddit and within the platform they can query Twitter. create trend analysis, sentiment analysis, Voxgov (subscription service: analyzing political social media)

graduate level and faculty research: accessing SM large scale data web scraping & APIs Twitter APIs. Jason script, Python etc. Gnip Firehose API ($) ; Web SCraper Chrome plugin (easy tool, Pyhon and R created); Twint (Twitter scraper)

Facepager (open source) if not Python or R coder. structure and download the data sets.

TAGS archiving google sheets, uses twitter API. anything older 7 days not avaialble, so harvest every week.

social feed manager (GWUniversity) – Justin Litman with Stanford. Install on server but allows much more.

legal concerns: copyright (public info, but not beyond copyrighted). fair use argument is strong, but cannot publish the data. can analyize under fair use. contracts supercede copyright (terms of service/use) licensed data through library.

methods: sampling concerns tufekci, 2014 questions for sm. SM data is a good set for SM, but other fields? not according to her. hashtag studies: self selection bias. twitter as a model organism: over-represnted data in academic studies.

methodological concerns: scope of access – lack of historical data. mechanics of platform and contenxt: retweets are not necessarily endorsements.

ethical concerns. public info – IRB no informed consent. the right to be forgotten. anonymized data is often still traceable.

table discussion: digital humanities, journalism interested, but too narrow. tools are still difficult to find an operate. context of the visuals. how to spread around variety of majors and classes. controversial events more likely to be deleted.

takedowns, lies and corrosion: what is a librarian to do: trolls, takedown,

++++++++++++++vr in library

Crague Cook, Jay Ray

the pilot process. 2017. 3D printing, approaching and assessing success or failure.  https://collegepilot.wiscweb.wisc.edu/

development kit circulation. familiarity with the Oculus Rift resulted in lesser reservation. Downturn also.

An experience station. clean up free apps.

question: spherical video, video 360.

safety issues: policies? instructional perspective: curating,WI people: user testing. touch controllers more intuitive then xbox controller. Retail Oculus Rift

app Scatchfab. 3modelviewer. obj or sdl file. Medium, Tiltbrush.

College of Liberal Arts at the U has their VR, 3D print set up.
Penn State (Paul, librarian, kiniseology, anatomy programs), Information Science and Technology. immersive experiences lab for video 360.

CALIPHA part of it is xrlibraries. libraries equal education. content provider LifeLiqe STEM library of AR and VR objects. https://www.lifeliqe.com/

+++++++++++++++++

Access for All:

https://sched.co/JAXn

accessibilityLeah Root

bloat code (e.g. cleaning up MS Word code)

ILLiad Doctype and Language declaration helps people with disabilities.

https://24ways.org/

 

+++++++++++++++++++

A Seat at the Table: Embedding the Library in Curriculum Development

https://sched.co/JAY5

embedded librarianembed library resources.

libraians, IT staff, IDs. help faculty with course design, primarily online, master courses. Concordia is GROWING, mostly because of online students.

solve issues (putting down fires, such as “gradebook” on BB). Librarians : research and resources experts. Librarians helping with LMS. Broadening definition of Library as support hub.

Digital Destruction of Democracy

The Digital Destruction of Democracy

ANYA SCHIFFRIN JANUARY 21, 2019

https://prospect.org/article/digital-destruction-democracy

Anya Schiffrin is an adjunct faculty member at the School of International and Public Affairs at Columbia University. She worked in Hanoi from 1997 to 1999 as the bureau chief of Dow Jones Newswires.
Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics
By Yochai Benkler, Robert Faris, & Hal Roberts
Oxford University Press
A Harvard law professor who is a well-known theorist of the digital age, Benkler and colleagues have produced an authoritative tome that includes multiple taxonomies and literature reviews as well as visualizations of the flow of disinformation.
clickbait fabricators
white supremacist and alt-right trolls
a history of the scholarship on propaganda, reminding the reader that much of the discussion began in the 1930s.
Benkler’s optimistic 2007 book, The Wealth of Networks, predicted that the Internet would bring people together and transform the way information is created and spread. Today, Benkler is far less sanguine and has become one of the foremost researchers of disinformation networks.
Fox News, BreitbartThe Daily CallerInfoWars, and Zero Hedge
As a result, mainstream journalists repeat and amplify the falsehoods even as they debunk them.
There is no clear line, they argue, between Russian propaganda, Breitbart lies, and the Trump victory. They add that Fox News is probably more influential than Facebook.
after George Soros gave a speech in January 2018 calling for regulation of the social media platforms, Facebook hired a Republican opposition research firm to shovel dirt at George Soros.
The European Union has not yet tried to regulate disinformation (although they do have codes of practice for the platforms), instead focusing on taxation, competition regulation, and protection of privacy. But Germany has strengthened its regulations regarding online hate speech, including the liability of the social media platforms.
disclosure of the sources of online political advertising.It’s a bit toothless because, just as with offshore bank accounts, it may be possible to register which U.S. entity is paying for online political advertising, but it’s impossible to know whether that entity is getting its funds from overseas. Even the Honest Ads bill was too much for Facebook to take.

++++++++++++
more on the issues of digital world and democracy in this IMS blog
https://blog.stcloudstate.edu/ims/2019/02/19/facebook-digital-gangsters/

School Safety and Student Wellbeing

CALL FOR CHAPTER PROPOSALS
Proposal Submission Deadline: February 12, 2019
Leveraging Technology for the Improvement of School Safety and Student Wellbeing
A book edited by Dr. Stephanie Huffman, Dr. Stacey Loyless, Dr. Shelly Allbritton, and Dr. Charlotte Green (University of Central Arkansas)

Introduction
Technology permeates all aspects of today’s school systems. An Internet search on technology in schools can generate millions of website results. The vast majority of these websites (well over 8,000,000 results for one simple search) focuses on advice, activities, and uses of technology in the classroom. Clearly teaching and learning with technology dominates the literature and conversations on how technology should or could be used in classroom settings. A search on school safety and technology can produce more than 3,000,000 results with many addressing technological tools such as video cameras, entry control devices, weapon detectors, and other such hardware. However, in recent times, cyberbullying appears to dominate the Internet conversations in references to school safety. With an increase in school violence in the past two decades, school safety is a fundamental concern in our nation’s schools. Policy makers, educators, parents, and students are seeking answers in how best to protect the physical, emotional, and social well-being of all children.

 

Objective of the Book
The proposed edited book covers the primary topic of P-12 school safety and the use of technology and technology used for fostering an environment in which all students can be academically successful and thrive as global citizens.  School safety is defined as the physical, social, and emotional well-being of children. The book will comprise empirical, conceptual and case based (practical application) research that craft an overall understanding of the issues in creating a “safe” learning environment and the role technology can and should play; where a student’s well-being is valued and protected from external and internal entities, equitable access is treasured as a means for facilitating the growth of the whole student, and policy, practices, and procedures are implemented to build a foundation to transform the culture and climate of the school into an inclusive nurturing environment.

 

Target Audience
The target audience is leadership and education scholars, leadership practitioners, and technology coordinators.  This book will be used as a collective body of work for the improvement of K-12 schools and as a tool for improving leadership and teacher preparation programs. School safety is a major concern for educators.  Technology has played a role in creating unsafe environments for children; however it also is an avenue for addressing the challenges of school safety

Recommended topics include, but are not limited to, the following:

Section I – Digital Leadership

  • Technology as a Climate and Cultural Transformation Tool
    • School Leadership in the Digital Age: Building a Shared Vision for all Aspects of Learning and Teaching
  • Ensuring Equity within a “One to One” Technology Framework
    • Infrastructure within Communities
    • Accessible WiFi for Low SES Students
    • Developing Culturally Responsive Pedagogy
  • Professional Development for School Leaders

Section II – Well Being

  • Social Media and School Safety: Inputs and Outputs
    • Tip lines: Crime, Bullying, Threats
    • Communication and Transparency
    • Platform for Social Justice
  • Teaching Strategies to Promote Healthy Student Interactions in Cyberspace (Digital Citizenship?)
    • Building Capacity and Efficacy, Platform to lower incidence of Cyber-Bullying, Boosting Instructional Engagement
  • Literacy and Preparedness for the Influence and Consequence of Digital Media Marketing Campaigns directed toward Children, Adolescents, and Teens.
  • Pioneering Innovative Technology Program in Curriculum: Fostering “Belonging” beyond Athletics & Arts.

Section III- Infrastructure Safety

  • Campus/Facility Safety and Security
    • Rural Schools vs. Urban Schools
    • Digital A/V Systems
    • Background Check – Visitor Registration (i.e. Raptor)
  • Network Security Systems and Protocols
    • User Filtering and Monitoring
    • Firewalls
  • Policy
    • Appropriate use policies
    • Digital Citizenship
    • Web development policy
    • Privacy
    • Intellectual Property & Copyright

Section IV – Academic Success

  • Professional Development for Classroom Teachers
    • Pedagogical Integration of Technology
    • Instructional Coaching for Student Engagement
    • Increase Rigor with Technology
    • Competence in the Blended/Hybrid/Flipped Classroom
  • Technology to enhance learning for all
    • Assistive Technology
    • Accessibility issues
    • Internet access for Low SES Students in the Blended/Hybrid/Flipped Classroom
  • Personal Learning Design
    • Differentiation for Student Efficacy
    • Strategies for Increasing Depth of Knowledge
    • Design Qualities for Enhanced Engagement

Submission Procedure
Researchers and practitioners are invited to submit on or before February 12, 2019, a chapter proposal of 1,000 to 2,000 words clearly explaining the purpose, methodology, and a brief summary findings of his or her proposed chapter. Authors will be notified by March 12, 2019 about the status of their proposals and sent chapter guidelines. Full chapters are expected to be submitted by June 12, 2019, and all interested authors must consult the guidelines for manuscript submissions at http://www.igi-global.com/publish/contributor-resources/before-you-write/ prior to submission. See Edited Chapter Template. All submitted chapters will be reviewed on a double-blind review basis. Contributors may also be requested to serve as reviewers for this project.

Note: There are no submission or acceptance fees for manuscripts submitted to this book publication, Leveraging Technology for the Improvement of School Safety and Student Wellbeing. All manuscripts are accepted based on a double-blind peer review editorial process.

All proposals should be submitted through the eEditorial Discovery®TM online submission manager. USE THE FOLLOWING LINK TO SUBMIT YOUR PROPOSAL.  https://www.igi-global.com/publish/call-for-papers/call-details/3709

Publisher
This book is scheduled to be published by IGI Global (formerly Idea Group Inc.), an international academic publisher of the “Information Science Reference” (formerly Idea Group Reference), “Medical Information Science Reference,” “Business Science Reference,” and “Engineering Science Reference” imprints. IGI Global specializes in publishing reference books, scholarly journals, and electronic databases featuring academic research on a variety of innovative topic areas including, but not limited to, education, social science, medicine and healthcare, business and management, information science and technology, engineering, public administration, library and information science, media and communication studies, and environmental science. For additional information regarding the publisher, please visit http://www.igi-global.com. This publication is anticipated to be released in 2020.

Important Dates
February 12, 2019: Proposal Submission Deadline
March 12, 2019: Notification of Acceptance
June 12, 2019: Full Chapter Submission
August 10, 2019: Review Results Returned
August 10, 2019: Final Acceptance Notification
September 7, 2019: Final Chapter Submission

Inquiries can be forwarded to
Dr. Stephanie Huffman
University of Central Arkansas
steph@uca.edu or 501-450-5430

1 2 3 4 5 6 9