Educational Psychology & Technology
24.9K views | +1 today
Scooped by Roxana Marachi, PhD
onto Educational Psychology & Technology!

School Ditches Online Learning Program After Parents Revolt // AP News 

School Ditches Online Learning Program After Parents Revolt // AP News  | Educational Psychology & Technology |

"CHESHIRE, Conn. (AP) — The fast-growing online platform was built with help from Facebook engineers and designed to help students learn at their own speed. But it’s been dropped because parents in this Connecticut suburb revolted, saying there was no need to change what’s worked in a town with a prized reputation for good schools.

The Summit Learning program, developed by a California charter school network, has signed up over 300 schools to use its blend of technology with go-at-your-own-pace personalized learning. 


Cheshire school administrators and some parents praised the program, but it faced criticism from others who said their children were spending too much time online, some content was inappropriate, and students were not getting enough direct guidance."... 

No comment yet.
Educational Psychology & Technology
This curated collection includes news, resources, and research related to Educational Psychology and/or Technology. The page also serves as a research tool to organize online content. The grey funnel shaped icon at the top allows for searching by keyword. For research more specific to tech, screen time and health/safety concerns, please see:, to learn about the next wave of privatization involving technology intersections with "Pay For Success" and "Social Impact Bonds", see, and for additional Educator Resources, please visit [Links to an external site].
Your new post is loading...
Your new post is loading...
Rescooped by Roxana Marachi, PhD from Screen Time and Tech Safety Research!

Health and Safety Research Gaps in Policies and Practices Integrating Emerging Technologies for Young Children 

Links are as follows in order of the slides: 


The Silicon Valley Billionaires Remaking America's Schools 


Dr. Catherine Steiner-Adair
Clinical Psychologist and Research Associate at Harvard Medical School 


Video link may be viewed at: 


Carter B, Rees P, Hale L, Bhattacharjee D, Paradkar MS. Association Between Portable Screen-Based Media Device Access or Use and Sleep Outcomes: A Systematic Review and Meta-analysis.JAMA Pediatr. 2016 Oct 31. doi: 10.1001/jamapediatrics.2016.2341. [Epub ahead of print] 


Screen Time Hurts More Than Kids' Eyes 


New Media Consortium / Consortium for School Networking Horizon Report 


"American Revolution 2.0: How Education Innovation is Going to Revitalize America and Transform the U.S. Economy" 


"Preschool is Good For Children But It's Expensive So Utah Is Offering It Online" m/local/education/preschool-is- good-for-poor-kids-but-its- expensive-so-utah-is-offering-it- online/2015/10/09/27665e52- 5e1d-11e5-b38e- 06883aacba64_story.html  


Philanthropy Roundtable's: "Blended Learning: Wise Givers Guide to Supporting Tech-Assisted Learning" (Formerly chaired by B. DeVos)  


CyberCharters Have Overwhelming Negative Impact 


Ma, J., van den Heuvel, M., Maguire, J., Parkin, P., Birken, C. (2017). Is handheld screen time use associated with language delay in infants? Presented at the Pediatric Academic Societies Meeting, San Francisco, CA.  


Jonathan Rochelle’s GSV/ASU PRIMETIME Keynote Speech pitching Google Cardboard for children in schools as proxy for actual field trips: 


Scientists Urge Google to Stop Untested Microwave Radiation of Children's Eyes and Brains with Virtual Reality Devices in Schools //  Asus product manual 


Telecom Industry Liability and Insurance Information 


National Association for Children and Safe Technology - iPad Information 


For infant/pregnancy related safety precautions, please visit 


194 Signatories (physicians, scientists, educators) on Joint Statement on Pregnancy and Wireless Radiation 


Article screenshot from France: "Portables. L'embrouille des ondes electromagnetiques


Wireless Phone Radiation Risks and Public Policy 


"Show The Fine Print" 


Scientist petition calls for greater protective measures for children and pregnant women, cites need for precautionary health warnings, stronger regulation of electromagnetic fields, creation of EMF free zones, and media disclosures of experts’ financial relationships with industry when citing their opinions regarding the safety of EMF-emitting technologies. Published in European Journal of Oncology 


International Agency for Research on Cancer Classifies Radiofrequency Electromagnetic Fields as Possibly Carcinogenic to Humans (2011)


For more on source of funding research, see: and 


Maryland State Children’s Environmental Health and Protection Advisory Council // Public Testimony


"Until now, radiation from cell towers has not been considered a risk to children, but a recent study raises new questions about possible long-term, harmful effects." 


For further reading, please see Captured Agency report published by Harvard’s Center for Ethics  or 


Updates/posts/safety information on Virtual Reality: 


Environmental Health Trust Virtual Reality Radiation Absorption Slides 


Healthy Kids in a Digital World: 


National Association for Children and Safe Technology 


Doctors’ Letters on Wifi in Schools// 154 page compilation 


Insurance and Liability Disclaimers/Information from Telecom Companies 


Most of the documents and articles embedded within the presentation above are searchable/accessible on the following page:

Document above is a pdf with live links. They are provided above for easier access. To download the original file, please click on title or arrow above. It is a large file so may take several minutes.  

No comment yet.
Scooped by Roxana Marachi, PhD!

The Structural Consequences of Big Data-Driven Education // Elana Zeide 

The Structural Consequences of Big Data-Driven Education // Elana Zeide  | Educational Psychology & Technology |

Educators and commenters who evaluate big data-driven learning environments focus on specific questions: whether automated education platforms improve learning outcomes, invade student privacy, and promote equality. This article puts aside separate unresolved—and perhaps unresolvable—issues regarding the concrete effects of specific technologies. It instead examines how big data-driven tools alter the structure of schools’ pedagogical decision-making, and, in doing so, change fundamental aspects of America’s education enterprise. Technological mediation and data-driven decision-making have a particularly significant impact in learning environments because the education process primarily consists of dynamic information exchange. In this overview, I highlight three significant structural shifts that accompany school reliance on data-driven instructional platforms that perform core school functions: teaching, assessment, and credentialing.

First, virtual learning environments create information technology infrastructures featuring constant data collection, continuous algorithmic assessment, and possibly infinite record retention. This undermines the traditional intellectual privacy and safety of classrooms. Second, these systems displace pedagogical decision-making from educators serving public interests to private, often for-profit, technology providers. They constrain teachers’ academic autonomy, obscure student evaluation, and reduce parents’ and students’ ability to participate or challenge education decision-making. Third, big data-driven tools define what ‘counts’ as education by mapping the concepts, creating the content, determining the metrics, and setting desired learning outcomes of instruction. These shifts cede important decision-making to private entities without public scrutiny or pedagogical examination. In contrast to the public and heated debates that accompany textbook choices, schools often adopt education technologies ad hoc. Given education’s crucial impact on individual and collective success, educators and policymakers must consider the implications of data-driven education proactively and explicitly.


Keywords: big data; personalized learning; competency-based education; smart tutors; learning analytics; MOOCs

Suggested Citation:

Zeide, Elana, The Structural Consequences of Big Data-Driven Education (June 23, 2017). Big Data, Vol 5, No. 2 (2017): 164-172. Available at SSRN:" 


Shortlink to download:


No comment yet.
Scooped by Roxana Marachi, PhD!

Facebook pays teens to install VPN that spies on them // TechCrunch

Facebook pays teens to install VPN that spies on them // TechCrunch | Educational Psychology & Technology |

"Desperate for data on its competitors, Facebook  has been secretly paying people to install a “Facebook Research” VPN that lets the company suck in all of a user’s phone and web activity, similar to Facebook’s Onavo Protect app that Apple banned in June and that was removed in August. Facebook sidesteps the App Store and rewards teenagers and adults to download the Research app and give it root access to network traffic in what may be a violation of Apple policy so the social network can decrypt and analyze their phone activity, a TechCrunch investigation confirms.

Facebook admitted to TechCrunch it was running the Research program to gather data on usage habits.

Since 2016, Facebook has been paying users ages 13 to 35 up to $20 per month plus referral fees to sell their privacy by installing the iOS or Android “Facebook Research” app. Facebook even asked users to screenshot their Amazon order history page. The program is administered through beta testing services Applause, BetaBound and uTest to cloak Facebook’s involvement, and is referred to in some documentation as “Project Atlas” — a fitting name for Facebook’s effort to map new trends and rivals around the globe.

Seven hours after this story was published, Facebook told TechCrunch it would shut down the iOS version of its Research app in the wake of our report. But on Wednesday morning, an Apple spokesperson confirmed that Facebook violated its policies, and it had blocked Facebook’s Research app on Tuesday before the social network seemingly pulled it voluntarily (without mentioning it was forced to do so)."


For full post, please visit:

Parenting IRL's curator insight, January 31, 1:50 PM
well this is just great....not
Scooped by Roxana Marachi, PhD!

Hacking Fortnite // CheckPoint Research

Hacking Fortnite // CheckPoint Research | Educational Psychology & Technology |

Research by: Alon Boxiner, Eran Vaknin and Oded Vanunu
January 16th, 2019


"Played in a virtual world, players of ‘Fortnite’, the massively popular game from game developer Epic Games, are tasked with testing their endurance as they battle for tools and weapons that will keep them secure and the ‘last man standing’.

In the last few weeks, however, Check Point Research discovered multiple vulnerabilities in Epic Games’ online platform that could have allowed a threat actor to take over the account of any game player, view their personal account information, purchase V-bucks, Fortnite’s virtual in-game currency and eavesdrop on and record players’ in-game chatter and background home conversations.


Created by Epic Games, an American video game developer, Fortnite is the game responsible for almost half of their $5bn-$8bn estimated value. With such a meteoric rise in fortune, it is no surprise then that the game had already attracted the attention from cyber criminals who set out to con unsuspecting players.

These scams previously took the role of deceiving players into logging into fake websites that promised to generate Fortnite’s ‘V-Buck’ in-game currency, a commodity that can usually only be acquired through the official Fortnite store or by earning them in the game itself. These sites promote players to enter their game login credentials, as well as personal information like name, address and credit card details and are spread via social media campaigns that claim players can “earn easy cash” and “make quick money”.

Our team’s research, however, relied on a far more sophisticated and sinister method, that did not require the user to hand over any login details whatsoever. By discovering a vulnerability found in some of Epic Games’ sub-domains, an XSS attack was permissible with the user merely needing to click on a link sent to them by the attacker. Once clicked, with no need even for them to enter any login credentials, their Fortnite username and password could immediately be captured the attacker."...

For full post, please visit: 


For link to video, see: 


No comment yet.
Scooped by Roxana Marachi, PhD!

Don't Do It, Gavin Newsom // Inside Higher Ed Blog [quote below from exit interview w/Gov. Brown published in Politico 1/6/19]

Don't Do It, Gavin Newsom // Inside Higher Ed Blog [quote below from exit interview w/Gov. Brown published in Politico 1/6/19] | Educational Psychology & Technology |

By John Warner

"Governor Newsom, please don't spend $10 million on a system that tracks students from kindergarten all the way into the work force."...


For full post, see here: 



No comment yet.
Scooped by Roxana Marachi, PhD!

The Datafication of Discipline: ClassDojo, Surveillance, and a Performative Classroom Culture // Manolev, Sullivan, and Slee (2018); Learning Media and Technology  

The Datafication of Discipline: ClassDojo, Surveillance, and a Performative Classroom Culture // Manolev, Sullivan, and Slee (2018); Learning Media and Technology   | Educational Psychology & Technology |


This paper critically examines the ways in which ClassDojo is altering the disciplinary landscape in schools through the datafication of discipline and student behaviour. ClassDojo is one of the most popular and successful educational technologies and is used internationally. It is a school-based social media platform that incorporates a gamified behaviour-shaping function, providing school communities with a centralised digital network in which to interact. We argue that ClassDojo’s datafying system of school discipline intensifies and normalises the surveillance of students. Furthermore, it creates a culture of performativity and serves as a mechanism for behaviour control."



For link to journal article and access for download 

No comment yet.
Scooped by Roxana Marachi, PhD!

AI Now Report 2018 // AI Now Institute 

To download, click on title above or here:  

No comment yet.
Scooped by Roxana Marachi, PhD!

What is Data Exploitation? // Privacy International

To view video on YouTube, see:


For questions related to the potential for Data Exploitation with "Smart Cities" projects, see: 

No comment yet.
Scooped by Roxana Marachi, PhD!

Under Surveillance: Capitalism in the Digital Age // Radio Open Source

Under Surveillance: Capitalism in the Digital Age // Radio Open Source | Educational Psychology & Technology |

"Yes, Virginia, the world did change direction in the late summer of 2001, and it’s been changing us ever since. 9/11 had everything to do it, but it was also the panicky season of the bust, when little Google, in fear of death, morphed from search service to data mining from its users. Our government, post 9/11, was ready to compromise privacy and underwrite a new science of surveillance—the object was to know everything about everybody. And here we are, not two decades later: Google is a trillion-dollar company, in an industry that knows more than we know about ourselves, and sells it. Omni-analyst Shoshana Zuboff argues we are being re-purposed for a new age of mankind.


Shoshana Zuboff is a business school professor and scholar with a Theory of Pretty Much Everything about our American condition in 2019. Unlike most theories of everything, this one is simple enough to remember. It’s also complex and researched enough to feel critically intelligent, not to say: plausible. The theory, in two words, is Surveillance Capitalism, the big business of social-network companies (think: Google, Facebook, Apple) who sift the signals from your phones and laptops to know, moment to moment, your heart’s desire and then sell it to you. Add a fashionable ideology of markets, a culture of consumer comfort, and the force of wealth—and the rest is details. Our disquieting modern condition is not in your mind. It’s in our lopsided landscape, as our guest Shoshana Zuboff maps it in stunning big book, The Age of Surveillance Capitalism."...


For original post, see: 

For direct link to audio, see: 



No comment yet.
Scooped by Roxana Marachi, PhD!

Educator Toolkit for Teacher and Student Privacy // Parent Coalition for Student Privacy 

Educator Toolkit for Teacher and Student Privacy // Parent Coalition for Student Privacy  | Educational Psychology & Technology |

To download, click on title above or here: 

No comment yet.
Scooped by Roxana Marachi, PhD!

Health Research Gaps in the Marketing and Promotion of Emerging Educational Technologies // (Marachi, 2018) Presented at the Digital Media and Developing Minds Conference, New York 

To download poster, click on title above. For resource collections related to the research and including many of the references cited, see:

No comment yet.
Scooped by Roxana Marachi, PhD!

St. Paul Schools, Ramsey County End Youth Data-Sharing Agreement // Twin Cities

St. Paul Schools, Ramsey County End Youth Data-Sharing Agreement // Twin Cities | Educational Psychology & Technology |

By Frederick Melo

"After nearly five years of preparation, a three-way agreement between St. Paul, Ramsey County and the St. Paul Public Schools to share youth data and target young people with services even before a crisis emerges will be dissolved amid withering criticism and community pushback.


The goal was at once simple and sweeping. Rather than wait until a young person is arrested or in crisis, public officials would be able to predict which families were most likely to slip through the cracks based on zip codes, incomes, truancy numbers, race and other indicators.

Ramsey County began exploring the potential of “predictive analytics” in 2014, and the city, county and school district signed a joint powers agreement allowing them to share youth data between them in 2018.


That agreement will be canceled."...


For full story, please see: 

No comment yet.
Scooped by Roxana Marachi, PhD!

'It's All About Controlling Students': Researchers Call Out Popular App

'It's All About Controlling Students': Researchers Call Out Popular App | Educational Psychology & Technology |

By Henrietta Cook

"It’s one of the world’s most popular education apps and used in more than half of Australian primary schools. Students who sign up to Class Dojo are assigned a cartoonish monster and awarded points when they do the right thing. Teachers deduct points – which are often displayed on interactive white boards at the front of the classroom – when children act out. But a new paper by University of South Australia researchers says Class Dojo promotes an archaic approach to discipline and likens it to China's social credit system.


“If we look at the fundamental basis of how China’s social credit system works, we see similarities with Class Dojo,” the study's lead author Jamie Manolev said.


“They both rely heavily on surveillance, rewards and punishments to reinforce behaviour and convert behaviour data into a score. Those scores are being used to determine what happens to students or citizens.”


Mr Manolev, who is also a primary school teacher, said some schools were using Dojo scores to decide whether students could participate in recreational activities. A student might miss out if too many points are deducted for late homework or talking out of turn.


He said the app was popular because it provided teachers with a “quick behavioural fix”.


But according to Mr Manolev, the technology sends a damaging message that children should behave appropriately just because they might receive a reward.


On the flip side, he said it sends a message that a student shouldn’t misbehave because they might be punished.

This, the researchers said, erodes self-motivation and the development of self-regulation.


“This carrot and stick approach to discipline is all about controlling students through the use of rewards and punishment as opposed to educating students about what good behaviour looks like," Mr Manolev said.


The researchers are also concerned displaying the scores in front of the class encourages unhealthy competition between children."...


For full post, please see: 

No comment yet.
Scooped by Roxana Marachi, PhD!

Canadians Are Rightly Worried About Invasion of Privacy in Smart Cities // The Conversation Canada 

Canadians Are Rightly Worried About Invasion of Privacy in Smart Cities // The Conversation Canada  | Educational Psychology & Technology | 

No comment yet.
Scooped by Roxana Marachi, PhD!

The Ethics of Virtual Reality Technology: Social Hazards and Public Policy Recommendations // Spiegel (2018) Science and Engineering Ethics

The Ethics of Virtual Reality Technology: Social Hazards and Public Policy Recommendations // Spiegel (2018) Science and Engineering Ethics | Educational Psychology & Technology |


"This article explores four major areas of moral concern regarding virtual reality (VR) technologies. First, VR poses potential mental health risks, including Depersonalization/Derealization Disorder. Second, VR technology raises serious concerns related to personal neglect of users’ own actual bodies and real physical environments. Third, VR technologies may be used to record personal data which could be deployed in ways that threaten personal privacy and present a danger related to manipulation of users’ beliefs, emotions, and behaviors. Finally, there are other moral and social risks associated with the way VR blurs the distinction between the real and illusory. These concerns regarding VR naturally raise questions about public policy. The article makes several recommendations for legal regulations of VR that together address each of the above concerns. It is argued that these regulations would not seriously threaten personal liberty but rather would protect and enhance the autonomy of VR consumers." 

No comment yet.
Scooped by Roxana Marachi, PhD!

Veillance and Transparency: A Critical Examination of Mutual Watching in the Post-Snowden, Big Data Era // Big Data & Society

Guest Editors:
Prof. Vian Bakir, School of Creative Studies & Media, Bangor University

Prof. Martina Feilzer, School of Social Sciences, Bangor University

Dr. Andrew McStay, School of Creative Studies & Media, Bangor University

"This special theme proposes that we live in a techno-cultural condition of increased and normalised transparency through various veillant forces (Steve Mann’s term for mutual watching) ranging from surveillant organizations and states to sousveillant individuals. Our papers address the technical, social, economic, political, legal, ethical and cultural implications of this situation.

-     On Ethics, Values and Norms: What, if anything, can or should we do about practices of watching that operate without informed consent or adequate processes of accountability in the post-Snowden, Big Data era?

-     On Regulation, Power, Resistance and Social Change:
 Are existing mechanisms of regulation and oversight able to deal with nation-states’ desire for transparency of their citizens, or is resistance required from other quarters?


-     On Representation, Discourse and Public Understanding: What socio-cultural discourses and practices on veillance and transparency prevail; how do they position the sur/sous/veillant subject; and do they adequately educate and engage people on abstract veillance practices?


Introduction to Special Theme Veillance and transparency: A critical examination of mutual watching in the post-Snowden, Big Data era
Vian Bakir, Martina Feilzer, Andrew McStay
Big Data & Society, March 2017, 10.1177/2053951717698996

Original Research Articles

Shareveillance: Subjectivity between open and closed data
Clare Birchall
Big Data & Society, November 2016, 10.1177/2053951716663965 

Empathic media and advertising: Industry, policy, legal and citizen perspectives (the case for intimacy)
Andrew McStay
Big Data & Society, November 2016, 10.1177/2053951716666868

Reluctant activists? The impact of legislative and structural attempts of surveillance on investigative journalism

Anthony Mills and Katharine Sarikakis
Big Data & Society, November 2016, 10.1177/2053951716669381 

Algorithmic paranoia and the convivial alternative

Dan McQuillan
Big Data & Society, November 2016, 10.1177/2053951716671340

Towards Data Justice? The ambiguity of anti-surveillance resistance in political activism

Lina Dencik, Jonathan Cable and Arne Hintz
Big Data & Society, November 2016, 10.1177/2053951716679678

The machine that ate bad people: The ontopolitics of the precrime assemblage

Peter Mantello
Big Data & Society, December 2016,  10.1177/2053951716682538


The Snowden Archive-in-a-Box: a year of traveling experiments in outreach and education
Evan Light
Big Data & Society, November 2016, 10.1177/2053951716666869

Conceptualising the Right to Data Protection in an Era of Big Data

Yvonne McDermott
Big Data & Society, January 2017, 10.1177/2053951716686994

Big Data is a Big Lie without little data: Humanistic Intelligence as a Human Right

Steve Mann
Big Data & Society, February 2017, 10.1177/2053951717691550

Crowd-Sourced Intelligence Agency: Prototyping counterveillance

Jennifer Gradecki, Derek Curry
Big Data & Society, February 2017, 10.1177/2053951717693259

Tracing You: How transparent surveillance reveals a desire for visibility
Benjamin Grosser 
Big Data & Society, February 2017, 10.1177/2053951717694053

Early Career Researcher Essay

Liberal Luxury: Decentering Snowden, Surveillance and Privilege
Piro Rexhepi
Big Data & Society, November 2016, 10.1177/2053951716679676


For original announcement, please see: 

No comment yet.
Scooped by Roxana Marachi, PhD!

The Seductive Diversion of ‘Solving’ Bias in Artificial Intelligence // Medium

The Seductive Diversion of ‘Solving’ Bias in Artificial Intelligence // Medium | Educational Psychology & Technology |

By Julia Powles and Helen Nissenbaum
"The rise of Apple, Amazon, Alphabet, Microsoft, and Facebook as the world’s most valuable companies has been accompanied by two linked narratives about technology. One is about artificial intelligence — the golden promise and hard sell of these companies. A.I. is presented as a potent, pervasive, unstoppable force to solve our biggest problems, even though it’s essentially just about finding patterns in vast quantities of data. The second story is that A.I. has a problem: bias.

The tales of bias are legion: online ads that show men higher-paying jobs; delivery services that skip poor neighborhoods; facial recognition systemsthat fail people of color; recruitment tools that invisibly filter out women. A problematic self-righteousness surrounds these reports: Through quantification, of course we see the world we already inhabit. Yet each time, there is a sense of shock and awe and a detachment from affected communities in the discovery that systems driven by data about our world replicate and amplify racial, gender, and class inequality.


Serious thinkers in academia and business have swarmed to the A.I. bias problem, eager to tweak and improve the data and algorithms that drive artificial intelligence. They’ve latched onto fairness as the objective, obsessing over competing constructs of the term that can be rendered in measurable, mathematical form. If the hunt for a science of computational fairness was restricted to engineers, it would be one thing. But given our contemporary exaltation and deference to technologists, it has limited the entire imagination of ethics, law, and the media as well.

There are three problems with this focus on A.I. bias. The first is that addressing bias as a computational problem obscures its root causes. Bias is a social problem, and seeking to solve it within the logic of automation is always going to be inadequate.


Second, even apparent success in tackling bias can have perverse consequences. Take the example of a facial recognition system that works poorly on women of color because of the group’s underrepresentation both in the training data and among system designers. Alleviating this problem by seeking to “equalize” representation merely co-opts designers in perfecting vast instruments of surveillance and classification.

When underlying systemic issues remain fundamentally untouched, the bias fighters simply render humans more machine readable, exposing minorities in particular to additional harms.


Third — and most dangerous and urgent of all — is the way in which the seductive controversy of A.I. bias, and the false allure of “solving” it, detracts from bigger, more pressing questions. Bias is real, but it’s also a captivating diversion.

What has been remarkably underappreciated is the key interdependence of the twin stories of A.I. inevitability and A.I. bias. Against the corporate projection of an otherwise sunny horizon of unstoppable A.I. integration, recognizing and acknowledging bias can be seen as a strategic concession — one that subdues the scale of the challenge. Bias, like job losses and safety hazards, becomes part of the grand bargain of innovation.

The reality that bias is primarily a social problem and cannot be fully solved technically becomes a strength, rather than a weakness, for the inevitability narrative. It flips the script. It absorbs and regularizes the classification practices and underlying systems of inequality perpetuated by automation, allowing relative increases in “fairness” to be claimed as victories — even if all that is being done is to slice, dice, and redistribute the makeup of those negatively affected by actuarial decision-making.

In short, the preoccupation with narrow computational puzzles distracts us from the far more important issue of the colossal asymmetry between societal cost and private gain in the rollout of automated systems. It also denies us the possibility of asking: Should we be building these systems at all?"...


For full post, see: 

No comment yet.
Scooped by Roxana Marachi, PhD!

Outsourcing the Classroom to Ed Tech & Machine Learning: Why Parents and Teachers Should Resist // Leonie Haimson, Audrey Watters, Peter Greene; 2018 Network for Public Education National Conference 

To download, click on title or arrow above. 


"Originally presented by Leonie Haimson of Class Size Matters at the October 2018 NPE Conference, this powerpoint provides a comprehensive overview of ed tech that can be used in teacher and/or parent/teacher presentations." 



No comment yet.
Scooped by Roxana Marachi, PhD!

Kids Shouldn’t Have to Sacrifice Privacy for Education [Op-Ed] // The New York Times

Kids Shouldn’t Have to Sacrifice Privacy for Education [Op-Ed] // The New York Times | Educational Psychology & Technology |

By Dipayan Ghosh and Jim Steyer 

"This year, the media has exposed — and the government, including through guidance issued by the F.B.I. has begun to address — a string of harms to individual privacy by the technology sector’s leading firms. But policymakers must intervene specifically to protect the most precious and vulnerable people in our society: children. Their behavioral data is continuously suctioned up by technology firms through tablets, smartphones and computers and is at risk of being misused.


For many American children, going to school means handing over personal data. The Summit “personalized learning” educational tool — a platform for online lessons and assessments that was developed by a charter school network with the help of Facebook engineers and is backed by the Chan Zuckerberg Initiative — has been criticized for asking parents to consent to sharing their children’s personal data, including their names, internet activity and grades. Google has vastly expanded its reach into America’s schools as more than half of students use its Gmail and Docs apps, and a majority of mobile devices shipped to schools are Chromebooks. Should the tremendous amounts of data underlying the operation of these kinds of services get into the wrong hands, our children’s futures could be at stake.


Concerns over illegitimate sharing of and access to student data have been raised by parent groups, consumer watchdogs, and privacy advocates, many of whom have begun public awareness campaigns and legal battles. They’re rightly worried, for example, about the fairness of college admissions processes that rely on student data profiles shared by personalized learning companies. Similarly, parents are concerned about the dispensation of financial awards including scholarships that are influenced by data that children have provided in surveys. In some cases the information doesn’t include just things like grades and test scores but also covers categories like race, religion, address and whether they have “impairments” like H.I.V. or depression.


In 2014, inBloom, a nonprofit that offered to warehouse and manage student data for public school districts, announced that it would shut down, after parents objected to 400 categories of information, including children’s reasons for absences and sensitive family relationships, being included in its database.


Beyond the collection of our children’s education-related data, we don’t want industry behemoths to profile our children and target them with advertisements and shady content. We don’t want children to fear that anything they say or do online could be used against them someday. And we don’t want technology companies to compete for our children’s attention just so that they can claim their loyalties when they come of age to join a social media network, choose an email provider or purchase their first cellphones. It is not just about the protection of data; it is a matter of letting children learn and grow without concern about how their early preferences, talents and habits could shape the opportunities they have in the future. But laws in the United States offer students very little digital privacy or security protection against the wiles of the industry.


Wherever the rules are muddy for the industry, we should make them resoundingly clear in such a way that protects our children and, implicitly, our national interest. If we have learned anything about Silicon Valley this year, it is that we cannot sit back and wait for the industry to voluntarily act on behalf of children; our government must intervene before more harm comes to them.


Tech companies should not be permitted to collect data on children and profile them using their personal data without a parent or guardian’s meaningful consent to the data collection. In an educational setting, in which children might be denied the ability to participate in certain learning activities if their parents object, meaningful consent is a often simply not possible. They should not be allowed to market products to a child based on inferences about the child’s behaviors, preferences, beliefs or interests, and they should not be allowed to sell or share a child’s personal data to a third party under any circumstances."...


No comment yet.
Rescooped by Roxana Marachi, PhD from Screen Time and Tech Safety Research!

Chinese Facial Recognition Company Left Database of People's Locations Exposed // CNET

Chinese Facial Recognition Company Left Database of People's Locations Exposed // CNET | Educational Psychology & Technology |

"A Chinese facial recognition company left its database exposed online, revealing information about millions of people, a security researcher discovered.

SenseNets, a company based in Shenzhen, China, offers facial recognition technology and crowd analysis, which the company boasted in a promotional video could track people across cities and pick them out in large groups.


But the company failed to protect that database with a password, Victor Gevers, a Dutch security researcher with the GDI Foundation, discovered Wednesday. The database contained more than 2.5 million records on people, including their ID card number, their address, birthday, and locations where SenseNets' facial recognition has spotted them.


From the last 24 hours alone, there were more than 6.8 million locations logged, Gevers said. Anyone would be able to look at these records and track a person's movements based on SenseNets' real-time facial recognition."...


For full post, see:  

No comment yet.
Scooped by Roxana Marachi, PhD!

Privacy, Poverty, and Big Data: A Matrix of Vulnerabilities For Poor Americans // Madden, Gilman, Levy, Marwick (2017). Washington University Law Review

This Article examines the matrix of vulnerabilities that low-income people face as a result of the collection and aggregation of big data and the application of predictive analytics. On one hand, big data systems could reverse growing economic inequality by expanding access to opportunities for low-income people. On the other hand, big data could widen economic gaps by making it possible to prey on low-income people or to exclude them from opportunities due to biases entrenched in algorithmic decision-making tools. New kinds of “networked privacy” harms, in which users are simultaneously held liable for their own behavior and the actions of those in their networks, may have particularly negative impacts on the poor. This Article reports on original empirical findings from a large, nationally-representative telephone survey with an oversample of low-income American adults, and highlights how these patterns make particular groups of low-status Internet users uniquely vulnerable to various forms of surveillance and networked privacy-related problems.In particular, a greater reliance on mobile connectivity, combined with lower usage of privacy-enhancing strategies, may contribute to various privacy and security-related harms. The Article then discusses three scenarios in which big data—including data gathered from social media inputs—is being aggregated to make predictions about individual behavior: employment screening, access to higher education, and predictive policing. Analysis of the legal frameworks surrounding these case studies reveals a lack of legal protections to counter digital discrimination against low-income people. In light of these legal gaps, the Article assesses leading proposals for enhancing digital privacy through the lens of class vulnerability, including comprehensive consumer privacy legislation, digital literacy, notice and choice regimes, and due process approaches. As policymakers consider reforms, the Article urges greater attention to impacts on low-income persons and communities." 

No comment yet.
Scooped by Roxana Marachi, PhD!

A Failure to “Do No Harm”: India’s Aadhaar Biometric ID Program and its Inability to Protect Privacy in Relation to Measures in Europe and the U.S. // Dixon (2016), Journal of Technology Science

A Failure to “Do No Harm”: India’s Aadhaar Biometric ID Program and its Inability to Protect Privacy in Relation to Measures in Europe and the U.S. // Dixon (2016), Journal of Technology Science | Educational Psychology & Technology |

"It is important that digital biometric identity systems be used by governments with a Do No Harm mandate, and the establishment of regulatory, enforcement and restorative frameworks ensuring data protection and privacy needs to transpire prior to the implementation of technological programs and services. However, when, and where large government bureaucracies are involved, the proper planning and execution of public service programs very often result in ungainly outcomes, and are often qualitatively not guaranteeable. Several important factors, such as the strength of the political and legal systems, may affect such cases as the implementation of a national digital identity system. Digital identity policy development, as well as technical deployment of biometric technologies and enrollment processes, may all differ markedly, and could depend in some part at least, on the overall economic development of the country in question, or political jurisdiction, among other factors. This article focuses on the Republic of India’s national digital biometric identity system, the Aadhaar, for its development, data protection and privacy policies, and impact. Two additional political jurisdictions, the European Union, and the United States are also situationally analyzed as they may be germane to data protection and privacy policies originated to safeguard biometric identities. Since biometrics are foundational elements in modern digital identity systems, expression of data protection policies that orient and direct how biometrics are to be utilized as unique identifiers are the focus of this analysis. As more of the world’s economies create and elaborate capacities, capabilities and functionalities within their respective digital ambits, it is not enough to simply install suitable digital identity technologies; much, much more - is durably required. For example, both vigorous and descriptive means of data protection should be well situated within any jurisdictionally relevant deployment area, prior to in-field deployment of digital identity technologies. Toxic mixes of knowledge insufficiencies, institutional naïveté, political tomfoolery, cloddish logical constructs, and bureaucratic expediency must never overrun fundamental protections for human autonomy, civil liberties, data protection, and privacy."...


For full article, please see: 


No comment yet.
Scooped by Roxana Marachi, PhD!

"What is Summit and why should parents and students be concerned about its use?" // Parent Coalition for Student Privacy 

Available for download here: 


Linked from the following post on student protest of online platform: 

No comment yet.
Scooped by Roxana Marachi, PhD!

O.K., Google: How Much Money Have I Made for You Today? // The New York Times

O.K., Google: How Much Money Have I Made for You Today? // The New York Times | Educational Psychology & Technology |

By Jennifer Szalai

A friend of mine says that whenever he walks into someone’s home he’s tempted to yell out, “Hey, Alexa,” or “O.K., Google,” and order 50 pizzas, just to see if there’s a device listening in on whatever gossip he planned to dish out next.


Shoshana Zuboff would undoubtedly get the joke, but she probably wouldn’t laugh. In “The Age of Surveillance Capitalism,” she warns against mistaking the soothing voice of a personal digital assistant for “anything other than the exploitation of your needs.” The cliché that “if you’re not paying for it, you’re the product” isn’t alarming enough for her. She likens the big tech platforms to elephant poachers, and our personal data to ivory tusks. “You are not the product,” she says. “You are the abandoned carcass.”


O.K., Zuboff, tell me more. It’s a testament to how extraordinarily intelligent her book is that by the time I was compared to an elephant carcass, I resisted the urge to toss it across the room. Zuboff, a professor emerita of Harvard Business School and the author of “In the Age of the Smart Machine” (1988), has a dramatic streak that could come off as simply grandiose if she didn’t so painstakingly make her case. She says we’re living through such “a bold and unprecedented shift in capitalist methods” that even as we encounter the occasional story about Facebook allowing its corporate clients to read users’ private messages or the software in Google’s Street View cars scraping unencrypted information from people’s homes, the American public doesn’t yet grasp the new dispensation in its entirety.


So many people take care to calibrate their privacy settings just so, sharing certain things with friends and keeping other things hidden, while their data still gets collected and shared among apps for possible monetization now or later. Google and Facebook might not call to mind the belching smoke stacks and child laborers of the Industrial Revolution, but Zuboff argues that they’re run by people who have turned out to be just as ruthless and profit-seeking as any Gilded Age tycoon. Instead of mining the natural landscape, surveillance capitalists extract their raw material from human experience."...


For full post, please see: 

No comment yet.
Scooped by Roxana Marachi, PhD!

Beyond Knowledge Ventriloquism and Echo Chambers: Raising the Quality of the Debate in Teacher Education // Teachers College Record 

Beyond Knowledge Ventriloquism and Echo Chambers: Raising the Quality of the Debate in Teacher Education // Teachers College Record  | Educational Psychology & Technology |

By Kenneth Zeichner and  Hilary G. Conklin 

Background/Context: For over two decades, there has been a steady call for deregulating U.S. teacher education, closing down allegedly poor quality college and university programs, and creating greater market competition. In response to this call to disrupt the dominance of colleges and universities in teacher education, and because of the policies and funding allocations of the U.S. Education Department and private foundation funding, non-university providers of teacher education have proliferated in certain areas of the country. A critical aspect of the current call for greater deregulation and market competition in teacher education has been the declaration that university teacher education has failed. While there is no dispute about the need for improvements in the dominant college and university system of teacher education, it is also important to critically evaluate the warrants for the value of programs that critics claim should replace college and university programs.


Purpose: The focus of this paper is to illustrate how research has been misrepresented to support policies and programs that would simultaneously reduce the role of colleges and universities in preparing U.S. teachers and support the expansion of the role of non-university providers. We also examine the print news media’s role in uncritically reproducing a narrative of failure about university teacher education and promoting the success of new non-university programs—attention that has served to inflate the public perception of these organizations and programs beyond what is warranted by the available evidence.


Research Design: Four cases are presented that illustrate the efforts to manufacture a narrative of the failure of colleges and universities in preparing teachers, and to construct a narrative of success for the non-university programs that have been funded to replace them. The authors use the concepts of echo chambers and knowledge ventriloquism to show how this process operates.


Conclusions/Recommendations: Following the presentation of the cases, specific recommendations are offered for raising the quality of the debates about the future of U.S. teacher education. These include greater transparency in the process of reform, better communication between researchers and stakeholders, using research that has been vetted to inform the debates, and genuinely exploring different policy options for teacher education.


For full article, please see: 

No comment yet.
Scooped by Roxana Marachi, PhD!

Disrupted Childhood: The Cost of Persuasive Design // 5Rights

Disrupted Childhood: The Cost of Persuasive Design // 5Rights | Educational Psychology & Technology | 

No comment yet.