Educational Psychology & Technology
18.0K views | +5 today
Scooped by Roxana Marachi, PhD
onto Educational Psychology & Technology!

How to Foster Grit, Tenacity and Perseverance: An Educator’s Guide // MindShift

How to Foster Grit, Tenacity and Perseverance: An Educator’s Guide // MindShift | Educational Psychology & Technology |

"How can we best prepare children and adolescents to thrive in the 21st century? This question is at the heart of what every educator attempts to do on a daily basis. Apart from imparting content of knowledge and facts, however, it’s becoming clear that the “noncognitive competencies” known as grit, perseverance, and tenacity are just as important, if not more so, in preparing kids to be self-sufficient and successful."

No comment yet.
Educational Psychology & Technology
This curated collection includes news, resources, and research related to Educational Psychology and/or Technology. The page also serves as a research tool to organize online content. The grey funnel shaped icon at the top allows for searching by keyword. For research more specific to tech and screen time, please see: For additional Educator Resources, please visit [Links to an external site].
Your new post is loading...
Your new post is loading...
Rescooped by Roxana Marachi, PhD from Screen Time, Wireless, and EMF Research!

Health and Safety Research Gaps in Policies and Practices Integrating Emerging Technologies for Young Children 

Links are as follows in order of the slides: 


The Silicon Valley Billionaires Remaking America's Schools 


Dr. Catherine Steiner-Adair
Clinical Psychologist and Research Associate at Harvard Medical School 


Video link may be viewed at: 


Carter B, Rees P, Hale L, Bhattacharjee D, Paradkar MS. Association Between Portable Screen-Based Media Device Access or Use and Sleep Outcomes: A Systematic Review and Meta-analysis.JAMA Pediatr. 2016 Oct 31. doi: 10.1001/jamapediatrics.2016.2341. [Epub ahead of print] 


Screen Time Hurts More Than Kids' Eyes 


New Media Consortium / Consortium for School Networking Horizon Report 


"American Revolution 2.0: How Education Innovation is Going to Revitalize America and Transform the U.S. Economy" 


"Preschool is Good For Children But It's Expensive So Utah Is Offering It Online" m/local/education/preschool-is- good-for-poor-kids-but-its- expensive-so-utah-is-offering-it- online/2015/10/09/27665e52- 5e1d-11e5-b38e- 06883aacba64_story.html  


Philanthropy Roundtable's: "Blended Learning: Wise Givers Guide to Supporting Tech-Assisted Learning" (Formerly chaired by B. DeVos)  


CyberCharters Have Overwhelming Negative Impact 


Ma, J., van den Heuvel, M., Maguire, J., Parkin, P., Birken, C. (2017). Is handheld screen time use associated with language delay in infants? Presented at the Pediatric Academic Societies Meeting, San Francisco, CA.  


Jonathan Rochelle’s GSV/ASU PRIMETIME Keynote Speech pitching Google Cardboard for children in schools as proxy for actual field trips: 


Scientists Urge Google to Stop Untested Microwave Radiation of Children's Eyes and Brains with Virtual Reality Devices in Schools //  Asus product manual 


Telecom Industry Liability and Insurance Information 


National Association for Children and Safe Technology - iPad Information 


For infant/pregnancy related safety precautions, please visit 


194 Signatories (physicians, scientists, educators) on Joint Statement on Pregnancy and Wireless Radiation 


Article screenshot from France: "Portables. L'embrouille des ondes electromagnetiques


Wireless Phone Radiation Risks and Public Policy 


"Show The Fine Print" 


Scientist petition calls for greater protective measures for children and pregnant women, cites need for precautionary health warnings, stronger regulation of electromagnetic fields, creation of EMF free zones, and media disclosures of experts’ financial relationships with industry when citing their opinions regarding the safety of EMF-emitting technologies. Published in European Journal of Oncology 


International Agency for Research on Cancer Classifies Radiofrequency Electromagnetic Fields as Possibly Carcinogenic to Humans (2011)


For more on source of funding research, see: and 


Maryland State Children’s Environmental Health and Protection Advisory Council // Public Testimony


"Until now, radiation from cell towers has not been considered a risk to children, but a recent study raises new questions about possible long-term, harmful effects." 


For further reading, please see Captured Agency report published by Harvard’s Center for Ethics  or 


Updates/posts/safety information on Virtual Reality: 


Environmental Health Trust Virtual Reality Radiation Absorption Slides 


Healthy Kids in a Digital World: 


National Association for Children and Safe Technology 


Doctors’ Letters on Wifi in Schools// 154 page compilation 


Insurance and Liability Disclaimers/Information from Telecom Companies 


Most of the documents and articles embedded within the presentation above are searchable/accessible on the following page:

Document above is a pdf with live links. They are provided above for easier access. To download the original file, please click on title or arrow above. It is a large file so may take several minutes.  

No comment yet.
Scooped by Roxana Marachi, PhD!

Big Data Meets Big Brother as China Moves To Rate Its Citizens // Wired

Big Data Meets Big Brother as China Moves To Rate Its Citizens // Wired | Educational Psychology & Technology |

By Rachel Botsman
"The Chinese government plans to launch its Social Credit System in 2020. The aim? To judge the trustworthiness – or otherwise – of its 1.3 billion residents

On June 14, 2014, the State Council of China published an ominous-sounding document called "Planning Outline for the Construction of a Social Credit System". In the way of Chinese policy documents, it was a lengthy and rather dry affair, but it contained a radical idea. What if there was a national trust score that rated the kind of citizen you were?


Imagine a world where many of your daily activities were constantly monitored and evaluated: what you buy at the shops and online; where you are at any given time; who your friends are and how you interact with them; how many hours you spend watching content or playing video games; and what bills and taxes you pay (or not). It's not hard to picture, because most of that already happens, thanks to all those data-collecting behemoths like Google, Facebook and Instagram or health-tracking apps such as Fitbit. But now imagine a system where all these behaviours are rated as either positive or negative and distilled into a single number, according to rules set by the government. That would create your Citizen Score and it would tell everyone whether or not you were trustworthy. Plus, your rating would be publicly ranked against that of the entire population and used to determine your eligibility for a mortgage or a job, where your children can go to school - or even just your chances of getting a date.

A futuristic vision of Big Brother out of control? No, it's already getting underway in China, where the government is developing the Social Credit System (SCS) to rate the trustworthiness of its 1.3 billion citizens. The Chinese government is pitching the system as a desirable way to measure and enhance "trust" nationwide and to build a culture of "sincerity". As the policy states, "It will forge a public opinion environment where keeping trust is glorious. It will strengthen sincerity in government affairs, commercial sincerity, social sincerity and the construction of judicial credibility."

Others are less sanguine about its wider purpose. "It is very ambitious in both depth and scope, including scrutinising individual behaviour and what books people are reading. It's Amazon's consumer tracking with an Orwellian political twist," is how Johan Lagerkvist, a Chinese internet specialist at the Swedish Institute of International Affairs, described the social credit system. Rogier Creemers, a post-doctoral scholar specialising in Chinese law and governance at the Van Vollenhoven Institute at Leiden University, who published a comprehensive translation of the plan, compared it to "Yelp reviews with the nanny state watching over your shoulder"....


For full post, please see: 

No comment yet.
Scooped by Roxana Marachi, PhD!

Google’s High-Altitude Internet Balloon Crashes in Kenya //  Hindustan Times

Google’s High-Altitude Internet Balloon Crashes in Kenya //  Hindustan Times | Educational Psychology & Technology |

 "A high altitude balloon that was launched by Google to provide high-speed internet in the remote parts of the earth under "Project Loon", has crashed in a Kenyan farm, a report said on Saturday.

The balloon, a part of a 10-balloon batch, was deployed for testing in Kakuru, Nanyuki, Nyeri and Marsabit in July 2017. It crashed at Nthambiro in Meru on Friday night, media reported.
Some residents complained of headaches after they gathered around the device to get its glimpse.
“The device from project loon indicates it fell after its expiry period of six months. No one is yet to claim the device,” Igembe South OCPD Jane Nyakeruma was quoted as saying.
Earlier this year, Google announced that it was “years closer” to deliver internet to remote parts of the world using high-flying balloons.
Researchers at Google’s Project Loon -- part of the company’s X research lab -- said it was now able to use machine learning to predict weather systems, meaning the firm has a greater control over where its balloons go, making it possible to focus on a specific region, rather than circumnavigating the globe, BBC reported.
Under the project, the firm suspended a network of huge balloons that beam down connectivity.
The balloons float in the stratosphere around 11 miles high. By raising or lowering altitude, the balloons can be caught in different weather streams, changing direction."
For original post, see: 


The following incidents have been recorded on the Wikipedia page for Project Loon: 


See also: 

International Coalition Objects to Google's Project Loon 



No comment yet.
Scooped by Roxana Marachi, PhD!

Intel Was Aware of the Chip Vulnerability When Its CEO Sold Off $24 Million in Company Stock // Business Insider

Intel Was Aware of the Chip Vulnerability When Its CEO Sold Off $24 Million in Company Stock // Business Insider | Educational Psychology & Technology | 

No comment yet.
Scooped by Roxana Marachi, PhD!

EdTech Coverage, The Hype Cycle, and Media Complicity // Alexander Russo (via Phi Delta Kappan)

EdTech Coverage, The Hype Cycle, and Media Complicity // Alexander Russo (via Phi Delta Kappan) | Educational Psychology & Technology |

"Another insider’s concerns about the media hype cycle – and some great edtech story leads for 2018.

By Alexander Russo

Policymakers, administrators, and educators have been experimenting with technology for roughly 30 years, going back to the days of computer-assisted instruction, connecting schools to “the information superhighway,” and One Laptop Per Child. By necessity, education journalists have been along for the ride. (Twenty years ago this past summer, the magazine then known as The Atlantic Monthly published The Computer Delusion, warning the American public about the dangers of bringing computers into schools and expecting much academic improvement.)

But the past few years have been especially confusing and contentious when it comes to technology and schools, in response to the backlash against the school reform movement, growing concerns about the dominance of the “big five” tech companies (Apple, Alphabet/Google, Amazon, Facebook, and Microsoft*), and edtech advocates’ tireless efforts to fix education with apps and gizmos."...


For full post, please visit: 

No comment yet.
Scooped by Roxana Marachi, PhD!

A Cute Toy Just Brought a Hacker Into Your Home // New York Times

A Cute Toy Just Brought a Hacker Into Your Home // New York Times | Educational Psychology & Technology |

"Researchers recently found the Furby Connect’s Bluetooth connection could be hijacked by hackers, letting them turn on the doll’s microphone and speak to children." Image creditTony Cenicola/The New York Times 


By Sheera Frenkel

SAN FRANCISCO — My Friend Cayla, a doll with nearly waist-length golden hair that talks and responds to children’s questions, was designed to bring delight to households. But there’s something else that Cayla might bring into homes as well: hackers and identity thieves.

Earlier this year, Germany’s Federal Network Agency, the country’s regulatory office, labeled Cayla “an illegal espionage apparatus” and recommended that parents destroy it. Retailers there were told they could sell the doll only if they disconnected its ability to connect to the internet, the feature that also allows in hackers. And the Norwegian Consumer Council called Cayla a “failed toy.”


The doll is not alone. As the holiday shopping season enters its frantic last days, many manufacturers are promoting “connected” toys to keep children engaged. There’s also a smart watch for kids, a droid from the recent “Star Wars” movies and a furry little Furby. These gadgets can all connect with the internet to interact — a Cayla doll can whisper to children in several languages that she’s great at keeping secrets, while a plush Furby Connect doll can smile back and laugh when tickled.


But once anything is online, it is potentially exposed to hackers, who look for weaknesses to gain access to digitally connected devices. Then once hackers are in, they can use the toys’ cameras and microphones to potentially see and hear whatever the toy sees and hears. As a result, according to cybersecurity experts, the toys can be turned to spy on little ones or to track their location.


“Parents need to be aware of what they are buying and bringing home to their children,” said Javvad Malik, a researcher with cybersecurity company AlienVault. “Many of these internet-connected devices have trivial ways to bypass security, so people have to be aware of what they’re buying and how secure it is.” 



No comment yet.
Rescooped by Roxana Marachi, PhD from Screen Time, Wireless, and EMF Research!

California Health Officials Release Guidance For Limiting Exposure to Cellphone Radiation // EduResearcher 

California Health Officials Release Guidance For Limiting Exposure to Cellphone Radiation // EduResearcher  | Educational Psychology & Technology |

"The California Department of Public Health has recently issued guidance for reducing exposure to radiation emitted from cell phones. An emphasis within the document includes children’s heightened vulnerabilities to cumulative hazards of long term exposure. The document was originally drafted in 2009 by the Department of Public Health’s Division of Environmental and Occupational Disease Control and underwent numerous revisions, yet remained hidden from public view until now (for more on the lawsuit that led the Sacramento Superior Court to order the release of the draft documents, see here). 


The recommendations outlined by the California Department of Public Health are similar to those issued by the Connecticut Department of Public Health in May, 2015. Below is the official press release issued on December 13th, 2017:


“SACRAMENTO – As smartphone use continues to increase in the U.S., especially among children, the California Department of Public Health (CDPH) today issued guidance for individuals and families who want to decrease their exposure to the radio frequency energy emitted from cell phones."...


For full post, see: 

No comment yet.
Scooped by Roxana Marachi, PhD!

​Chromageddon Comedown: Educators Are Wary After Thousands of Google Devices Fail // EdSurge News

​Chromageddon Comedown: Educators Are Wary After Thousands of Google Devices Fail // EdSurge News | Educational Psychology & Technology |

By Sydney Johnson
"Kelly Dumont is an education technologist for Canyons School District in Sandy, Utah. On December 5, his district was one of many around the country that experienced a massive Chromebook error that caused hundreds of thousands of devices to temporarily disconnect from the internet.


Dumont, who is the elementary edtech team lead in his Utah district, believes that all 20,000 of the devices in CSD were affected by the issue, which was resolved by Google later that day. Now that the devices are back online, however, he and other educators question whether the error was a one-off glitch, or a foreshadowing for other unexpected issues in the cloud-based technology.

“This was temporarily catastrophic,” Dumont tells EdSurge.


On a support page setup to assist educators in re-connecting their decies, Google attributes the error to an “invalid network policy” that was pushed out to devices. The error caused devices to lose “connectivity to passphrase-protected WiFi networks configured through admin policies.”


Mass policy updates like the one that affected devices last week are not uncommon—though the effect last week is believed to be a first. “We issue network policy updates on a regular basis, which varies based on admin updates and device boots,” a representative from Google wrote in an email.


Dumont is aware that these types of updates are routinely pushed through the cloud-based system. But the event leaves him wary nonetheless.


“It’s an eye-opening situation because we are trusting the cloud so much for everything, and we rely heavily on Google services,” he says. “Google Apps for Education is our baseline for 1:1. The potential to have 20,000 devices go down at once is concerning.” Adam Henderson, the director of technology systems at Nassau County Schools in Florida, shared similar concerns with EdSurge last week while the issue was still unraveling."... 

No comment yet.
Scooped by Roxana Marachi, PhD!

Lurkers, Creepers, and Virtuous Interactivity: From Property Rights to Consent and Care as a Conceptual Basis For Privacy Concerns and Information Ethics // First Monday

Lurkers, Creepers, and Virtuous Interactivity: From Property Rights to Consent and Care as a Conceptual Basis For Privacy Concerns and Information Ethics // First Monday | Educational Psychology & Technology |

By D.E. Wittkower
"Exchange of personal information online is usually conceptualized according to an economic model that treats personal information as data owned by the persons these data are ‘about.’ This leads to a distinct set of concerns having to do with data ownership, data mining, profits, and exploitation, which do not closely correspond to the concerns about privacy that people actually have. A post-phenomenological perspective, oriented by feminist ethics of care, urges us to figure out how privacy concerns arrive in fundamentally human contexts and to speak to that, rather than trying to convince people to care about privacy as it is juridically conceived and articulated. By considering exchanges of personal information in a human-to-human online informational economy — being friends on social networking sites — we can identify an alternate set of concerns: consent, respect, lurking, and creepiness. I argue that these concerns will provide a better guide to both users and companies about prudence and ethics in information economies than the existing discourse around ‘privacy.’



1. Introduction
2. Methodology
3. Privacy and the property-based understanding of personal information
4. Personal Information re-conceived in the context of relationships
5. The online-emergent virtue of interactivity
6. Being a lurker
7. Being a creeper
8. Broader applications


For full post, click on title above or here: 

No comment yet.
Scooped by Roxana Marachi, PhD!

For-Profit Online College Faces Lawsuit For Allegedly Misleading Students // SF Examiner

For-Profit Online College Faces Lawsuit For Allegedly Misleading Students // SF Examiner | Educational Psychology & Technology |  



No comment yet.
Scooped by Roxana Marachi, PhD!

Teach Like They're Data: Max Ventilla's Extractive AltSchool Platform, Personalization, and Profit // Long View on Education

Teach Like They're Data: Max Ventilla's Extractive AltSchool Platform, Personalization, and Profit // Long View on Education | Educational Psychology & Technology |

"“Paradoxically, the corporate sector responsible for deindustrialization and cutting wages while receiving tax breaks that starved cities of revenue is now repositioned as beneficent donor, if not savior [of education]. The global billionaires who accumulated unfathomable wealth at the expense of most of the world’s people are now our benefactors and leaders.” – Pauline Lipman & Cristen Jenkins

“The technology always extracts. … Before computers, it was fossil fuels. The idea that you can pull free physical work out of the ground, that was a really good trick, and it resulted in all of these exponential curves. But now we’re discovering how to pull free mental work out of the ground. That’s going to be an equivalent, huge trick over the next 50 years.” – Max Ventilla


Platforms and Classrooms 

Max Ventilla, former head of personalisation at Google, has announced that his start-up, Altschool, will be closing some locations and focusing in on developing and marketing it’s software platform. Marketing itself as a site of ‘hyperpersonalisation’, AltSchool has received $175 million in venture capital as it pursues Ventilla’s for-profit dream to reshape education. According to Bloomberg, Ventilla sent a letter to parents informing them of the upcoming changes:


“Ventilla said AltSchool will only run classrooms near the main offices in San Francisco and New York. ‘We know this is tough news that will have a big impact on your family,’ Ventilla said. But the moves are needed, he wrote, given AltSchool’s ‘strategy, path to growth and finances.’ Ventilla told Bloomberg that the company had long planned to prioritize selling technology to other schools. He said it’s happening earlier than anticipated because of demand. For outside schools, the company charges about $150 to $500 annually per student for its technology, depending on the size of the institution.”

Ventilla’s platform aims to use the same kind of personalisation technology that Google, Facebook, and Netflix use to recommend results and experiences to us. A kind of “mass customisation”, this approach relies on collecting data about individual users based on our likes and clicks, and creating algorithms to use larger patterns in their massive data sets to offer us custom results which we can ‘choose’ from. As Audrey Watters argues, personalisation sounds progressive – we’ll offer you individually tailored experiences! – but it’s more about delivering ‘content’ like customised Facebook ads.

In an interview, Ventilla says that “we start with a representation of each child”, and even though “the vast majority of the learning should happen non-digitally”, the child’s habits and preferences gets converted into data, “a digital representation of the important things that relate to that child’s learning, not just their academic learning but also their non-academic learning. Everything logistic that goes into setting up the experience for them, whether it’s who has permission to pick them up or their allergy information. You name it.”


And just like Netflix matches us to TV shows, “If you have that accurate and actionable representation for each child, now you can start to personalize the whole experience for that child. You can create that kind of loop you described where because we can represent a child well, we can match them to the right experiences.”


After watching a video of what a day in their middle school looks like, I was struck by the contrasting realities of the classroom and the platform. Each class has a maximum of 24 students with two teachers and the life of the classroom looks vibrant: the teachers conference with students about their work, an expert from Stanford helps the students carry out a design project, and the space of the school looks comfortable and inviting.


Yet, the bright spaces and two teachers per class is not the business model – the platform – that Ventilla wants to sell because “teachers are expensive“.1

The flexibility that Altschool offers to wealthy parents in the expensive neighbourhoods of its lab schools in San Francisco and Brooklyn won’t  ‘scale’ when the platform is sold. In its lab schools, Altschool allows flexible pick-up and drop-off times for parents via their smart phones, and even accomodates surprise vacations to Hong Kong if they should arise. According to the New York Times, “A tablet with your child’s lesson plans would go with you, and he or she could study and work wherever you are. AltSchool’s plan, ultimately, after years of data-keeping, self-assessment and reassessment, is to take its best practices and technological innovations to the universe of public schools.”

The same NYT article contrasts Altschool with the “boot-camp model of so many of the city’s charter schools, where learning can too easily be divorced from pleasure, and fear rather than joy is the operative motivator.” But what will Altschool – the platform – look like when it is exported to public schools where the cost of teachers and space matter? Given that “AltSchool’s losses are piling up as it spends at a pace of about $40 million per year“, it’s not hard to imagine that the more desirable aspects of Altschool’s flexibility will be only be available for purchase by the wealthy.

As one example of how the implementation of the platform might carry negative consequences in public schools, consider the Altschool’s use of cameras to gather surveillance. According to Business Insider, “Cameras are also mounted at eye level for kids, so teachers can review successful lessons and ‘the steps leading up to those ‘ah-ha’ moments,’ head of school Kathleen Gibbons said. Some children use them as confessionals, sharing their secrets with the camera.”"...


For full post, please see: 

No comment yet.
Scooped by Roxana Marachi, PhD!

Regulating Blockchain and Distributed Ledger Identity // K(NO)W Identity Conference

"As governments and companies move from research and development to deployment of distributed ledgers and blockchains for identity, they will need to consider the legal and regulatory implications of, and risks presented by, these transformative technologies. In this session we’ll consider different approaches to regulation, and ask the fundamental question of whether self-sovereign identity and regulation can co-exist Filmed at the 2017 K(NO)W Identity Conference - Washington, D.C.

Juan Llanos - Senior Advisor, One World Identity
Steve Ehrlich - Lead Analyst for Emerging Technologies, Spitzberg Partners
Alan Cohen - Of Counsel, Steptoe & Johnson LLP
Jamie Smith - Global Chief Communications and Marketing Officer, Bitfury Group
Andrea Tinianow - Director, Global Delaware 



No comment yet.
Scooped by Roxana Marachi, PhD!

YouTube Is Addressing Its Massive Child Exploitation Problem

YouTube Is Addressing Its Massive Child Exploitation Problem | Educational Psychology & Technology |

By Charlie Warzel

"Across YouTube, an unsettling trend has emerged: Accounts are publishing disturbing and exploitative videos aimed at and starring children in compromising, predatory, or creepy situations — and racking up millions of views.


BuzzFeed News has found a number of videos, many of which appear to originate from eastern Europe, that feature young children, often in revealing clothing, placed in vulnerable scenarios. In many instances, they're restrained with ropes or tape and sometimes crying or in visible distress. In other videos, the children are kidnapped, or made to 'play doctor' with an adult. The videos frequently include gross-out themes like injections, eating feces, or needles. Many come from YouTube 'verified' channels and have tens of millions of views. After BuzzFeed News brought these videos to the attention of YouTube, they were removed.


In recent weeks, YouTube has faced criticism after reports about unsettling animated videos and bizarre content, which were aimed at children using family-friendly characters, and some of which was not caught by YouTube Kids’ filters. All of the videos reviewed by BuzzFeed News were live-action, ostensibly set up by adults and featuring children. Taken together, they make up a vast, disturbing, and wildly popular universe of videos that — until recently — existed unmoderated by the platform."...


For full post, please see: 

No comment yet.
Scooped by Roxana Marachi, PhD!

Automating Inequality (by Virginia Eubanks) // Macmillan

Automating Inequality (by Virginia Eubanks) // Macmillan | Educational Psychology & Technology |

"The State of Indiana denies one million applications for healthcare, foodstamps and cash benefits in three years—because a new computer system interprets any mistake as “failure to cooperate.” In Los Angeles, an algorithm calculates the comparative vulnerability of tens of thousands of homeless people in order to prioritize them for an inadequate pool of housing resources. In Pittsburgh, a child welfare agency uses a statistical model to try to predict which children might be future victims of abuse or neglect.

Since the dawn of the digital age, decision-making in finance, employment, politics, health and human services has undergone revolutionary change. Today, automated systems—rather than humans—control which neighborhoods get policed, which families attain needed resources, and who is investigated for fraud. While we all live under this new regime of data, the most invasive and punitive systems are aimed at the poor.


In Automating Inequality, Virginia Eubanks systematically investigates the impacts of data mining, policy algorithms, and predictive risk models on poor and working-class people in America. The book is full of heart-wrenching and eye-opening stories, from a woman in Indiana whose benefits are literally cut off as she lays dying to a family in Pennsylvania in daily fear of losing their daughter because they fit a certain statistical profile.


The U.S. has always used its most cutting-edge science and technology to contain, investigate, discipline and punish the destitute. Like the county poorhouse and scientific charity before them, digital tracking and automated decision-making hide poverty from the middle-class public and give the nation the ethical distance it needs to make inhumane choices: which families get food and which starve, who has housing and who remains homeless, and which families are broken up by the state. In the process, they weaken democracy and betray our most cherished national values.


This deeply researched and passionate book could not be more timely." 

No comment yet.
Scooped by Roxana Marachi, PhD!

iPhones and Children Are a Toxic Pair, Say Two Big Apple Investors // Wall Street Journal

iPhones and Children Are a Toxic Pair, Say Two Big Apple Investors // Wall Street Journal | Educational Psychology & Technology |

By David Benoit

"The iPhone has made Apple Inc. and Wall Street hundreds of billions of dollars. Now some big shareholders are asking at what cost, in an unusual campaign to make the company more socially responsible.


A leading activist investor and a pension fund are saying the smartphone maker needs to respond to what some see as a growing public-health crisis of youth phone addiction.


Jana Partners LLC and the California State Teachers’ Retirement System, or Calstrs, which control about $2 billion of Apple shares, sent a letter to Apple on Saturday urging it to develop new software tools that would help parents control and limit phone use more easily and to study the impact of overuse on mental health.

The Apple push is a preamble to a new several-billion-dollar fund Jana is seeking to raise this year to target companies it believes can be better corporate citizens. It is the first instance of a big Wall Street activist seeking to profit from the kind of social-responsibility campaign typically associated with a small fringe of investors.


Adding splash, rock star Sting and his wife, Trudie Styler, will be on an advisory board along with Sister Patricia A. Daly, a nun who successfully fought Exxon Mobil Corp. over environmental disclosures, and Robert Eccles, an expert on sustainable investing.


The Apple campaign would be unusual for an activist like Jana, which normally urges companies to make financial changes. But the investors believe that Apple’s highflying stock could be hurt in coming decades if it faces a backlash and that proactive moves could generate goodwill and keep consumers loyal to Apple brands.


“Apple can play a defining role in signaling to the industry that paying special attention to the health and development of the next generation is both good business and the right thing to do,” the shareholders wrote in the letter, a copy of which was reviewed by The Wall Street Journal. “There is a developing consensus around the world including Silicon Valley that the potential long-term consequences of new technologies need to be factored in at the outset, and no company can outsource that responsibility.”


Obsessive teenage smartphone usage has sparked a debate among academics, parents and even the people who helped create the iPhone. 


Some have raised concerns about increased rates in teen depression and suicide and worry that phones are replacing old-fashioned human interaction. It is part of a broader re-evaluation of the effects on society of technology companies such as Google and Inc. and social-media companies like Facebook Inc. and Snap chat owner Snap Inc., which are facing questions about their reach into everyday life.


Apple hasn’t offered any public guidance to parents on how to manage children’s smartphone use or taken a position on at what age they should begin using iPhones.


Apple and its rivals point to features that give parents some measure of control. Apple, for instance, gives parents the ability to choose which apps, content and services their children can access.


The basic idea behind socially responsible investing is that good corporate citizenship can also be good business. Big investors and banks, including TPG, UBS Group AG and Goldman Sachs Group Inc. are making bets on socially responsible companies, boosting what they see as good actors and avoiding bad ones. Big-name activists increasingly view bad environmental, social or governance policies as red flags. Jana plans to go further, putting its typical tools to work to drive change that may not immediately pay off.


Apple is an ambitious first target: The combined Jana-Calstrs stake is relatively small given Apple’s nearly $900 billion market value. Still, in recent years Apple has twice faced activists demanding it pare its cash holdings, and both times the company ceded some ground.


Chief Executive Tim Cook has led Apple’s efforts to be a more socially responsible company, for instance on environmental and immigration issues, and said in an interview with the New York Times last year that Apple has a “moral responsibility” to help the U.S. economy.


Apple has shown willingness to use software to address potentially negative consequences of phone usage. Amid rising concerns about distracted driving, the company last year updated its software with a “do not disturb while driving” feature, which enables the iPhone to detect when someone is behind the wheel and automatically silence notifications.


The iPhone is the backbone of a business that generated $48.35 billion in profit in fiscal 2017. It helped turn Apple into the world’s largest publicly listed company by market value, and anticipation of strong sales of its latest model, the iPhone X, helped its stock rise 50% in the past year. Apple phones made up 43% of U.S. smartphones in use in 2016, according to comScore , and an estimated 86 million Americans over age 13 own an iPhone.


Jana and Calstrs are working with Jean M. Twenge of San Diego State University, who chronicled the problem of what she has dubbed the “iGen” in a book that was previewed in a widely discussed article in the Atlantic magazine last fall, and with Michael Rich of Harvard Medical School and Boston Children’s Hospital, known as “the mediatrician” for his work on the impact of media on children.


The investors believe both the content and the amount of time spent on phones need to be tailored to youths, and they are raising concern about the public-health effects of failing to act. They point to research from Ms. Twenge and others about a “growing body of evidence” of “unintentional negative side effects,” including studies showing concerns from teachers. That is one reason Calstrs was eager to support the campaign, according to the letter.


The group wants Apple to help find solutions to questions like what is optimal usage and to be at the forefront of the industry’s response—before regulators or consumers potentially force it to act.


The investors say Apple should make it easier and more intuitive for parents to set up usage limits, which could head off any future moves to proscribe smartphones.


The question is “How can we apply the same kind of public-health science to this that we do to, say, nutrition?” Dr. Rich said in an interview. “We aren’t going to tell you never go to Mickey D’s, but we are going to tell you what a Big Mac will do and what broccoli will do.”


—Tripp Mickle and Betsy Morris contributed to this article. Write to David Benoit at"


For full post, see: 

No comment yet.
Scooped by Roxana Marachi, PhD!

Industry Giants Fail to Tackle Child Labour Allegations In Cobalt Battery Supply Chains // Amnesty International

Industry Giants Fail to Tackle Child Labour Allegations In Cobalt Battery Supply Chains // Amnesty International | Educational Psychology & Technology |

Are smartphone and electric vehicle companies doing enough to cut human rights abuses out of their cobalt supply chains?


  • "Survey of electronics and car companies shows major blind spots in supply chains
  • Apple is the industry leader for responsible cobalt sourcing – but the bar is low
  • Microsoft, Lenovo and Renault have made least progress

Major electronics and electric vehicle companies are still not doing enough to stop human rights abuses entering their cobalt supply chains, almost two years after an Amnesty International investigation exposed how batteries used in their products could be linked to child labour in the Democratic Republic of Congo (DRC), the organization said today.


A new report, Time to Recharge, ranks industry giants including Apple, Samsung Electronics, Dell, Microsoft, BMW, Renault and Tesla on how much they have improved their cobalt sourcing practices since January 2016. It finds that while a handful of companies have made progress, others are still failing to take even basic steps like investigating supply links in the DRC."...


For full story, please see: 


No comment yet.
Scooped by Roxana Marachi, PhD!

Six Ways (And Counting) That Big Data Systems Are Harming Society // The Conversation

Six Ways (And Counting) That Big Data Systems Are Harming Society // The Conversation | Educational Psychology & Technology |

By Joanna Redden

"There is growing consensus that with big data comes great opportunity, but also great risk.


But these risks are not getting enough political and public attention. One way to better appreciate the risks that come with our big data future is to consider how people are already being negatively affected by uses of it. At Cardiff University’s Data Justice Lab, we decided to record the harms that big data uses have already caused, pulling together concrete examples of harm that have been referenced in previous work so that we might gain a better big picture appreciation of where we are heading.


We did so in the hope that such a record will generate more debate and intervention from the public into the kind of big data society, and future we want. The following examples are a condensed version of our recently published Data Harm Record, a running record, to be updated as we learn about more cases.

1. Targeting based on vulnerability

With big data comes new ways to socially sort with increasing precision. By combining multiple forms of data sets, a lot can be learned. This has been called “algorithmic profiling” and raises concerns about how little people know about how their data is collected as they search, communicate, buy, visit sites, travel, and so on.


Much of this sorting goes under the radar, although the practices of data brokers have been getting attention. In her testimony to the US Congress, World Privacy Forum’s Pam Dixon reported finding data brokers selling lists of rape victims, addresses of domestic violence shelters, sufferers of genetic diseases, sufferers of addiction and more.

2. Misuse of personal information

Concerns have been raised about how credit card companies are using personal details like where someone shops or whether or not they have paid for marriage counselling to set rates and limits. One study details the case of a man who found his credit rating reduced because American Express determined that others who shopped where he shopped had a poor repayment history.


This event, in 2008, was an early big data example of “creditworthiness by association” and is linked to ongoing practices of determining value or trustworthiness by drawing on big data to make predictions about people.

3. Discrimination

As corporations, government bodies and others make use of big data, it is key to know that discrimination can and is happening – both unintentionally and intentionally. This can happen as algorithmically driven systems offer, deny or mediate access to services or opportunities to people differently.


Some are raising concerns about how new uses of big data may negatively influence people’s abilities get housing or insurance – or to access education or get a job. A 2017 investigation by ProPublica and Consumer Reports showed that minority neighbourhoods pay more for car insurance than white neighbourhoods with the same risk levels. ProPublica also shows how new prediction tools used in courtrooms for sentencing and bonds “are biased against blacks”. Others raise concerns about how big data processes make it easier to target particular groups and discriminate against them."...


For full post, see: 

No comment yet.
Scooped by Roxana Marachi, PhD!

Google Internet Balloon Crashes in Meru, Kenya // 

Google Internet Balloon Crashes in Meru, Kenya //  | Educational Psychology & Technology |

By David Muchui

A Google high altitude balloon meant to provide high speed internet in Kenya crashed at Nthambiro in Meru Friday night causing a scare among residents. Residents gathered in a miraa farm where the balloon powered device fell to get a glimpse of it before police officers took it away.


The device from Google's balloon-powered high-speed internet service known as Project Loon is part of 10 balloons deployed in July 2017 for testing in Nakuru, Nanyuki, Nyeri and Marsabit.


Some scared residents expressed concern after they complained of headache after approaching the device.  Igembe South OCPD Jane Nyakeruma said no injuries or damages were reported as a result of the crash.


"The device from Project Loon indicates it fell after its expiry period of six months. No one is yet to claim the device," Ms Nyakeruma said.


Super Pressured


According to the developer, Project Loon is a global network of high altitude balloons which ascend like weather balloons until they reach the stratosphere where they sail about 20 km (65,000 feet) above the earth.


"The Loon balloons are super pressured, allowing them to last much longer. Loon balloons are also unique in that they can sail the wind to travel where they need to go. They can coordinate with other balloons as a flock and their electronics are entirely solar powered," it states."...


For main post, see here: 


No comment yet.
Scooped by Roxana Marachi, PhD!

Consumer and Privacy Groups Demand Action on Toys that Spy on Children // CCFC

Consumer and Privacy Groups Demand Action on Toys that Spy on Children // CCFC | Educational Psychology & Technology |

One Year after Complaint to Federal Trade Commission, Dangerous Toys Are Still on the Market

WASHINGTON—December 18, 2017— Consumer and privacy groups are calling on the Federal Trade Commission (FTC) and companies that sell dangerous internet-connected toys and smartwatches to act to protect children from serious safety and security threats they pose. One year ago, advocacy groups filed a complaint with the FTC about two internet-connected toys, My Friend Cayla and i-Que Intelligent Robot, which capture, record, and analyze what children say and respond to them. The complaint alleged that the manufacturer of these products, Genesis Toys, and the technology provider, Nuance Communications, unfairly and deceptively collect, use, and share audio files of children's voices without providing adequate notice or obtaining verified parental consent, and fail to prevent strangers and predators from covertly eavesdropping on children's private conversations, creating a risk of stalking and physical danger. 


Several major retailers have ceased sales of the toys in response, with the exception of Amazon.


The advocacy groups worked in concert with the Norwegian Consumer Council (NCC), whose research uncovered the problems with Cayla and i-Que, in bringing the danger of these toys to the attention of the FTC and retailers.  In response, Germany has banned the toys as spying devices, and French authorities have demanded information from Genesis and Nuance on the threat posed to children.  Major U.S. retailers like Target and ToysRUs responded by stopping sales of the toys. Walmart informed the groups that it would stop selling the toys, but they were listed on the company’s website this fall until one of the groups, USPIRG, highlighted the dangers of My Friend Cayla in its annual Trouble in Toyland report.


Amazon has taken no action to stop sales of the toys on its site, despite repeated requests from the advocates. The FTC has announced no action in response to the complaint.  


More recently, in October, advocacy groups sent a letter to the Federal Trade Commission asking it to act to protect kids from the danger of smartwatches which are marketed to allow parents to track the location of and stay in touch with very young children.  NCC research showed that the watches, sold in the U.S. under the brands Caref and SeTracker, actually put children at risk—they are unreliable, data is stored unsafely, and they can easily be overtaken by a hacker who might prey upon the child. These products also remain for sale on Amazon, despite the advocates’ request that the company cease sales, and the FTC has yet to respond to the advocates about their concerns.


U.S. groups calling on the FTC and Amazon to take action today to protect children from these dangerous products are the Campaign for a Commercial-Free Childhood, the Center for Digital Democracy, Consumer Action, Consumer Federation of America, Consumer Watchdog, EPIC, Public Citizen’s Commercial Alert Program, and USPIRG.

“Children share intimate details about themselves with their dolls and toys,” said CCFC’s Executive Director, Josh Golin. “My Friend Cayla and i-Que are unsafe devices which put that sensitive information at risk. We applaud the retailers which have stopped selling these toys, and we urge Amazon to put children’s welfare first and do the same.”


“Neither My Friend Cayla nor i-Que Intelligent Robot should be on anyone’s holiday shopping list,” said Susan Grant, Director of Consumer Protection and Privacy at CFA. “Parents should be able to count on responsible retailers and the federal government to keep products that threaten their children’s privacy and security from continuing to be sold.” 


"This year, the state PIRGs added Cayla, as a representative of all interconnected toys and apps targeted at young kids, to our 32nd annual Trouble In Toyland list of potentially hazardous toys," said U.S. PIRG Consumer Program Director Ed Mierzwinski. "Parents and toy-givers need to understand that privacy-invasive toys pose real threats to children, just as toys that pose choking or ingestion hazards or contain excessive levels of toxic lead and other chemicals do."


“Products that connect to the internet are concerning at the best of times, because these products can collect data about ourselves and our daily lives and it’s difficult to control how that data is used by companies. Toys that connect to the internet and track and collect data about kids should be banned from store shelves,” said Linda Sherry, Director of National Priorities for Consumer Action.


“The FTC’s failure to respond to our complaints is appalling.  It shows how empty Acting Chair Maureen Ohlhausen’s promises on another issue to protect Internet privacy now that net neutrality rules have been rescinded are likely to be,” said John M. Simpson, Consumer Watchdog’s Privacy and Technology Project Director. “Responsible retailers won’t sell these invasive toys and responsible advertising companies — unlike Google — won’t advertise them.”


“Connected toys raise serious privacy concerns,” said Marc Rotenberg, President of EPIC.


“Kids should play with their toys and their friends, and not with surveillance devices dressed as dolls.”



No comment yet.
Scooped by Roxana Marachi, PhD!

Dr. Catherine Steiner-Adair on Screen Time Concerns Related to Child Development 

"Dr. Catherine Steiner-Adair is a Clinical Psychologist, Consultant, Speaker, and Author of The Big Disconnect: Protecting Childhood and Family Relationships in the Digital Age. This is a video clip is from a longer talk she gave in Framingham, MA on June 10th, 2015.


For viewers interested in additional information and research related to screen time concerns, see slides from NAACP Conference on the ESSA and Civil Rights in Education in San Jose, CA, August 19th, 2017. "Health and Safety Research Gaps in Policies and Practices Integrating Emerging Technologies for Young Children":"


To view video on YouTube:  

No comment yet.
Scooped by Roxana Marachi, PhD!

School Ditches Online Learning Program After Parents Revolt // AP News 

School Ditches Online Learning Program After Parents Revolt // AP News  | Educational Psychology & Technology |

"CHESHIRE, Conn. (AP) — The fast-growing online platform was built with help from Facebook engineers and designed to help students learn at their own speed. But it’s been dropped because parents in this Connecticut suburb revolted, saying there was no need to change what’s worked in a town with a prized reputation for good schools.

The Summit Learning program, developed by a California charter school network, has signed up over 300 schools to use its blend of technology with go-at-your-own-pace personalized learning. 


Cheshire school administrators and some parents praised the program, but it faced criticism from others who said their children were spending too much time online, some content was inappropriate, and students were not getting enough direct guidance."... 

No comment yet.
Scooped by Roxana Marachi, PhD!

Data Mining Program Designed to Predict Child Abuse Proves Unreliable, DCFS Says // Chicago Tribune

Data Mining Program Designed to Predict Child Abuse Proves Unreliable, DCFS Says // Chicago Tribune | Educational Psychology & Technology |

"The Illinois Department of Children and Family Services is ending a high-profile program that used computer data mining to identify children at risk for serious injury or death after the agency's top official called the technology unreliable.


"We are not doing the predictive analytics because it didn't seem to be predicting much," DCFS Director Beverly "B.J." Walker told the Tribune.

The $366,000 Rapid Safety Feedback program was central to reforms promised by Walker's predecessor, George Sheldon, who took office in 2015 following a series of child deaths and other problems.


Two Florida firms — the nonprofit Eckerd Connects and its for-profit partner, Mindshare Technology — mined electronic DCFS files and assigned a score of 1 to 100 to children who were the subject of an abuse allegation to the agency hotline. The algorithms rated the children's risk of being killed or severely injured during the next two years, according to DCFS public statements.


But caseworkers were alarmed and overwhelmed by alerts as thousands of children were rated as needing urgent protection. More than 4,100 Illinois children were assigned a 90 percent or greater probability of death or injury, according to internal DCFS child-tracking data released to the Tribune under state public records laws.


And 369 youngsters, all under age 9, got a 100 percent chance of death or serious injury in the next two years, the Tribune found.


At the same time, high-profile child deaths kept cropping up with little warning from the predictive analytics software, DCFS officials told the Tribune."... 

No comment yet.
Scooped by Roxana Marachi, PhD!

Five Risks Posed by the Increasing Misuse of Technology in Schools // EdSurge 

Five Risks Posed by the Increasing Misuse of Technology in Schools // EdSurge  | Educational Psychology & Technology |

"At any given moment in the day, I am attached to my cellphone, my iPad or my computer. As a writer, I was an early convert to the computer. I began writing on a TRS-80 from Radio Shack in 1983 on wonderful writing software called WordPerfect, which has mysteriously disappeared. I had two TRS-80s, because one of them was always in repair. I love the computer for many reasons. I no longer had to white out my errors; I no longer had to retype an entire article because of errors. My handwriting is almost completely illegible. The computer is a godsend for a writer and editor.


I have seen teachers who use technology to inspire inquiry, research, creativity and excitement. I understand what a powerful tool it is.


But it is also fraught with risk, and the tech industry has not done enough to mitigate the risks.

Risk One: The Threat to Student Privacy

Risk one is the invasion of student privacy, utilizing data by tech companies collected when students are online. The story of inBloom is a cautionary tale. Funded in 2014 with $100 million from the Gates Foundation and the Carnegie Corporation, inBloom intended to collect massive amounts of personally identifiable student data and use it to “personalize” learning to each student.

Parents became alarmed by the plan to put their children’s data into a cloud and mobilized in communities and states to stop inBloom. They were not nearly as impressed by the possibilities of data-driven instruction as the entrepreneurs promoting inBloom. The parents won. State after state dropped out, and inBloom collapsed.


Though inBloom is dead, the threat to student privacy is not. Every time a student makes a keystroke, an algorithm somewhere is collecting information about that student. Will his or her data be sold? The benefit to entrepreneurs and corporations is clear; the benefit to students is not at all clear.

Risk Two: The Proliferation of 'Personalized Learning'

Personalized learning, or “competency-based education,” are both euphemisms for computer adaptive instruction. Again, a parent rebellion is brewing, because parents want their children taught by a human being, not a computer. They fear that their children will be mechanized, standardized, subjected to depersonalized instruction, not “personalized learning.” While many entrepreneurs are investing in software to capture this burgeoning industry, there is still no solid evidence that students learn more or better when taught by a computer.

Risk Three: The Extensive Use of Technology for Assessment.

Technology is highly compatible with standardized testing, which encourages standardized questions and standardized answers. If the goal of learning is to teach creativity, imagination, and risk-taking, assessment should encourage students to be critical thinkers, not accepting the conventional wisdom, not checking off the right answer. Furthermore, the ability of computers to judge essays is still undeveloped and may remain so. Professor Les Perelman at MIT demonstrated that computer-graded essays can get high scores for gibberish and that computers lack the “intelligence” to reason or understand what matters most in writing.

Risk Four: The Cyber Charter School

Most such virtual schools, or cyber charters, are operated for profit; the largest of them is a chain called K12 Inc., which is listed on the New York Stock Exchange. Its executives are paid millions of dollars each year. Its biggest initial investor was the junk bond king Michael Milken. Numerous articles in publications such as the New York Times and the Washington Post have documented high student attrition, low teacher wages, low student test scores and low graduation rates. Yet the company is profitable.


The most controversial school in Ohio is the Electronic Classroom of Tomorrow (ECOT), whose owner makes political contributions to office-holders and has collected about $1 billion in taxpayer dollars since 2000. ECOT reputedly has the lowest graduation rate in the nation. The state of Ohio recently won a lawsuit requiring ECOT to return $60 million because of inflated enrollment figures. Studies of cyber charters have concluded that students learn very little when enrolled in them. There may be students who have legitimate reasons to learn at home online, but these “schools” should not receive the same tuition as brick-and-mortar schools that have certified teachers, custodians, libraries, the costs of physical maintenance, playgrounds, teams, school nurses and other necessities.

Risk Five: Money in Edtech

The tech industry wields its money in dubious ways to peddle its product. The market for technology is burgeoning, and a large industry is hovering around the schools, eager for their business. In November 2017, the New York Times published an expose of the business practices of the tech industry in Baltimore County. It documented payola, influence peddling and expensive wining and dining of school officials, which resulted in nearly $300 million of spending on computers that received low ratings by evaluators and that were soon obsolescent. This, in a district that has neglected the basic maintenance of some of its buildings.


The greatest fear of parents and teachers is that the tech industry wants to replace teachers with computers. They fear that the business leaders want to cut costs by replacing expensive humans with inexpensive machines, that never require health care or a pension. They believe that education requires human interaction. They prefer experience, wisdom, judgment, sensibility, sensitivity and compassion in the classroom to the cold, static excellence of a machine.


I agree with them. "


Diane Ravitch is a Research Professor of Education at New York University and a historian of education. She is the Founder and President of the Network for Public Education (NPE) and blogs at


Originally published here: 

No comment yet.
Scooped by Roxana Marachi, PhD!

The Disruption Machine: What the Gospel of Innovation Gets Wrong // The New Yorker

The Disruption Machine: What the Gospel of Innovation Gets Wrong // The New Yorker | Educational Psychology & Technology |

By Jill Lepore

[Illustration by Brian Stauffer]






No comment yet.
Scooped by Roxana Marachi, PhD!

Name+DOB+SSN = FAFSA Data Gold Mine // Krebs on Security

Name+DOB+SSN = FAFSA Data Gold Mine // Krebs on Security | Educational Psychology & Technology |

By Brian Krebs

"KrebsOnSecurity has sought to call attention to online services which expose sensitive consumer data if the user knows a handful of static details about a person that are broadly for sale in the cybercrime underground, such as name, date of birth, and Social Security Number. Perhaps the most eye-opening example of this is on display at, the Web site set up by the U.S. Department of Education for anyone interested in applying for federal student financial aid."...


For full post, please visit: 

No comment yet.
Scooped by Roxana Marachi, PhD!

Smart Cities: Utopian Vision, Dystopian Reality // Privacy International (October 2017)

Published by Privacy International
"The smart city market is booming. National and local governments all over the world expect their cities to become more efficient, more sustainable, cleaner and safer by integrating technology, increasing data generation and centralizing data to provide better services. From large multinationals to small start-ups, companies want their slice of the multi-billion dollars per year pie of municipal budgets and long-term government contracts.

But do smart cities even exist? And are our cities actually getting smarter? Or are smart cities a mere pretext to collect and process more data? This piece examines the reality of the smart city market beyond the ‘smart’ marketing term and existing smart city initiatives. We also consider the consequences and significant concerns emerging in terms of privacy and other fundamental human rights."


To download full report, click title above or here: 


For main page where report is featured: 


For related post on Medium, see:  


Related Privacy 101s: 


No comment yet.