Educational Psychology & Technology
18.0K views | +8 today
Follow
 
Scooped by Roxana Marachi, PhD
onto Educational Psychology & Technology
Scoop.it!

Neuroscience & the Classroom: Making Connections - A Course for K-12 Teachers // Annenberg

Neuroscience & the Classroom: Making Connections - A Course for K-12 Teachers // Annenberg | Educational Psychology & Technology | Scoop.it

"Insights drawn from neuroscience not only provide educators with a scientific basis for understanding some of the best practices in teaching, but also offer a new lens through which to look at the problems teachers grapple with every day. By gaining insights into how the brain works—and how students actually learn—teachers will be able to create their own solutions to the classroom challenges they face and improve their practice." 

 

 

For full website, click on title above or here: 
http://www.learner.org/courses/neuroscience

more...
No comment yet.
Educational Psychology & Technology
This curated collection includes news, resources, and research related to Educational Psychology and/or Technology. The page also serves as a research tool to organize online content. The grey funnel shaped icon at the top allows for searching by keyword. For research more specific to tech and screen time, please see: http://bit.ly/screen_time. For additional Educator Resources, please visit http://EduResearcher.com [Links to an external site].
Your new post is loading...
Your new post is loading...
Rescooped by Roxana Marachi, PhD from Screen Time, Wireless, and EMF Research
Scoop.it!

Health and Safety Research Gaps in Policies and Practices Integrating Emerging Technologies for Young Children 

Links are as follows in order of the slides: 

http://www.commercialfreechildhood.org/action/tell-fisher-price-no-ipad-bouncy-seats-infants 

 

The Silicon Valley Billionaires Remaking America's Schools 

https://www.nytimes.com/2017/06/06/technology/tech-billionaires-education-zuckerberg-facebook-hastings.html 

 

Dr. Catherine Steiner-Adair
Clinical Psychologist and Research Associate at Harvard Medical School https://childmind.org/bio/catherine-steiner-adair/ 

 

Video link may be viewed at: https://youtu.be/pjnFPo_mk6s 

 

Carter B, Rees P, Hale L, Bhattacharjee D, Paradkar MS. Association Between Portable Screen-Based Media Device Access or Use and Sleep Outcomes: A Systematic Review and Meta-analysis.JAMA Pediatr. 2016 Oct 31. doi: 10.1001/jamapediatrics.2016.2341. [Epub ahead of print] https://www.ncbi.nlm.nih.gov/pubmed/27802500?dopt=Abstract 

 

Screen Time Hurts More Than Kids' Eyes

http://www.healthline.com/health-news/screen-time-hurts-more-than-kids-eyes-101215 

 

New Media Consortium / Consortium for School Networking Horizon Report 
http://cdn.nmc.org/media/2016-nmc-cosn-horizon-report-k12-EN.pdf 

 

"American Revolution 2.0: How Education Innovation is Going to Revitalize America and Transform the U.S. Economy"  http://sco.lt/5JnF7B 

 

"Preschool is Good For Children But It's Expensive So Utah Is Offering It Online" https://www.washingtonpost.co m/local/education/preschool-is- good-for-poor-kids-but-its- expensive-so-utah-is-offering-it- online/2015/10/09/27665e52- 5e1d-11e5-b38e- 06883aacba64_story.html  

 

Philanthropy Roundtable's: "Blended Learning: Wise Givers Guide to Supporting Tech-Assisted Learning"

http://www.philanthropyroundtable.org/file_uploads/Blended_Learning_Guidebook.pdf (Formerly chaired by B. DeVos)  

 

CyberCharters Have Overwhelming Negative Impact 

 

Ma, J., van den Heuvel, M., Maguire, J., Parkin, P., Birken, C. (2017). Is handheld screen time use associated with language delay in infants? Presented at the Pediatric Academic Societies Meeting, San Francisco, CA. https://www.sciencedaily.com/releases/2017/05/170504083141.htm  

 

Jonathan Rochelle’s GSV/ASU PRIMETIME Keynote Speech pitching Google Cardboard for children in schools as proxy for actual field trips: https://www.youtube.com/watch?v=YNqYMI89umE 

 

Scientists Urge Google to Stop Untested Microwave Radiation of Children's Eyes and Brains with Virtual Reality Devices in Schools  http://sco.lt/8ZY5Zp // https://drive.google.com/file/d/0B12B4w0bwyQ_bzRTSUtfb2lORXM/view  Asus product manual

http://dlcdnet.asus.com/pub/ASUS/ZenFone/ZE550ML/e10509_ze550ml_ze551ml_em_0601.pdf 

 

Telecom Industry Liability and Insurance Information 

http://sco.lt/6MrkcT 

 

National Association for Children and Safe Technology - iPad Information 

 

For infant/pregnancy related safety precautions, please visit http://BabySafeProject.org 

 

194 Signatories (physicians, scientists, educators) on Joint Statement on Pregnancy and Wireless Radiation http://sco.lt/7C2N3B 

 

Article screenshot from France: "Portables. L'embrouille des ondes electromagnetiques  

http://sco.lt/68rtCb

 

Wireless Phone Radiation Risks and Public Policy

http://bit.ly/wirelessradiationUCLA102215 

 

"Show The Fine Print" 
http://ShowTheFinePrint.org 

 

Scientist petition calls for greater protective measures for children and pregnant women, cites need for precautionary health warnings, stronger regulation of electromagnetic fields, creation of EMF free zones, and media disclosures of experts’ financial relationships with industry when citing their opinions regarding the safety of EMF-emitting technologies. Published in European Journal of Oncology http://sco.lt/8SDDd3 

 

International Agency for Research on Cancer Classifies Radiofrequency Electromagnetic Fields as Possibly Carcinogenic to Humans (2011)

 

For more on source of funding research, see: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1797826/ and http://ascopubs.org/doi/abs/10.1200/jco.2008.21.6366 

 

Maryland State Children’s Environmental Health and Protection Advisory Council // Public Testimony https://youtu.be/8sCV1l7IfDY?t=7m15s

 

"Until now, radiation from cell towers has not been considered a risk to children, but a recent study raises new questions about possible long-term, harmful effects."  http://sco.lt/5tm5dx 

 

For further reading, please see Captured Agency report published by Harvard’s Center for Ethics http://sco.lt/4qwS2r  or https://ethics.harvard.edu/files/center-for-ethics/files/capturedagency_alster.pdf 

 

Updates/posts/safety information on Virtual Reality:

http://www.scoop.it/t/emf-wireless-radiation?q=virtual 

 

Environmental Health Trust Virtual Reality Radiation Absorption Slides 

https://ehtrust.org/wp-content/uploads/Virtual-reality-Slides-1.pdf 

 

Healthy Kids in a Digital World:

http://commercialfreechildhood.org/healthykidsdigitalworld 

 

National Association for Children and Safe Technology http://nacst.org 

 

Doctors’ Letters on Wifi in Schools// 154 page compilation

https://drive.google.com/file/d/0B8Oub2Nx5eSLNEthQmNlb3ZGcTQ/view 

 

Insurance and Liability Disclaimers/Information from Telecom Companies https://ehtrust.org/wp-content/uploads/Telecom-10-K-Liability-and-Insurance-Companies-Slides-EHT-6-2016.pdf 

  

Most of the documents and articles embedded within the presentation above are searchable/accessible on the following page: http://bit.ly/screen_time
_______________________________

Document above is a pdf with live links. They are provided above for easier access. To download the original file, please click on title or arrow above. It is a large file so may take several minutes.  

 
more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Automating Inequality (by Virginia Eubanks) // Macmillan

Automating Inequality (by Virginia Eubanks) // Macmillan | Educational Psychology & Technology | Scoop.it

"The State of Indiana denies one million applications for healthcare, foodstamps and cash benefits in three years—because a new computer system interprets any mistake as “failure to cooperate.” In Los Angeles, an algorithm calculates the comparative vulnerability of tens of thousands of homeless people in order to prioritize them for an inadequate pool of housing resources. In Pittsburgh, a child welfare agency uses a statistical model to try to predict which children might be future victims of abuse or neglect.

Since the dawn of the digital age, decision-making in finance, employment, politics, health and human services has undergone revolutionary change. Today, automated systems—rather than humans—control which neighborhoods get policed, which families attain needed resources, and who is investigated for fraud. While we all live under this new regime of data, the most invasive and punitive systems are aimed at the poor.

 

In Automating Inequality, Virginia Eubanks systematically investigates the impacts of data mining, policy algorithms, and predictive risk models on poor and working-class people in America. The book is full of heart-wrenching and eye-opening stories, from a woman in Indiana whose benefits are literally cut off as she lays dying to a family in Pennsylvania in daily fear of losing their daughter because they fit a certain statistical profile.

 

The U.S. has always used its most cutting-edge science and technology to contain, investigate, discipline and punish the destitute. Like the county poorhouse and scientific charity before them, digital tracking and automated decision-making hide poverty from the middle-class public and give the nation the ethical distance it needs to make inhumane choices: which families get food and which starve, who has housing and who remains homeless, and which families are broken up by the state. In the process, they weaken democracy and betray our most cherished national values.

 

This deeply researched and passionate book could not be more timely."

 

https://us.macmillan.com/automatinginequality/virginiaeubanks/9781250074317/ 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

iPhones and Children Are a Toxic Pair, Say Two Big Apple Investors // Wall Street Journal

iPhones and Children Are a Toxic Pair, Say Two Big Apple Investors // Wall Street Journal | Educational Psychology & Technology | Scoop.it

By David Benoit

"The iPhone has made Apple Inc. and Wall Street hundreds of billions of dollars. Now some big shareholders are asking at what cost, in an unusual campaign to make the company more socially responsible.

 

A leading activist investor and a pension fund are saying the smartphone maker needs to respond to what some see as a growing public-health crisis of youth phone addiction.

 

Jana Partners LLC and the California State Teachers’ Retirement System, or Calstrs, which control about $2 billion of Apple shares, sent a letter to Apple on Saturday urging it to develop new software tools that would help parents control and limit phone use more easily and to study the impact of overuse on mental health.

The Apple push is a preamble to a new several-billion-dollar fund Jana is seeking to raise this year to target companies it believes can be better corporate citizens. It is the first instance of a big Wall Street activist seeking to profit from the kind of social-responsibility campaign typically associated with a small fringe of investors.

 

Adding splash, rock star Sting and his wife, Trudie Styler, will be on an advisory board along with Sister Patricia A. Daly, a nun who successfully fought Exxon Mobil Corp. over environmental disclosures, and Robert Eccles, an expert on sustainable investing.

 

The Apple campaign would be unusual for an activist like Jana, which normally urges companies to make financial changes. But the investors believe that Apple’s highflying stock could be hurt in coming decades if it faces a backlash and that proactive moves could generate goodwill and keep consumers loyal to Apple brands.

 

“Apple can play a defining role in signaling to the industry that paying special attention to the health and development of the next generation is both good business and the right thing to do,” the shareholders wrote in the letter, a copy of which was reviewed by The Wall Street Journal. “There is a developing consensus around the world including Silicon Valley that the potential long-term consequences of new technologies need to be factored in at the outset, and no company can outsource that responsibility.”

 

Obsessive teenage smartphone usage has sparked a debate among academics, parents and even the people who helped create the iPhone. 

 

Some have raised concerns about increased rates in teen depression and suicide and worry that phones are replacing old-fashioned human interaction. It is part of a broader re-evaluation of the effects on society of technology companies such as Google and Amazon.com Inc. and social-media companies like Facebook Inc. and Snap chat owner Snap Inc., which are facing questions about their reach into everyday life.

 

Apple hasn’t offered any public guidance to parents on how to manage children’s smartphone use or taken a position on at what age they should begin using iPhones.

 

Apple and its rivals point to features that give parents some measure of control. Apple, for instance, gives parents the ability to choose which apps, content and services their children can access.

 

The basic idea behind socially responsible investing is that good corporate citizenship can also be good business. Big investors and banks, including TPG, UBS Group AG and Goldman Sachs Group Inc. are making bets on socially responsible companies, boosting what they see as good actors and avoiding bad ones. Big-name activists increasingly view bad environmental, social or governance policies as red flags. Jana plans to go further, putting its typical tools to work to drive change that may not immediately pay off.

 

Apple is an ambitious first target: The combined Jana-Calstrs stake is relatively small given Apple’s nearly $900 billion market value. Still, in recent years Apple has twice faced activists demanding it pare its cash holdings, and both times the company ceded some ground.

 

Chief Executive Tim Cook has led Apple’s efforts to be a more socially responsible company, for instance on environmental and immigration issues, and said in an interview with the New York Times last year that Apple has a “moral responsibility” to help the U.S. economy.

 

Apple has shown willingness to use software to address potentially negative consequences of phone usage. Amid rising concerns about distracted driving, the company last year updated its software with a “do not disturb while driving” feature, which enables the iPhone to detect when someone is behind the wheel and automatically silence notifications.

 

The iPhone is the backbone of a business that generated $48.35 billion in profit in fiscal 2017. It helped turn Apple into the world’s largest publicly listed company by market value, and anticipation of strong sales of its latest model, the iPhone X, helped its stock rise 50% in the past year. Apple phones made up 43% of U.S. smartphones in use in 2016, according to comScore , and an estimated 86 million Americans over age 13 own an iPhone.

 

Jana and Calstrs are working with Jean M. Twenge of San Diego State University, who chronicled the problem of what she has dubbed the “iGen” in a book that was previewed in a widely discussed article in the Atlantic magazine last fall, and with Michael Rich of Harvard Medical School and Boston Children’s Hospital, known as “the mediatrician” for his work on the impact of media on children.

 

The investors believe both the content and the amount of time spent on phones need to be tailored to youths, and they are raising concern about the public-health effects of failing to act. They point to research from Ms. Twenge and others about a “growing body of evidence” of “unintentional negative side effects,” including studies showing concerns from teachers. That is one reason Calstrs was eager to support the campaign, according to the letter.

 

The group wants Apple to help find solutions to questions like what is optimal usage and to be at the forefront of the industry’s response—before regulators or consumers potentially force it to act.

 

The investors say Apple should make it easier and more intuitive for parents to set up usage limits, which could head off any future moves to proscribe smartphones.

 

The question is “How can we apply the same kind of public-health science to this that we do to, say, nutrition?” Dr. Rich said in an interview. “We aren’t going to tell you never go to Mickey D’s, but we are going to tell you what a Big Mac will do and what broccoli will do.”

 

—Tripp Mickle and Betsy Morris contributed to this article. Write to David Benoit at david.benoit@wsj.com"

 

For full post, see:

http://www.paywallnews.com/business/iPhones-and-Children-Are-a-Toxic-Pair--Say-Two-Big-Apple-Investors.B1Qo3eMe4z.html 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Industry Giants Fail to Tackle Child Labour Allegations In Cobalt Battery Supply Chains // Amnesty International

Industry Giants Fail to Tackle Child Labour Allegations In Cobalt Battery Supply Chains // Amnesty International | Educational Psychology & Technology | Scoop.it

Are smartphone and electric vehicle companies doing enough to cut human rights abuses out of their cobalt supply chains?
__________

 

  • "Survey of electronics and car companies shows major blind spots in supply chains
  • Apple is the industry leader for responsible cobalt sourcing – but the bar is low
  • Microsoft, Lenovo and Renault have made least progress

Major electronics and electric vehicle companies are still not doing enough to stop human rights abuses entering their cobalt supply chains, almost two years after an Amnesty International investigation exposed how batteries used in their products could be linked to child labour in the Democratic Republic of Congo (DRC), the organization said today.

 

A new report, Time to Recharge, ranks industry giants including Apple, Samsung Electronics, Dell, Microsoft, BMW, Renault and Tesla on how much they have improved their cobalt sourcing practices since January 2016. It finds that while a handful of companies have made progress, others are still failing to take even basic steps like investigating supply links in the DRC."...

 

For full story, please see: 
https://www.amnesty.org/en/latest/news/2017/11/industry-giants-fail-to-tackle-child-labour-allegations-in-cobalt-battery-supply-chains/ 

 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Six Ways (And Counting) That Big Data Systems Are Harming Society // The Conversation

Six Ways (And Counting) That Big Data Systems Are Harming Society // The Conversation | Educational Psychology & Technology | Scoop.it

By Joanna Redden

"There is growing consensus that with big data comes great opportunity, but also great risk.

 

But these risks are not getting enough political and public attention. One way to better appreciate the risks that come with our big data future is to consider how people are already being negatively affected by uses of it. At Cardiff University’s Data Justice Lab, we decided to record the harms that big data uses have already caused, pulling together concrete examples of harm that have been referenced in previous work so that we might gain a better big picture appreciation of where we are heading.

 

We did so in the hope that such a record will generate more debate and intervention from the public into the kind of big data society, and future we want. The following examples are a condensed version of our recently published Data Harm Record, a running record, to be updated as we learn about more cases.

1. Targeting based on vulnerability

With big data comes new ways to socially sort with increasing precision. By combining multiple forms of data sets, a lot can be learned. This has been called “algorithmic profiling” and raises concerns about how little people know about how their data is collected as they search, communicate, buy, visit sites, travel, and so on.

 

Much of this sorting goes under the radar, although the practices of data brokers have been getting attention. In her testimony to the US Congress, World Privacy Forum’s Pam Dixon reported finding data brokers selling lists of rape victims, addresses of domestic violence shelters, sufferers of genetic diseases, sufferers of addiction and more.

2. Misuse of personal information

Concerns have been raised about how credit card companies are using personal details like where someone shops or whether or not they have paid for marriage counselling to set rates and limits. One study details the case of a man who found his credit rating reduced because American Express determined that others who shopped where he shopped had a poor repayment history.

 

This event, in 2008, was an early big data example of “creditworthiness by association” and is linked to ongoing practices of determining value or trustworthiness by drawing on big data to make predictions about people.

3. Discrimination

As corporations, government bodies and others make use of big data, it is key to know that discrimination can and is happening – both unintentionally and intentionally. This can happen as algorithmically driven systems offer, deny or mediate access to services or opportunities to people differently.

 

Some are raising concerns about how new uses of big data may negatively influence people’s abilities get housing or insurance – or to access education or get a job. A 2017 investigation by ProPublica and Consumer Reports showed that minority neighbourhoods pay more for car insurance than white neighbourhoods with the same risk levels. ProPublica also shows how new prediction tools used in courtrooms for sentencing and bonds “are biased against blacks”. Others raise concerns about how big data processes make it easier to target particular groups and discriminate against them."...

 

For full post, see:

https://theconversation.com/six-ways-and-counting-that-big-data-systems-are-harming-society-88660 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Google Internet Balloon Crashes in Meru, Kenya // 

Google Internet Balloon Crashes in Meru, Kenya //  | Educational Psychology & Technology | Scoop.it

By David Muchui

A Google high altitude balloon meant to provide high speed internet in Kenya crashed at Nthambiro in Meru Friday night causing a scare among residents. Residents gathered in a miraa farm where the balloon powered device fell to get a glimpse of it before police officers took it away.

 

The device from Google's balloon-powered high-speed internet service known as Project Loon is part of 10 balloons deployed in July 2017 for testing in Nakuru, Nanyuki, Nyeri and Marsabit.

 

Some scared residents expressed concern after they complained of headache after approaching the device.  Igembe South OCPD Jane Nyakeruma said no injuries or damages were reported as a result of the crash.

 

"The device from Project Loon indicates it fell after its expiry period of six months. No one is yet to claim the device," Ms Nyakeruma said.

 

Super Pressured

 

According to the developer, Project Loon is a global network of high altitude balloons which ascend like weather balloons until they reach the stratosphere where they sail about 20 km (65,000 feet) above the earth.

 

"The Loon balloons are super pressured, allowing them to last much longer. Loon balloons are also unique in that they can sail the wind to travel where they need to go. They can coordinate with other balloons as a flock and their electronics are entirely solar powered," it states."...

 

For main post, see here:

http://allafrica.com/stories/201712300019.html 

 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Consumer and Privacy Groups Demand Action on Toys that Spy on Children // CCFC

Consumer and Privacy Groups Demand Action on Toys that Spy on Children // CCFC | Educational Psychology & Technology | Scoop.it

One Year after Complaint to Federal Trade Commission, Dangerous Toys Are Still on the Market


WASHINGTON—December 18, 2017— Consumer and privacy groups are calling on the Federal Trade Commission (FTC) and companies that sell dangerous internet-connected toys and smartwatches to act to protect children from serious safety and security threats they pose. One year ago, advocacy groups filed a complaint with the FTC about two internet-connected toys, My Friend Cayla and i-Que Intelligent Robot, which capture, record, and analyze what children say and respond to them. The complaint alleged that the manufacturer of these products, Genesis Toys, and the technology provider, Nuance Communications, unfairly and deceptively collect, use, and share audio files of children's voices without providing adequate notice or obtaining verified parental consent, and fail to prevent strangers and predators from covertly eavesdropping on children's private conversations, creating a risk of stalking and physical danger. 

 

Several major retailers have ceased sales of the toys in response, with the exception of Amazon.

 

The advocacy groups worked in concert with the Norwegian Consumer Council (NCC), whose research uncovered the problems with Cayla and i-Que, in bringing the danger of these toys to the attention of the FTC and retailers.  In response, Germany has banned the toys as spying devices, and French authorities have demanded information from Genesis and Nuance on the threat posed to children.  Major U.S. retailers like Target and ToysRUs responded by stopping sales of the toys. Walmart informed the groups that it would stop selling the toys, but they were listed on the company’s website this fall until one of the groups, USPIRG, highlighted the dangers of My Friend Cayla in its annual Trouble in Toyland report.

 

Amazon has taken no action to stop sales of the toys on its site, despite repeated requests from the advocates. The FTC has announced no action in response to the complaint.  

 

More recently, in October, advocacy groups sent a letter to the Federal Trade Commission asking it to act to protect kids from the danger of smartwatches which are marketed to allow parents to track the location of and stay in touch with very young children.  NCC research showed that the watches, sold in the U.S. under the brands Caref and SeTracker, actually put children at risk—they are unreliable, data is stored unsafely, and they can easily be overtaken by a hacker who might prey upon the child. These products also remain for sale on Amazon, despite the advocates’ request that the company cease sales, and the FTC has yet to respond to the advocates about their concerns.

 

U.S. groups calling on the FTC and Amazon to take action today to protect children from these dangerous products are the Campaign for a Commercial-Free Childhood, the Center for Digital Democracy, Consumer Action, Consumer Federation of America, Consumer Watchdog, EPIC, Public Citizen’s Commercial Alert Program, and USPIRG.

“Children share intimate details about themselves with their dolls and toys,” said CCFC’s Executive Director, Josh Golin. “My Friend Cayla and i-Que are unsafe devices which put that sensitive information at risk. We applaud the retailers which have stopped selling these toys, and we urge Amazon to put children’s welfare first and do the same.”

 

“Neither My Friend Cayla nor i-Que Intelligent Robot should be on anyone’s holiday shopping list,” said Susan Grant, Director of Consumer Protection and Privacy at CFA. “Parents should be able to count on responsible retailers and the federal government to keep products that threaten their children’s privacy and security from continuing to be sold.” 

 

"This year, the state PIRGs added Cayla, as a representative of all interconnected toys and apps targeted at young kids, to our 32nd annual Trouble In Toyland list of potentially hazardous toys," said U.S. PIRG Consumer Program Director Ed Mierzwinski. "Parents and toy-givers need to understand that privacy-invasive toys pose real threats to children, just as toys that pose choking or ingestion hazards or contain excessive levels of toxic lead and other chemicals do."

 

“Products that connect to the internet are concerning at the best of times, because these products can collect data about ourselves and our daily lives and it’s difficult to control how that data is used by companies. Toys that connect to the internet and track and collect data about kids should be banned from store shelves,” said Linda Sherry, Director of National Priorities for Consumer Action.

 

“The FTC’s failure to respond to our complaints is appalling.  It shows how empty Acting Chair Maureen Ohlhausen’s promises on another issue to protect Internet privacy now that net neutrality rules have been rescinded are likely to be,” said John M. Simpson, Consumer Watchdog’s Privacy and Technology Project Director. “Responsible retailers won’t sell these invasive toys and responsible advertising companies — unlike Google — won’t advertise them.”

 

“Connected toys raise serious privacy concerns,” said Marc Rotenberg, President of EPIC.

 

“Kids should play with their toys and their friends, and not with surveillance devices dressed as dolls.”

 

###

http://www.commercialfreechildhood.org/consumer-and-privacy-groups-demand-action-toys-spy-children 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Dr. Catherine Steiner-Adair on Screen Time Concerns Related to Child Development 

"Dr. Catherine Steiner-Adair is a Clinical Psychologist, Consultant, Speaker, and Author of The Big Disconnect: Protecting Childhood and Family Relationships in the Digital Age. This is a video clip is from a longer talk she gave in Framingham, MA on June 10th, 2015.

 

For viewers interested in additional information and research related to screen time concerns, see slides from NAACP Conference on the ESSA and Civil Rights in Education in San Jose, CA, August 19th, 2017. "Health and Safety Research Gaps in Policies and Practices Integrating Emerging Technologies for Young Children": http://sco.lt/58869R"

 

To view video on YouTube: https://www.youtube.com/watch?v=pjnFPo_mk6s&feature=youtu.be  

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

School Ditches Online Learning Program After Parents Revolt // AP News 

School Ditches Online Learning Program After Parents Revolt // AP News  | Educational Psychology & Technology | Scoop.it

"CHESHIRE, Conn. (AP) — The fast-growing online platform was built with help from Facebook engineers and designed to help students learn at their own speed. But it’s been dropped because parents in this Connecticut suburb revolted, saying there was no need to change what’s worked in a town with a prized reputation for good schools.

The Summit Learning program, developed by a California charter school network, has signed up over 300 schools to use its blend of technology with go-at-your-own-pace personalized learning. 

 

Cheshire school administrators and some parents praised the program, but it faced criticism from others who said their children were spending too much time online, some content was inappropriate, and students were not getting enough direct guidance."...

 

 


https://apnews.com/1eb22db68a974c8ca4baf9c6d75640ba/School-ditches-online-learning-program-after-parents-revolt 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Data Mining Program Designed to Predict Child Abuse Proves Unreliable, DCFS Says // Chicago Tribune

Data Mining Program Designed to Predict Child Abuse Proves Unreliable, DCFS Says // Chicago Tribune | Educational Psychology & Technology | Scoop.it

"The Illinois Department of Children and Family Services is ending a high-profile program that used computer data mining to identify children at risk for serious injury or death after the agency's top official called the technology unreliable.

 

"We are not doing the predictive analytics because it didn't seem to be predicting much," DCFS Director Beverly "B.J." Walker told the Tribune.

The $366,000 Rapid Safety Feedback program was central to reforms promised by Walker's predecessor, George Sheldon, who took office in 2015 following a series of child deaths and other problems.

 

Two Florida firms — the nonprofit Eckerd Connects and its for-profit partner, Mindshare Technology — mined electronic DCFS files and assigned a score of 1 to 100 to children who were the subject of an abuse allegation to the agency hotline. The algorithms rated the children's risk of being killed or severely injured during the next two years, according to DCFS public statements.

 

But caseworkers were alarmed and overwhelmed by alerts as thousands of children were rated as needing urgent protection. More than 4,100 Illinois children were assigned a 90 percent or greater probability of death or injury, according to internal DCFS child-tracking data released to the Tribune under state public records laws.

 

And 369 youngsters, all under age 9, got a 100 percent chance of death or serious injury in the next two years, the Tribune found.

 

At the same time, high-profile child deaths kept cropping up with little warning from the predictive analytics software, DCFS officials told the Tribune."...

 

http://www.chicagotribune.com/news/watchdog/ct-dcfs-eckerd-met-20171206-story.html 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Five Risks Posed by the Increasing Misuse of Technology in Schools // EdSurge 

Five Risks Posed by the Increasing Misuse of Technology in Schools // EdSurge  | Educational Psychology & Technology | Scoop.it

"At any given moment in the day, I am attached to my cellphone, my iPad or my computer. As a writer, I was an early convert to the computer. I began writing on a TRS-80 from Radio Shack in 1983 on wonderful writing software called WordPerfect, which has mysteriously disappeared. I had two TRS-80s, because one of them was always in repair. I love the computer for many reasons. I no longer had to white out my errors; I no longer had to retype an entire article because of errors. My handwriting is almost completely illegible. The computer is a godsend for a writer and editor.

 

I have seen teachers who use technology to inspire inquiry, research, creativity and excitement. I understand what a powerful tool it is.

 

But it is also fraught with risk, and the tech industry has not done enough to mitigate the risks.

Risk One: The Threat to Student Privacy

Risk one is the invasion of student privacy, utilizing data by tech companies collected when students are online. The story of inBloom is a cautionary tale. Funded in 2014 with $100 million from the Gates Foundation and the Carnegie Corporation, inBloom intended to collect massive amounts of personally identifiable student data and use it to “personalize” learning to each student.


Parents became alarmed by the plan to put their children’s data into a cloud and mobilized in communities and states to stop inBloom. They were not nearly as impressed by the possibilities of data-driven instruction as the entrepreneurs promoting inBloom. The parents won. State after state dropped out, and inBloom collapsed.

 

Though inBloom is dead, the threat to student privacy is not. Every time a student makes a keystroke, an algorithm somewhere is collecting information about that student. Will his or her data be sold? The benefit to entrepreneurs and corporations is clear; the benefit to students is not at all clear.

Risk Two: The Proliferation of 'Personalized Learning'

Personalized learning, or “competency-based education,” are both euphemisms for computer adaptive instruction. Again, a parent rebellion is brewing, because parents want their children taught by a human being, not a computer. They fear that their children will be mechanized, standardized, subjected to depersonalized instruction, not “personalized learning.” While many entrepreneurs are investing in software to capture this burgeoning industry, there is still no solid evidence that students learn more or better when taught by a computer.

Risk Three: The Extensive Use of Technology for Assessment.

Technology is highly compatible with standardized testing, which encourages standardized questions and standardized answers. If the goal of learning is to teach creativity, imagination, and risk-taking, assessment should encourage students to be critical thinkers, not accepting the conventional wisdom, not checking off the right answer. Furthermore, the ability of computers to judge essays is still undeveloped and may remain so. Professor Les Perelman at MIT demonstrated that computer-graded essays can get high scores for gibberish and that computers lack the “intelligence” to reason or understand what matters most in writing.

Risk Four: The Cyber Charter School

Most such virtual schools, or cyber charters, are operated for profit; the largest of them is a chain called K12 Inc., which is listed on the New York Stock Exchange. Its executives are paid millions of dollars each year. Its biggest initial investor was the junk bond king Michael Milken. Numerous articles in publications such as the New York Times and the Washington Post have documented high student attrition, low teacher wages, low student test scores and low graduation rates. Yet the company is profitable.

 

The most controversial school in Ohio is the Electronic Classroom of Tomorrow (ECOT), whose owner makes political contributions to office-holders and has collected about $1 billion in taxpayer dollars since 2000. ECOT reputedly has the lowest graduation rate in the nation. The state of Ohio recently won a lawsuit requiring ECOT to return $60 million because of inflated enrollment figures. Studies of cyber charters have concluded that students learn very little when enrolled in them. There may be students who have legitimate reasons to learn at home online, but these “schools” should not receive the same tuition as brick-and-mortar schools that have certified teachers, custodians, libraries, the costs of physical maintenance, playgrounds, teams, school nurses and other necessities.

Risk Five: Money in Edtech

The tech industry wields its money in dubious ways to peddle its product. The market for technology is burgeoning, and a large industry is hovering around the schools, eager for their business. In November 2017, the New York Times published an expose of the business practices of the tech industry in Baltimore County. It documented payola, influence peddling and expensive wining and dining of school officials, which resulted in nearly $300 million of spending on computers that received low ratings by evaluators and that were soon obsolescent. This, in a district that has neglected the basic maintenance of some of its buildings.

 

The greatest fear of parents and teachers is that the tech industry wants to replace teachers with computers. They fear that the business leaders want to cut costs by replacing expensive humans with inexpensive machines, that never require health care or a pension. They believe that education requires human interaction. They prefer experience, wisdom, judgment, sensibility, sensitivity and compassion in the classroom to the cold, static excellence of a machine.

 

I agree with them. "

 

Diane Ravitch is a Research Professor of Education at New York University and a historian of education. She is the Founder and President of the Network for Public Education (NPE) and blogs at dianeravitch.net.

 

Originally published here:

https://www.edsurge.com/news/2017-12-29-5-risks-posed-by-the-increasing-misuse-of-technology-in-schools 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

The Disruption Machine: What the Gospel of Innovation Gets Wrong // The New Yorker

The Disruption Machine: What the Gospel of Innovation Gets Wrong // The New Yorker | Educational Psychology & Technology | Scoop.it

By Jill Lepore

[Illustration by Brian Stauffer]

... https://www.newyorker.com/magazine/2014/06/23/the-disruption-machine 

 

 

 

 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Name+DOB+SSN = FAFSA Data Gold Mine // Krebs on Security

Name+DOB+SSN = FAFSA Data Gold Mine // Krebs on Security | Educational Psychology & Technology | Scoop.it

By Brian Krebs

"KrebsOnSecurity has sought to call attention to online services which expose sensitive consumer data if the user knows a handful of static details about a person that are broadly for sale in the cybercrime underground, such as name, date of birth, and Social Security Number. Perhaps the most eye-opening example of this is on display at fafsa.ed.gov, the Web site set up by the U.S. Department of Education for anyone interested in applying for federal student financial aid."...

 

For full post, please visit: 

https://krebsonsecurity.com/2017/11/namedobssnfafsa-data-gold-mine/ 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Letter to Agora Cyber Charter School on Violations of FERPA // U.S. Department of Education 

To download, click on title or arrow above. 

Also accessible online here: https://studentprivacy.ed.gov/sites/default/files/resource_document/file/Agora%20Findings%20letter%20FINAL%2011.2.17.pdf?utm_content=buffer4150e&utm_medium=social&utm_source=twitter.com&utm_campaign=buffer 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Big Data Meets Big Brother as China Moves To Rate Its Citizens // Wired

Big Data Meets Big Brother as China Moves To Rate Its Citizens // Wired | Educational Psychology & Technology | Scoop.it

By Rachel Botsman
"The Chinese government plans to launch its Social Credit System in 2020. The aim? To judge the trustworthiness – or otherwise – of its 1.3 billion residents


On June 14, 2014, the State Council of China published an ominous-sounding document called "Planning Outline for the Construction of a Social Credit System". In the way of Chinese policy documents, it was a lengthy and rather dry affair, but it contained a radical idea. What if there was a national trust score that rated the kind of citizen you were?

 

Imagine a world where many of your daily activities were constantly monitored and evaluated: what you buy at the shops and online; where you are at any given time; who your friends are and how you interact with them; how many hours you spend watching content or playing video games; and what bills and taxes you pay (or not). It's not hard to picture, because most of that already happens, thanks to all those data-collecting behemoths like Google, Facebook and Instagram or health-tracking apps such as Fitbit. But now imagine a system where all these behaviours are rated as either positive or negative and distilled into a single number, according to rules set by the government. That would create your Citizen Score and it would tell everyone whether or not you were trustworthy. Plus, your rating would be publicly ranked against that of the entire population and used to determine your eligibility for a mortgage or a job, where your children can go to school - or even just your chances of getting a date.

 
A futuristic vision of Big Brother out of control? No, it's already getting underway in China, where the government is developing the Social Credit System (SCS) to rate the trustworthiness of its 1.3 billion citizens. The Chinese government is pitching the system as a desirable way to measure and enhance "trust" nationwide and to build a culture of "sincerity". As the policy states, "It will forge a public opinion environment where keeping trust is glorious. It will strengthen sincerity in government affairs, commercial sincerity, social sincerity and the construction of judicial credibility."
 
 

Others are less sanguine about its wider purpose. "It is very ambitious in both depth and scope, including scrutinising individual behaviour and what books people are reading. It's Amazon's consumer tracking with an Orwellian political twist," is how Johan Lagerkvist, a Chinese internet specialist at the Swedish Institute of International Affairs, described the social credit system. Rogier Creemers, a post-doctoral scholar specialising in Chinese law and governance at the Van Vollenhoven Institute at Leiden University, who published a comprehensive translation of the plan, compared it to "Yelp reviews with the nanny state watching over your shoulder"....

 

For full post, please see: 

http://www.wired.co.uk/article/chinese-government-social-credit-score-privacy-invasion 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Google’s High-Altitude Internet Balloon Crashes in Kenya //  Hindustan Times

Google’s High-Altitude Internet Balloon Crashes in Kenya //  Hindustan Times | Educational Psychology & Technology | Scoop.it

 "A high altitude balloon that was launched by Google to provide high-speed internet in the remote parts of the earth under "Project Loon", has crashed in a Kenyan farm, a report said on Saturday.

 
The balloon, a part of a 10-balloon batch, was deployed for testing in Kakuru, Nanyuki, Nyeri and Marsabit in July 2017. It crashed at Nthambiro in Meru on Friday night, media reported.
 
Some residents complained of headaches after they gathered around the device to get its glimpse.
 
“The device from project loon indicates it fell after its expiry period of six months. No one is yet to claim the device,” Igembe South OCPD Jane Nyakeruma was quoted as saying.
 
Earlier this year, Google announced that it was “years closer” to deliver internet to remote parts of the world using high-flying balloons.
 
Researchers at Google’s Project Loon -- part of the company’s X research lab -- said it was now able to use machine learning to predict weather systems, meaning the firm has a greater control over where its balloons go, making it possible to focus on a specific region, rather than circumnavigating the globe, BBC reported.
 
Under the project, the firm suspended a network of huge balloons that beam down connectivity.
 
The balloons float in the stratosphere around 11 miles high. By raising or lowering altitude, the balloons can be caught in different weather streams, changing direction."
 
For original post, see:

http://www.hindustantimes.com/tech/google-s-high-altitude-internet-balloon-crashes-in-kenya/story-79Kqd7ZGJZj2TS68QzEUbN.html 

_______________________________


The following incidents have been recorded on the Wikipedia page for Project Loon: https://en.wikipedia.org/wiki/Project_Loon 

 

See also: 

International Coalition Objects to Google's Project Loon 
http://c4st.org/international-coalition-objects-to-googles-project-loon/ 

 

 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Intel Was Aware of the Chip Vulnerability When Its CEO Sold Off $24 Million in Company Stock // Business Insider

Intel Was Aware of the Chip Vulnerability When Its CEO Sold Off $24 Million in Company Stock // Business Insider | Educational Psychology & Technology | Scoop.it

http://www.businessinsider.com/intel-ceo-krzanich-sold-shares-after-company-was-informed-of-chip-flaw-2018-1 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

EdTech Coverage, The Hype Cycle, and Media Complicity // Alexander Russo (via Phi Delta Kappan)

EdTech Coverage, The Hype Cycle, and Media Complicity // Alexander Russo (via Phi Delta Kappan) | Educational Psychology & Technology | Scoop.it

"Another insider’s concerns about the media hype cycle – and some great edtech story leads for 2018.

By Alexander Russo

Policymakers, administrators, and educators have been experimenting with technology for roughly 30 years, going back to the days of computer-assisted instruction, connecting schools to “the information superhighway,” and One Laptop Per Child. By necessity, education journalists have been along for the ride. (Twenty years ago this past summer, the magazine then known as The Atlantic Monthly published The Computer Delusion, warning the American public about the dangers of bringing computers into schools and expecting much academic improvement.)


But the past few years have been especially confusing and contentious when it comes to technology and schools, in response to the backlash against the school reform movement, growing concerns about the dominance of the “big five” tech companies (Apple, Alphabet/Google, Amazon, Facebook, and Microsoft*), and edtech advocates’ tireless efforts to fix education with apps and gizmos."...

 

For full post, please visit:

http://www.kappanonline.org/russo-edtech-coverage-the-hype-cycle-and-media-complicity/ 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

A Cute Toy Just Brought a Hacker Into Your Home // New York Times

A Cute Toy Just Brought a Hacker Into Your Home // New York Times | Educational Psychology & Technology | Scoop.it

"Researchers recently found the Furby Connect’s Bluetooth connection could be hijacked by hackers, letting them turn on the doll’s microphone and speak to children." Image creditTony Cenicola/The New York Times 

_____________________________________ 


By Sheera Frenkel

SAN FRANCISCO — My Friend Cayla, a doll with nearly waist-length golden hair that talks and responds to children’s questions, was designed to bring delight to households. But there’s something else that Cayla might bring into homes as well: hackers and identity thieves.

Earlier this year, Germany’s Federal Network Agency, the country’s regulatory office, labeled Cayla “an illegal espionage apparatus” and recommended that parents destroy it. Retailers there were told they could sell the doll only if they disconnected its ability to connect to the internet, the feature that also allows in hackers. And the Norwegian Consumer Council called Cayla a “failed toy.”

 

The doll is not alone. As the holiday shopping season enters its frantic last days, many manufacturers are promoting “connected” toys to keep children engaged. There’s also a smart watch for kids, a droid from the recent “Star Wars” movies and a furry little Furby. These gadgets can all connect with the internet to interact — a Cayla doll can whisper to children in several languages that she’s great at keeping secrets, while a plush Furby Connect doll can smile back and laugh when tickled.

 

But once anything is online, it is potentially exposed to hackers, who look for weaknesses to gain access to digitally connected devices. Then once hackers are in, they can use the toys’ cameras and microphones to potentially see and hear whatever the toy sees and hears. As a result, according to cybersecurity experts, the toys can be turned to spy on little ones or to track their location.

 

“Parents need to be aware of what they are buying and bringing home to their children,” said Javvad Malik, a researcher with cybersecurity company AlienVault. “Many of these internet-connected devices have trivial ways to bypass security, so people have to be aware of what they’re buying and how secure it is.”

 

 

https://www.nytimes.com/2017/12/21/technology/connected-toys-hacking.html?_r=0 

 

 

more...
No comment yet.
Rescooped by Roxana Marachi, PhD from Screen Time, Wireless, and EMF Research
Scoop.it!

California Health Officials Release Guidance For Limiting Exposure to Cellphone Radiation // EduResearcher 

California Health Officials Release Guidance For Limiting Exposure to Cellphone Radiation // EduResearcher  | Educational Psychology & Technology | Scoop.it

"The California Department of Public Health has recently issued guidance for reducing exposure to radiation emitted from cell phones. An emphasis within the document includes children’s heightened vulnerabilities to cumulative hazards of long term exposure. The document was originally drafted in 2009 by the Department of Public Health’s Division of Environmental and Occupational Disease Control and underwent numerous revisions, yet remained hidden from public view until now (for more on the lawsuit that led the Sacramento Superior Court to order the release of the draft documents, see here). 

 

The recommendations outlined by the California Department of Public Health are similar to those issued by the Connecticut Department of Public Health in May, 2015. Below is the official press release issued on December 13th, 2017:

 

“SACRAMENTO – As smartphone use continues to increase in the U.S., especially among children, the California Department of Public Health (CDPH) today issued guidance for individuals and families who want to decrease their exposure to the radio frequency energy emitted from cell phones."...

 

For full post, see:

https://eduresearcher.com/2017/12/18/childhealth/ 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

​Chromageddon Comedown: Educators Are Wary After Thousands of Google Devices Fail // EdSurge News

​Chromageddon Comedown: Educators Are Wary After Thousands of Google Devices Fail // EdSurge News | Educational Psychology & Technology | Scoop.it

By Sydney Johnson
"Kelly Dumont is an education technologist for Canyons School District in Sandy, Utah. On December 5, his district was one of many around the country that experienced a massive Chromebook error that caused hundreds of thousands of devices to temporarily disconnect from the internet.

 

Dumont, who is the elementary edtech team lead in his Utah district, believes that all 20,000 of the devices in CSD were affected by the issue, which was resolved by Google later that day. Now that the devices are back online, however, he and other educators question whether the error was a one-off glitch, or a foreshadowing for other unexpected issues in the cloud-based technology.

“This was temporarily catastrophic,” Dumont tells EdSurge.

 

On a support page setup to assist educators in re-connecting their decies, Google attributes the error to an “invalid network policy” that was pushed out to devices. The error caused devices to lose “connectivity to passphrase-protected WiFi networks configured through admin policies.”

 

Mass policy updates like the one that affected devices last week are not uncommon—though the effect last week is believed to be a first. “We issue network policy updates on a regular basis, which varies based on admin updates and device boots,” a representative from Google wrote in an email.

 

Dumont is aware that these types of updates are routinely pushed through the cloud-based system. But the event leaves him wary nonetheless.

 

“It’s an eye-opening situation because we are trusting the cloud so much for everything, and we rely heavily on Google services,” he says. “Google Apps for Education is our baseline for 1:1. The potential to have 20,000 devices go down at once is concerning.” Adam Henderson, the director of technology systems at Nassau County Schools in Florida, shared similar concerns with EdSurge last week while the issue was still unraveling."...

 

https://www.edsurge.com/news/2017-12-13-chromageddon-come-down-educators-are-wary-after-thousands-of-google-devices-fail 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Lurkers, Creepers, and Virtuous Interactivity: From Property Rights to Consent and Care as a Conceptual Basis For Privacy Concerns and Information Ethics // First Monday

Lurkers, Creepers, and Virtuous Interactivity: From Property Rights to Consent and Care as a Conceptual Basis For Privacy Concerns and Information Ethics // First Monday | Educational Psychology & Technology | Scoop.it

By D.E. Wittkower
Abstract
"Exchange of personal information online is usually conceptualized according to an economic model that treats personal information as data owned by the persons these data are ‘about.’ This leads to a distinct set of concerns having to do with data ownership, data mining, profits, and exploitation, which do not closely correspond to the concerns about privacy that people actually have. A post-phenomenological perspective, oriented by feminist ethics of care, urges us to figure out how privacy concerns arrive in fundamentally human contexts and to speak to that, rather than trying to convince people to care about privacy as it is juridically conceived and articulated. By considering exchanges of personal information in a human-to-human online informational economy — being friends on social networking sites — we can identify an alternate set of concerns: consent, respect, lurking, and creepiness. I argue that these concerns will provide a better guide to both users and companies about prudence and ethics in information economies than the existing discourse around ‘privacy.’

 

Contents

1. Introduction
2. Methodology
3. Privacy and the property-based understanding of personal information
4. Personal Information re-conceived in the context of relationships
5. The online-emergent virtue of interactivity
6. Being a lurker
7. Being a creeper
8. Broader applications

 

For full post, click on title above or here:

http://firstmonday.org/article/view/6948/5628 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

For-Profit Online College Faces Lawsuit For Allegedly Misleading Students // SF Examiner

For-Profit Online College Faces Lawsuit For Allegedly Misleading Students // SF Examiner | Educational Psychology & Technology | Scoop.it

http://www.sfexaminer.com/profit-online-college-faces-lawsuit-allegedly-misleading-students/  

 

 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Teach Like They're Data: Max Ventilla's Extractive AltSchool Platform, Personalization, and Profit // Long View on Education

Teach Like They're Data: Max Ventilla's Extractive AltSchool Platform, Personalization, and Profit // Long View on Education | Educational Psychology & Technology | Scoop.it

"“Paradoxically, the corporate sector responsible for deindustrialization and cutting wages while receiving tax breaks that starved cities of revenue is now repositioned as beneficent donor, if not savior [of education]. The global billionaires who accumulated unfathomable wealth at the expense of most of the world’s people are now our benefactors and leaders.” – Pauline Lipman & Cristen Jenkins

“The technology always extracts. … Before computers, it was fossil fuels. The idea that you can pull free physical work out of the ground, that was a really good trick, and it resulted in all of these exponential curves. But now we’re discovering how to pull free mental work out of the ground. That’s going to be an equivalent, huge trick over the next 50 years.” – Max Ventilla

 

Platforms and Classrooms 

Max Ventilla, former head of personalisation at Google, has announced that his start-up, Altschool, will be closing some locations and focusing in on developing and marketing it’s software platform. Marketing itself as a site of ‘hyperpersonalisation’, AltSchool has received $175 million in venture capital as it pursues Ventilla’s for-profit dream to reshape education. According to Bloomberg, Ventilla sent a letter to parents informing them of the upcoming changes:

 

“Ventilla said AltSchool will only run classrooms near the main offices in San Francisco and New York. ‘We know this is tough news that will have a big impact on your family,’ Ventilla said. But the moves are needed, he wrote, given AltSchool’s ‘strategy, path to growth and finances.’ Ventilla told Bloomberg that the company had long planned to prioritize selling technology to other schools. He said it’s happening earlier than anticipated because of demand. For outside schools, the company charges about $150 to $500 annually per student for its technology, depending on the size of the institution.”


Ventilla’s platform aims to use the same kind of personalisation technology that Google, Facebook, and Netflix use to recommend results and experiences to us. A kind of “mass customisation”, this approach relies on collecting data about individual users based on our likes and clicks, and creating algorithms to use larger patterns in their massive data sets to offer us custom results which we can ‘choose’ from. As Audrey Watters argues, personalisation sounds progressive – we’ll offer you individually tailored experiences! – but it’s more about delivering ‘content’ like customised Facebook ads.

In an interview, Ventilla says that “we start with a representation of each child”, and even though “the vast majority of the learning should happen non-digitally”, the child’s habits and preferences gets converted into data, “a digital representation of the important things that relate to that child’s learning, not just their academic learning but also their non-academic learning. Everything logistic that goes into setting up the experience for them, whether it’s who has permission to pick them up or their allergy information. You name it.”

 

And just like Netflix matches us to TV shows, “If you have that accurate and actionable representation for each child, now you can start to personalize the whole experience for that child. You can create that kind of loop you described where because we can represent a child well, we can match them to the right experiences.”

 

After watching a video of what a day in their middle school looks like, I was struck by the contrasting realities of the classroom and the platform. Each class has a maximum of 24 students with two teachers and the life of the classroom looks vibrant: the teachers conference with students about their work, an expert from Stanford helps the students carry out a design project, and the space of the school looks comfortable and inviting.

 

Yet, the bright spaces and two teachers per class is not the business model – the platform – that Ventilla wants to sell because “teachers are expensive“.1

The flexibility that Altschool offers to wealthy parents in the expensive neighbourhoods of its lab schools in San Francisco and Brooklyn won’t  ‘scale’ when the platform is sold. In its lab schools, Altschool allows flexible pick-up and drop-off times for parents via their smart phones, and even accomodates surprise vacations to Hong Kong if they should arise. According to the New York Times, “A tablet with your child’s lesson plans would go with you, and he or she could study and work wherever you are. AltSchool’s plan, ultimately, after years of data-keeping, self-assessment and reassessment, is to take its best practices and technological innovations to the universe of public schools.”

The same NYT article contrasts Altschool with the “boot-camp model of so many of the city’s charter schools, where learning can too easily be divorced from pleasure, and fear rather than joy is the operative motivator.” But what will Altschool – the platform – look like when it is exported to public schools where the cost of teachers and space matter? Given that “AltSchool’s losses are piling up as it spends at a pace of about $40 million per year“, it’s not hard to imagine that the more desirable aspects of Altschool’s flexibility will be only be available for purchase by the wealthy.

As one example of how the implementation of the platform might carry negative consequences in public schools, consider the Altschool’s use of cameras to gather surveillance. According to Business Insider, “Cameras are also mounted at eye level for kids, so teachers can review successful lessons and ‘the steps leading up to those ‘ah-ha’ moments,’ head of school Kathleen Gibbons said. Some children use them as confessionals, sharing their secrets with the camera.”"...

 

For full post, please see: 

http://www.longviewoneducation.org/teach-like-theyre-data/ 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Regulating Blockchain and Distributed Ledger Identity // K(NO)W Identity Conference

"As governments and companies move from research and development to deployment of distributed ledgers and blockchains for identity, they will need to consider the legal and regulatory implications of, and risks presented by, these transformative technologies. In this session we’ll consider different approaches to regulation, and ask the fundamental question of whether self-sovereign identity and regulation can co-exist Filmed at the 2017 K(NO)W Identity Conference - Washington, D.C. http://www.knowidentityconference.com

Panelists:
Juan Llanos - Senior Advisor, One World Identity
Steve Ehrlich - Lead Analyst for Emerging Technologies, Spitzberg Partners
Alan Cohen - Of Counsel, Steptoe & Johnson LLP
Jamie Smith - Global Chief Communications and Marketing Officer, Bitfury Group
Andrea Tinianow - Director, Global Delaware
 

https://www.youtube.com/watch?v=cT5hlMOKhdc 

 

 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

YouTube Is Addressing Its Massive Child Exploitation Problem

YouTube Is Addressing Its Massive Child Exploitation Problem | Educational Psychology & Technology | Scoop.it

By Charlie Warzel

"Across YouTube, an unsettling trend has emerged: Accounts are publishing disturbing and exploitative videos aimed at and starring children in compromising, predatory, or creepy situations — and racking up millions of views.

 

BuzzFeed News has found a number of videos, many of which appear to originate from eastern Europe, that feature young children, often in revealing clothing, placed in vulnerable scenarios. In many instances, they're restrained with ropes or tape and sometimes crying or in visible distress. In other videos, the children are kidnapped, or made to 'play doctor' with an adult. The videos frequently include gross-out themes like injections, eating feces, or needles. Many come from YouTube 'verified' channels and have tens of millions of views. After BuzzFeed News brought these videos to the attention of YouTube, they were removed.

 

In recent weeks, YouTube has faced criticism after reports about unsettling animated videos and bizarre content, which were aimed at children using family-friendly characters, and some of which was not caught by YouTube Kids’ filters. All of the videos reviewed by BuzzFeed News were live-action, ostensibly set up by adults and featuring children. Taken together, they make up a vast, disturbing, and wildly popular universe of videos that — until recently — existed unmoderated by the platform."...

 

For full post, please see:

https://www.buzzfeed.com/charliewarzel/youtube-is-addressing-its-massive-child-exploitation-problem 

more...
No comment yet.