Educational Psychology & Technology
26.0K views | +1 today
Follow
 
Scooped by Roxana Marachi, PhD
onto Educational Psychology & Technology
Scoop.it!

Big Hype, Hard Fall for News Corp.'s $1 Billion Ed-Tech Venture // EdWeek

Big Hype, Hard Fall for News Corp.'s $1 Billion Ed-Tech Venture // EdWeek | Educational Psychology & Technology | Scoop.it

By Benjamin Herold (EdWeek) 
 

"Amplify, the education division of Rupert Murdoch's company, is deeply in the red and on the auction block after its ambitious vision failed to materialize.


The global media giant News Corp. sought to push its way into the K-12 marketplace five years ago by betting big on technology. Now, despite a $1 billion investment and a steady stream of brash promises to radically disrupt the way public schools do business, the company's education division, known as Amplify, is deeply in the red and on the auction block.


Veteran observers of the fickle K-12 ed-tech market say they aren't surprised.


"There's a long history of education entrepreneurs who have crashed on the rocks because the market was not what they thought it would be," said Douglas A. Levin, a consultant on the ed-tech market and the recent head of the State Educational Technology Directors Association.


Earlier this month, it became clear that the highly publicized venture had become a financial albatross. When News Corp. announced that it would write off the education division's $371 million in losses over the past year and look to sell off Amplify, investors cheered, sending the parent company's share price up 7.5 percent, to $14.89.


Inside schools, meanwhile, the ripple effects of Amplify's striking demise promised to be minimal. A majority of the 30,000 or so tablet computers sold by the company went to a single district, and Amplify fell far short of its modest goal of getting its no-expense-spared digital curriculum into the hands of 30,000 students by the 2015-16 school year.


Experts attributed the company's lack of impact on the K-12 market to a series of miscalculations."...


For full post, click on title above or here: 
http://www.edweek.org/ew/articles/2015/08/26/big-hype-hard-fall-for-news-corps-ed-tech-venture.html 


more...
No comment yet.
Educational Psychology & Technology
This curated collection includes news, resources, and research related to the intersections of Educational Psychology and Technology. The page also serves as a research tool to organize online content. The grey funnel shaped icon at the top allows for searching by keyword. For research more specific to tech, screen time and health/safety concerns, please see: http://bit.ly/screen_time, to learn about the next wave of privatization involving technology intersections with Pay For Success,  Social Impact Bonds, and Results Based Financing (often marketed with language promoting "public-private-partnerships"), see http://bit.ly/sibgamble, and for additional Educator Resources, please visit http://EduResearcher.com [Links to an external site].
Your new post is loading...
Your new post is loading...
Scooped by Roxana Marachi, PhD
Scoop.it!

When "Innovation" is Exploitation: Data Ethics, Data Harms and Why We Need to Demand Data Justice // Marachi, 2019, Summer Institute of A Black Education Network at Stanford University, California 

To download pdf, please click on title or arrow above.

 

For more on the data brokers selling personal information from a variety of platforms, including education, please see: https://www.fastcompany.com/90310803/here-are-the-data-brokers-quietly-buying-and-selling-your-personal-information 

 

Please also visit: Parent Coalition for Student Privacy

https://www.studentprivacymatters.org/

 

See the Data Exploitation page at Privacy International

https://privacyinternational.org/video/1626/video-what-data-exploitation

 

and visit the Data Justice Lab: 

https://datajusticelab.org/

 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Once More With Feeling: 'Anonymized' Data Is Not Really Anonymous // Techdirt

Once More With Feeling: 'Anonymized' Data Is Not Really Anonymous // Techdirt | Educational Psychology & Technology | Scoop.it

By Karl Bode

"As companies and governments increasingly hoover up our personal data, a common refrain to keep people from worrying is the claim that nothing can go wrong because the data itself is "anonymized" or stripped of personal detail. But time and time again, we've noted how this really is cold comfort; given it takes only a little effort to pretty quickly identify a person based on access to other data sets. Yet most companies (including cell phone companies that sell your location data) act as if "anonymizing" your data is iron-clad protection from having it identified. It's simply not true.

 

The latest case in point: in new research published this week in the journal Nature Communications, data scientists from Imperial College London and UCLouvain found that it wasn't particularly hard for companies (or, anybody else) to identify the person behind "anonymized" data using other data sets. More specifically, the researchers developed a machine learning model that was able to correctly re-identify 99.98% of Americans in any anonymised dataset using just 15 characteristics including age, gender and marital status:

 

"While there might be a lot of people who are in their thirties, male, and living in New York City, far fewer of them were also born on 5 January, are driving a red sports car, and live with two kids (both girls) and one dog,” explained study first author Dr Luc Rocher, from UCLouvain."


And using fifteen datasets is actually pretty high for this sort of study. One investigation of "anonymized" user credit card data by MIT found that users could be correctly "de-anonymized" 90 percent of the time using just four relatively vague points of information. Another study looking at vehicle data found that 15 minutes’ worth of data from just brake pedal use could lead them to choose the right driver, out of 15 options, 90% of the time.

The problem, of course, comes when multiple leaked data sets are released in the wild and can be cross referenced by attackers (state sponsored or otherwise), de-anonymized, then abused. The researchers in this new study were quick to proclaim how government and industry proclamations of "don't worry, it's anonymized!" are dangerous and inadequate:

 

"Companies and governments have downplayed the risk of re-identification by arguing that the datasets they sell are always incomplete,” said senior author Dr Yves-Alexandre de Montjoye, from Imperial’s Department of Computing, and Data Science Institute. "Our findings contradict this and demonstrate that an attacker could easily and accurately estimate the likelihood that the record they found belongs to the person they are looking for."

 

It's not clear how many studies like this we need before we stop using "anonymized" as some kind of magic word in privacy circles, but it's apparently going to need to be a few dozen more."...

 

For original post, please visit:

https://www.techdirt.com/articles/20190723/08540542637/once-more-with-feeling-anonymized-data-is-not-really-anonymous.shtml

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

The Structural Consequences of Big Data-Driven Education // Elana Zeide 

The Structural Consequences of Big Data-Driven Education // Elana Zeide  | Educational Psychology & Technology | Scoop.it

Abstract
Educators and commenters who evaluate big data-driven learning environments focus on specific questions: whether automated education platforms improve learning outcomes, invade student privacy, and promote equality. This article puts aside separate unresolved—and perhaps unresolvable—issues regarding the concrete effects of specific technologies. It instead examines how big data-driven tools alter the structure of schools’ pedagogical decision-making, and, in doing so, change fundamental aspects of America’s education enterprise. Technological mediation and data-driven decision-making have a particularly significant impact in learning environments because the education process primarily consists of dynamic information exchange. In this overview, I highlight three significant structural shifts that accompany school reliance on data-driven instructional platforms that perform core school functions: teaching, assessment, and credentialing.

First, virtual learning environments create information technology infrastructures featuring constant data collection, continuous algorithmic assessment, and possibly infinite record retention. This undermines the traditional intellectual privacy and safety of classrooms. Second, these systems displace pedagogical decision-making from educators serving public interests to private, often for-profit, technology providers. They constrain teachers’ academic autonomy, obscure student evaluation, and reduce parents’ and students’ ability to participate or challenge education decision-making. Third, big data-driven tools define what ‘counts’ as education by mapping the concepts, creating the content, determining the metrics, and setting desired learning outcomes of instruction. These shifts cede important decision-making to private entities without public scrutiny or pedagogical examination. In contrast to the public and heated debates that accompany textbook choices, schools often adopt education technologies ad hoc. Given education’s crucial impact on individual and collective success, educators and policymakers must consider the implications of data-driven education proactively and explicitly.

 

Keywords: big data; personalized learning; competency-based education; smart tutors; learning analytics; MOOCs

Suggested Citation:

Zeide, Elana, The Structural Consequences of Big Data-Driven Education (June 23, 2017). Big Data, Vol 5, No. 2 (2017): 164-172. Available at SSRN: https://ssrn.com/abstract=2991794"

 

https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2991794 

 

Shortlink to download: http://bit.ly/EdBigData

 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

How Much School Surveillance Is Too Much? // The New York Times 

How Much School Surveillance Is Too Much? // The New York Times  | Educational Psychology & Technology | Scoop.it

By Charlie Warzel

"Back-to-school season is upon us and the future of child surveillance may soon be underway in Florida.

Last month I referenced a proposed school surveillance program in the state, where lawmakers were planning to introduce a statewide database “that would combine individuals’ educational, criminal-justice and social-service records with their social media data, then share it all with law enforcement.”

This month we know a bit more about what that program will look like, thanks to a 19-slide PowerPoint presentation released by the Florida Department of Education. Turns out, it’s quite extensive. You can look at the full presentation here and read a local news article on the report here.

A few highlights:

Officials are calling it a “safety portal” because the software will pull information from multiple databases, rather than host it all itself. Much of the information inside will be pulled from platforms like Twitter, Facebook, Instagram, YouTube, Reddit, Flickr, Google+ and Pinterest. Software deployed by the state will scan social media and general websites. It will monitor and store students’ social media posts as well as the geolocation, showing where the posts were made.

 

According to the PowerPoint, the software will monitor keywords on five topics: “gun,” “bomb,” “bullying,” “mental health” and “general,” which is vague enough to leave room for virtually any topic. Recorded information will go into the state database (portal) and school districts, and specific “Threat Assessment Teams” will get alerts if it’s determined there’s a threat.

In the age of mass school shootings, such a program may seem comforting to some parents. But the program’s sweeping collection parameters and the combination of information from multiple state databases mean that the portal will ultimately be home to a great deal of sensitive information. And the portal remains an excellent example of how programs designed to protect may have serious unintended privacy consequences.

For instance, according to the PowerPoint, the portal will not “store information about students’ race, religion, disability or sexual orientation.” However, information in the portal will contain School Environmental Safety Incident Reporting tags, which could reveal information about students who were bullied because of race, sexual orientation or disability. This type of de-anonymizing, even if accidental, is common when cross-referencing different databases.

For whatever safety it could add, automating systems to catch school shooters and other threats also increases the likelihood of technical errors. One slide from the PowerPoint notes that the portal will assess potential threats and monitored social media posts using “programmatical scoring that can help in determining relevancy of each returned record.”

The slide is vague on what exactly “relevancy” means in this case, and without algorithm transparency, it’s unclear whether the scoring system will bias students who may be vulnerable because of, say, a disclosed mental health issue. And since relevancy is left vague, it’s unclear how often a threat will trigger the portal’s flagging mechanisms.

Depending on algorithmic calibration, teams could be inundated with false alarms or, perhaps, not receive them at all.

 

But as with any database, the biggest concern is who will have access to the data of hundreds of thousands of students. Here’s how Amelia Vance, who directs the Future of Privacy Forum’s Education Privacy Project, put it in an email:

"There are over 4,000 schools in Florida; if we assume each school has a three-person minimum threat assessment team, that equals over 12,000 people who will have access to this portal. Even if Threat Assessment Teams are only at the district level, that is a minimum of 200 people for the 67 districts. With that many people able to access the system, it is highly likely that there will be multiple security vulnerabilities.”

 

Perhaps the system works flawlessly (though I’m quite skeptical). Early reports suggest that the database may have serious limitations. Still, it’s hard not to think about the impact on the students themselves. The surveillance state seems to be encroaching on every aspect of minors’ lives lately. It comes from all angles — from overeager parents, facial recognition in summer camps and, increasingly, in school.

 

And it shows no signs of letting up. Near the end of the PowerPoint presentation is a slide that says, “What’s Next?,” which includes an action item on collecting School Environmental Safety Incident Reporting data (including categories from arson to tobacco use).

 

“It is currently collected 3 times a year,” the slide reads.

 

“A weekly collection is being considered.”...

 

For full post, please visit:

https://www.nytimes.com/2019/08/27/opinion/florida-school-surveillance.html 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Editorial: The datafication of education // Learning, Media and Technology

Editorial: The datafication of education // Learning, Media and Technology | Educational Psychology & Technology | Scoop.it

By Juliane Jarke and Andreas Breiter
"The increasing datafication, in particular the availability of data and corresponding algorithms introduces new means to measure, capture, describe and represent social life in numbers. The education sector is one of the most noticeable domains affected by datafication, because it transforms not only the ways in which teaching and learning are organised but also the ways in which future generations (will) construct reality with and through data. The datafication of education comprises of the collection of data on all levels of educational systems (individual, classroom, school, region, state, international), potentially about all processes of teaching, learning and school management. This proliferation of data changes decision-making and opinion-forming processes of educational stakeholders such as education policy, school supervision, school authorities, teachers, students and parents. For example, data are used to improve school development, to hold schools and teachers accountable, to control access to schooling or to compare student achievements across countries.

 

Such use cases raise expectations with respect to increased transparency, accountability, service orientation and civic participation but also associated fears with respect to surveillance and control, privacy issues, power relations, and (new) inequalities (e.g., Anagnostopoulos, Rutledge, and Jacobsen 2013; Eynon 2013; Selwyn, 2015; Livingstone and Sefton-Green 2016).  Lupton and Williamson 2017; Lupton and Williamson, 2017. 

 

Within the educational context, more and heterogeneous data are being generated – deliberately – for monitoring, surveillance or evaluation purposes, but also – automatically – through routine operations of a manifold of digital devices and systems (Selwyn 2015), producing ‘digital traces’ (Breiter and Hepp 2017).

 

Schools, for example, are being transformed into ‘data platforms’ in which ‘a wide range of data tracking, sensing and analytics technologies are being mobilised’ (Williamson 2015a). These digital educational data are distinct from pre-digital forms as they may be exhaustive in scope, highly detailed, and can be combined in a flexible manner and at different aggregation levels (Parks 2014).  Such possibilities have always existed on a small scale, but new data infrastructures and algorithmic capabilities allow for analytics of an ‘unprecedented complexity and scope’ (Parks 2014). However, the underlying algorithms and the ways in which data are produced by data providers, statisticians as well as the role of software companies and educational technology providers are hardly understood (Eynon 2013); Williamson 2015b).

 

This special issue aims to shed light on the dynamics of datafication and related transformation of education. 
Contributions consider data practices that span across different countries, educational fields and governance levels from early childhood education (Bradbury, this issue), to schools (Ratner et al, this issue; Manolev et al, this issue), universities (Jones & McCoy, this issue), educational technology providers (Macgilchrist, this issue) to educational policy making and governance (Williamson & Piattoeva, this issue). In the following, we provide a brief overview over datafication in these different educational domains and subsequently reflect on the ambivalent consequences described in the contributions of this special issue."

 

Jarke, J. and Breiter, A. (2019). Editorial: the datafication of education. Learning, Media and Technology, 44, 1-6.

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Students protest Zuckerberg-backed digital learning program and ask him: ‘What gives you this right?’ // Washington Post

Students protest Zuckerberg-backed digital learning program and ask him: ‘What gives you this right?’ // Washington Post | Educational Psychology & Technology | Scoop.it

By Valerie Strauss

"Students at a New York high school have protested in recent weeks an online education program developed with engineers working for Facebook founder Mark Zuckerberg in the latest challenge to the growing “personalized learning” movement in U.S. education.

 

More than 100 students from Brooklyn’s Secondary School for Journalism left campus during school hours last week and this week. Protest leaders sent a letter to Zuckerberg questioning his support for the Summit Learning Platform, which is being used in some 380 schools in a number of states and the District of Columbia.

 

The students said they weren’t learning on the platform and are concerned about the privacy of personal information collected on it. Their demonstration was the latest in a number of states, including one in a Connecticut school district, where officials ended their collaboration with Summit.

 

The free platform, which offers online lessons and assessments, was developed by a network of 11 charter schools in California and Washington known collectively as Summit Public Schools, and Facebook engineers helped develop the software. Zuckerberg and his wife, Priscilla Chan, back the learning platform with engineering support through their for-profit Chan Zuckerberg Initiative. The Summit website says the platform is a “personalized, research-backed approach” to teaching and learning.

 

“Personalized learning” — one of the reform models being promoted by the Chan Zuckerberg Initiative, philanthropists and businesses — is the latest form of what used to be known as “differentiated learning,” or, simply, learning specific to the student. Today, online programs allow students to move at their own pace.

 

Though there is no consensus definition of “personalized learning,” and though it seems to make intuitive sense to enable students to move at their own pace, in practice, this has amounted to computer-based learning programs of varying quality that require kids to sit in front of screens for a good part of the school day.

 

Kelly Hernandez, 17, a senior at the Secondary School for Journalism, said she helped organize the protest because students felt their complaints about Summit were not being heard. She said students began using it at the beginning of the school year without background information.

 

“We weren’t asked for an opinion about whether we would want to do Summit Learning,” she said. “ ‘Just use the computer. Here’s your name and password. Enjoy.’ ”

 

Akila Robinson, 17, another protest leader at the school, said she had problems logging on to Summit for two months and couldn’t get help. Another student, she said, had the same sign-on information.

 

School officials declined to comment on the protest or issues with Summit.

 

After the protest, school officials told students the program would no longer be used for juniors and seniors, but that ninth- and 10th-graders would continue using it.

 

Hernandez and Robinson sent a letter to Zuckerberg, which says in part (see the letter in full below):

"Unfortunately we didn’t have a good experience using the program, which requires hours of classroom time sitting in front of computers. Not all students would receive computers, the assignments are boring, and it’s too easy to pass and even cheat on the assessments. Students feel as if they are not learning anything and that the program isn’t preparing them for the Regents exams they need to pass to graduate. Most importantly, the entire program eliminates much of the human interaction, teacher support, and discussion and debate with our peers that we need in order to improve our critical thinking.

 

Unlike the claims made in your promotional materials, we students find that we are learning very little to nothing. It’s severely damaged our education, and that’s why we walked out in protest. . . .

Another issue that raises flags to us is all our personal information the Summit program collects without our knowledge or consent. We were never informed about this by Summit or anyone at our school, but recently learned that Summit is collecting our names, student ID numbers, email addresses, our attendance, disability, suspension and expulsion records, our race, gender, ethnicity and socio-economic status, our date of birth, teacher observations of our behavior, our grade promotion or retention status, our test scores and grades, our college admissions, our homework, and our extracurricular activities. Summit also says on its website that they plan to track us after graduation through college and beyond. Summit collects too much of our personal information, and discloses this to 19 other corporations.

 

What gives you this right, and why weren’t we asked about this before you and Summit invaded our privacy in this way?"...

 

 

For full letter and story, please see: 

https://www.washingtonpost.com/education/2018/11/17/students-protest-zuckerberg-backed-digital-learning-program-ask-him-what-gives-you-this-right/?utm_term=.9a959456b53d 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Race After Technology: Abolitionist Tools for the New Jim Code // Ruha Benjamin, Polity Books

Race After Technology: Abolitionist Tools for the New Jim Code // Ruha Benjamin, Polity Books | Educational Psychology & Technology | Scoop.it
Race After Technology:
Abolitionist Tools for the New Jim Code
Ruha Benjamin
 

"From everyday apps to complex algorithms, Ruha Benjamin cuts through tech-industry hype to understand how emerging technologies can reinforce White supremacy and deepen social inequity.

 

Benjamin argues that automation, far from being a sinister story of racist programmers scheming on the dark web, has the potential to hide, speed up, and deepen discrimination while appearing neutral and even benevolent when compared to the racism of a previous era. Presenting the concept of the “New Jim Code,” she shows how a range of discriminatory designs encode inequity by explicitly amplifying racial hierarchies; by ignoring but thereby replicating social divisions; or by aiming to fix racial bias but ultimately doing quite the opposite. Moreover, she makes a compelling case for race itself as a kind of technology, designed to stratify and sanctify social injustice in the architecture of everyday life.

 

This illuminating guide provides conceptual tools for decoding tech promises with sociologically informed skepticism. In doing so, it challenges us to question not only the technologies we are sold but also the ones we ourselves manufacture."

 

For more information or to order, please visit: 

http://politybooks.com/bookdetail/?isbn=9781509526390  

more...
No comment yet.
Rescooped by Roxana Marachi, PhD from Social Impact Bonds, "Pay For Success," Results-Based Contracting, and Blockchain Identity Systems
Scoop.it!

Out of 43 Blockchain Startups, Zero Have Delivered Products // Futurism 

Out of 43 Blockchain Startups, Zero Have Delivered Products // Futurism  | Educational Psychology & Technology | Scoop.it

By Dan Robitzski
Weakest Link

Ask a blockchain advocate and you’ll hear all about how decentralized databases are primed to save the world. But if you ask 43 high-profile blockchain technology startups what they’ve built since raising funds, you’ll hear nothing but crickets.

At least, that’s what happened when MERL Tech, a technology research firm that monitors trends in emerging technology, reached out to blockchain companies about their work. The firm’s conclusion is that blockchain startups are promising big and delivering nothing.

Investigating The Ledger

In a report published Thursday, MERL Tech documented the ambitious whitepapers and apparently too-good-to-be-true claims of 43 blockchain startups that were featured prominently on internet searches.

MERL Tech’s team found no evidence that these companies actually delivered any sort of functional products or services — not even any sort of updates on how their work was coming along. And reaching out directly yielded nothing but radio silence.

 

A blockchain is a sort of distributed database that advocates argue will increase transparency by redistributing power. The idea is that everyone who buys in will have their own copy of anything that happens on a blockchain — for example, cryptocurrency transactions — so no centralized entity like a big bank could hold too much power.

 

But many question whether or not blockchain tech actually serves a purpose. Often, blockchain seems like a buzzword that tech startups add to their pitch decks in order to attract investments.

Selection Bias

Of course, 43 individual startups do not represent the entire blockchain industry. It’s possible that MERL Tech, which likely acted in good faith, happened to investigate some bad apples or select shadier companies.

Ultimately, take this as a warning to do your due diligence before buying into a new blockchain company’s bold claims."...

 

 

For full post, see: 

https://futurism.com/tech-research-investigated-43-blockchain-startups 

 

For more on Blockchain, see: http://bit.ly/Blockchain_Files

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Facebook Loses Facial Recognition Appeal, Must Face Privacy Class Action // US News

Facebook Loses Facial Recognition Appeal, Must Face Privacy Class Action // US News | Educational Psychology & Technology | Scoop.it
 

"By Jonathan Stempel

(Reuters) - A federal appeals court on Thursday rejected Facebook Inc's effort to undo a class action lawsuit claiming that it illegally collected and stored biometric data for millions of users without their consent.

 

The 3-0 decision from the 9th U.S. Circuit Court of Appeals in San Francisco over Facebook's facial recognition technology exposes the company to billions of dollars in potential damages to the Illinois users who brought the case.

 

It came as the social media company faces broad criticism from lawmakers and regulators over its privacy practices. Last month, Facebook agreed to pay a record $5 billion fine to settle a Federal Trade Commission data privacy probe.

 

"This biometric data is so sensitive that if it is compromised, there is simply no recourse," Shawn Williams, a lawyer for plaintiffs in the class action, said in an interview. "It's not like a Social Security card or credit card number where you can change the number. You can't change your face."

 

Facebook said it plans to appeal. "We have always disclosed our use of face recognition technology and that people can turn it on or off at any time," a spokesman said in an email. Google, a unit of Alphabet Inc, won the dismissal of a similar lawsuit in Chicago last December.

 

The lawsuit began in 2015, when Illinois users accused Facebook of violating that state's Biometric Information Privacy Act in collecting biometric data.

 

Facebook allegedly accomplished this through its "Tag Suggestions" feature, which allowed users to recognize their Facebook friends from previously uploaded photos.

 

Writing for the appeals court, Circuit Judge Sandra Ikuta said the Illinois users could sue as a group, rejecting Facebook's argument that their claims were unique and required individual lawsuits.

 

She also said the 2008 Illinois law was intended to protect individuals' "concrete interests in privacy," and Facebook's alleged unauthorized use of a face template "invades an individual's private affairs and concrete interests."

 

The court returned the case to U.S. District Judge James Donato in San Francisco, who had certified a class action in April 2018, for a possible trial.

 

Illinois' biometric privacy law provides for damages of $1,000 for each negligent violation and $5,000 for each intentional or reckless violation.

 

Williams, a partner at Robbins Geller Rudman & Dowd, said the class could include 7 million Facebook users.

 

The FTC probe arose from the discovery that Facebook had let British consulting firm Cambridge Analytica harvest users' personal information. Facebook's $5 billion payout still requires U.S. Department of Justice approval."...

 

For full post, please visit:

https://www.usnews.com/news/top-news/articles/2019-08-08/facebook-loses-facial-recognition-technology-appeal-must-face-class-action 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

High-Tech Redlining: How AI Upgrades Institutional Racism

High-Tech Redlining: How AI Upgrades Institutional Racism | Educational Psychology & Technology | Scoop.it

"During the Great Depression, the federal government created the Home Owners’ Loan Corporation, which made low-interest home loans, and the Federal Housing Administration, which guaranteed mortgages made by private banks. The people running HOLC didn’t know much, if anything, about local borrowers, so they constructed “residential safety maps” that graded neighborhoods on a scale of A to D, with D neighborhoods color-coded in red to denote undesirable, high-risk areas. These “redlined” maps were also used by FHA and private businesses, and spilled over into banking, insurance, and retail stores, creating a vicious cycle of restricted services and deteriorating neighborhoods. 

 

Many private banks had their own redlined maps. In California, for example, Security First National Bank created a Los Angeles neighborhood rating system. Most neighborhoods in Central L. A. were redlined, often with explicit notations about “concentrations of Japanese and Negroes.” Boyle Heights was said to be “honeycombed with diverse and subversive elements.” Watts was redlined because it was a melting pot of not only Blacks and Japanese, but also Germans, Greeks, Italians, and Scots.

The 1968 Fair Housing Act outlawed redlining. However, in the age of Big Data, employment, insurance, and loan applications are increasingly being evaluated by data mining models that are not as overt but may be even more pernicious than color-coded maps, because they are not limited by geographic boundaries, and because their inner workings are often hidden.

 

No one, not even the programmers who write the code, know exactly how black-box algorithms make their assessments, but it is almost certain that these algorithms directly or indirectly consider gender, race, ethnicity, sexual orientation, and the like: call it hi-tech redlining. It is not moral or ethical to penalize individuals because they share group characteristics that a black-box algorithm has found to be correlated statistically with behavior.

 

Many algorithms for evaluating job candidates identify statistical patterns in the characteristics of current employees. The chief scientist for one company acknowledged that some of the factors chosen by its software do not make sense. For example, the software found that several good programmers in its database visited a particular Japanese manga site frequently; so it decided that people who visit this site are likely to be good programmers. The chief scientist said that, “Obviously, it’s not a causal relationship,” but argued that it was still useful because there was a strong statistical correlation. This is an excruciating example of the ill-founded belief–even by people who should know better–that statistical patterns are more important than common sense.

 

The CEO also said that the company’s algorithm looks at dozens of factors, and constantly changes the variables considered important as correlations come and go. She believes that the ever-changing list of variables demonstrates the model’s power and flexibility. A more compelling interpretation is that the algorithm captures transitory coincidental correlations that are of little value. If these were causal relationships, they would not come and go. They would persist and be useful. An algorithm that uses coincidental correlations to evaluate job applicants is almost surely biased. How fair is it if a Mexican-American female does not spend time at a Japanese manga site that is popular with white male software engineers?

 

Similarly, Amazon recently abandoned an attempt to develop customized algorithms for evaluating the resumes of applicants. The algorithms trained on the resumes of job applicants over the previous ten years, and favored people who were like the (mostly male) people Amazon had hired in the past. Candidates who went to all-women’s colleges were downgraded because men who worked at Amazon hadn’t gone to those colleges. Ditto with candidates who played on female sports teams.

 

A Chinese algorithm for evaluating loan applications looks at cell-phone usage; for example, how frequently incoming and outgoing calls are answered, and whether users keep their phones fully charged. Which of these metrics are signs of a phone user being a good credit risk; which of a bad credit risk? Any uncertainty you feel demonstrates the arbitrary nature of these markers."...

 

For full post, please visit: 

https://www.fastcompany.com/90269688/high-tech-redlining-ai-is-quietly-upgrading-institutional-racism

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Captivating Technology: Race, Carceral Technoscience, and Liberatory Imagination in Everyday Life // Ruha Benjamin, Editor

Captivating Technology: Race, Carceral Technoscience, and Liberatory Imagination in Everyday Life // Ruha Benjamin, Editor | Educational Psychology & Technology | Scoop.it
 
Published book: June 2019 
"From electronic ankle monitors and predictive-policing algorithms to workplace surveillance systems, technologies originally developed for policing and prisons have rapidly expanded into nonjuridical domains, including hospitals, schools, banking, social services, shopping malls, and digital life. Rooted in the logics of racial disparity and subjugation, these purportedly unbiased technologies not only extend prison spaces into the public sphere but also deepen racial hierarchies and engender new systems for social control. The contributors to Captivating Technology examine how carceral technologies are being deployed to classify and coerce specific populations and whether these innovations can be resisted and reimagined for more liberatory ends. Moving from traditional sites of imprisonment to the arenas of everyday life being reshaped by carceral technoscience, this volume culminates in a sustained focus on justice-oriented approaches to science and technology that blends historical, speculative, and biographical methods to envision new futures made possible.

Contributors. Ruha Benjamin, Troy Duster, Ron Eglash, Nettrice Gaskins, Anthony Ryan Hatch, Andrea Miller, Alondra Nelson, Tamara K. Nopper, Christopher Perreira, Winifred R. Poster, Dorothy E. Roberts, Lorna Roth, Britt Rusert, R. Joshua Scannell, Mitali Thakor, Madison Van Oort

 

For more information and to order, please visit:

https://www.dukeupress.edu/captivating-technology 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Louisiana Declares State Emergency After Cyberattacks On School Districts // The Hill

Louisiana Declares State Emergency After Cyberattacks On School Districts // The Hill | Educational Psychology & Technology | Scoop.it

By Maggie Miller

"Louisiana Gov. John Bel Edwards (D) declared a state-wide emergency this week following cyberattacks on several school districts.

 

The declaration comes after three local school districts were hit by what local media have described as ransomware attacks, where hackers take over and encrypt vital cyber systems and demand a ransom to release the data.

 

The state-wide emergency declaration allows for state resources and cyber assistance to be given to these school districts from the Louisiana National Guard, the Louisiana State Police and the Office of Technology Services.

 

According to the governor’s office, the state is also cooperating with the FBI, state agencies and higher education partners. 

 

The school systems in Sabine, Morehouse and Ouachita parishes in northern Louisiana were all hit by malware attacks this week, with the governor’s office in contact with other school districts in the state to assess how far the malware has spread."...

 

For full post, please visit: 
https://thehill.com/homenews/state-watch/454928-louisiana-declares-state-emergency-after-cyber-attacks-on-school

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

K-12 Districts Wasting Millions by Not Using Purchased Software, New Analysis Finds // EdWeek Market Brief

K-12 Districts Wasting Millions by Not Using Purchased Software, New Analysis Finds // EdWeek Market Brief | Educational Psychology & Technology | Scoop.it
 

By Michelle Davis
"A new analysis of K-12 school district spending bolsters the notion that many ed-tech products and software purchased aren’t actually used or don’t have the intended impact.

Ed-tech company Glimpse K12 studied $2 billion in school spending and found that on average, 67 percent of educational software product licenses go unused. Glimpse K12 tracked 200,000 curriculum software licenses purchased by 275 schools during the 2017-2018 school year. The analysis found educational software was the biggest source of wasted spending in K-12 districts.

 

In some districts, up to 90 percent of purchased software licenses are not being used, said Glimpse K12 co-founder Adam Pearson. The analysis estimates that overall, the districts studied are losing about $2 million on these products throughout the school year.

 

In the U.S. K-12 education marketplace, where districts spend about $8.4 billion on ed-tech software a year according to the Software and Information Industry Association, that could mean over $5.6 billion wasted annually."...

 

For full post, visit: 

https://marketbrief.edweek.org/marketplace-k-12/unused-educational-software-major-source-wasted-k-12-spending-new-analysis-finds/?cmp=soc-edit-tw

more...
Scooped by Roxana Marachi, PhD
Scoop.it!

The Data Brokers Quietly Buying and Selling Your Personal Information // Fast Company

The Data Brokers Quietly Buying and Selling Your Personal Information // Fast Company | Educational Psychology & Technology | Scoop.it
By Steven Melendez and Alex Pasternack
"It’s no secret that your personal data is routinely bought and sold by dozens, possibly hundreds, of companies. What’s less known is who those companies are, and what exactly they do.
 

Thanks to a new Vermont law requiring companies that buy and sell third-party personal data to register with the Secretary of State, we’ve been able to assemble a list of 121 data brokers operating in the U.S. It’s a rare, rough glimpse into a bustling economy that operates largely in the shadows, and often with few rules.

 

Even Vermont’s first-of-its-kind law, which went into effect last month, doesn’t require data brokers to disclose who’s in their databases, what data they collect, or who buys it. Nor does it require brokers to give consumers access to their own data or opt out of data collection. Brokers are, however required to provide some information about their opt-out systems under the law–assuming they provide one.

 

If you do want to keep your data out of the hands of these companies, you’ll often have to contact them one by one through whatever opt-out systems they provide; more on that below.

The registry is an expansive, alphabet soup of companies, from lesser-known organizations that help landlords research potential tenants or deliver marketing leads to insurance companies, to the quiet giants of data. Those include big names in people search, like Spokeo, ZoomInfo, White Pages, PeopleSmart, Intelius, and PeopleFinders; credit reporting, like Equifax, Experian, and TransUnion; and advertising and marketing, like Acxiom, Oracle, LexisNexis, Innovis, and KBM. Some companies also specialize in “risk mitigation,” which can include credit reporting but also background checks and other identity verification services.

 

Still, these 121 entities represent just a fraction of the broader data economy: The Vermont law only covers third-party data firms–those trafficking in the data of people with whom they have no relationship–as opposed to “first-party” data holders like Amazon, Facebook, or Google, which collect data directly from users."...

 

For full post, please see:

https://www.fastcompany.com/90310803/here-are-the-data-brokers-quietly-buying-and-selling-your-personal-information 

 

 

For detailed version of image above, click here. [Image by Cracked Labs]

 

 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Examining Privacy Violations in Children's Apps // Reyes, Wijesekera, Reardon, Bar On, Razahpanah, and Vallina-Rodriguez, and Egelman, 2019 

http://www.oecd.org/sti/ieconomy/workshop-protection-children-connected-world-1-2b-reyes-egelman.pdf

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Fifty Attorneys General Open Antitrust Probe Into Google // CNBC

Fifty Attorneys General Open Antitrust Probe Into Google // CNBC | Educational Psychology & Technology | Scoop.it

By Lauren Feiner
"Fifty attorneys general are joining an investigation into Google over possible antitrust violations, Texas Attorney General Ken Paxton, the initiative’s leader, announced Monday.

The news confirms reports last week about the bipartisan investigation into Google’s practices. The probe includes attorneys general from 48 states, the District of Columbia and Puerto Rico. California and Alabama are not involved in the probe, Paxton said at a press conference.

 

Other attorneys general at the media conference emphasized Google’s dominance in the ad market and use of consumer data."...

 

https://www.cnbc.com/2019/09/09/texas-attorney-general-leads-google-antitrust-probe.html 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Colleges are using big data to track students in an effort to boost graduation rates, but it comes at a cost // Hechinger Report

Colleges are using big data to track students in an effort to boost graduation rates, but it comes at a cost // Hechinger Report | Educational Psychology & Technology | Scoop.it

...."A third of U.S. colleges are using predictive analytics to boost graduation rates. Critics fear the algorithms may invade privacy and reinforce inequities." 

 

https://hechingerreport.org/predictive-analytics-boosting-college-graduation-rates-also-invade-privacy-and-reinforce-racial-inequities/

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

1.4 Million Student Social Security Numbers Found Unencrypted in Maryland

1.4 Million Student Social Security Numbers Found Unencrypted in Maryland | Educational Psychology & Technology | Scoop.it

By Colin Wood

"A recent audit has revealed that the Maryland Department of Education “inappropriately” stored personal information of more than 1.4 million students and more than 230,000 teachers.

 

The report published by the Maryland General Assembly’s audit office last week found that the education department stored personally identifiable information in databases and applications in plaintext format, leaving it more susceptible to interception by bad actors. The audit also found that the department had not instated “sufficient” malware protection, nor did it ensure that critical systems managed by third parties were protected against security risks.

 

“Specifically, we found critical servers running on outdated and no longer supported operating systems and a number of computers had not been updated with the latest releases for software products that were known to have significant security-related vulnerabilities,” the report states.

 

The audit found some of the department’s software hasn’t been updated since 2008.

 

The state’s Education Department, which in 2017 spent $7.7 billion, manages information technology systems housing student data that includes Social Security numbers, along with names of students and teachers. The auditor found that this information was not encrypted, despite the office’s recommendation to remediate this issue in its previous audit in March.

 

All state agencies in Maryland are legally required to encrypt sensitive data.

 

In a response to the audit, the education department agreed with the auditor’s recommendations to inventory its systems, delete all unneeded sensitive data, and to encrypt the sensitive data that remains. The department says its IT division is now working with the Maryland Department of Information Technology to complete these tasks by September 30.

 

Data left unencrypted on state and local government or school district servers has in many cases made its way into the hands of bad actors, who can infiltrate computer systems via a compromised email account or weak security on various educational software platforms. While all organizations are vulnerable to cyberattacks, the U.S. education sector ranked last out of 17 sectors for its cybersecurity capabilities, according to a report published last December by a New York-based cybersecurity research firm."...

 

For original post, please see: 

https://edscoop.com/1-4-million-student-social-security-numbers-found-unencrypted-in-maryland/

 

 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Parents raise concerns after Boulder Valley students are hit by data breach // The Denver Channel

Parents raise concerns after Boulder Valley students are hit by data breach // The Denver Channel | Educational Psychology & Technology | Scoop.it

[Selected segments of full article below]

By Meghan Lopez

"BOULDER COUNTY, Colo. -- Boulder Valley School District parents started to receive letters in the mail this week about a data breach affecting numerous students.

 

Boulder Valley was one of 13,000 districts and universities affected by a hack into the Pearson PLC education software company that affected as many as 60,000 students.

 

The information that was released was limited to the student’s first and last name and date of birth. Pearson says there is no indication that the data has been used.

 

Details of the data breach

The breach happened in November 2018, however Pearson didn’t find out about it until this March, when it was notified by the FBI.

 

Boulder Valley Schools said it was informed that its students were affected by the breach this month and the district started sending out letters to parents last Friday, which Pearson paid for. The letters were sent out only to the students who were affected.


“The delay in them, Pearson, telling us is because there’s an ongoing FBI investigation and with that investigation only so much information could be shared,” said Andrew Moore, the district’s chief information officer.

 

Current and former students as well as homeschool families who are affiliated with the district were all affected by the breach, including Moore’s own daughter, who recently graduated college.

 

The Pearson breach happened through its AIMSweb 1.0 system, a tool used in schools to track the achievements and progress of students over time.

 

Boulder Valley already upgraded from the version of the system that was breached, something that was planned even before the hack.

 

The district shared some data with that company through its Infinite Campus student information system.

 

“Infinite campus has every student that has been here for quite some time. There’s a lot of data in Infinite Campus, just like every other school district across the country,” Moore said.

 

Parents are required to register their child’s information into the program for the school, regardless of which school they attend. That’s why some homeschooling parents whose children didn’t use the AIMSweb 1.0 system were affected.

 

“I think any company that has a data breach needs to step up,” Moore said.

 

He is not defending Pearson but the district is also not planning on scaling back cooperation with the company."

...

 

"Prior concerns

Pearson is not the only third-party vendor the district works with and shares some student data with. According to the district’s website, it works with more than 100 vendors that have signed its data protection policy.

 

This fact has caused concern among parents like Segur, Bocquet and Cohen for years.

 

There has been so much concern about student data privacy in the district, in fact, that parents created a Facebook page solely dedicated to that topic. The group has nearly 400 members.

 

Cohen tried to opt her five children out of BVSD’s data sharing numerous times but was told by the district that the opt-out option is strictly limited to technology used in the classroom and whether or not parents want their children to have access to computers.

 

She didn’t want her children’s information to be placed into the Infinite Campus system but was told it’s not optional for students affiliated with the district. She wants the district to be more open about which vendors it is giving data to, what data is being provided and how it is being used.

 

“Not having transparency about it going to all these vendors in this way is not sufficient,” she said.

 

Cohen says she has been bringing up these data privacy concerns with the district for about five years. Bocquet and Segur have each been voicing their concerns with the district for about three years.

 

They have written the school board, filed complaints with the district and even reached out to their state lawmakers about the need to protect student privacy. They also created an online petition for something to be done.

“I think the district has kind of played fast and loose with children’s data not concerning with whether the parents want to share that data but doing what’s convenient for them without consent of the parents,” she said.

 

Segur’s interest in the issue started when her son, who was in the third grade at the time, started coming home and telling his mother about all of the games he was able to play on the computer, including some that involved violence or had chatrooms where the students could talk to complete strangers.

 

“I just assumed that the school district was doing the right thing. You send your kids to school, the schools are safe,” Segur said. “Then I kind of discovered it’s the Wild West.”

 

Since then, the district has gotten a new superintendent and changes have started to happen. Boulder Valley Schools says this summer it added new filters to keep students from using things like Netflix and Hulu and to block certain ads.

 

However, Cohen says she would like more flagging systems to be put in place and for parents to be able to opt their children out of certain things as they see fit.

 

“Every time I see a new permutation of the agreements that parents are supposed to sign or students are supposed to sign, they tend to err on the side of protecting the district against the students and most of the liabilities are shifted to students and families,” Cohen said."...

 

For full post, please visit:

https://www.thedenverchannel.com/news/local-news/parents-raise-concerns-after-boulder-valley-students-are-hit-by-data-breach

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Algorithmic Systems in Education: Incorporating Equity and Fairness When Using Student Data // Center for Democracy and Technology 

Algorithmic Systems in Education: Incorporating Equity and Fairness When Using Student Data // Center for Democracy and Technology  | Educational Psychology & Technology | Scoop.it

By Center for Democracy and Technology

"Some K-12 school districts are beginning to use algorithmic systems to assist in making critical decisions affecting students’ lives and education. Some districts have already integrated algorithms into decision-making processes for assigning students to schools, keeping schools and students safe, and intervening to prevent students from dropping out. There is a growing industry of artificial intelligence startups marketing their products to educational agencies and institutions. These systems stand to significantly impact students’ learning environments, well-being, and opportunities. However, without appropriate safeguards, some algorithmic systems could pose risks to students’ privacy, free expression, and civil rights.

 

This issue brief is designed to help all stakeholders make informed and rights-respecting choices and provides key information and guidance about algorithms in the K-12 context for education practitioners, school districts, policymakers, developers, and families. It also discusses important considerations around the use of algorithmic systems including accuracy and limitations; transparency and explanation; and fairness and equity.

To address these considerations, education leaders and the companies that work with them should take the following actions when designing or procuring an algorithmic system:

  • Assess the impact of the system and document its intended use: Consider and document the intended outcomes of the system and the risk of harm to students’ well-being and rights.

  • Engage stakeholders early and throughout implementation: Algorithmic systems that affect students and parents should be designed with input from those communities and other relevant experts.

  • Examine input data for bias: Bias in input data will lead to bias in outcomes, so it is critical to understand and eliminate or mitigate those biases before the system is deployed.

  • Document best practices and guidelines for future use: Future users need to know the appropriate contexts and uses for the system and its limitations.

Once an algorithmic system is created and implemented, the following actions are critical to ensuring these systems are meeting their intended outcomes and not causing harm to students:

  • Keep humans in the loop: Algorithmic decision-making systems still require that humans are involved to maintain nuance and context during decision-making processes.

  • Implement data governance: Because algorithmic systems consume and produce a lot of data, a governance plan is needed to address issues like retention limits, deletion policies, and access controls.

  • Conduct regular audits: Audits of the algorithmic system can help ensure that the system is working as expected and not causing discriminatory outcomes or other unexpected harm.
  • Ensure ongoing communication with stakeholders: Regular communication with stakeholders can help the community learn about, provide feedback on, and raise concerns about the systems that affect their schools.

  • Govern appropriate uses of algorithmic systems: Using an algorithm outside of the purposes and contexts for which it was designed and tested can yield unexpected, inaccurate, and potentially harmful results.

  • Create strategies for accountability and redress: Algorithmic systems will make errors, so the educational institutions employing them will benefit from having plans and policies to catch and correct errors, receive and review reports of incorrect decisions, and provide appropriate redress to students or others harmed by incorrect or unfair decisions.

  • Ensure legal compliance: While legal compliance is not enough to ensure that algorithmic systems are fair and appropriate, algorithmic systems must be held to the same legal standards and processes as other types of decision-making, such as FERPA and civil rights protections.

 

For full post, please visit:

https://cdt.org/insight/algorithmic-systems-in-education-incorporating-equity-and-fairness-when-using-student-data/

 

To download report: 

https://cdt.org/files/2019/08/2019-08-08-Digital-Decision-making-Brief-FINAL.pdf

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Teen Hacker Finds Glitch in School Software That Exposed Millions of Records

Teen Hacker Finds Glitch in School Software That Exposed Millions of Records | Educational Psychology & Technology | Scoop.it

By Andy Greenberg

"A few short decades ago, the archetypal hacker was a bored teenager breaking into his school's network to change grades, à la Ferris Bueller. So today, when cybersecurity has become the domain of state-sponsored spy agencies and multibillion-dollar companies, it may be refreshing to know that the high school hacker lives on—as do the glaring vulnerabilities in school software.

 

At the Defcon hacker conference in Las Vegas today, 18-year-old Bill Demirkapi presented his findings from three years of after-school hacking that began when he was a high school freshman. Demirkapi poked around the web interfaces of two common pieces of software, sold by tech firms Blackboard and Follett and used by his own school. In both cases, he found serious bugs that would allow a hacker to gain deep access to student data. In Blackboard's case in particular, Demirkapi found 5 million vulnerable records for students and teachers, including student grades, immunization records, cafeteria balance, schedules, cryptographically hashed passwords, and photos.

 

Demirkapi points out that if he, then a bored 16-year-old motivated only by his own curiosity, could so easily access these corporate databases, his story doesn't reflect well on the broader security of the companies holding millions of students' personal information."The access I had was pretty much anything the school had," Demirkapi says. "The state of cybersecurity in education software is really bad, and not enough people are paying attention to it."

5,000 Schools, 5 Million Records

Demirkapi found a series of common web bugs in Blackboard's Community Engagement software and Follett's Student Information System, including so-called SQL-injection and cross-site-scripting vulnerabilities. For Blackboard, those bugs ultimately allowed access to a database that contained 24 categories of data, everything from phone numbers to discipline records, bus routes, and attendance records—though not every school seemed to store data in every field. Only 34,000 of the records included immunization history, for instance. More than 5,000 schools appeared to be included in the data, with roughly 5 million individual records in total, including students, teachers, and other staff.

 
 

In Follett's software, Demirkapi says he found bugs that would have given a hacker access to student data like grade point average, special education status, number of suspensions, and passwords.

 

Unlike in Blackboard's software, those passwords were stored unencrypted, in fully readable form. By the time Demirkapi had gained that level of access to Follett's software, however, he was two years into his hacking escapades and slightly better informed about legal dangers like the Computer Fraud and Abuse Act, which forbids gaining unauthorized access to a company's network. So while he says he checked the data about himself and a friend who gave him permission, to verify that the bugs led to access, he didn't explore further or enumerate the total number of vulnerable records, as he had with Blackboard. "I was a little stupider in the 10th grade," he says of his earlier explorations.

 

When WIRED reached out to Blackboard and Follett, Follett's senior vice president of technology George Gatsis expressed his thanks to Demirkapi for helping the company identify its bugs, which he says were fixed by July of 2018. "We were happy to work with Bill and grateful he was wiling to work through those things with us," Gatsis says. But Gatsis also claimed that even with the security flaws he exploited, Demirkapi could never have accessed Follett data other than his own. Demirkapi counters that he "100 percent had access to other people’s data," and says he even showed Follett's engineers the password of the friend who had let him access his information.

 

For full post, please visit:

https://www.wired.com/story/teen-hacker-school-software-blackboard-follett/ 

more...
No comment yet.
Rescooped by Roxana Marachi, PhD from Social Impact Bonds, "Pay For Success," Results-Based Contracting, and Blockchain Identity Systems
Scoop.it!

Ethereum’s "Smart Contracts" Run on Blockchain are Full of Holes // Technology Review

Ethereum’s "Smart Contracts" Run on Blockchain are Full of Holes // Technology Review | Educational Psychology & Technology | Scoop.it

By Mike Orcutt

 

"Blockchain-powered computer programs promise to revolutionize the digital economy, but new research suggests they’re far from secure.

 

Computer programs that run on blockchains are shaking up the financial system. But much of the hype around what are called smart contracts is just that. It’s a brand-new field. Technologists are just beginning to figure out how to design them so they can be relied on not to lose people’s money, and—as a new survey of Ethereum smart contracts illustrates—security researchers are only now coming to terms with what a smart-contract vulnerability even looks like.

 

Digital vending machines: The term “smart contract” comes from digital currency pioneer Nick Szabo, who coined it more than 20 years ago (and who may or may not be Satoshi Nakamoto). The basic idea, he wrote, is that “many kinds of contractual clauses (such as collateral, bonding, delineation of property rights, etc.) can be embedded in the hardware and software we deal with, in such a way as to make a breach of contract expensive (if desired, sometimes prohibitively so) for the breacher.” Szabo called physical vending machines a “primitive ancestor of smart contracts,” since they take coins and dispense a product and the correct change according to the displayed price.

 

Enter the blockchain: Today, the most common conception of a smart contract is a computer program stored on a blockchain. A blockchain is essentially a shared accounting ledger that uses cryptography and a network of computers to track assets and secure the ledger from tampering. For Bitcoin, that gives two parties who don’t know each other an ironclad guarantee that an agreed upon transfer of funds will happen as expected—that is, no one will get cheated.

 

Smart contracts are where things get interesting. Using a smart contract, two people could create a system that withdraws funds from one person’s account—a parent’s, let’s say—and deposits them into a child’s account if and when the child’s balance falls below a certain level. And that’s just the simplest example—in theory, smart contracts can be used to program all kinds of financial agreements, from derivatives contracts to auctions to blockchain-powered escrow accounts.

 

ICOs everywhere: One of the most popular applications of smart contracts has been to create new cryptocurrencies. A few of them have provided glimpses of a new kind of economy in which a purpose-made digital currency  can be used for a “decentralized” service, like data storage or digital currency trading. Investor excitement over the promise of such applications has helped fuel the ICO craze, which has raised over $5 billion. (What the hell is an ICO? ← Here’s a primer)

 

But hold your horses: Technologists still don’t have a full picture of what a security hole in a smart contract looks like, says Ilya Sergey, a computer scientist at University College London, who coauthored a study on the topic published last week.

Users learned this the hard way in 2016 when a hacker stole $50 million from the so-called Decentralized Autonomous Organization, which was based on the Ethereum blockchain. And in November around $150 million suddenly became inaccessible to users of the wallet service Parity, which is also rooted in Ethereum.

 

Sergey and colleagues used a novel tool to analyze a sample of nearly one million Ethereum smart contracts, flagging around 34,000 as vulnerable—including the one that led to the Parity mishap. Sergey compares the team’s work to interacting with a vending machine, as though the researchers randomly pushed buttons and recorded the conditions that made the machine act in unintended ways. “I believe that a large number of vulnerabilities are still to be discovered and formally specified,” Sergey says."...

 

For original post, please visit:

https://www.technologyreview.com/s/610392/ethereums-smart-contracts-are-full-of-holes/ 

more...
No comment yet.
Rescooped by Roxana Marachi, PhD from Educational Psychology & Technology
Scoop.it!

What is Data Exploitation? // Privacy International

To view video on YouTube, see: https://www.youtube.com/watch?v=8CKJtfLV6HU

 

For questions related to the potential for Data Exploitation with "Smart Cities" projects, see: https://eduresearcher.com/2018/05/02/smartcity/ 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Smile, Your City Is Watching You // The New York Times

Smile, Your City Is Watching You // The New York Times | Educational Psychology & Technology | Scoop.it

By Ben Green

"Walking through the streets of New York City, you can feel the thrill of being lost in the crowd. As throngs of people filter past, each going about their days, it seems possible to blend in without being noticed.

 

But as municipalities and companies pursue the dream of “smart cities,” creating hyper-connected urban spaces designed for efficiency and convenience, this experience is receding farther and farther from reality.

 

Consider the LinkNYC kiosks installed across New York City — more than 1,700 are already in place, and there are plans for thousands more. These kiosks provide public Wi-Fi, free domestic phone calls and USB charging ports.

 

Yet the LinkNYC kiosks are not just a useful public service. They are owned and operated by CityBridge (a consortium of companies that includes investment and leadership from Sidewalk Labs — a subsidiary of Alphabet, the parent company of Google) and are outfitted with sensors and cameras that track the movements of everyone in their vicinity. Once you connect, the network will record your location every time you come within 150 feet of a kiosk.

 
  

And although CityBridge calls this information “anonymized” because it doesn’t include your name or email address — the system instead records a unique identifier for each device that connects — when millions of these data points are collected and analyzed, such data can be used to track people’s movements and infer intimate details of their lives."...

 

For full post, please visit: 
https://www.nytimes.com/2019/06/27/opinion/cities-privacy-surveillance.html 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Johns Hopkins Researchers Found “Significant Problems” With Summit Platform Use in Providence Schools // E-Learning Inside

Johns Hopkins Researchers Found “Significant Problems” With Summit Platform Use in Providence Schools // E-Learning Inside | Educational Psychology & Technology | Scoop.it

By Henry Kronk

"In May, Rhode Island Department of Education Commissioner Angelica Infante-Green invited researchers from the Johns Hopkins Institute for Education Policy to review the Providence Public School District. Part of their investigation reported on the use of Summit Learning, a free platform developed by Summit Public Schools (a charter network in the Bay Area) and the Chan Zuckerberg Initiative.

 

The Providence Public School District has not been in great shape for some time. According to scores on the Rhode Island Comprehensive Assessment System (RICAS), just 10% of Providence students scored proficient in math, while 14% were proficient in language arts, across grade levels. 87.7% of students enrolled in the 2017-18 school year qualified for free or reduced lunch.

The district, therefore, might make for an ideal candidate for Summit Learning. When Summit CEO Diane Tavenner launched her first school in Redwood City, Calif. in 2003, the school drew its first class from a student body where 90% were poor and 90% were English learners, according to Mother Jones. Tavenner and her staff managed to graduate 100% of Summit’s first cohort.

If 2019 was anything like past years in Providence, that likely did not occur. While some Providence schools report graduation rates in the high 90s, many others fall below 70%, according to the Rhode Island Department of Education. The Providence Public School District, like many urban American education systems, deals every year with numerous issues that amount to a steep uphill struggle.

When it comes to the act of learning and academic performance, according to the Johns Hopkins researchers led by Dr. Jay Plasman, Summit Learning has not helped. The team found “significant problems in the use of the Summit Learning Platform.” The researchers observed numerous occasions of plug-and-play teaching, where the instructor set every student to work individually on the platform without leading or engaging the class.

“In one school, we did not observe a single Summit math teacher engage in whole-class or even small-group math instruction,” the researchers write. “Instead, teachers either completed work at their desks, and/or answered questions when students raised their hand. Finally, the lack of teacher surveillance of student progress in some Summit classrooms meant that students worked very slowly through the material.”

This lack of teacher engagement allowed for numerous misuses of the platform. Many students were observed spending long periods off-task. Some worked on assignments for other classes. Others watched unrelated YouTube videos.

Plasman’s team also found another trait that has been reported elsewhere: students often don’t go through the material on Summit Learning, but instead skip right to the assessment and try to guess their way through to the next section.

“To paint a picture of one Summit classroom at a given moment during our visit: Four students were working on history, one student stalled on an index screen, one stalled on a choice screen, one focused on a screen with other (non-math) content, two doing mathematics well below grade-level work, and two doing mathematics at, or close to, grade level,” the researchers write.

The team found that the platform was “almost universally disliked” among students.

 

Interviewees reported being bored, burnt-out, and uncomfortable with the amount of screen time the platform required."...

 

The Providence Journal has published the investigation in full.

 

Featured Image: Shannon S, Unsplash.

 

For original post, please visit:

https://news.elearninginside.com/johns-hopkins-researchers-found-significant-problems-with-summit-learning-use-in-providence-schools/ 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Veillance and Transparency: A Critical Examination of Mutual Watching in the Post-Snowden, Big Data Era // Big Data & Society

Guest Editors:
Prof. Vian Bakir, School of Creative Studies & Media, Bangor University

Prof. Martina Feilzer, School of Social Sciences, Bangor University


Dr. Andrew McStay, School of Creative Studies & Media, Bangor University
_________________

"This special theme proposes that we live in a techno-cultural condition of increased and normalised transparency through various veillant forces (Steve Mann’s term for mutual watching) ranging from surveillant organizations and states to sousveillant individuals. Our papers address the technical, social, economic, political, legal, ethical and cultural implications of this situation.

-     On Ethics, Values and Norms: What, if anything, can or should we do about practices of watching that operate without informed consent or adequate processes of accountability in the post-Snowden, Big Data era?


-     On Regulation, Power, Resistance and Social Change:
 Are existing mechanisms of regulation and oversight able to deal with nation-states’ desire for transparency of their citizens, or is resistance required from other quarters?

 

-     On Representation, Discourse and Public Understanding: What socio-cultural discourses and practices on veillance and transparency prevail; how do they position the sur/sous/veillant subject; and do they adequately educate and engage people on abstract veillance practices?

Editorial

Introduction to Special Theme Veillance and transparency: A critical examination of mutual watching in the post-Snowden, Big Data era
Vian Bakir, Martina Feilzer, Andrew McStay
Big Data & Society, March 2017, 10.1177/2053951717698996

Original Research Articles

Shareveillance: Subjectivity between open and closed data
Clare Birchall
Big Data & Society, November 2016, 10.1177/2053951716663965 


Empathic media and advertising: Industry, policy, legal and citizen perspectives (the case for intimacy)
Andrew McStay
Big Data & Society, November 2016, 10.1177/2053951716666868


Reluctant activists? The impact of legislative and structural attempts of surveillance on investigative journalism

Anthony Mills and Katharine Sarikakis
Big Data & Society, November 2016, 10.1177/2053951716669381 


Algorithmic paranoia and the convivial alternative

Dan McQuillan
Big Data & Society, November 2016, 10.1177/2053951716671340


Towards Data Justice? The ambiguity of anti-surveillance resistance in political activism

Lina Dencik, Jonathan Cable and Arne Hintz
Big Data & Society, November 2016, 10.1177/2053951716679678


The machine that ate bad people: The ontopolitics of the precrime assemblage

Peter Mantello
Big Data & Society, December 2016,  10.1177/2053951716682538
 

Commentaries

The Snowden Archive-in-a-Box: a year of traveling experiments in outreach and education
Evan Light
Big Data & Society, November 2016, 10.1177/2053951716666869


Conceptualising the Right to Data Protection in an Era of Big Data

Yvonne McDermott
Big Data & Society, January 2017, 10.1177/2053951716686994


Big Data is a Big Lie without little data: Humanistic Intelligence as a Human Right

Steve Mann
Big Data & Society, February 2017, 10.1177/2053951717691550


Crowd-Sourced Intelligence Agency: Prototyping counterveillance

Jennifer Gradecki, Derek Curry
Big Data & Society, February 2017, 10.1177/2053951717693259


Tracing You: How transparent surveillance reveals a desire for visibility
 
Benjamin Grosser 
Big Data & Society, February 2017, 10.1177/2053951717694053


Early Career Researcher Essay

Liberal Luxury: Decentering Snowden, Surveillance and Privilege
Piro Rexhepi
Big Data & Society, November 2016, 10.1177/2053951716679676

 

For original announcement, please see:

https://journals.sagepub.com/page/bds/collections/veillance-and-transparency 

more...
No comment yet.