Educational Psychology & Technology
27.7K views | +1 today
Follow
 
Scooped by Roxana Marachi, PhD
onto Educational Psychology & Technology
Scoop.it!

Big Hype, Hard Fall for News Corp.'s $1 Billion Ed-Tech Venture // EdWeek

Big Hype, Hard Fall for News Corp.'s $1 Billion Ed-Tech Venture // EdWeek | Educational Psychology & Technology | Scoop.it

By Benjamin Herold (EdWeek) 
 

"Amplify, the education division of Rupert Murdoch's company, is deeply in the red and on the auction block after its ambitious vision failed to materialize.


The global media giant News Corp. sought to push its way into the K-12 marketplace five years ago by betting big on technology. Now, despite a $1 billion investment and a steady stream of brash promises to radically disrupt the way public schools do business, the company's education division, known as Amplify, is deeply in the red and on the auction block.


Veteran observers of the fickle K-12 ed-tech market say they aren't surprised.


"There's a long history of education entrepreneurs who have crashed on the rocks because the market was not what they thought it would be," said Douglas A. Levin, a consultant on the ed-tech market and the recent head of the State Educational Technology Directors Association.


Earlier this month, it became clear that the highly publicized venture had become a financial albatross. When News Corp. announced that it would write off the education division's $371 million in losses over the past year and look to sell off Amplify, investors cheered, sending the parent company's share price up 7.5 percent, to $14.89.


Inside schools, meanwhile, the ripple effects of Amplify's striking demise promised to be minimal. A majority of the 30,000 or so tablet computers sold by the company went to a single district, and Amplify fell far short of its modest goal of getting its no-expense-spared digital curriculum into the hands of 30,000 students by the 2015-16 school year.


Experts attributed the company's lack of impact on the K-12 market to a series of miscalculations."...


For full post, click on title above or here: 
http://www.edweek.org/ew/articles/2015/08/26/big-hype-hard-fall-for-news-corps-ed-tech-venture.html 


No comment yet.
Educational Psychology & Technology
This curated collection includes news, resources, and research related to the intersections of Educational Psychology and Technology. The page also serves as a research tool to organize online content. The grey funnel shaped icon at the top allows for searching by keyword. For research more specific to tech, screen time and health/safety concerns, please see: http://bit.ly/screen_time, to learn about the next wave of privatization involving technology intersections with Pay For Success,  Social Impact Bonds, and Results Based Financing (often marketed with language promoting "public-private-partnerships"), see http://bit.ly/sibgamble, and for additional Educator Resources, please visit http://EduResearcher.com [Links to an external site].
Your new post is loading...
Your new post is loading...
Scooped by Roxana Marachi, PhD
Scoop.it!

When "Innovation" is Exploitation: Data Ethics, Data Harms and Why We Need to Demand Data Justice // Marachi, 2019, Summer Institute of A Black Education Network 

To download pdf, please click on title or arrow above.

 

For more on the data brokers selling personal information from a variety of platforms, including education, please see: https://www.fastcompany.com/90310803/here-are-the-data-brokers-quietly-buying-and-selling-your-personal-information 

 

Please also visit: Parent Coalition for Student Privacy

https://www.studentprivacymatters.org/

  

and visit the Data Justice Lab: 

https://datajusticelab.org/

 

No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Ed Tech Cashes In on the Pandemic

Ed Tech Cashes In on the Pandemic | Educational Psychology & Technology | Scoop.it

By Gayle Green

When schools shut down in mid-March, teachers rose valiantly to the occasion, redoing courses so they could be taught online, figuring out ways of reaching the many students who do not have high-speed internet. About a month into the retooling, a video appeared that gave vent to the frustration many were feeling in this netherworld of cyber teaching. A sweet young woman introduces herself as “the music teacher,” and says she’s composed an ode to online teaching. After a few disarming chords on her ukulele, she lets out a primal scream. That scream made the rounds of social media, even making national news.

The transition to online teaching made everyone aware of the value of person-to-person communication. The human signals that tell a teacher how a class is reacting—the sighs, groans, snorts, giggles, eye rolls, glances, body language—are stripped away online. The teacher can’t even tell if she’s being heard. Warmth is difficult to express; rapport, trust, bonding almost impossible to build. “Kids can be hard to motivate under the best of circumstances,” says teacher blogger Steven Singer, “but try doing it through a screen.” Students say so, too: “I can’t get myself to care … I just feel really disconnected from everything.”

Ed tech companies lost no time moving in. “When the pandemic hit, right away we got a list of all these technology companies that make education software that were offering free access to their products for the duration of the coronavirus crisis,” said Gordon Lafer, political economist at the University of Oregon and a member of his local school board. “They pitch these offerings as stepping up to help out the country in a moment of crisis. But it’s also like coke dealers handing out free samples.” Marketing has become so aggressive that a school superintendent near Seattle tweeted a heartfelt appeal to vendors: “Please stop. Just stop … my superintendent colleagues and I … need to focus on our communities. Let us do our jobs.” Her plea hit a nerve, prompting a survey by the National Superintendents Roundtable that revealed “a deep vein of irritation and discontent” at the barrage of texts, emails, and phone calls, “a distraction and nuisance” when they’re trying to deal with the COVID-19 crisis. Comments on this survey ranged from “negative in the extreme” to “scathing,” and expressed concerns that these products “have not been validated” and that “free” offers conceal contracts for long-term pay.

For the past two decades, ed tech has been pushing into public schools, convincing districts to invest in tablets, software, online programs, assessment tools. Many superintendents have allowed these incursions, directing funding to technology that might have been better spent on human resources, teachers, counselors, nurses, librarians (up to $5.6 billion of school technology purchased sits unused, according to a 2019 analysis in EdWeek Market Brief). Now the pandemic has provided ed tech a “golden opportunity,” a “tailwind” (these are the terms we hear): Michael Moe, head of the venture capitalist group Global Silicon Valley, says: “We see the education industry today as the health care industry of 30 years ago.” Not a happy thought.

Ed tech proponents have long claimed that classrooms are obsolete and that online is the future of education. Frederick Hess, director of education policy studies at the Koch-funded American Enterprise Institute, urges that the United States’ $700 billion public-education budget should be spent on “a bunch of online materials—along with a device for every child and better connectivity.” Education Secretary Betsy DeVos, who has close ties with the Koch network, also sees the classroom as obsolete: “If our ability to educate is limited to what takes place in any given physical building, we are never going to meet the unique needs of every student.”...

 

They may get their way.

“Personalized” Learning Without Persons

In May, New York Gov. Andrew Cuomo announced his intention “to work with the Gates Foundation to … develop a blueprint to reimagine education in the new normal.” “The old model of everybody goes and sits in a classroom and the teacher is in front of that classroom and teaches that class, and you do that all across the city, all across the state, all these buildings, all these physical classrooms,” Cuomo said. “Why, with all the technology you have?”

 

Bill Gates has been promoting various versions of so-called personalized learning for decades. Personalized programs are different from the online teaching most teachers did this spring, in that they eliminate the need for the teacher. Their interactive software has the student interacting with the computer, not a human being. They’re called “personalized” because an algorithm based on a student’s past performance generates “learning plans” tailored to her level and interests. The student sits, encased in headphones, responding to prompts, clicking her way through preset steps to predetermined answers; she demonstrates “competencies” by passing a test, then moves on to the next task and the next test, until she receives a “digital badge.” The student is said to be in charge, to have “ownership” of her education.

 

Far from putting her in charge, such programs make the student what the program says she should be. Paul Emerich France, a teacher who went to work for a Silicon Valley startup as a true believer but became rapidly disillusioned, describes personalized learning as “isolating … impersonal … disembodied and disconnected, with a computer constantly being a mediator between my students and me”: It “dehumanizes the learning environment.” Personalized programs may offer students choices about where and when to do assignments and whether they want dogs or cupcakes in their math and grammar exercises, but this is trivial compared with the excitement, curiosity, discovery that a live class can generate that actually can put a student in charge."...

 

For full post, please visit:

 

 

https://prospect.org/api/amp/education/ed-tech-cashes-in-on-the-pandemic/ 

 

 

No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Data Justice and COVID-19: Global Perspectives

Data Justice and COVID-19: Global Perspectives | Educational Psychology & Technology | Scoop.it

"Data Justice and COVID-19: Global Perspectives
Edited by: Linnet Taylor, Gargi Sharma, Aaron Martin, and Shazade Jameson

The COVID-19 pandemic has reshaped how social, economic, and political power is created, exerted, and extended through technology. Through case studies from around the world, this book analyses the ways in which technologies of monitoring infections, information, and behaviour have been applied and justified during the emergency, what their side-effects have been, and what kinds of resistance they have met."

 

https://shop.meatspacepress.com/product/data-justice-and-covid-19-global-perspectives 

 

Pdf of the book is available here:

https://ia801504.us.archive.org/25/items/data-justice-and-covid-19/Data_Justice_and_COVID-19.pdf

No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Software that monitors students during tests perpetuates inequality and violates their privacy // Technology Review

Software that monitors students during tests perpetuates inequality and violates their privacy // Technology Review | Educational Psychology & Technology | Scoop.it

By Shea Swager

"The coronavirus pandemic has been a boon for the test proctoring industry. About half a dozen companies in the US claim their software can accurately detect and prevent cheating in online tests. ExamityHonorLockProctorio,ProctorU, Respondus and others have rapidly grown since colleges and universities switched to remote classes.

 

While there’s no official tally, it’s reasonable to say that millions of algorithmically proctored tests are happening every month around the world". Proctorio told the New York Times in May that business had increased by 900% during the first few months of the pandemic, to the point where the company proctored 2.5 million tests worldwide in April alone.

I'm a university librarian and I've seen the impacts of these systems up close. My own employer, the University of Colorado Denver, has a contract with Proctorio.

It’s become clear to me that algorithmic proctoring is a modern surveillance technology that reinforces white supremacy, sexism, ableism, and transphobia. The use of these tools is an invasion of students’ privacy and, often, a civil rights violation.

If you’re a student taking an algorithmically proctored test, here’s how it works: When you begin, the software starts recording your computer’s camera, audio, and the websites you visit. It measures your body and watches you for the duration of the exam, tracking your movements to identify what it considers cheating behaviors. If you do anything that the software deems suspicious, it will alert your professor to view the recording and provide them a color-coded probability of your academic misconduct.

Depending on which company made the software, it will use some combination of machine learning, AI, and biometrics (including facial recognition, facial detection, or eye tracking) to do all of this. The problem is that facial recognition and detection have proven to be racistsexist, and transphobicover, and over, and over again.

In general, technology has a pattern of reinforcing structural oppression like racism and sexism. Now these same biases are showing up in test proctoring software that disproportionately hurts marginalized students.

A Black woman at my university once told me that whenever she used Proctorio's test proctoring software, it always prompted her to shine more light on her face. The software couldn’t validate her identity and she was denied access to tests so often that she had to go to her professor to make other arrangements. Her white peers never had this problem.

Similar kinds of discrimination can happen if a student is trans or non-binary. But if you’re a white cis man (like most of the developers who make facial recognition software), you’ll probably be fine.

Students with children are also penalized by these systems. If you’ve ever tried to answer emails while caring for kids, you know how impossible it can be to get even a few uninterrupted minutes in front of the computer. But several proctoring programs will flag noises in the room or anyone who leaves the camera’s view as nefarious. That means students with medical conditions who must use the bathroom or administer medication frequently would be considered similarly suspect.

Beyond all the ways that proctoring software can discriminate against students, algorithmic proctoring is also a significant invasion of privacy. These products film students in their homes and often require them to complete “room scans,” which involve using their camera to show their surroundings. In many cases, professors can access the recordings of their students at any time, and even download these recordings to their personal machines. They can also see each student’s location based on their IP address.

Privacy is paramount to librarians like me because patrons trust us with their data. After 9/11, when the Patriot Act authorized the US Department of Homeland Security to access library patron records in their search for terrorists, many librarians started using software that deleted a patron’s record once a book was returned.

 

Products that violate people’s privacy and discriminate against them go against my professional ethos, and it’s deeply concerning to see such products eagerly adopted by institutions of higher education.

This zealousness would be slightly more understandable if there was any evidence that these programs actually did what they claim. To my knowledge, there isn’t a single peer-reviewed or controlled study that shows proctoring software effectively detects or prevents cheating. Given that universities pride themselves on making evidence-based decisions, this is a glaring oversight.

Fortunately, there are movements underway to ban proctoring software and ban face recognition technologies on campuses, as well as congressional bills to ban the US federal government from using face recognition. But even if face recognition technology were banned, proctoring software could still exist as a program that tracks the movements of students’ eyes and bodies. While that might be less racist, it would still discriminate against people with disabilities, breastfeeding parents, and people who are neuroatypical. These products can’t be reformed; they should be abandoned.

Cheating is not the threat to society that test proctoring companies would have you believe. It doesn’t dilute the value of degrees or degrade institutional reputations, and student’s aren’t trying to cheat their way into being your surgeon. Technology didn’t invent the conditions for cheating and it won’t be what stops it. The best thing we in higher education can do is to start with the radical idea of trusting students. Let’s choose compassion over surveillance.

Shea Swauger is an academic librarian and researcher at the University of Colorado Denver.

 

For original post please visit:

https://www.technologyreview.com/2020/08/07/1006132/software-algorithms-proctoring-online-tests-ai-ethics/

No comment yet.
Rescooped by Roxana Marachi, PhD from Educational Psychology & Technology
Scoop.it!

No Data About Us Without Us [Webinar] // Dignity in Schools

"No Data About Us Without Us

How and when educators, administrators, superintendents, and school boards make decisions about students and families, budget allocation and resource distribution is undergoing a transformation with the emergence of artificial intelligence and machine learning. Districts are trying to leverage new technologies to improve efficiencies and increase effectiveness without fully understanding the cost, compromise and collateral effects."

Direct link to video on YouTube
https://www.youtube.com/watch?v=b-SzLqMZd2o 


Key Topic Timestamps

 

Marika Pfefferkorn segment of webinar
12:30: History/background description of St. Paul, MN data sharing project and related community action 
25:35: Algorithms, Big Data, Predictive Analytics 
28:35: Student voice - "My past does not predict my future" 
35:00: Cradle to Prison Algorithms 
36:15: Recognizing Points of Entry 

 

Roxana Marachi segment of webinar

44:00: No Data About Us Without Us Intro 
52:24: Student Protest of Zuckerberg backed Summit Digital Learning platform 

55:13: Behavioral Data Extraction in the name of SEL/Trauma Informed Approaches 

1:10:23: Global Silicon Valley Asset Management / push for technology via privatization
1:15:12: "Online Preschool" 

1:17:32: Data Justice Lab (1:21:20 on Data Harms)

1:20:09: Data Exploitation 
1:22:09: Chan Zuckerberg screenshot from CDE presentation 2/22/19
1:25:40: CZI funding of data sharing project uses "predictive analytics" with Hoonuit Early Warning & Intervention System to flag "at risk" students 

1:28:56: Critical perspectives on Pay for Success, Social Impact Bonds, Impact Investing, and Blockchain related data extraction

1:30:46: Next steps/what Can We Do? Resources and Q/A

________________________________

For more/related information, please see: 

http://bit.ly/edpsychtech

http://bit.ly/sibgamble

 

For resources, please visit:

Parent Coalition for Student Privacy 
https://www.studentprivacymatters.org/

 

AppCensus AppSearch: Learn the privacy cost of free apps
https://search.appcensus.io/

 

To download the slides for the 2nd half of the webinar, please see: 

http://sco.lt/7EQbK4

No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Special Issue: “The datafication of teaching in higher education: Critical issues and perspectives” // Teaching in Higher Education 

Special Issue: “The datafication of teaching in higher education: Critical issues and perspectives” // Teaching in Higher Education  | Educational Psychology & Technology | Scoop.it

By Ben Williamson, Sian Bayne and Suellen Shay

"Universities have been significantly disrupted by the ongoing COVID-19 pandemic, and are likely to remain in a fragile, uncertain condition for months if not years to come. The very rapid shift to online teaching seems increasingly likely to be just a first step on a long path to the expansion of digital or hybrid technologies in higher education.  For many education technology (edtech) vendors, the pandemic is not just a health crisis and an educational emergency, but a market opportunity fueled both by private capital calculations and by desperate university customers. With the very continuation of higher education teaching at stake as universities recover from the coronavirus crisis, companies providing vital digital infrastructure for distance education are attractive prospects for educational and market institutions alike.

Our special issue on The datafication of teaching in higher education was already in production as coronavirus spread around the planet. The issues confronted by many of the authors, however, anticipate discussions now occupying universities as they work out how far to increase their digital delivery, and what to do about the huge quantities of data these technologies collect about their students, staff and institutional performances. Although the use of statistics, metrics and data to measure student achievement, staff outputs and university performance is not new, as we show in the editorial introduction, digital forms of data are becoming increasingly prevalent with the widespread introduction of digital technologies for teaching and learning. Predictive learning analytics, learning management systems, online learning platforms, performance dashboards, plagiarism detection, library resource management, student experience apps, attendance monitoring, and even artificial intelligence assistants and tutors all depend on the persistent collection and analysis of data, and are part of a rapidly growing edtech sector and a multibillion dollar education market.

 

The datafication of teaching in higher education is transformative in three key ways. First, it is expanding processes of measurement, comparison and evaluation to as many functions and activities of higher education as possible, through increasingly automated systems that run on highly opaque and proprietary code and algorithms that are based on specific technical understandings of education. Second, datafication privileges performance comparison more than ever, and thereby reinforces longstanding political preoccupations with marketization and competition, as the comparative performances of students, staff, courses, programs and whole institutions are made visible for evaluation and assessment. And third, datafication fuses higher education to the business models of a global education industry, which then reshapes higher education to fit its preferred ideas about what  constitutes and measurably beneficial university experience. In other words, technologies of datafication are the material embodiment of particular measurement practices, political priorities and business plans, and reshape institutions of education to fit those forms.

 

The collected papers in the special issue tease out a number of key concerns. Paul Prinsloo foregrounds issues of ‘data power’, arguing that data systems define what ‘counts’ as a good student, an effective educator, or a quality education. He raises significant questions regarding the ‘data colonialism’ of edtech companies from the Global North pushing into Global South contexts to reveal ‘truths’ about education, students and teachers. Data analytics and the dashboards that present information about students are the focus of Michael Brown, whose article identifies the role of dashboards in ‘seeing students’ and shaping educators’ pedagogical strategies. Educators, he reports, may find their normal pedagogical routines stymied by the demands of datafication, and struggle to make sense of the data presented to them by their dashboards. This, for Michaela Harrison and coauthors, raises the issue of how ‘student data subjects’ are created from data in ways that make them visible to the educator’s eye as digital traces, which they argue may result in a ‘process of (un)teaching’ rather than meaningful teacher-student interaction.

 

Learning management systems have acquired some of the most extensive databases of student information on the planet. Roxana Marachi and Lawrence Quill draw specific attention to the learning management system Canvas, arguing it enables ‘frictionless’ data transitions across K12, higher education, and workforce data through the integration of third party applications and interoperability or data-sharing across platforms. They make the important call for greater public awareness concerning the use of predictive analytics, impacts of algorithmic bias, and enactment of ethical and legal protections for users who are required to use such software platforms. Juliana Raffaghelli and Bonnie Stewart suggest that building educators’ ‘data literacy’, with an emphasis on critical, ethical and personal approaches to datafication, is an important response to the increase of algorithmic decision-making and data collection in higher education, enabling educators make sense of the systems that shape life and learning in the twenty-first century. Extending a critical data literacies approach a computer science classroom, Mary Loftus and Michael Madden report on an experimental teaching module where students both explore the construction of machine learning models and learn to reflect on their social consequences as ‘students who will be building the autonomous, connected systems of the future’.

 

A number of the papers examine how datafication reinforces logics of marketization and performativity. Annette Bamberger, Yifat Bronshtein and Miri Yemini, for example, argue that as social media has become central to university marketing and reputation management, techniques of datafication help produce persuasive information that can be circulated as social media marketing material in the context of competitive struggles for the international student market. Aneta Hayes and Jie Cheng then examine the shortcomings of international teaching excellence and higher education outcomes frameworks, arguing that ‘epistemic equality’ and non-discrimination should be officially considered as indicators of teaching excellence, and show how evaluating universities on epistemic equality could work in practice. Such an approach stands in contrast to the surveillance techniques of the ‘smart campus’ analysed by Michael Kwet and Paul Prinsloo, who foreground the risks of normalizing surveillance architectures on-campus, call for a ban on various forms of dataveillance, and argue for decentralized services, ‘public interest technology’ and more democratic pedagogic models.

 

Rounding out the special issue, Neil Selwyn and Dragan Gasevic stage a dialogue between critical social science and data science. They add a computational dimension to familiar social criticisms of data representativeness, reductionism and injustice, as well as exploring social tensions inherent in technical claims to data-based precision, clarity and predictability, and finally highlight opportunities for productive interdisciplinary exchange and collaboration. Their paper offers a productive way forward for research on datafication in higher education. But significant challenges remain to reimagine and reshape the role of HE in the 2020s, both during the coronavirus recovery and in the longer term. We hope the special issue helps to catalyse debate about the limits, potential and challenges of the datafied university, and about the role of datafication in higher education for the future.

Ben Williamson (University of Edinburgh), Sian Bayne (University of Edinburgh) and Suellen Shay (University of Cape Town)"

 

[Photo by Pixabay on Pexels.com]

https://teachinginhighereducation.wordpress.com/2020/04/30/special-issue-the-datafication-of-teaching-in-higher-education-critical-issues-and-perspectives/ 

No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

The Color of Surveillance: Monitoring of Poor and Working People // Center on Privacy and Technology // Georgetown Law

The Color of Surveillance: Monitoring of Poor and Working People // Center on Privacy and Technology // Georgetown Law | Educational Psychology & Technology | Scoop.it

https://www.law.georgetown.edu/privacy-technology-center/events/color-of-surveillance-2019/ 

No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

University mandates wearable COVID-19 tracker, sparking student protest

University mandates wearable COVID-19 tracker, sparking student protest | Educational Psychology & Technology | Scoop.it

By Jeremy Horwitz
"Despite the continued spread of COVID-19 throughout the world, schools have developed reopening plans for the fall semester — some with fully remote classes, others in traditional classrooms. Having opted to open both classrooms and residence halls, Michigan’s Oakland University will require residents and staff to wear COVID-19 trackers, a move that has led to a petition drive against the mandate.

Under the new rules, which students say were quietly introduced on a student life webpage at the end of July, resident students “must wear a BioButton,” a device introduced in May to continuously monitor a person’s vital signs for 90 days. Developer BioIntelliSense says that the coin-sized, disposable device can track temperature, respiratory rate, heart rate, body position, sleep, and activity state with medical-grade accuracy, enabling monitoring of students, workers, and high-risk patients without their active participation. In theory, the technology could be useful for businesses and educational institutions interested in bringing people back into physical gathering spaces.

 

Unsurprisingly, the petitioning students object to being continuously tracked on and off campus, and they cite a collection of valid concerns, ranging from violations of personal privacy to religious objections, as well as the absence of any agreement to the new requirement. While the petition isn’t demanding a stop to the use of BioButtons, it’s asking the university to make the trackers optional for staff and students, a step that would decrease the system’s efficacy — assuming that it works as expected.

COVID-19 tracing technologies remain controversial due to a mix of privacy concerns and implementation limitations. Potential solutions such as the Apple-Google smartphone-based exposure notification system still have yet to be widely deployed in the United States, and they lack access to personal vital sign tracking, relying instead on self-reporting of positive COVID-19 diagnoses. Alternatives such as BioButton could provide earlier and more detailed warnings of possible illness, but it’s unclear whether most schools will rely upon wearable sensors or simply keep their physical classrooms closed in favor of safe remote instruction.

As of press time, the petition has over 2,200 signatures, which represents over 10% of the university’s total student population of roughly 20,000. Though some of the signatures have come from former Oakland students and people who appear to be unaffiliated with the university, the petition is likely to gather further steam as more students, staff, and members of the general public become aware of the school’s new policies."...

 

For full story, please visit:

https://venturebeat.com/2020/08/03/university-mandates-wearable-covid-19-tracker-sparking-student-protest/

No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Open Letter Regarding the Usage of Proctorio // University of British Columbia (UBC) July, 2020

Open Letter Regarding the Usage of Proctorio // University of British Columbia (UBC) July, 2020 | Educational Psychology & Technology | Scoop.it

July 3rd, 2020

"Dear President Santa Ono, Provost Andrew Szeri, Dr. Simon Bates, and Deans of UBC,

In moving to remote instruction, we recognize how the COVID-19 pandemic has created challenges surrounding the rising incidents of academic misconduct, and we commend the efforts of faculty and teaching staff to produce a quality online teaching experience. We thank all faculty and teaching staff for their unrelenting efforts to provide academic continuity under incredibly difficult circumstances.

With respect to exam invigilation, however, it has become increasingly clear that usage of Proctorio​ negatively impacts students’ academic performance, and students have repeatedly expressed that they are not comfortable utilizing this software. While the sudden onset of the pandemic has left instructors with few options but to use Proctorio as a test proctoring software, its continued usage is not suitable for Summer Term 2 and Fall 2020 on the grounds of unethical corporate practices, reoccurring technical implementation difficulties, and intrinsically discriminatory programming.

In response to a UBC student claiming that Proctorio had failed to provide support when encountering an issue with a UBC online proctored exam, ​Proctorio CEO Mike Olsen posted excerpts of a support chat log  generating​ concerns around a privacy breach . It is evident that this is not an isolated incident – this is one of ​many incidents of Proctorio’s poor support response and points to Proctorio’s ​disregard for student privacy and protection of student data. Students report not being able to access their instructors for test-related questions, and being denied access to the exam due to connectivity issues.

Additionally, Proctorio and other algorithmic test proctoring software raise concerns about discrimination against ​students based on their bodies, external surroundings, and behaviours. Algorithmic test proctoring software has been demonstrated to discriminate against people of colour, students with accessibility needs and medical conditions, trans students, students with connectivity difficulties, and students with children by flagging “abnormal” behaviours and denying access to certain groups of students.

As a result, Proctorio does not reinforce academic integrity, but instead reinforces a discriminatory exclusion and surveillance culture that is detrimental to student learning and test-taking ability.

 

In light of UBC’s commitments to equity, diversity, and inclusion and the Inclusion Action Plan’s​ Goal 4.B of “implement[ing] inclusive course design, teaching practice, and assessments,” UBC should not be subscribing to a pedagogy of punishment by investing in discriminatory surveillance practices. No student should have their grade put at risk due to biased data algorithms and technical difficulties.

While we understand that Proctorio may not have violated the letter of the law, we contend that the Proctorio CEO’s treatment of a UBC student breached the spirit of the law, as well as norms surrounding privacy. The Proctorio CEO’s actions further illustrates the wider concerns that students are deeply unsettled by Proctorio’s surveillance and have their academic futures put at risk by technical issues. In response to student concerns, ​Dutch Universities are currently organizing an external technical audit of Proctorio​. UBC should follow suit or participate as an observer in the audit process in order to mitigate harm to students.

Organizations such as the Algorithmic Justice League currently conduct algorithmic audits, within a human rights framework.

 

During the past, present, and future – regardless of a pandemic, – students deserve to have fair assessments conducted in good faith by instructors who treat them with a high degree of trust, respect, and dignity. There are many ways to build academic integrity and values of honesty within assessments that do not require Proctorio’s unnecessarily invasive surveillance. We recognize and applaud the guidance given in the ​Guiding Principles for Fall 2020 Adaptations​ on building academic integrity into the course beyond Proctorio and Turnitin.

However, further action must be taken in light of serious student concerns.

The University of British Columbia consistently ranks in the top 50 institutions for higher education, and employs many talented and creative faculty and staff members who are highly capable of designing alternative methods of invigilation. Examples from this past term include breakout rooms on Zoom, examination styles emphasizing applied learning outcomes, as well as invigilation provided by the Centre for Accessibility, to name a few.

We call upon the Senate, the University Administration, and the Deans to implement the following recommendations:

  1. UBC must end its relationship with Proctorio and other invasive, algorithmic remote test proctoring software. As Proctorio has shown a disregard for student privacy by releasing student support logs, we call upon UBC to end its contract with Proctorio as the extent to which Proctorio is willing to infringe upon FIPPA is uncertain;

  2. Support and provide resources to faculty in identifying and incorporating alternate final assessment methods in order to avoid Proctorio and other algorithmic exam software;

  3. If choosing to utilize remote proctoring software, UBC instructors ​must ​provide low-barrier options to opt out of using remote proctoring software and offer alternate forms of assessment;

  4. Incorporating stronger language against the use of Proctorio and remote proctoring software, as well as explanations about concerns regarding Proctorio in the ​Guiding Principles for Fall 2020 Adaptations​;

  5. If choosing to utilize remote proctoring software, UBC instructors must provide a clear rationale for their usage of the software (i.e. invigilation required for professional accreditation programs) and demonstrate their understanding of how it will affect students in a statement of academic integrity expectations within the course syllabi (p15, Guiding Principles for Fall 2020 Adaptations)

  6. If continuing with the usage of Proctorio, UBC must conduct an external technical audit of Proctorio’s privacy mechanisms in order to mitigate harm to students.

We firmly oppose the use of Proctorio in subsequent academic terms and hope to hold UBC accountable to an ethical and compassionate approach to assessment and education. We hope the University will take a proactive approach by considering our calls to action, and we look forward to hearing from you regarding the implementation of these recommendations."

For full letter and list of signatories, please visit: 

https://www.ams.ubc.ca/news/open-letter-regarding-the-usage-of-proctorio/ 

No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Students fear for their data privacy after University of California invests in private equity firm // Salon

Students fear for their data privacy after University of California invests in private equity firm // Salon | Educational Psychology & Technology | Scoop.it

"A financial connection between a virtual classroom platform, the University of California system and a private equity firm is raising eyebrows"

 

By Matthew Rozsa

"College students within California's premier research university system are wondering if their privacy is safe after learning that the University of California (UC) made a commitment to invest $200 million in an investment firm that has access to vast troves of student data through a subsidiary.

The private equity investment firm, Thoma Bravo, announcing that it was purchasing the educational software firm Instructure Inc. in December in an all-cash deal for roughly $2 billion, an acquisition that was completed in March. Instructure Inc. owns Canvas, a popular virtual classroom platform. Given the platform's popularity within higher education institutions, Instructure has data on grades, lectures, tests, papers and more from tens of thousands of students.

 

Within a few weeks of the news announcement of Thoma Bravo's acquisition, more than 50 people who work at colleges signed a public letter urging Thoma Bravo to issue a legally-binding statement promising that it would not abuse its newfound access to student data. Their primary concern was how the data acquired through Instructure's learning management software Canvas would be used by the company.

"We request Instructure make clear statements be made as to how they intend to legally and ethically protect current student data, future student data, and access to both under the new ownership," the letter explained. "While the Chief Legal Officer Matt Kaminer has expressed Instructure's commitment to, and taken major steps towards ensuring ethical handling of student data, there are no guarantees that prevent the private equity firm from using student data in ways we didn't intend."

Instructure has made it clear through their own language that they view the student data they aggregated as one of their chief assets, although they have also insisted that they do not use that data improperly.

 

 

Yet an article published in the Virginia Journal of Law and Technology, titled "Transparency and the Marketplace for Student Data," pointed out that there is "an overall lack of transparency in the student information commercial marketplace and an absence of law to protect student information." As such, some students at the University of California are concerned that — despite reassurances to the contrary — their institution's new financial relationship with Thoma Bravo will mean their personal data can be sold or otherwise misused.

 

"It appears that the UC may be invested — however indirectly — in the monetization of data collected from their own students," Mustafa Hussain, a PhD candidate in Informatics at the University of California, Irvine, told Salon in a statement. He also said that he was unclear as to whether the college is "aware of Thoma Bravo's acquisition of Instructure, whether they have seen it as cause for concern, how they have responded, and whether they are invested in the student data market in other ways. There's a lack of transparency here, from the UC, Thoma Bravo, and Instructure, and I feel it's cause for concern."

 

Hussain's views were echoed in a statement by graduate student workers Samiha Khalil and Jack Davies.

 

"As graduate student workers responsible for delivering 50% of all instructional hours in the University of California, we are very concerned by the persistent request that we use Canvas for our online instruction and to record our lectures," Khalil and Davies explained.

After describing how the University of California failed to communicate their rights as students and teaching assistants and how the need for virtual learning has increased during the coronavirus pandemic, they added several other accusations. They claimed that "the UC weaponized Canvas to facilitate its repression of grad workers at UCSC [University of California, Santa Cruz] fighting for a cost of living adjustment, installing a tattle-bot to have undergraduates snitch on striking workers. The UC also hired IT experts to retrieve data on grade deletion from Canvas in order to punish strikers through the student code of conduct, since deleting grades is not a violation of our contract as workers."

 

They concluded, "As the EdTech [educational technology] industry expands with the demand for online instruction, workers need to organize to resist universities, like the UC, as they seek to take advantage of this crisis to deepen worker exploitation and surveillance."

The students' concerns over surveillance and privacy are not unwarranted. Previously, the University of California used military surveillance technology to help quell the grad student strikes at UC Santa Cruz and other campuses, as Salon previously reported.

 

Salon reached out to the University of California, Thoma Bravo and Instructure about the alleged $200 million investment commitment and the accusations made by Khalil and Davies. The University of California issued a statement denying that the Canvas platform was favored or that student data had been abused that they were aware of.

 

"While some campuses utilize the Canvas learning management system, it has neither been used in the manner you describe nor has the University of California hired IT experts as you suggest," they wrote. Thoma Bravo and Instructure declined to comment for this story, although Instructure said that it was committed to protecting student privacy and data and referred to specific blog posts.

Private equity firms like Thoma Bravo are an innately controversial type of investment fund, which some thinkers like Robert Reich contend should be heavily regulated. Private equity firms generally purchase and restructure companies (often by laying off workers) in order to milk them for funds, often gutting them in the process and selling off their stripped assets later if or when they go bankrupt. Private equity firms are also infamous for inflating fees and expenses charged to companies in which they hold stakes. This means that private equities can ravage pension plans that have been invested in them or reap billions from investors through their fee arrangements. Because of the lack of oversight of private equity, the FBI has expressed concern that such firms could easily be used to launder money.

This type of investment fund has made the news in various unflattering ways. When Mitt Romney was the Republican presidential nominee in 2012, his work at the private investment firm Bain Capital came under scrutiny, specifically Romney's practice of charging companies millions in fees before loading them up with debt. Because the companies in Bain's portfolio were often advised to fire large numbers of workers to pay off those debts, Romney effectively created situations where their businesses needed to fire workers specifically because of the debt that he created for them. This method is known as a "leveraged buyout": Private equities find businesses with good cash flows that are experiencing problems, take over the company (either voluntarily or through a "hostile" method) with a small amount of its own money and the rest covered by bank loans and then charge large management fees for advice on who to fire in order to pay off the loans. The majority of leveraged buyouts are backed between 60 to 90 percent with loaned money. 

 

Last year Sen. Elizabeth Warren, D-Mass., proposed legislation that would prohibit private equities from stripping the companies they take over of cash, real estate and other assets, as well as hold them for responsible for the debts they acquire in order to purchase them. Companies like Toys R Us, Friendly's and HCR ManorCare all went bankrupt because of private equities.

Advertisement:
 

"For far too long, Washington has looked the other way while private equity firms take over companies, load them with debt, strip them of their wealth, and walk away scot-free — leaving workers, consumers, and whole communities to pick up the pieces," Warren said in a statement at the time."

 

For original story on Salon, please visit:

https://www.salon.com/2020/07/28/students-fear-for-their-data-privacy-after-university-of-california-invests-in-private-equity-firm/

No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Algorithms, Agency, and Respect for Persons // Rubel, Castro, and Pham, 2020 // Social Theory and Practice 

Algorithms, Agency, and Respect for Persons // Rubel, Castro, and Pham, 2020 // Social Theory and Practice  | Educational Psychology & Technology | Scoop.it
Alan Rubel, Clinton Castro, Adam Pham 


"Algorithmic systems and predictive analytics play an increasingly important role in various aspects of modern life. Scholarship on the moral ramifications of such systems is in its early stages, and much of it focuses on bias and harm. This paper argues that in understanding the moral salience of algorithmic systems it is essential to understand the relation between algorithms, autonomy, and agency. We draw on several recent cases in criminal sentencing and K–12 teacher evaluation to outline four key ways in which issues of agency, autonomy, and respect for persons can conflict with algorithmic decision-making. Three of these involve failures to treat individual agents with the respect they deserve. The fourth involves distancing oneself from a morally suspect action by attributing one’s decision to take that action to an algorithm, thereby laundering one’s agency."

 

For full pre-production copy, click here.  

No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

New privacy bill would give parents an ‘Eraser Button’ and ban ads targeting children // The Verge

New privacy bill would give parents an ‘Eraser Button’ and ban ads targeting children // The Verge | Educational Psychology & Technology | Scoop.it
COPPA already prohibits companies like Facebook and Google from collecting personal data and location information from anyone under the age of 13 without explicit parental consent, but the senators’ new bill amending the law would extend protections to children up to age 15. However, if approved, platforms would only be able to collect the data of children aged 13 to 15 with their own consent and not that of their parents.

 

https://www.theverge.com/2019/3/12/18261181/eraser-button-bill-children-privacy-coppa-hawley-markey 

No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Under Fire: The Rise and Fall of Predictive Policing 

Under Fire: The Rise and Fall of Predictive Policing  | Educational Psychology & Technology | Scoop.it

https://www.acgusa.org/wp-content/uploads/2020/03/2020_Predpol_Peteranderl_Kellen.pdf

No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Sousveillance Capitalism // Borradaile & Reeves 2020

Sousveillance Capitalism // Borradaile & Reeves 2020 | Educational Psychology & Technology | Scoop.it

Glencora Borradaile
Oregon State University

Joshua Reeves
Oregon State University

Abstract

"The striking commercial success of Shoshana Zuboff’s 2019 book, The Age of Surveillance Capitalism, provides us with an excellent opportunity to reflect on how the present convergence of surveillance/capitalism coincides with popular critical and theoretical themes in surveillance studies, particularly that of sousveillance. Accordingly, this piece will first analyze how surveillance capitalism has molded the political behaviors and imaginations of activists. After acknowledging the theoretically and politically fraught implications of fighting surveillance with even more surveillance—especially given the complexities of digital capitalism’s endless desire to produce data—we conclude by exploring some of the political possibilities that lie at the margins of sousveillance capitalism (in particular, the extra-epistemological political value of sousveillance)."

 

https://ojs.library.queensu.ca/index.php/surveillance-and-society/article/view/13920 

 

 

 

No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Australian universities investigating 'deeply concerning' hack of controversial exam software // SBS News

Australian universities investigating 'deeply concerning' hack of controversial exam software // SBS News | Educational Psychology & Technology | Scoop.it

Personal records of 444,000 ProctorU users have reportedly been obtained in a hack and leaked online in hacker forums.

 

By Essam Al-Ghalib

"Universities across Australia have launched investigations after a controversial online exam tool was hacked and the personal records of students were stolen and leaked on the internet. ProctorU is an online tool that allows students to be remotely supervised while taking exams at home during the coronavirus crisis.

The University of Queensland student union in late April raised concerns about personal data being gathered by the ProctorU software, as well as students being filmed at home.

 

Now, personal records of 444,000 ProctorU users have reportedly been obtained in a data breach and leaked online in hacker forums.

It is not known how many Australian students or university staff were impacted by the breach, but the University of Melbourne, the University of Sydney and Swinburne University have all confirmed they are now investigating.

ProctorU's privacy policy states that it "does not use any test-taker's personal information for any purpose other than for facilitating the proctoring of online exams".

 

"We never sell personal information to third parties," the policy reads.

 

However, ProctorU's privacy policy also acknowledges that data may be transfered to a third party in the "event of a bankruptcy, merger, acquisition, reorganization, or similar transaction".

A spokesperson for the University of Sydney said the hack was "deeply concerning".

 

The university met with ProctorU's chief executive and compliance officer on Thursday to discuss the "breach of confidential data relating to users of their service.

 

"We understand the data relates to people who were registered as users of ProctorU’s services on or before 2014. We don’t believe our current students are directly impacted by this breach as we began using ProctorU’s online proctoring services in 2020, in response to the COVID19 pandemic.

 

"Any breach of security and privacy of this type is of course deeply concerning and we will continue to work with ProctorU to understand the circumstances of the breach and determine whether any follow-up actions are required on our part."

 

A University of Melbourne spokesperson characterised the hack as a "cyber-security issue", while Swinburne University said it was investigating a "data breach from a third-party provider".

 

"At this stage, we understand that only a small number of Swinburne Online students have been impacted, and have commenced our own independent investigation," a Swinburne spokesperson said.

“Swinburne Online is proactively contacting the student community to inform them of the breach and advising them to update their security details. The safety, wellbeing and privacy of Swinburne students and staff is our priority and we will continue to inform our community of any updates to this situation."

 

Some 4,000 University of Queensland students signed a petition in late April calling on the university to bin the ProctorU software. 

 

The University of Queensland said it has not been affected by the hack. UNSW, which uses the software, said no UNSW student records were included in the hacked database. 

ProctorU has been contacted for comment."

 

For full post, please visit:

https://www.sbs.com.au/news/australian-universities-investigating-deeply-concerning-hack-of-controversial-exam-software

No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

The case of Canvas: Longitudinal datafication through learning management systems // Marachi & Quill, 2020 Teaching in Higher Education: Critical Perspectives  

The case of Canvas: Longitudinal datafication through learning management systems // Marachi & Quill, 2020 Teaching in Higher Education: Critical Perspectives   | Educational Psychology & Technology | Scoop.it
Abstract
The Canvas Learning Management System (LMS) is used in thousands of universities across the United States and internationally, with a strong and growing presence in K-12 and higher education markets. Analyzing the development of the Canvas LMS, we examine 1) ‘frictionless’ data transitions that bridge K12, higher education, and workforce data 2) integration of third party applications and interoperability or data-sharing across platforms 3) privacy and security vulnerabilities, and 4) predictive analytics and dataveillance.  We conclude that institutions of higher education are currently ill-equipped to protect students and faculty required to use the Canvas Instructure LMS from data harvesting or exploitation. We challenge inevitability narratives and call for greater public awareness concerning the use of predictive analytics, impacts of algorithmic bias, need for algorithmic transparency, and enactment of ethical and legal protections for users who are required to use such software platforms."

KEYWORDS: Data ethics, data privacy, predictive analytics, higher education, dataveillance

 https://doi.org/10.1080/13562517.2020.1739641
 
Author email contact : roxana.marachi@sjsu.edu 
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Researchers Raise Concerns About Algorithmic Bias in Online Course Tools // EdSurge 

Researchers Raise Concerns About Algorithmic Bias in Online Course Tools // EdSurge  | Educational Psychology & Technology | Scoop.it

By Jeffrey Young
"Awareness of the dangers of algorithmic bias in AI systems is growing. Earlier this year, a 42-year-old Detroit resident was wrongly arrested after a face-recognition system falsely matched his photo with that of an image from security camera footage. Such systems have been shown to give more false matches on photos of Black people than white peers.

Some scholars worry that AI in learning management systems used by colleges could lead to misidentifications in academic settings, by doing things like falsely tagging certain students as low-performing, which could lead their professors to treat them differently or otherwise disadvantage them.

For instance, the popular LMS Canvas had a feature that red-flagged students who turned in late work, suggesting on a dashboard shown to professors that such students were less likely to do well in the class, says Roxana Marachi, an associate professor of education at San Jose State University. Yet she imagines scenarios in which students could be misidentified, such as when students turn in assignments on time but in alternative ways (like in paper rather than digital form), leading to false matches.

“Students are not aware that they are being flagged in these ways that their professors see,” she says.

Colleges insist that scholars be incredibly careful with data and research subjects in the research part of their jobs, but not with the tools they use for teaching. “That’s basic research ethics—inform the students about the way their data is being used,” she notes.

While that particular red flag feature is no longer used by Canvas, Marachi says she worries that colleges and companies are experimenting with learning analytics in ways that are not transparent and could be prone to algorithmic bias.
 

In an academic paper published recently in the journal Teaching in Higher Education: Critical Perspectives, she and a colleague call for “greater public awareness concerning the use of predictive analytics, impacts of algorithmic bias, need for algorithmic transparency, and enactment of ethical and legal protections for users who are required to use such software platforms.” The article was part of a special issue devoted to the “datafication of teaching in higher education.”

 

At a time when colleges and universities say they are renewing their commitment to fighting racism, data justice should be front and center, according to Marachi. “The systems we are putting into place are laying the tracks for institutional racism 2.0 unless we address it—and unless we put guardrails or undo the harms that are pending,” she adds.

Leaders of the LMS Canvas, which is produced by the company Instructure, insist they take data privacy seriously, and that they are working to make their policies clearer to students and professors.

Just three weeks ago the company hired a privacy attorney, Daisy Bennett, to assist in that work. She plans to write a plain-language version of the company’s user privacy policy and build a public portal explaining how data is used. And the company has convened a privacy council, made up of professors and students, that meets every two to three months to give advice on data practices. “We do our best to engage our end users and customers,” said Jared Stein, vice president of higher education strategy at Instructure, in an interview with EdSurge.

He stressed that Marachi’s article does not point to specific instances of student harm from data, and that the goal of learning analytics features are often to help students succeed. “Should we take those fears of what could go wrong and completely cast aside the potential to improve the teaching and learning experience?” he asked. “Or should we experiment and move forward?”

Marachi’s article raises concerns about a statement made at an Instructure earnings call by then-CEO Dan Goldsmith regarding a new feature:

“Our DIG initiative, it is first and foremost a platform for [Machine Learning] and [Artificial Intelligence], and we will deliver and monetize it by offering different functional domains of predictive algorithms and insights. Maybe things like student success, retention, coaching and advising, career pathing, as well as a number of the other metrics that will help improve the value of an institution or connectivity across institutions.”

Other scholars have focused on the comment as well, noting that the goals of companies sometimes prioritize monetizing features over helping students.

Stein, of Instructure, said that Goldsmith was “speaking about what was possible with data and not necessarily reflecting what we were actually building—he probably just overstated what we have as a vision for use of data.” He said he outlined the plans and strategy for the DIG initiative in a blog post, which points to its commitment to “ethical use of learning analytics.”

As to the concern about LMS and other tools leading to institutional racism? “Should we have guardrails? Absolutely.”

Competing Narratives

Marachi said she has talked with Instructure staff about her concerns, and that she appreciates their willingness to listen. But the argument she and other scholars are making is a critique of whether learning analytics is worth doing at all.

In an introductory article to the journal series on the datafication of college teaching, Ben Williamson and Sian Bayne from the University of Edinburgh, and Suellen Shay from the University of Cape Town, lay out a broad list of concerns about the prospect of using big data in teaching.

“The fact that some aspects of learning are easier to measure than others might result in simplistic, surface level elements taking on a more prominent role in determining what counts as success,” they write. “As a result, higher order, extended, and creative thinking may be undermined by processes that favor formulaic adherence to static rubrics.”


They place datafication in the context of what they see as a commercialization of higher education—as a way to fill gaps caused by policy decisions that have reduced public funding of college.

“There is a clear risk here that pedagogy may be reshaped to ensure it ‘fits’ on the digital platforms that are required to generate the data demanded to assess students’ ongoing learning,” they argue. “Moreover, as students are made visible and classified in terms of quantitative categories, it may change how teachers view them, and how students understand themselves as learners.”

And the mass movement to online teaching due to the COVID-19 pandemic makes their concerns “all the more urgent,” they add.

 

The introduction ends with a call to rethink higher education more broadly as colleges look at data privacy issues. They cite a book by Raewyn Connell called “The Good University: What Universities Do and Why it Is Time For Radical Change,” that “outlined a vision for a ‘good university’ in which the forces of corporate culture, academic capitalism and performative managerialism are rejected in favour of democratic, engaged, creative, and sustainable practices.”

 

Their hope is that higher education will be treated as a social and public good rather than a product."

 

Jeffrey R. Young (@jryoung) is the higher education editor at EdSurge and the producer and co-host of the EdSurge Podcast. He can be reached at jeff [at] edsurge [dot] com.

 

For original post, please visit: 

https://www.edsurge.com/news/2020-06-26-researchers-raise-concerns-about-algorithmic-bias-in-online-course-tools

No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Digital Sousveillance: A Network Analysis of the U.S. Surveillant Assemblage // Burke, 2020

Abstract

"This paper introduces a new methodological approach to the study of surveillance that I call digital sousveillance— the co-optation of digital data and the use of computational methods and techniques to resituate technologies of control and surveillance of individuals to instead observe the organizational observer. To illustrate the potential of this method, I employ quantitative network analytic methods to trace the changes in and development of the vast network of public and private organizations involved in surveillance operations in the United States—what I term the “US surveillant assemblage”—from the 1970s to the 2000s. The results of the network analyses suggest that the US surveillant assemblage is becoming increasingly privatized and that the line between “public” and “private” is becoming blurred as private organizations are, at an increasing rate, partnering with the US government to engage in mass surveillance."

 

To view/read, click on link below. 

 

Burke, Colin. 2020. Digital Sousveillance: A Network Analysis of the US Surveillant Assemblage. Surveillance & Society 18(1): 74-89. https://ojs.library.queensu.ca/index.php/surveillance-and-society/index  | ISSN: 1477-7487. © The author(s), 2020 | Licensed to the Surveillance Studies Network under a Creative Commons Attribution Non-Commercial No Derivatives license

No comment yet.
Rescooped by Roxana Marachi, PhD from Social & Emotional Learning and Critical Perspectives on SEL Related Initiatives
Scoop.it!

Psychodata: disassembling the psychological, economic, and statistical infrastructure of 'social-emotional learning' // Williamson, 2019 // Journal of Education Policy 

To download, click on title or arrow above. 

 

A blogpost summary of the article is also available with download link at: 

https://codeactsineducation.wordpress.com/2019/10/07/psychodata/ 

 

No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

USENIX Security Conference 2019 // 50 Ways to Leak Your Data: An Exploration of Apps

50 Ways to Leak Your Data: An Exploration of Apps' Circumvention of the Android Permissions System

Joel Reardon, University of Calgary / AppCensus Inc.
Distinguished Paper Award Winner

"Modern smartphone platforms implement permission-based models to protect access to sensitive data and system resources. However, apps can circumvent the permission model and gain access to protected data without user consent by using both covert and side channels. Side channels present in the implementation of the permission system allow apps to access protected data and system resources without permission; whereas covert channels enable communication between two colluding apps so that one app can share its permission-protected data with another app lacking those permissions. Both pose threats to user privacy.

In this work, we make use of our infrastructure that runs hundreds of thousands of apps in an instrumented environment. This testing environment includes mechanisms to monitor apps' runtime behaviour and network traffic. We look for evidence of side and covert channels being used in practice by searching for sensitive data being sent over the network for which the sending app did not have permissions to access it. We then reverse engineer the apps and third-party libraries responsible for this behaviour to determine how the unauthorized access occurred. We also use software fingerprinting methods to measure the static prevalence of the technique that we discover among other apps in our corpus.

Using this testing environment and method, we uncovered a number of side and covert channels in active use by hundreds of popular apps and third-party SDKs to obtain unauthorized access to both unique identifiers as well as geolocation data. We have responsibly disclosed our findings to Google and have received a bug bounty for our work."


View the full USENIX Security '19 conference program at https://www.usenix.org/conference/usenixsecurity19/technical-

sessions

 

For link to video, please visit: 

https://www.youtube.com/watch?v=twf-sgWp5bs

 

No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Frequently Asked Questions About Online Charter Schools // In The Public Interest

https://www.inthepublicinterest.org/wp-content/uploads/ITPI_OnlineChartersQA_July2020.pdf

 

For more with critical perspectives on privatization in education, please see: http://bit.ly/chart_look 

No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

‘Not the Real Me’: Social Imaginaries of Personal Data Profiling // Lupton, 2020

‘Not the Real Me’: Social Imaginaries of Personal Data Profiling // Lupton, 2020 | Educational Psychology & Technology | Scoop.it
 

"Abstract
In this article, I present findings from my Data Personas study, in which I invited Australian adults to respond to the stimulus of the ‘data persona’ to help them consider personal data profiling and related algorithmic processing of personal digitised information. The literature on social imaginaries is brought together with vital materialism theory, with a focus on identifying the affective forces, relational connections and agential capacities in participants’ imaginaries and experiences concerning data profiling and related practices now and into the future. The participants were aware of how their personal data were generated from their online engagements, and that commercial and government agencies used these data.

However, most people suggested that data profiling was only ever partial, configuring a superficial and static version of themselves. They noted that as people move through their life-course, their identities and bodies are subject to change: dynamic and emergent. While the digital data that are generated about humans are also lively, these data can never fully capture the full vibrancy, fluidity and spontaneity of human experience and behaviour. In these imaginaries, therefore, data personas are figured as simultaneously less-than-human and more-than-human. The implications for understanding and theorising human-personal data relations are discussed."

 

https://journals.sagepub.com/doi/abs/10.1177/1749975520939779 

No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Data Justice and COVID-19: Global Perspectives (Taylor, Sharma, Martin, & Jameson, 2020) 

"About the book:
In early 2020, as the COVID-19 pandemic swept the world and states of emergency were declared by one country after another, the global technology sector—already equipped with unprecedented wealth, power, and influence—mobilised to seize the opportunity. This collection is an account of what happened next and captures the emergent conflicts and responses around the world. The essays provide a global perspective on the implications of these developments for justice: they make it possible to compare how the intersection of state and corporate power—and the way that power is targeted and exercised—confronts, and invites resistance from, civil society in countries worldwide.

This edited volume captures the technological response to the pandemic in 33 countries, accompanied by nine thematic reflections, and reflects the unfolding of the first wave of the pandemic.


This book can be read as a guide to the landscape of technologies deployed during the pandemic and also be used to compare individual country strategies. It will prove useful as a tool for teaching and learning in various disciplines and as a reference point for activists and analysts interested in issues of data justice.

The essays interrogate these technologies and the political, legal, and regulatory structures that determine how they are applied. In doing so,the book exposes the workings of state technological power to critical assessment and contestation."

 

We hope that you will support the project by buying a paperback copy or donating for the pdf version on our shop. However you may also read the full book here on Issuu, or download a pdf version (8mb, compressed) via Internet Archive.

 

Download a one-page .pdf with the book details here.

https://meatspacepress.com/#mc_signup

No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Mental Health Apps Rarely Evidence-Based

Mental Health Apps Rarely Evidence-Based | Educational Psychology & Technology | Scoop.it

https://www.madinamerica.com/2020/07/mental-health-apps-rarely-evidence-based/

No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Technology Race Ethics and Equity in Education (TREE) Lab // Northwestern University

Technology Race Ethics and Equity in Education (TREE) Lab // Northwestern University | Educational Psychology & Technology | Scoop.it

The TREE Lab: Examining Technology, Race, Equity, and Ethics in Education  

 

"The TREE (Technology, Race, Ethics, and Equity in Education) Lab within the School of Education and Social Policy is an NSF-funded initiative that brings together NU undergraduate students with youth and community members to jointly investigate ethical, social, and racialized dimensions of new technologies. Co-directed by SESP Assistant Professor Sepehr Vakil and McCormick Computer Science Assistant Professor of Instruction Sarah Van Wart, the TREE lab is fundamentally committed to reimagining the possibilities of technology learning. We design tools and environments that facilitate engagement with complex technologies in ways that make visible their sociopolitical and ethical dimensions and implications. We draw on a range of conceptual and methodological approaches including critical theory, learning sciences, HCI, sociocultural theory, and participatory design."

 

https://tree.northwestern.edu/

No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

50 Ways to Leak Your Data: An Exploration of Apps' Circumvention of the Android Permissions System // USENIX

50 Ways to Leak Your Data: An Exploration of Apps' Circumvention of the Android Permissions System // USENIX | Educational Psychology & Technology | Scoop.it

Distinguished Paper Award Winner

Abstract.

 

"Modern smartphone platforms implement permission-based models to protect access to sensitive data and system resources. However, apps can circumvent the permission model and gain access to protected data without user consent by using both covert and side channels. Side channels present in the implementation of the permission system allow apps to access protected data and system resources without permission; whereas covert channels enable communication between two colluding apps so that one app can share its permission-protected data with another app lacking those permissions. Both pose threats to user privacy.


In this work, we make use of our infrastructure that runs hundreds of thousands of apps in an instrumented environment. This testing environment includes mechanisms to monitor apps' runtime behaviour and network traffic. We look for evidence of side and covert channels being used in practice by searching for sensitive data being sent over the network for which the sending app did not have permissions to access it. We then reverse engineer the apps and third-party libraries responsible for this behaviour to determine how the unauthorized access occurred. We also use software fingerprinting methods to measure the static prevalence of the technique that we discover among other apps in our corpus.

Using this testing environment and method, we uncovered a number of side and covert channels in active use by hundreds of popular apps and third-party SDKs to obtain unauthorized access to both unique identifiers as well as geolocation data. We have responsibly disclosed our findings to Google and have received a bug bounty for our work."

 

Authors: 

Joel Reardon, University of Calgary / AppCensus Inc.; Álvaro Feal, IMDEA Networks Institute / Universidad Carlos III Madrid; Primal Wijesekera, U.C. Berkeley / ICSI; Amit Elazari Bar On, U.C. Berkeley; Narseo Vallina-Rodriguez, IMDEA Networks Institute / ICSI / AppCensus Inc.; Serge Egelman, U.C. Berkeley / ICSI / AppCensus Inc.

 

https://www.usenix.org/conference/usenixsecurity19/presentation/reardon 

 

For video of presentation, please visit: 
https://www.youtube.com/watch?v=twf-sgWp5bs&feature=emb_title 

No comment yet.