Why the "two sides" narrative doesn't work.
Get Started for FREE
Sign up with Facebook Sign up with X
I don't have a Facebook or a X account
![]() ![]()
![]() The Internet is an infinite world of information and linkages; fuels the economy, boosts world culture and promotes democracy. But it is also the nest of digital assassins who lie in wait unnoticed and wait for their time to throw verbal, visual and technological bombs to damage reputations — and sign up others via social media to attain their evil motives more quickly. That’s the hideous reality of online life, as described by Richard Torrenzano and Mark Davis in their book Digital Assassination: Protecting Your Reputation, Brand or Business Against Online Attacks. The tome paints an accurate representation — one that brands, businesspeople, public figures and celebrities alike must take seriously if they want to thrive in today’s digital age, as “this power of the new digital assassin to destroy is as powerful asYouTube but as old as civilization,” the authors declare. Via jean lievens
![]() One of the hottest developments in education technology is schools banning technology. Via EDTECH@UTRGV
EDTECH@UTRGV's curator insight,
November 2, 2022 2:36 PM
What do you think? Should we ban cell phones in school?
![]() Cellphone bans in schools might not last much longer if teachers continue to find creative ways to utilize electronic devices in class. Via Dr. Tom D'Amico (@TDOttawa)
![]() Laws and programs designed to benefit vulnerable groups, such as the disabled or people of color, often end up benefiting all of society. Via roula haj-ismail, Dennis Swender
![]()
The hiring environment for US tech job seekers — particularly those looking for entry-level and early career roles — has become more challenging, beyond the direct effects of the plunge in tech job postings since 2022. Postings for senior and manager-level tech jobs have dropped sharply from their earlier peak, but as of early 2025, were still closer to their pre-pandemic levels (down 19%) than standard and junior tech titles (down 34%). Experience requirements in tech job postings have also tightened. Between the second quarters of 2022 and 2025, the share of tech postings looking for at least 5 years of experience rose from 37% to 42% between the second quarters of 2022 and 2025 — at the expense of postings looking for 2-4 years of experience. Tech job postings still accounted for 37% of the job applications started by tech workers in June 2025 (identified based on their Indeed profiles), slightly above the share three years earlier, highlighting how job seeker interest has held up despite the decline in opportunities.
Via Edumorfosis
![]()
From
www
Students are increasingly using AI tools to help with — and do — their homework. Here's how older online study services, students and professors are adapting.
Via Yashy Tohsaku
![]() Two studies show that many college students are offloading higher order thinking to AI, asking chatbots to do hard work for them. Via Ana Cristina Pratas
![]() (image is a Bryan Mathers sketch of my keynote) As I mentioned in my last post, I gave a keynote at the Education After the Algorithm conference in Dublin last week. It was a thoughtful, engaging event, congrats to Eamon Costello and all involved. Via Ana Cristina Pratas
![]() Deportation is the formal removal of foreign nationals from a country when they are found to be in violation of […]
![]() Leader of UnidosUS decries the humanitarian and economic damage of Trump administration’s immigration raids. ‘This is about our civil rights being destroyed.’
![]() The U.S. Department of Education has developed a new smartphone app that it hopes will make the notoriously difficult Free Application For Federal Student Aid a little easier. Via EDTECH@UTRGV, Laura Gomez Trevino ![]()
Laura Gomez Trevino's curator insight,
May 26, 12:54 AM
The article states the U.S. Department of Education introduced the mobile FAFSA app a much needed step toward making financial aid more accessible, especially for low income students who rely on smartphones for internet access. The U.S. Department of Education is helping break down walls to college affordability. This app could transform the application process into something faster, simpler, and far more equitable.
![]() Generally, the first step in applying for financial aid is completing the Free Application for Federal Student Aid (FAFSA). The schools you listed on the FAFSA will take that information and use it to calculate the financial aid you’re eligible for. Your financial aid awards may vary from school to school based on a numberContinue Reading Via Teresa Herrin, Laura Gomez Trevino ![]()
Laura Gomez Trevino's curator insight,
May 25, 3:12 PM
The article titled 5 things you did not know about your financial aid award, I believed provided a clear overview of the first steps in applying for financial aid resulting in submitting and completing the FAFSA. It also highlights how schools use FAFSA data to determine eligibility and award packages. The awards vary depending on each institution’s policies and parents/students income and other factors. I think it also is a helpful reminder that submitting the FAFSA early and to multiple schools can be key to maximizing financial aid opportunities for students.
|
![]() Multiculturalism is a threat to our freedom, not a benign model for mutual respect. It is concerned with one culture, the West, and particularly with America, which it wants to alter dramatically. Constitutional republicanism as we know it can exist only through the active participation of one united people working within the confines of the nation-state. Our current experiment with multiculturalism is dangerous because the sharing of a common culture and a common language creates the trust quotient necessary for our republic to succeed. The U.S. must end separatism and reembrace patriotic assimilation in order to protect its national identity and create real social solidarity. Via Charles Tiayon
![]() To normalize their ideas, extremists attempt to take a position previously considered radical and make it palatable enough for the public to get behind. By gaining even a small foothold, they can expand the outer edge of extremism while simultaneously moving toward the center. Our intense polarization makes that possible. McAleer is the author of "The Cure For Hate – A Former White Supremacist's Journey From Violent Extremism To Radical Compassion." He co-founded Life After Hate and is a founding partner of the Builders Movement. When I was a white supremacist who had infiltrated the Canadian military reserves, an officer who had spent two tours of Northern Ireland embedded in a British unit told me that the Irish Republican Army had only 75 active personnel who pulled triggers and planted bombs. Behind those combatants were 3,500 people who offered them safe houses and storage for their ammunition. Bolstering them was a much broader community of people who endorsed their efforts. Ultimately, decades of sectarian violence were perpetrated by a small group of people on each side; but it was the broader public's support that gave extremists permission to carry out their carnage. Britain’s recent riots, instigated by anti-immigration protesters in cities across England following the stabbing deaths of three young girls, illustrate this point clearly. A violent eruption only spreads like wildfire when an environment of public support enables it to escalate. In the days when I was driven by an extremist agenda, our movement recognized the need and opportunity to increase broad-based support among the North American middle. To normalize extremist ideas, we attempted to take a position previously considered radical and make it palatable enough for the public to get behind. If we could repackage a concept that only 1 percent of people supported in a way that 5 percent would accept, we could expand our outer edge of extremism while simultaneously moving where the center lies. We paid close attention to public discourse in the middle, searching for signals of our efforts taking hold. Thankfully, we failed; but the lesson remains: Language of intolerance and dehumanization in the center ultimately enables radical extremism at the outer edges.
The most extremist members of society, those bent on exclusion, hatred and suffering, are ready and waiting to seize upon our words to accomplish their destructive agendas. Almost universally, violent conflicts worldwide begin with slurs to denounce another group, painting them in a derogatory light. Through the gradual process of dehumanization through rhetoric, exclusion and microaggressions, each group frames "the enemy" as an existential threat to their value system, religion, way of life, privilege, culture and so on. Lazy language that defaults to stereotypes, generalizations and name-calling creates just enough fuel to light a fire in the outer fringes. With enough tacit support from the center, a spark can give way to an inferno with enough power to sustain itself. It is essential that the majority of people in the center maintain our values and humanity. As Friedrich Nietzsche said, "Beware that, when fighting monsters, you yourself do not become a monster." We cannot lend our voices to the cause of extremism, even if we are doing so unintentionally. How we choose to show up, particularly on divisive issues, recalibrates the norms. It sends a signal to those around us that we demand better from ourselves; that we will not stoop to carelessness, fear and judgment to comfort ourselves or win favor in challenging times. When we choose our words intentionally, we help guide others to do the same. With curiosity and courage, we can halt the slide. Over this past year, I have traveled extensively throughout the United States, screening the film “The Cure For Hate – Bearing Witness To Auschwitz,” and implementing an accompanying curriculum that helps high school students explore the process of othering, dehumanization and polarization (then and now). We have gone from the bluest town in the bluest county in the bluest state — Battleboro, Vt. — to the reddest town in the reddest county in the reddest state — Rigby, Idaho. On the surface, these places seem to be worlds apart; but, when I talk with the parents of our student participants, they all express similar concerns for their children. They long for their kids to grow up safe and healthy; they want them to have access to a promising future. They have different ideas on how to reach these goals, but they start from a common place. When we adopt a mindset of "us vs. them," we ignore this space where progress toward those shared goals can happen. When we break the pattern, that's when we all stand a chance. Via Charles Tiayon
Charles Tiayon's curator insight,
August 22, 2024 12:26 AM
"To normalize their ideas, extremists attempt to take a position previously considered radical and make it palatable enough for the public to get behind. By gaining even a small foothold, they can expand the outer edge of extremism while simultaneously moving toward the center. Our intense polarization makes that possible. McAleer is the author of "The Cure For Hate – A Former White Supremacist's Journey From Violent Extremism To Radical Compassion." He co-founded Life After Hate and is a founding partner of the Builders Movement. When I was a white supremacist who had infiltrated the Canadian military reserves, an officer who had spent two tours of Northern Ireland embedded in a British unit told me that the Irish Republican Army had only 75 active personnel who pulled triggers and planted bombs. Behind those combatants were 3,500 people who offered them safe houses and storage for their ammunition. Bolstering them was a much broader community of people who endorsed their efforts. Ultimately, decades of sectarian violence were perpetrated by a small group of people on each side; but it was the broader public's support that gave extremists permission to carry out their carnage. Britain’s recent riots, instigated by anti-immigration protesters in cities across England following the stabbing deaths of three young girls, illustrate this point clearly. A violent eruption only spreads like wildfire when an environment of public support enables it to escalate. In the days when I was driven by an extremist agenda, our movement recognized the need and opportunity to increase broad-based support among the North American middle. To normalize extremist ideas, we attempted to take a position previously considered radical and make it palatable enough for the public to get behind. If we could repackage a concept that only 1 percent of people supported in a way that 5 percent would accept, we could expand our outer edge of extremism while simultaneously moving where the center lies. We paid close attention to public discourse in the middle, searching for signals of our efforts taking hold. Thankfully, we failed; but the lesson remains: Language of intolerance and dehumanization in the center ultimately enables radical extremism at the outer edges.
The most extremist members of society, those bent on exclusion, hatred and suffering, are ready and waiting to seize upon our words to accomplish their destructive agendas. Almost universally, violent conflicts worldwide begin with slurs to denounce another group, painting them in a derogatory light. Through the gradual process of dehumanization through rhetoric, exclusion and microaggressions, each group frames "the enemy" as an existential threat to their value system, religion, way of life, privilege, culture and so on. Lazy language that defaults to stereotypes, generalizations and name-calling creates just enough fuel to light a fire in the outer fringes. With enough tacit support from the center, a spark can give way to an inferno with enough power to sustain itself. It is essential that the majority of people in the center maintain our values and humanity. As Friedrich Nietzsche said, "Beware that, when fighting monsters, you yourself do not become a monster." We cannot lend our voices to the cause of extremism, even if we are doing so unintentionally. How we choose to show up, particularly on divisive issues, recalibrates the norms. It sends a signal to those around us that we demand better from ourselves; that we will not stoop to carelessness, fear and judgment to comfort ourselves or win favor in challenging times. When we choose our words intentionally, we help guide others to do the same. With curiosity and courage, we can halt the slide. Over this past year, I have traveled extensively throughout the United States, screening the film “The Cure For Hate – Bearing Witness To Auschwitz,” and implementing an accompanying curriculum that helps high school students explore the process of othering, dehumanization and polarization (then and now). We have gone from the bluest town in the bluest county in the bluest state — Battleboro, Vt. — to the reddest town in the reddest county in the reddest state — Rigby, Idaho. On the surface, these places seem to be worlds apart; but, when I talk with the parents of our student participants, they all express similar concerns for their children. They long for their kids to grow up safe and healthy; they want them to have access to a promising future. They have different ideas on how to reach these goals, but they start from a common place. When we adopt a mindset of "us vs. them," we ignore this space where progress toward those shared goals can happen. When we break the pattern, that's when we all stand a chance." #metaglossia_mundus: https://thefulcrum.us/bridging-common-ground/extremist-language
![]() Across the Middle East, journalists, activists and others have long accused Facebook of censoring their speech By ISABEL DEBRE and FARES AKRAM - Associated Press 3 hrs ago 9 min to read 1 of 12 FILE – In this Oct. 15, 2021, mourners chant slogans as they hold a placard with Arabic that reads "Our choice is resistance" during the funeral of three Hezbollah supporters who were killed during clashes, in the southern Beirut suburb of Dahiyeh, Lebanon. Internal company documents from the former Facebook product manager-turned-whistleblower Frances Haugen show that in some of the world's most volatile regions, terrorist content and hate speech proliferate because the company remains short on moderators who speak local languages and understand cultural contexts. Bilal Hussein - staff, AP FILE – In this May 13, 2021, file photo, a girl is tossed into the air as people gather for Eid al-Fitr prayers at the Dome of the Rock Mosque in the Al-Aqsa Mosque compound in the Old City of Jerusalem. Eid al-Fitr, festival of breaking of the fast, marks the end of the holy month of Ramadan. I In May, as the Gaza war raged and tensions surged across the Middle East, Instagram briefly banned the hashtag #AlAqsa, a reference to the Al-Aqsa Mosque in Jerusalem's Old City, a flash point in the conflict. Facebook, which owns Instagram, later apologized, explaining its algorithms had mistaken the third-holiest site in Islam for the militant group Al-Aqsa Martyrs Brigade, an armed offshoot of the secular Fatah party. Mahmoud Illean - staff, AP FILE – In this Dec. 30, 2004 file photo, Zakaria Zubeidi, then leader in the Al Aqsa Martyrs Brigade in the West Bank, is carried by supporters during a presidential elections campaign rally in support of Mahmoud Abbas, in the West Bank town of Jenin. In May, as the Gaza war raged and tensions surged across the Middle East, Instagram briefly banned the hashtag #AlAqsa, a reference to the Al-Aqsa Mosque in Jerusalem's Old City, a flash point in the conflict. Facebook, which owns Instagram, later apologized, explaining its algorithms had mistaken the third-holiest site in Islam for the militant group Al-Aqsa Martyrs Brigade, an armed offshoot of the secular Fatah party. Nasser Nasser - staff, AP FILE – In this Jan. 14, 2018, file photo, Rohingya Muslim refugees look at a cellphone at the Kutupalong refugee camp, in Bangladesh. Internal company documents from the former Facebook product manager-turned-whistleblower Frances Haugen show that in some of the world's most volatile regions, terrorist content and hate speech proliferate because the company remains short on moderators who speak local languages and understand cultural contexts. Manish Swarup - staff, AP FILE – In this Oct. 21, 2018, file photo, members of the Rashtriya Swayamsevak Sangh (RSS) or National Volunteer Organization cast shadows in Bangalore, India. Internal company documents from the former Facebook product manager-turned-whistleblower Frances Haugen show that in some of the world's most volatile regions, terrorist content and hate speech proliferate because the company remains short on moderators who speak local languages and understand cultural contexts. Aijaz Rahi - staff, AP FILE – In this July 15, 2020, file phoro, a worshipper reads verses of the Quran on his mobile phone in a mosque in Rabat, Morocco. Internal company documents from the former Facebook product manager-turned-whistleblower Frances Haugen show that in some of the world's most volatile regions, terrorist content and hate speech proliferate because the company remains short on moderators who speak local languages and understand cultural contexts. Mosa'ab Elshamy - staff, AP Palestinian journalist Hassan Slaieh makes video of fishermen at the sea port in Gaza City, Wednesday, Oct. 20, 2021. A hashtag for one of Islam's holiest sites banned. Scores of Palestinian journalist accounts blocked. Archives of the Syrian civil war disappeared. And a vast vocabulary of common words off-limits to speakers of Arabic, Facebook's third-most popular language with millions of users worldwide. Adel Hana - staff, AP FILE – In this Dec. 20, 2018, file photo, a Bangladeshi reads a news report that makes mention of Facebook along with other social networking service, on his mobile phone in Dhaka, Bangladesh. Internal company documents from the former Facebook product manager-turned-whistleblower Frances Haugen show that in some of the world's most volatile regions, terrorist content and hate speech proliferate because the company remains short on moderators who speak local languages and understand cultural contexts. Uncredited - stringer, AP FILE – In this Nov. 23, 2017, file photo, employees of the Competence Call Center (CCC) work for the Facebook Community Operations Team in Essen, Germany. About 500 people control and delete facebook content if it does not apply the Facebook standards in the new center in Essen. Internal company documents from the former Facebook product manager-turned-whistleblower Frances Haugen show that in some of the world's most volatile regions, terrorist content and hate speech proliferate because the company remains short on moderators who speak local languages and understand cultural contexts. Martin Meissner - stringer, AP FILE – In this Oct. 6, 2021, file photo, a Lebanese protester holds a portrait of Hezbollah leader Sayyed Hassan Nasrallah with Arabic that reads, "He knew," referring to the thousand tons of ammonium nitrates that exploded last year at Beirut seaport, during a protest against the Hezbollah group and the visit of the Iranian foreign minister Hossein Amirabdollahian, in Beirut, Lebanon. Internal company documents from the former Facebook product manager-turned-whistleblower Frances Haugen show that in some of the world's most volatile regions, terrorist content and hate speech proliferate because the company remains short on moderators who speak local languages and understand cultural contexts. Hussein Malla - staff, AP FILE – In this Oct. 12, 2021, file photo, a migrant boy holds a mobile phone while lying by a stove at a makeshift camp housing migrants mostly from Afghanistan, in Velika Kladusa, Bosnia. Internal company documents from the former Facebook product manager-turned-whistleblower Frances Haugen show that in some of the world's most volatile regions, terrorist content and hate speech proliferate because the company remains short on moderators who speak local languages and understand cultural contexts. Uncredited - stringer, AP Palestinian journalist Hassan Slaieh shows his blocked Facebook page during an interview in Gaza City, Monday, Oct. 18, 2021. A hashtag for one of Islam's holiest sites is banned, scores of Palestinian journalist accounts are blocked, and the archives of the Syrian civil war have disappeared. A vast vocabulary of common words are off-limits to speakers of Arabic, Facebook's third-most popular language with millions of users worldwide. Adel Hana - staff, AP DUBAI, United Arab Emirates (AP) — As the Gaza war raged and tensions surged across the Middle East last May, Instagram briefly banned the hashtag #AlAqsa, a reference to the Al-Aqsa Mosque in Jerusalem’s Old City, a flash point in the conflict. Facebook, which owns Instagram, later apologized, explaining its algorithms had mistaken the third-holiest site in Islam for the militant group Al-Aqsa Martyrs Brigade, an armed offshoot of the secular Fatah party. For many Arabic-speaking users, it was just the latest potent example of how the social media giant muzzles political speech in the region. Arabic is among the most common languages on Facebook’s platforms, and the company issues frequent public apologies after similar botched content removals. Now, internal company documents from the former Facebook product manager-turned-whistleblower Frances Haugen show the problems are far more systemic than just a few innocent mistakes, and that Facebook has understood the depth of these failings for years while doing little about it. Such errors are not limited to Arabic. An examination of the files reveals that in some of the world’s most volatile regions, terrorist content and hate speech proliferate because the company remains short on moderators who speak local languages and understand cultural contexts. And its platforms have failed to develop artificial-intelligence solutions that can catch harmful content in different languages. In countries like Afghanistan and Myanmar, these loopholes have allowed inflammatory language to flourish on the platform, while in Syria and the Palestinian territories, Facebook suppresses ordinary speech, imposing blanket bans on common words. “The root problem is that the platform was never built with the intention it would one day mediate the political speech of everyone in the world,” said Eliza Campbell, director of the Middle East Institute’s Cyber Program. “But for the amount of political importance and resources that Facebook has, moderation is a bafflingly under-resourced project.” This story, along with others published Monday, is based on Haugen’s disclosures to the Securities and Exchange Commission, which were also provided to Congress in redacted form by her legal team. The redacted versions received by Congress were reviewed by a consortium of news organizations, including The Associated Press. In a statement to the AP, a Facebook spokesperson said that over the last two years the company has invested in recruiting more staff with local dialect and topic expertise to bolster its review capacity around the world. But when it comes to Arabic content moderation, the company said, “We still have more work to do. ... We conduct research to better understand this complexity and identify how we can improve." In Myanmar, where Facebook-based misinformation has been linked repeatedly to ethnic and religious violence, the company acknowledged in its internal reports that it had failed to stop the spread of hate speech targeting the minority Rohingya Muslim population. The Rohingya’s persecution, which the U.S. has described as ethnic cleansing, led Facebook to publicly pledge in 2018 that it would recruit 100 native Myanmar language speakers to police its platforms. But the company never disclosed how many content moderators it ultimately hired or revealed which of the nation's many dialects they covered. Despite Facebook’s public promises and many internal reports on the problems, the rights group Global Witness said the company’s recommendation algorithm continued to amplify army propaganda and other content that breaches the company’s Myanmar policies following a military coup in February. In India, the documents show Facebook employees debating last March whether it could clamp down on the “fear mongering, anti-Muslim narratives” that Prime Minister Narendra Modi’s far-right Hindu nationalist group, Rashtriya Swayamsevak Sangh, broadcasts on its platform. In one document, the company notes that users linked to Modi’s party had created multiple accounts to supercharge the spread of Islamophobic content. Much of this content was “never flagged or actioned,” the research found, because Facebook lacked moderators and automated filters with knowledge of Hindi and Bengali. Arabic poses particular challenges to Facebook’s automated systems and human moderators, each of which struggles to understand spoken dialects unique to each country and region, their vocabularies salted with different historical influences and cultural contexts. The Moroccan colloquial Arabic, for instance, includes French and Berber words, and is spoken with short vowels. Egyptian Arabic, on the other hand, includes some Turkish from the Ottoman conquest. Other dialects are closer to the “official” version found in the Quran. In some cases, these dialects are not mutually comprehensible, and there is no standard way of transcribing colloquial Arabic. Facebook first developed a massive following in the Middle East during the 2011 Arab Spring uprisings, and users credited the platform with providing a rare opportunity for free expression and a critical source of news in a region where autocratic governments exert tight controls over both. But in recent years, that reputation has changed. Scores of Palestinian journalists and activists have had their accounts deleted. Archives of the Syrian civil war have disappeared. And a vast vocabulary of everyday words have become off-limits to speakers of Arabic, Facebook’s third-most common language with millions of users worldwide. For Hassan Slaieh, a prominent journalist in the blockaded Gaza Strip, the first message felt like a punch to the gut. “Your account has been permanently disabled for violating Facebook’s Community Standards,” the company’s notification read. That was at the peak of the bloody 2014 Gaza war, following years of his news posts on violence between Israel and Hamas being flagged as content violations. Within moments, he lost everything he’d collected over six years: personal memories, stories of people’s lives in Gaza, photos of Israeli airstrikes pounding the enclave, not to mention 200,000 followers. The most recent Facebook takedown of his page last year came as less of a shock. It was the 17th time that he had to start from scratch. He had tried to be clever. Like many Palestinians, he'd learned to avoid the typical Arabic words for “martyr” and “prisoner,” along with references to Israel’s military occupation. If he mentioned militant groups, he’d add symbols or spaces between each letter. Other users in the region have taken an increasingly savvy approach to tricking Facebook’s algorithms, employing a centuries-old Arabic script that lacks the dots and marks that help readers differentiate between otherwise identical letters. The writing style, common before Arabic learning exploded with the spread of Islam, has circumvented hate speech censors on Facebook’s Instagram app, according to the internal documents. But Slaieh’s tactics didn’t make the cut. He believes Facebook banned him simply for doing his job. As a reporter in Gaza, he posts photos of Palestinian protesters wounded at the Israeli border, mothers weeping over their sons’ coffins, statements from the Gaza Strip’s militant Hamas rulers. Criticism, satire and even simple mentions of groups on the company’s Dangerous Individuals and Organizations list — a docket modeled on the U.S. government equivalent — are grounds for a takedown. “We were incorrectly enforcing counterterrorism content in Arabic," one document reads, noting the current system “limits users from participating in political speech, impeding their right to freedom of expression.” The Facebook blacklist includes Gaza’s ruling Hamas party, as well as Hezbollah, the militant group that holds seats in Lebanon’s Parliament, along with many other groups representing wide swaths of people and territory across the Middle East, the internal documents show, resulting in what Facebook employees describe in the documents as widespread perceptions of censorship. “If you posted about militant activity without clearly condemning what’s happening, we treated you like you supported it,” said Mai el-Mahdy, a former Facebook employee who worked on Arabic content moderation until 2017. In response to questions from the AP, Facebook said it consults independent experts to develop its moderation policies and goes “to great lengths to ensure they are agnostic to religion, region, political outlook or ideology.” “We know our systems are not perfect," it added. The company’s language gaps and biases have led to the widespread perception that its reviewers skew in favor of governments and against minority groups. Former Facebook employees also say that various governments exert pressure on the company, threatening regulation and fines. Israel, a lucrative source of advertising revenue for Facebook, is the only country in the Mideast where Facebook operates a national office. Its public policy director previously advised former right-wing Prime Minister Benjamin Netanyahu. Israeli security agencies and watchdogs monitor Facebook and bombard it with thousands of orders to take down Palestinian accounts and posts as they try to crack down on incitement. “They flood our system, completely overpowering it,” said Ashraf Zeitoon, Facebook’s former head of policy for the Middle East and North Africa region, who left in 2017. “That forces the system to make mistakes in Israel’s favor. Nowhere else in the region had such a deep understanding of how Facebook works.” Facebook said in a statement that it fields takedown requests from governments no differently from those from rights organizations or community members, although it may restrict access to content based on local laws. “Any suggestion that we remove content solely under pressure from the Israeli government is completely inaccurate,” it said. Syrian journalists and activists reporting on the country’s opposition also have complained of censorship, with electronic armies supporting embattled President Bashar Assad aggressively flagging dissident content for removal. Raed, a former reporter at the Aleppo Media Center, a group of antigovernment activists and citizen journalists in Syria, said Facebook erased most of his documentation of Syrian government shelling on neighborhoods and hospitals, citing graphic content. “Facebook always tells us we break the rules, but no one tells us what the rules are,” he added, giving only his first name for fear of reprisals. In Afghanistan, many users literally cannot understand Facebook’s rules. According to an internal report in January, Facebook did not translate the site’s hate speech and misinformation pages into Dari and Pashto, the two most common languages in Afghanistan, where English is not widely understood. When Afghan users try to flag posts as hate speech, the drop-down menus appear only in English. So does the Community Standards page. The site also doesn’t have a bank of hate speech terms, slurs and code words in Afghanistan used to moderate Dari and Pashto content, as is typical elsewhere. Without this local word bank, Facebook can’t build the automated filters that catch the worst violations in the country. When it came to looking into the abuse of domestic workers in the Middle East, internal Facebook documents acknowledged that engineers primarily focused on posts and messages written in English. The flagged-words list did not include Tagalog, the major language of the Philippines, where many of the region’s housemaids and other domestic workers come from. In much of the Arab world, the opposite is true — the company over-relies on artificial-intelligence filters that make mistakes, leading to “a lot of false positives and a media backlash,” one document reads. Largely unskilled human moderators, in over their heads, tend to passively field takedown requests instead of screening proactively. Sophie Zhang, a former Facebook employee-turned-whistleblower who worked at the company for nearly three years before being fired last year, said contractors in Facebook’s Ireland office complained to her they had to depend on Google Translate because the company did not assign them content based on what languages they knew. Facebook outsources most content moderation to giant companies that enlist workers far afield, from Casablanca, Morocco, to Essen, Germany. The firms don't sponsor work visas for the Arabic teams, limiting the pool to local hires in precarious conditions — mostly Moroccans who seem to have overstated their linguistic capabilities. They often get lost in the translation of Arabic’s 30-odd dialects, flagging inoffensive Arabic posts as terrorist content 77% of the time, one document said. “These reps should not be fielding content from non-Maghreb region, however right now it is commonplace,” another document reads, referring to the region of North Africa that includes Morocco. The file goes on to say that the Casablanca office falsely claimed in a survey it could handle “every dialect” of Arabic. But in one case, reviewers incorrectly flagged a set of Egyptian dialect content 90% of the time, a report said. Iraq ranks highest in the region for its reported volume of hate speech on Facebook. But among reviewers, knowledge of Iraqi dialect is “close to non-existent,” one document said. “Journalists are trying to expose human rights abuses, but we just get banned,” said one Baghdad-based press freedom activist, who spoke on condition of anonymity for fear of reprisals. “We understand Facebook tries to limit the influence of militias, but it’s not working.” Linguists described Facebook’s system as flawed for a region with a vast diversity of colloquial dialects that Arabic speakers transcribe in different ways. “The stereotype that Arabic is one entity is a major problem,” said Enam al-Wer, professor of Arabic linguistics at the University of Essex, citing the language’s “huge variations” not only between countries but class, gender, religion and ethnicity. Despite these problems, moderators are on the front lines of what makes Facebook a powerful arbiter of political expression in a tumultuous region. Although the documents from Haugen predate this year’s Gaza war, episodes from that 11-day conflict show how little has been done to address the problems flagged in Facebook’s own internal reports. Activists in Gaza and the West Bank lost their ability to livestream. Whole archives of the conflict vanished from newsfeeds, a primary portal of information for many users. Influencers accustomed to tens of thousands of likes on their posts saw their outreach plummet when they posted about Palestinians. “This has restrained me and prevented me from feeling free to publish what I want for fear of losing my account,” said Soliman Hijjy, a Gaza-based journalist whose aerials of the Mediterranean Sea garnered tens of thousands more views than his images of Israeli bombs — a common phenomenon when photos are flagged for violating community standards. During the war, Palestinian advocates submitted hundreds of complaints to Facebook, often leading the company to concede error and reinstate posts and accounts. In the internal documents, Facebook reported it had erred in nearly half of all Arabic language takedown requests submitted for appeal. “The repetition of false positives creates a huge drain of resources,” it said. In announcing the reversal of one such Palestinian post removal last month, Facebook’s semi-independent oversight board urged an impartial investigation into the company’s Arabic and Hebrew content moderation. It called for improvement in its broad terrorism blacklist to “increase understanding of the exceptions for neutral discussion, condemnation and news reporting,” according to the board's policy advisory statement. Facebook’s internal documents also stressed the need to “enhance” algorithms, enlist more Arab moderators from less-represented countries and restrict them to where they have appropriate dialect expertise. “With the size of the Arabic user base and potential severity of offline harm … it is surely of the highest importance to put more resources to the task to improving Arabic systems,” said the report. But the company also lamented that “there is not one clear mitigation strategy.” Meanwhile, many across the Middle East worry the stakes of Facebook’s failings are exceptionally high, with potential to widen long-standing inequality, chill civic activism and stoke violence in the region. “We told Facebook: Do you want people to convey their experiences on social platforms, or do you want to shut them down?” said Husam Zomlot, the Palestinian envoy to the United Kingdom, who recently discussed Arabic content suppression with Facebook officials in London. “If you take away people’s voices, the alternatives will be uglier.” Akram reported from Gaza City, Gaza Strip. Associated Press writers Sam McNeil in Beijing, Sheikh Saaliq in New Delhi and Barbara Ortutay in Oakland, California, contributed to this report. Via Charles Tiayon
![]()
From
www
'I often hear from those who wish to achieve a sense of belonging for every student but are worried that their initiatives will inadvertently stoke division or backlash within the community.' Learn more from Dr Emily Meadows and see how her framework can help you and your school. Via roula haj-ismail, Dennis Swender
![]() Unity in diversity saves the day! #diversity #survival #inspiration #shorts
![]() "Abstract: A globalised world brings diversity into the classroom and internationalisation to higher education, where intercultural competence comes to the fore. Accommodating interculturality, however, extends beyond the international student cohort and includes heterogeneous domestic cultures, including Indigenous cultures. In the Australian context, historically Aboriginal and Torres Strait Islander peoples have experienced limited access to culturally appropriate health, social and educational services. Accordingly, higher education institutions can be vehicles of change in this regard. More specifically, just as many higher education providers have moved towards internationalised curriculum, there is increasing evidence and intention to introduce Indigenised curriculum where respective educators delivering indigenised curricula need to be culturally competent. Institutions therefore, are offering cultural training programs for educators delivering Indigenised curricula, where recognition of Indigenous cultural competence amongst educators would be useful. Yet, the review presented in this paper demonstrates a gap in literature regarding measurement of cultural and intercultural competence in the context of Australian higher education. To that end, an instrument specifically designed to measure educator intercultural competence in Australia as related to Aboriginal and Torres Strait Islander peoples is proposed. This instrument will enable higher education institutions to document educator Indigenous cultural competence, demonstrate the intercultural skills of their educational staff and continuously improve intercultural competency within the institution."
Navigating intercultural competence at home Sharon Schembri Journal of University Teaching and Learning Practice 21(4) DOI: 10.53761/1m378720 https://www.researchgate.net/publication/394361347_Navigating_intercultural_competence_at_home #metaglossia_mundus Via Charles Tiayon
Charles Tiayon's curator insight,
August 9, 11:59 AM
"Abstract: A globalised world brings diversity into the classroom and internationalisation to higher education, where intercultural competence comes to the fore. Accommodating interculturality, however, extends beyond the international student cohort and includes heterogeneous domestic cultures, including Indigenous cultures. In the Australian context, historically Aboriginal and Torres Strait Islander peoples have experienced limited access to culturally appropriate health, social and educational services. Accordingly, higher education institutions can be vehicles of change in this regard. More specifically, just as many higher education providers have moved towards internationalised curriculum, there is increasing evidence and intention to introduce Indigenised curriculum where respective educators delivering indigenised curricula need to be culturally competent. Institutions therefore, are offering cultural training programs for educators delivering Indigenised curricula, where recognition of Indigenous cultural competence amongst educators would be useful. Yet, the review presented in this paper demonstrates a gap in literature regarding measurement of cultural and intercultural competence in the context of Australian higher education. To that end, an instrument specifically designed to measure educator intercultural competence in Australia as related to Aboriginal and Torres Strait Islander peoples is proposed. This instrument will enable higher education institutions to document educator Indigenous cultural competence, demonstrate the intercultural skills of their educational staff and continuously improve intercultural competency within the institution." Navigating intercultural competence at home
![]()
From
teachonline
Educators play a critical role in helping students succeed with their online studies. This new list of resources curated from the websites of Ontario's 24 colleges, 25 universities and 9 Indigenous Institutes is a comprehensive guide designed to support students in their online learning journey. Via Ana Cristina Pratas
![]()
From
www
Commentary on Stephen's Web ~ Your Brain on ChatGPT: Accumulation of Cognitive Debt when Using an AI Assistant for Essay Writing Task by Stephen Downes. Online learning, e-learning, new media, connectivism, MOOCs, personal learning environments, new literacy, and more Via Ana Cristina Pratas
![]()
From
www
What is the difference between Democrats and Republicans? This nonpartisan comparison compares and contrasts the policies and political positions of the Democratic and Republican parties on major issues such as taxes, the role of government, entitlements (Social Security, Medicare), gun control, immigration, healthcare and civil rights.
![]() From FindLaw. By Lyle Therese A. Hilotin-Lee, J.D. | Legally reviewed by Laura Temme, Esq. | Last reviewed August 05, 2025
![]() I created a series of sketch notes for Tiffani Bova’s “What’s Next” podcast where she meets brilliant people to discuss customer experience, growth and innovation. Tiffani Bova is a Globa Via juandoming
![]()
|