... designed to collect posts and informations I found and want to keep available but not relevant to the other topics I am curating on Scoop.it (on behalf of ASSIM):
In her Quantitative Science Studies paper, Lauranne Chaignon traces the 20-year evolution of Clarivateâs Highly Cited Researchers (HCR) list: a ranking that once defined academic prestige and still shapes the Shanghai University Rankings.
đđđČ đ đąđ§đđąđ§đ đŹ đđĄđ«đđ đđąđŹđđąđ§đđ đđ«đđŹ (2001â2023): âĄïž đđđđđđđŹđ đđ«đ (2001â2011): Created by Eugene Garfieldâs Institute for Scientific Information (ISI) as a rich, biographical database of influential scientists âĄïž đđ§đđąđđđđšđ« đđ«đ (2012â2018): Under Thomson Reuters and Clarivate, the list became an annual metric of research excellence, stripped of biography and tightly linked to university rankings âĄïž đđ§đđđ đ«đąđđČ đđ«đąđŹđąđŹ (2019â2023): Growing cases of self-citation, manipulation, and fake affiliations forced Clarivate to adopt âqualitative filtersâ and partnerships with watchdogs like Retraction Watch to exclude fraudulent entries (over 13% in 2023)
âĄïž Changing purpose: Once a tool for scholarly networking, the list is now a symbol of academic reputation and competition, with major policy and funding implications
âĄïž Crisis of credibility: Scandals (e.g. Saudi universities paying foreign researchers for nominal affiliations) have exposed how metrics can be gamed, threatening Clarivateâs authority
đđĄđ đ đźđđźđ«đ? After removing mathematics entirely from the 2023 list due to manipulation, Clarivate faces a critical question: can citation-based metrics still define true research influence?
Interesting insights into how a bibliometric tool became both a global status symbol and a cautionary tale about measurement, integrity, and the commercialization of scientific prestige.
Jamie Q. Roberts, University of Sydney I donât know about you, but ever since I can remember â from my early teens â I have been bemused about the end
The editors of the non predator scientific journals are often doing a good job but without the free help of the scientific community it would not be possible. However the huge profit they make (at least 32% of their revenue) is not acceptable as it is made with money of the taxpayer or of donators. It should changeâŠ
14 Go, câest le poids du modĂšle dâIA que je fais tourner⊠sur mon ordinateur, en local đ»
Pas sur les serveurs dâOpenAI, pas dans le cloud de Google... mais bien sur mon MacBook personnel. Et honnĂȘtement, ça change pas mal de choses dans ma maniĂšre dâutiliser lâIA au quotidien.
Edit: To be clear, this is not an argument that research quality has declined â if anything, standards for rigor and data quality have increased in many fields. The point is about incentives: how evaluation systems influence where scholars invest their time, attention, and energy.
âPublish or perishâ continues to shape academic careers â but increasingly at the expense of quality. Itâs been ten years since I was promoted to full professor, and I find myself re-examining what we choose to measure.
When quantity becomes the primary KPI, something important is lost.
Recent reports show that publication pressure is linked to burnout, ethical dilemmas, and rising retractions (Newsletters QS, 2025; Editors Cafe, 2025; Council Science, 2024). At the same time, major academic presses and research organizations are calling for reward systems that value mentorship, collaboration, and scientific culture just as much as publication counts (Inside Higher Ed, 2025; Nature, 2025).
The landscape is shifting â preprints, open peer review, modular formats â but none of this will matter unless universities change how scholars are evaluated and rewarded.
Most researchers already know what meaningful scholarship looks like: âą Slow, careful thinking âą Genuine collaboration âą Time to mentor and develop others âą The honesty to say, âThis study needs more time.â
If the goal is real impact â not just output â then: Quality must matter more than quantity. Contribution must matter more than performance metrics. This is the conversation we need to have now.
References Council Science. (2024). The âpublish or perishâ mentality is fuelling research paper retractions. Editors Cafe. (2025). Rethinking âPublish or Perishâ in the Age of AI. Inside Higher Ed. (2025). Major Academic Press Calls for âPublish or Perishâ Reform. Nature. (2025). Move beyond âpublish or perishâ by measuring behaviours. Newsletters QS. (2025). How has âpublish or perishâ become âpublish and perishâ in academia?
THE INTERNET IS EATING ITSELF The Internet Is Now 52% AI-Generated Content And It's Training On Itself
New research just dropped numbers that should terrify anyone who cares about truth:
**52% of all internet content is now AI-generated.**Âč In 2022, it was 10%.
But here's where it gets insane (actually it's all insane TBH):
**74% of ALL NEW web pages contain AI-generated content.**ÂČ The internet added 3-5 billion pages monthly in 2024 most of it synthetic.Âł
The internet isn't just being eaten. It's being mass-produced by the thing that's eating it.
Why This Matters
Large Language Models aren't brains. They're giant stomachs. They consume everything. Digest nothing. Excrete more content, which gets consumed again an infinite feedback loop of synthetic regurgitation.
Here's what happens when AI trains on AI:
â Model collapse: Recursive training causes "irreversible performance decay."⎠â Narrowing of knowledge: Models reflect themselves, not reality â Death of originality: A hall of mirrors, each reflection dimmer than the last
We're replacing:
Human nuance Cultural context Real expertise Original thought The truth
With statistically probable simulations.
The Economics
Licensing real content costs billions. Synthetic data? Almost nothing. So they choose cheap scale over real knowledge. No regulation. No transparency. No tracking.
OpenAI doesn't ask permission to train on Anthropic's outputs. They just scrape the web.
Competition accelerates the collapse. Every AI company races to build bigger models.
They need more data. Synthetic data looks like a shortcut. Collectively, they're destroying the foundation their business depends on: real human knowledge.
We've Already Passed the Tipping Point
What happens when:
â Medical information trains on synthetic medical papers? â Children learn history from recursive AI summaries? â Scientific research builds on fabricated datasets?
We don't just lose quality. We lose the ability to know what's real. The internet was humanity's collective memory. Now it's becoming humanity's collective hallucination.
The Bottom Line
LLMs are giant stomachs, not brains. They consume. They excrete. They consume again. ******************************************************************************** The trick with technology is to avoid spreading darkness at the speed of light Stephen Klein | Founder & CEO, Curiouser.AI | Teaches AI Ethics at UC Berkeley | Raising on WeFunder | Would your support (LINK IN COMMENTS)
Footnotes:
Âč Graphite Research (2025). Analysis of 65,000 URLs from Common Crawl. ÂČ Ahrefs (2025). Analysis of 900,000 newly created web pages in April 2025. Âł Common Crawl Foundation. Database adds 3-5 billion pages monthly. ⎠Shumailov et al. (2024). "AI models collapse when trained on recursively generated data." Nature, 631, 755-759. | 388 comments on LinkedIn
I discovered that a reading pack for my doctoral leadership subject contained fabricated references. Almost everything listed was AI-generated citations that either didnât exist or linked to the wrong papers.
When I raised it, the provider confirmed that AI had been used and that the material was shared before human review. They also reminded me that doctoral candidates should be able to verify their own sources.
That response was so disappointing. Doctoral candidates are expected to build on verified scholarship, not correct institutional errors. Iâve asked to withdraw from the course because the university doesnât seem to understand why this is a serious concern and has pushed the responsibility back on me.
Distributing unverified academic material in a foundation subject is a breach of academic integrity and sets entirely the wrong ethical tone for the course.
Am I overreacting? Or is this yet another symptom of the wider issues that are undermining confidence in the sector? | 306 comments on LinkedIn
My first visit to Tbilisi, Georgia for the International Conference on Medical Education has been incredible and filled with thoughtful discussions, engaged learners, and the perfect mix of local and international perspectives. Thanks to Salome Voronovi for the invitation, and always nice to see David Taylor.
A concept that really struck a chord is what Iâve started calling the Suitcase Paradox:
In lifelong learning or curriculum design, just like when packing for a trip, you canât keep adding new things unless you take something out first. And you have to fit it into the overhead compartment on a plane and in our anatomical overhead compartment, the brain!
Healthcare professionals must continually unlearn outdated practices to make room for new evidence, new technologies, and new ways of thinking.
Thatâs what lifelong learning, and particularly continuing professional development (CPD), is all about.
But to make it work, educators must evolve into learning facilitators, helping learners curate, adapt, and apply knowledge depending on where they are on the learning continuum.
And because healthcare doesnât happen in silos, neither should learning. Interprofessional education (IPE) brings students from different health professions together.
Interprofessional continuing education (IPCE) extends that collaboration into practice. And when itâs done right, it leads to interprofessional collaborative practice (IPCP), where the ultimate outcome is better patient care.
I even got in a mention of the Donald & Barbara Zucker School of Medicine curriculum!
Plenty more to come: Iâve still got a wine tour đ· ahead and a masterclass on lifelong learning and CPD on Monday!
To get content containing either thought or leadership enter:
To get content containing both thought and leadership enter:
To get content containing the expression thought leadership enter:
You can enter several keywords and you can refine them whenever you want. Our suggestion engine uses more signals but entering a few keywords here will rapidly give you great content to curate.