Philosophy everywhere everywhen
10.4K views | +1 today
Follow
 
Scooped by Wildcat2030
onto Philosophy everywhere everywhen
Scoop.it!

The Philosophy of Data

The Philosophy of Data | Philosophy everywhere everywhen | Scoop.it
Our ability to gather and process huge amounts of data does many things, including correcting intuitive biases and illuminating patterns of behavior

..

Our brains often don’t notice subtle verbal patterns, but Pennebaker’s computers can. Younger writers use more downbeat and past-tense words than older writers who use more positive and future-tense words.

Liars use more upbeat words like “pal” and “friend” but fewer excluding words like “but,” “except” and “without.” (When you are telling a false story, it’s hard to include the things you did not see or think about.)

We think of John Lennon as the most intellectual of the Beatles, but, in fact, Paul McCartney’s lyrics had more flexible and diverse structures and George Harrison’s were more cognitively complex.

.

more...
No comment yet.
Philosophy everywhere everywhen
First Law of Philosophy: For every philosopher, there is an equal and opposite philosopher. The Second Law of Philosophy: They're both wrong
Curated by Wildcat2030
Your new post is loading...
Your new post is loading...
Scooped by Wildcat2030
Scoop.it!

'Anumeric' people: What happens when a language has no words for numbers?

'Anumeric' people: What happens when a language has no words for numbers? | Philosophy everywhere everywhen | Scoop.it
Numbers do not exist in all cultures. There are numberless hunter-gatherers embedded deep in Amazonia, living along branches of the world’s largest river tree. Instead of using words for precise quantities, these people rely exclusively on terms analogous to “a few” or “some.”

In contrast, our own lives are governed by numbers. As you read this, you are likely aware of what time it is, how old you are, your checking account balance, your weight and so on. The exact (and exacting) numbers we think with impact everything from our schedules to our self-esteem.

But, in a historical sense, numerically fixated people like us are the unusual ones. For the bulk of our species’ approximately 200,000-year lifespan, we had no means of precisely representing quantities. What’s more, the 7,000 or so languages that exist today vary dramatically in how they utilize numbers.

Speakers of anumeric, or numberless, languages offer a window into how the invention of numbers reshaped the human experience. In a new book, I explore the ways in which humans invented numbers, and how numbers subsequently played a critical role in other milestones, from the advent of agriculture to the genesis of writing.
more...
Ivon Prefontaine, PhD's curator insight, April 27, 5:07 PM
Somewhere in this is a message about integrating quality and quantity.
Scooped by Wildcat2030
Scoop.it!

Hierarchies have a place even in societies built on equality – Stephen C Angle, Kwame Anthony Appiah, Julian Baggini and others | Aeon Essays

The modern West has placed a high premium on the value of equality. Equal rights are enshrined in law while old hierarchies of nobility and social class have been challenged, if not completely dismantled. Few would doubt that global society is all the better for these changes. But hierarchies have not disappeared. Society is still stratified according to wealth and status in myriad ways.

On the other hand, the idea of a purely egalitarian world in which there are no hierarchies at all would appear to be both unrealistic and unattractive. Nobody, on reflection, would want to eliminate all hierarchies, for we all benefit from the recognition that some people are more qualified than others to perform certain roles in society. We prefer to be treated by senior surgeons not medical students, get financial advice from professionals not interns. Good and permissible hierarchies are everywhere around us.

Yet hierarchy is an unfashionable thing to defend or to praise. British government ministers denounce experts as out of tune with popular feeling; both Donald Trump and Bernie Sanders built platforms on attacking Washington elites; economists are blamed for not predicting the 2008 crash; and even the best established practice of medical experts, such as childhood vaccinations, are treated with resistance and disbelief. We live in a time when no distinction is drawn between justified and useful hierarchies on the one hand, and self-interested, exploitative elites on the other.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Materialism alone cannot explain the riddle of consciousness – Adam Frank | Aeon Essays

Materialism alone cannot explain the riddle of consciousness – Adam Frank | Aeon Essays | Philosophy everywhere everywhen | Scoop.it
Materialism holds the high ground these days in debates over that most ultimate of scientific questions: the nature of consciousness. When tackling the problem of mind and brain, many prominent researchers advocate for a universe fully reducible to matter. ‘Of course you are nothing but the activity of your neurons,’ they proclaim. That position seems reasonable and sober in light of neuroscience’s advances, with brilliant images of brains lighting up like Christmas trees while test subjects eat apples, watch movies or dream. And aren’t all the underlying physical laws already known?

From this seemly hard-nosed vantage, the problem of consciousness seems to be just one of wiring, as the American physicist Michio Kaku argued in The Future of the Mind (2014). In the very public version of the debate over consciousness, those who advocate that understanding the mind might require something other than a ‘nothing but matter’ position are often painted as victims of wishful thinking, imprecise reasoning or, worst of all, an adherence to a mystical ‘woo’.

It’s hard not to feel the intuitional weight of today’s metaphysical sobriety. Like Pickett’s Charge up the hill at Gettysburg, who wants to argue with the superior position of those armed with ever more precise fMRIs, EEGs and the other material artefacts of the materialist position? There is, however, a significant weakness hiding in the imposing-looking materialist redoubt. It is as simple as it is undeniable: after more than a century of profound explorations into the subatomic world, our best theory for how matter behaves still tells us very little about what matter is. Materialists appeal to physics to explain the mind, but in modern physics the particles that make up a brain remain, in many ways, as mysterious as consciousness itself.

When I was a young physics student I once asked a professor: ‘What’s an electron?’ His answer stunned me. ‘An electron,’ he said, ‘is that to which we attribute the properties of the electron.’ That vague, circular response was a long way from the dream that drove me into physics, a dream of theories that perfectly described reality. Like almost every student over the past 100 years, I was shocked by quantum mechanics, the physics of the micro-world. In place of a clear vision of little bits of matter that explain all the big things around us, quantum physics gives us a powerful yet seemly paradoxical calculus. With its emphasis on probability waves, essential uncertainties and experimenters disturbing the reality they seek to measure, quantum mechanics made imagining the stuff of the world as classical bits of matter (or miniature billiard balls) all but impossible.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Panpsychism is crazy, but it’s also most probably true – Philip Goff | Aeon Ideas

Panpsychism is crazy, but it’s also most probably true – Philip Goff | Aeon Ideas | Philosophy everywhere everywhen | Scoop.it
Common sense tells us that only living things have an inner life. Rabbits and tigers and mice have feelings, sensations and experiences; tables and rocks and molecules do not. Panpsychists deny this datum of common sense. According to panpsychism, the smallest bits of matter – things such as electrons and quarks – have very basic kinds of experience; an electron has an inner life.

The main objection made to panpsychism is that it is ‘crazy’ and ‘just obviously wrong’. It is thought to be highly counterintuitive to suppose that an electron has some kind of inner life, no matter how basic, and this is taken to be a very strong reason to doubt the truth of panpsychism. But many widely accepted scientific theories are also crazily counter to common sense. Albert Einstein tells us that time slows down at high speeds. According to standard interpretations of quantum mechanics, particles have determinate positions only when measured. And according to Charles Darwin’s theory of evolution, our ancestors were apes. All of these views are wildly at odds with our common-sense view of the world, or at least they were when they were first proposed, but nobody thinks this is a good reason not to take them seriously. Why should we take common sense to be a good guide to how things really are?

No doubt the willingness of many to accept special relativity, natural selection and quantum mechanics, despite their strangeness from the point of view of pre-theoretical common sense, is a reflection of their respect for the scientific method. We are prepared to modify our view of the world if we take there to be good scientific reason to do so. But in the absence of hard experimental proof, people are reluctant to attribute consciousness to electrons.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

How fashion moves philosophy forward – J Bradley Studemeyer | Aeon Ideas

The rise and fall of popular positions in the field of philosophy is not governed solely by reason. Philosophers are generally reasonable people but, as with the rest of the human species, their thoughts are heavily influenced by their social settings. Indeed, they are perhaps more influenced than thinkers in other fields, since popular or ‘big’ ideas in modern philosophy change more frequently than ideas in, say, chemistry or biology. Why?

The relative instability of philosophical positions is a result of how the discipline is practised. In philosophy, questions about methods and limitations are on the table in a way that they tend not to be in the physical sciences, for example. Scientists generally acknowledge a ‘gold standard’ for validity – the scientific method – and, for the most part, the way in which investigations are conducted is more or less settled. Falsifiability rules the scientific disciplines: almost all scientists are in agreement that, if a hypothesis isn’t testable, then it isn’t scientific. There is no counterpoint of this in philosophy. Here, students and professors continue to ask: ‘Which questions can we ask?’ and ‘How can we ask, much less answer, those questions?’ There is no universally agreed-upon way in which to do philosophy.

Given that philosophy’s foundational questions and methods are still far from settled – they never will be – it’s natural that there is more flux, more volatility, in philosophy than in the physical sciences. But this volatility is not like the paradigm shifts described by the US historian of science Thomas Kuhn. A better analogy, in fact, would be changes of fashion.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

This Simple Philosophical Puzzle Shows How Difficult It Is to Know Something - Facts So Romantic - Nautilus

This Simple Philosophical Puzzle Shows How Difficult It Is to Know Something - Facts So Romantic - Nautilus | Philosophy everywhere everywhen | Scoop.it
In the 1960s, the American philosopher Edmund Gettier devised a thought experiment that has become known as a “Gettier case.” It shows that something’s “off” about the way we understand knowledge. This ordeal is called the “Gettier problem,” and 50 years later, philosophers are still arguing about it. Jennifer Nagel, a philosopher of mind at the University of Toronto, sums up its appeal. “The resilience of the Gettier problem,” she says, “suggests that it is difficult (if not impossible) to develop any explicit reductive theory of knowledge.”

What is knowledge? Well, thinkers for thousands of years had more or less taken one definition for granted: Knowledge is “justified true belief.” The reasoning seemed solid: Just believing something that happens to be true doesn’t necessarily make it knowledge. If your friend says to you that she knows what you ate last night (say it’s veggie pizza), and happens to be right after guessing, that doesn’t mean she knew. That was just a lucky guess—a mere true belief. Your friend would know, though, if she said veggie pizza because she saw you eat it—that’s the “justification” part. Your friend, in that case, would have good reason to believe you ate it.

The reason the Gettier problem is renowned is because Gettier showed, using little short stories, that this intuitive definition of knowledge was flawed. His 1963 paper, titled “Is Justified True Belief Knowledge?” resembles an undergraduate assignment. It’s just three pages long. But that’s all Gettier needed to revolutionize his field, epistemology, the study of the theory of knowledge.

The “problem” in a Gettier problem emerges in little, unassuming vignettes. Gettier had his, and philosophers have since come up with variations of their own. Try this version, from the University of Birmingham philosopher Scott Sturgeon:

Suppose I burgle your house, find two bottles of Newcastle Brown in the kitchen, drink and replace them. You remember purchasing the ale and come to believe there will be two bottles waiting for you at home. Your belief is justified and true, but you do not know what’s going on.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

The post-truth era of Trump is just what Nietzsche predicted

The post-truth era of Trump is just what Nietzsche predicted | Philosophy everywhere everywhen | Scoop.it
The morning of the US presidential election, I was leading a graduate seminar on Friedrich Nietzsche’s critique of truth. It turned out to be all too apt.

Nietzsche, German counter-Enlightenment thinker of the late 19th century, seemed to suggest that objective truth – the concept of truth that most philosophers relied on at the time – doesn’t really exist. That idea, he wrote, is a relic of an age when God was the guarantor of what counted as the objective view of the world, but God is dead, meaning that objective, absolute truth is an impossibility. God’s point of view is no longer available to determine what is true.

Nietzsche fancied himself a prophet of things to come – and not long after Donald Trump won the presidency, the Oxford Dictionaries declared the international word of the year 2016 to be “post-truth”.

Indeed, one of the characteristics of Trump’s campaign was its scorn for facts and the truth. Trump himself unabashedly made any claim that seemed fit for his purpose of being elected: that crime levels are sky-high, that climate change is a Chinese hoax, that he’d never called it a Chinese hoax, and so on. But the exposure of his constant contradictions and untruths didn’t stop him. He won.

Nietzsche offers us a way of understanding how this happened. As he saw it, once we realise that the idea of an absolute, objective truth is a philosophical hoax, the only alternative is a position called “perspectivism” – the idea there is no one objective way the world is, only perspectives on what the world is like.

This might seem outlandish. After all, surely we all agree certain things are objectively true: Trump’s predecessor as president is Barack Obama, the capital of France is Paris, and so on. But according to perspectivism, we agree on those things not because these propositions are “objectively true”, but by virtue of sharing the same perspective.

When it comes to basic matters, sharing a perspective on the truth is easy – but when it comes to issues such as morality, religion and politics, agreement is much harder to achieve. People occupy different perspectives, seeing the world and themselves in radically different ways. These perspectives are each shaped by the biases, the desires and the interests of those who hold them; they can vary wildly, and therefore so can the way people see the world.
Your truth, my truth

A core tenet of Enlightenment thought was that our shared humanity, or a shared faculty called reason, could serve as an antidote to differences of opinion a common ground that can function as the arbiter of different perspectives. Of course people disagree, but, the idea goes, through reason and argument they can come to see the truth. Nietzsche’s philosophy, however, claims such ideals are philosophical illusions, wishful thinking, or at worst covert way of imposing one’s own view on everyone else under the pretence of rationality and truth.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Authenticity in the Age of the Fake - Issue 42: Fakes - Nautilus

Authenticity in the Age of the Fake - Issue 42: Fakes - Nautilus | Philosophy everywhere everywhen | Scoop.it
The announcement of synthetic diamonds in 1955 was met with the same kind of alarm and skepticism that greeted claims to have made alchemical gold in the Middle Ages. Could these “fake” gems, created by a team at the General Electric research laboratories in Schenectady, New York really match the genuine article? One anonymous critic from California captured a widespread suspicion in blunt terms when he wrote to GE, saying:

You can’t make real diamonds for they are nature grown. You can’t make gold; no one can. They dig gold out of the ground and also diamonds. But no one can make them with a machine. That is just a lot of bull.
Yet what if it were true that diamonds really can be manufactured? When GE revealed the discovery, the stock of the De Beers diamond cartel in South Africa, which dominated the global market, plummeted. It seemed like a rare and precious commodity was about to be supplanted by an artificial form that could be fabricated by the ton, mirroring a millennia-old concern about the devastating power of fakes. Concerns over the devaluation of gold currency led the Roman emperor Diocletian to ban alchemy in the third century, and worries about counterfeiting and debased coinage also lay behind the condemnations of the art by Pope John XXII in 1317 and of King Henry IV of England in 1403.This, though, was no alchemy: The GE diamonds were perfect chemical replicas of the real thing. Was it the end of a billion-dollar market?

The answer was no. “Fake” diamonds are cheaper, and for industrial uses they have utterly eclipsed their natural counterparts. But at the luxury end of the market—gemstones for jewelry—artificial diamonds account for only 2 percent of global sales. How come?

When it comes to luxury and exotic materials, the competition between fake and real is partly a technical, chemical affair: how to create a good imitation, and how to spot it. But, as artificial gold and diamonds show, there is a deeper level to it, which is about something very human and socially constructed: the concept and value of authenticity.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Dalai Lama: Behind Our Anxiety, the Fear of Being Unneeded

Dalai Lama: Behind Our Anxiety, the Fear of Being Unneeded | Philosophy everywhere everywhen | Scoop.it
In many ways, there has never been a better time to be alive. Violence plagues some corners of the world, and too many still live under the grip of tyrannical regimes. And although all the world’s major faiths teach love, compassion and tolerance, unthinkable violence is being perpetrated in the name of religion.

And yet, fewer among us are poor, fewer are hungry, fewer children are dying, and more men and women can read than ever before. In many countries, recognition of women’s and minority rights is now the norm. There is still much work to do, of course, but there is hope and there is progress.

How strange, then, to see such anger and great discontent in some of the world’s richest nations. In the United States, Britain and across the European Continent, people are convulsed with political frustration and anxiety about the future. Refugees and migrants clamor for the chance to live in these safe, prosperous countries, but those who already live in those promised lands report great uneasiness about their own futures that seems to border on hopelessness.

Why?

A small hint comes from interesting research about how people thrive. In one shocking experiment, researchers found that senior citizens who didn’t feel useful to others were nearly three times as likely to die prematurely as those who did feel useful. This speaks to a broader human truth: We all need to be needed.

Being “needed” does not entail selfish pride or unhealthy attachment to the worldly esteem of others. Rather, it consists of a natural human hunger to serve our fellow men and women. As the 13th-century Buddhist sages taught, “If one lights a fire for others, it will also brighten one’s own way.”

more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Consciousness could be a side effect of 'entropy', say researchers

Consciousness could be a side effect of 'entropy', say researchers | Philosophy everywhere everywhen | Scoop.it
It's impressive enough that our human brains are made up of the same 'star stuff' that forms the Universe, but new research suggests that this might not be the only thing the two have in common.

Just like the Universe, our brains might be programmed to maximise disorder - similar to the principle of entropy - and our consciousness could simply be a side effect.

The quest to understand human consciousness - our ability to be aware of ourselves and our surroundings - has been going on for centuries. Although consciousness is a crucial part of being human, researchers still don't truly understand where it comes from, and why we have it.

But a new study, led by researchers from France and Canada, puts forward a new possibility: what if consciousness arises naturally as a result of our brains maximising their information content? In other words, what if consciousness is a side effect of our brain moving towards a state of entropy?

Entropy is basically the term used to describe the progression of a system from order to disorder. Picture an egg: when it's all perfectly separated into yolk and white, it has low entropy, but when you scramble it, it has high entropy - it's the most disordered it can be.

This is what many physicists believe is happening to our Universe. After the Big Bang, the Universe has gradually been moving from a state of low entropy to high entropy, and because the second law of thermodynamics states that entropy can only increase in a system, it could explain why the arrow of time only ever moves forwards.

So researchers decided to apply the same thinking to the connections in our brains, and investigate whether they show any patterns in the way they choose to order themselves while we're conscious.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Bob Dylan's Nobel prize – and what really defines literature

Bob Dylan's Nobel prize – and what really defines literature | Philosophy everywhere everywhen | Scoop.it
Bob Dylan’s Nobel prize win and the ensuing debate as to whether a musician should have been considered is a striking comment on the seemingly glib question of what literature actually is. And with the Man Booker prize also just around the corner, how and why literature matters are topics currently animating plenty of cultural debate.

Assessing the literary merit of Dylan’s work is nothing new. Christopher Ricks, a former Professor of Poetry at the University of Oxford, published a book on Dylan back in 2003, and a Cambridge Companion to Bob Dylan was released a few years later. But others have argued that his Nobel award snubs those who write “literature” — as in, in books.

The Nobel Prize in Literature is awarded to a writer who has produced for the field of literature, in Alfred Nobel’s words, “the most outstanding work in an ideal direction”. Dylan won the prize for having “created new poetic expressions”. The UK’s former Poet Laureate Andrew Motion commented that Dylan’s songs are “often the best words in the best order”. And Professor Sara Danius, Permanent Secretary of the Swedish Academy, spoke of Dylan’s “pictorial thinking”. The week before Dylan’s win, David Szalay’s All That Man Is, shortlisted for the Man Booker, won the Gordon Burn prize. The judges said the novel “subtly changes the way you look at the contemporary world”.

But what is an “ideal direction” for literature? And how exactly does literature change our relationship with “the contemporary world”?
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Is humanity losing faith in reason?

Is humanity losing faith in reason? | Philosophy everywhere everywhen | Scoop.it
The rise in popularity of Donald Trump and the emotionally-charged Brexit referendum have led many observers to proclaim that we are in an era of post-truth politics. Leave campaigner and Conservative MP Michael Gove spoke to this new reality when he declared that people “have had enough of experts”.

Truth doesn’t seem matter so much as truthiness – the quality of something seeming to be true even if it’s false.

Have we lost faith in reason? Philosopher Julian Baggini asks this question in a well-timed and cogently argued new book, The Edge of Reason: A Rational Skeptic in an Irrational World. Baggini looks at why rationality gets a bad rap these days. In many fields, the experts have let us down. Science has arguably over-reached. And religion – something billions of people continue to hold dear – is frequently portrayed in secular society as incompatible with intellectual coherence if not sanity.

A related phenomenon is the modern penchant for reducing reason to just logic or scientific reasoning. This would seem to rule out the possibility of there being moral truths, something humanity should be slow to surrender.

As Baggini puts it, “reason is not only disinterested reason, acting independently of anything other than value-free facts and logic”. Providing today’s “Unthinkable” idea, he adds: “We cannot look to disinterested reason to provide the basis of morality.”

You say reason is under fire: in what way?

Julian Baggini: “Let me count the ways! First and most recently, the widely documented loss of faith in experts and elites assumes that having greater knowledge and experience in thinking about issues counts for nothing and could even get in the way of a superior common sense. The brain is seen as having failed us and so the gut is trusted instead.

“Second, reason is associated with a dry, scientific world view that has no place for emotion, intuition or faith. Logic is for robots, we are not robots, therefore logic is not for us, which is ironically an attempt at arguing logically.

“Third, reason is routinely dismissed as merely a means of rationalising our prejudices and instinctive beliefs. ‘We all know’ that psychology has shown that the rational mind is not in charge and that the unconscious, driven by emotion and automatic processing, rules.
more...
No comment yet.
Rescooped by Wildcat2030 from Wisdom 1.0
Scoop.it!

The 50 Most Influential Living Philosophers | The Best Schools

The 50 Most Influential Living Philosophers | The Best Schools | Philosophy everywhere everywhen | Scoop.it
Here are the 50 most influential living philosophers, actively changing our understanding of ourselves and our world. Philosophy is far from dead!

Via Xaos
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Are human rights anything more than legal conventions? – John Tasioulas | Aeon Ideas

Are human rights anything more than legal conventions? – John Tasioulas | Aeon Ideas | Philosophy everywhere everywhen | Scoop.it
We live in an age of human rights. The language of human rights has become ubiquitous, a lingua franca used for expressing the most basic demands of justice. Some are old demands, such as the prohibition of torture and slavery. Others are newer, such as claims to internet access or same-sex marriage. But what are human rights, and where do they come from? This question is made urgent by a disquieting thought. Perhaps people with clashing values and convictions can so easily appeal to ‘human rights’ only because, ultimately, they don’t agree on what they are talking about? Maybe the apparently widespread consensus on the significance of human rights depends on the emptiness of that very notion? If this is true, then talk of human rights is rhetorical window-dressing, masking deeper ethical and political divisions.

Philosophers have debated the nature of human rights since at least the 12th century, often under the name of ‘natural rights’. These natural rights were supposed to be possessed by everyone and discoverable with the aid of our ordinary powers of reason (our ‘natural reason’), as opposed to rights established by law or disclosed through divine revelation. Wherever there are philosophers, however, there is disagreement. Belief in human rights left open how we go about making the case for them – are they, for example, protections of human needs generally or only of freedom of choice? There were also disagreements about the correct list of human rights – should it include socio-economic rights, like the rights to health or work, in addition to civil and political rights, such as the rights to a fair trial and political participation?

But many now argue that we should set aside philosophical wrangles over the nature and origins of human rights. In the 21st century, they contend, human rights exist not in the nebulous ether of philosophical speculation, but in the black letter of law. Human rights are those laid down in The Universal Declaration of Human Rights (1948) and the various international and domestic laws that implement it. Some who adopt this line of thought might even invoke the 18th-century English philosopher Jeremy Bentham, who contemptuously dismissed the idea of natural rights existing independently of human-made laws as ‘rhetorical nonsense – nonsense upon stilts’.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Why Foucault's work on power is more important than ever – Colin Koopman | Aeon Essays

Why Foucault's work on power is more important than ever – Colin Koopman | Aeon Essays | Philosophy everywhere everywhen | Scoop.it
Imagine you are asked to compose an ultra-short history of philosophy. Perhaps you’ve been challenged to squeeze the impossibly sprawling diversity of philosophy itself into just a few tweets. You could do worse than to search for the single word that best captures the ideas of every important philosopher. Plato had his ‘forms’. René Descartes had his ‘mind’ and John Locke his ‘ideas’. John Stuart Mill later had his ‘liberty’. In more recent philosophy, Jacques Derrida’s word was ‘text’, John Rawls’s was ‘justice’, and Judith Butler’s remains ‘gender’. Michel Foucault’s word, according to this innocent little parlour game, would certainly be ‘power’.

Foucault remains one of the most cited 20th-century thinkers and is, according to some lists, the single most cited figure across the humanities and social sciences. His two most referenced works, Discipline and Punish: The Birth of the Prison (1975) and The History of Sexuality, Volume One (1976), are the central sources for his analyses of power. Interestingly enough, however, Foucault was not always known for his signature word. He first gained his massive influence in 1966 with the publication of The Order of Things. The original French title gives a better sense of the intellectual milieu in which it was written: Les mots et les choses, or ‘Words and Things’. Philosophy in the 1960s was all about words, especially among Foucault’s contemporaries.

In other parts of Paris, Derrida was busily asserting that ‘there is nothing outside the text’, and Jacques Lacan turned psychoanalysis into linguistics by claiming that ‘the unconscious is structured like a language’. This was not just a French fashion. In 1967 Richard Rorty, surely the most infamous American philosopher of his generation, summed up the new spirit in the title of his anthology of essays, The Linguistic Turn. That same year, Jürgen Habermas, soon to become Germany’s leading philosopher, published his attempt at ‘grounding the social sciences in a theory of language’.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

If You Think You’re a Genius, You’re Crazy - Issue 46: Balance - Nautilus

If You Think You’re a Genius, You’re Crazy - Issue 46: Balance - Nautilus | Philosophy everywhere everywhen | Scoop.it
When John Forbes Nash, the Nobel Prize-winning mathematician, schizophrenic, and paranoid delusional, was asked how he could believe that space aliens had recruited him to save the world, he gave a simple response. “Because the ideas I had about supernatural beings came to me the same way that my mathematical ideas did. So I took them seriously.”

Nash is hardly the only so-called mad genius in history. Suicide victims like painters Vincent Van Gogh and Mark Rothko, novelists Virginia Woolf and Ernest Hemingway, and poets Anne Sexton and Sylvia Plath all offer prime examples. Even ignoring those great creators who did not kill themselves in a fit of deep depression, it remains easy to list persons who endured well-documented psychopathology, including the composer Robert Schumann, the poet Emily Dickinson, and Nash. Creative geniuses who have succumbed to alcoholism or other addictions are also legion.

Instances such as these have led many to suppose that creativity and psychopathology are intimately related. Indeed, the notion that creative genius might have some touch of madness goes back to Plato and Aristotle. But some recent psychologists argue that the whole idea is a pure hoax. After all, it is certainly no problem to come up with the names of creative geniuses who seem to have displayed no signs or symptoms of mental illness.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

How playing Wittgensteinian language-games can set us free – Sandy Grant | Aeon Ideas

We live out our lives amid a world of language, in which we use words to do things. Ordinarily we don’t notice this; we just get on with it. But the way we use language affects how we live and who we can be. We are as if bewitched by the practices of saying that constitute our ways of going on in the world. If we want to change how things are, then we need to change the way we use words. But can language-games set us free?

It was the maverick philosopher Ludwig Wittgenstein who coined the term ‘language-game’. He contended that words acquire meaning by their use, and wanted to see how their use was tied up with the social practices of which they are a part. So he used ‘language-game’ to draw attention not only to language itself, but to the actions into which it is woven. Consider the exclamations ‘Help!’ ‘Fire!’ ‘No!’ These do something with words: soliciting, warning, forbidding. But Wittgenstein wanted to expose how ‘words are deeds’, that we do something every time we use a word. Moreover, what we do, we do in a world with others.

This was not facile word-nerdery. Wittgenstein was intent on bringing out how ‘the “speaking” of language is part of an activity, or form of life’. In Philosophical Investigations (1953), he used the example of two builders. A brickie calls ‘Slab!’ and his helper brings it. What’s going on here? The helper who responds is not like a dog reacting to an order. We are humans, the ones who live together in language in the particular way that we do, a way that involves distinctive social practices.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Why Tolkien's fantastic imaginary languages have had more impact than Esperanto

Why Tolkien's fantastic imaginary languages have had more impact than Esperanto | Philosophy everywhere everywhen | Scoop.it
JRR Tolkien began writing The Fall of Gondolin while on medical leave from the first world war, 100 years ago this month. It is the first story in what would become his legendarium – the mythology that underpins The Lord of the Rings. But behind the fiction was his interest in another epic act of creation: the construction of imaginary languages.

That same year, on the other side of Europe, Ludwik Zamenhof died in his native Poland. Zamenhof had also been obsessed with language invention, and in 1887 brought out a book introducing his own creation. He published this under the pseudonym Doktoro Esperanto, which in time became the name of the language itself.

The construction of imaginary languages, or conlangs, has a long history, dating back to the 12th century. And Tolkien and Zamenhof are two of its most successful proponents. Yet their aims were very different, and in fact point to opposing views of what language itself actually is.

Zamenhof, a Polish Jew growing up in a country where cultural and ethnic animosity was rife, believed that the existence of a universal language was the key to peaceful co-existence. Although language is the “prime motor of civilisation” he wrote, “difference of speech is a cause of antipathy, nay even of hatred, between people”. His plan was to devise something which was simple to learn, not tied to any one nation or culture, and could thus help unite rather than divide humanity.

As “international auxiliary languages” go, Esperanto has been very successful. At its peak, its speakers numbered in the millions, and although exact estimates are very difficult to make, even today up to a million people still use it. It has an expansive body of native literature, there’s a museum in China dedicated exclusively to it, while in Japan Zamenhof himself is even honoured as a god by one particular Shinto sect who use the language. Yet it never really came close to achieving his dreams of world harmony. And at his death, with World War I tearing Europe apart, the optimism he’d had for it had turned mostly to disillusion.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Extraterrestrials May Be Robots Without Consciousness - Cosmos on Nautilus

Extraterrestrials May Be Robots Without Consciousness - Cosmos on Nautilus | Philosophy everywhere everywhen | Scoop.it
Humans are probably not the greatest intelligences in the universe. Earth is a relatively young planet and the oldest civilizations could be billions of years older than us. But even on Earth, Homo sapiens may not be the most intelligent species for that much longer.

The world Go, chess, and Jeopardy champions are now all AIs. AI is projected to outmode many human professions within the next few decades. And given the rapid pace of its development, AI may soon advance to artificial general intelligence—intelligence that, like human intelligence, can combine insights from different topic areas and display flexibility and common sense. From there it is a short leap to superintelligent AI, which is smarter than humans in every respect, even those that now seem firmly in the human domain, such as scientific reasoning and social skills. Each of us alive today may be one of the last rungs on the evolutionary ladder that leads from the first living cell to synthetic intelligence.

What we are only beginning to realize is that these two forms of superhuman intelligence—alien and artificial—may not be so distinct. The technological developments we are witnessing today may have all happened before, elsewhere in the universe. The transition from biological to synthetic intelligence may be a general pattern, instantiated over and over, throughout the cosmos. The universe’s greatest intelligences may be postbiological, having grown out of civilizations that were once biological. (This is a view I share with Paul Davies, Steven Dick, Martin Rees, and Seth Shostak, among others.) To judge from the human experience—the only example we have—the transition from biological to postbiological may take only a few hundred years.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

How would the Stoics cope today? | Ryan Holiday

How would the Stoics cope today? | Ryan Holiday | Philosophy everywhere everywhen | Scoop.it
Some of us are stressed. Others are overworked, struggling with the new responsibilities of parenthood, or moving from one flawed relationship to another. Whatever it is, whatever you are going through, there is wisdom from the Stoics that can help.

Followers of this ancient and inscrutable philosophy have found themselves at the centre of some of history’s most trying ordeals, from the French Revolution to the American Civil War to the prison camps of Vietnam. Bill Clinton reportedly reads Roman Emperor and stoic Marcus Aurelius’s Meditations once a year, and one can imagine him handing a copy to Hillary after her heart-wrenching loss in the US presidential election.

Stoicism is a school of philosophy which was founded in Athens in the early 3rd century and then progressed to Rome, where it became a pragmatic way of addressing life’s problems. The central message is, we don’t control what happens to us; we control how we respond.

The Stoics were really writing and thinking about one thing: how to live. The questions they asked were not arcane or academic but practical and real. “What do I do about my anger?” “What do I do if someone insults me?” “I’m afraid to die; why is that?” “How can I deal with the difficult situations I face?” “How can I deal with the success or power I hold?”

There also happens to be a decent amount of advice on how to live under the looming threat of a tyrant (“I may wish to be free from torture, but if the time comes for me to endure it, I’ll wish to bear it courageously with bravery and honour,” wrote the Roman philosopher Seneca). All of which makes Stoic philosophy particularly well-suited to the world we live in.

While it would be hard to find a word dealt a greater injustice at the hands of the English language than “stoicism”— with its mistaken connotations of austerity and lack of emotion — in fact, nothing could be more necessary for our times than a good dose of Stoic philosophy.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

We Were Wrong About Consciousness Disappearing in Dreamless Sleep, Say Scientists

We Were Wrong About Consciousness Disappearing in Dreamless Sleep, Say Scientists | Philosophy everywhere everywhen | Scoop.it
When it comes to dreamlessness, conventional wisdom states that consciousness disappears when we fall into a deep, dreamless sleep.

But researchers have come up with a new way to define the different ways that we experience dreamlessness, and say there’s no evidence to suggest that our consciousness 'switches off' when we stop dreaming. In fact, they say the state of dreamlessness is way more complicated than we’d even imagined.

"[T]he idea that dreamless sleep is an unconscious state is not well-supported by the evidence," one of the researchers, Evan Thompson from the University of British Columbia in Canada, told Live Science.

Instead, he says the evidence points to the possibility of people having conscious experiences during all states of sleep - including deep sleep - and that could have implications for those accused of committing a crime while sleepwalking.

But first off, what exactly is dreamlessness?

Traditionally, dreamlessness is defined at that part of sleep that occurs between bouts of dreams - a time of deep sleep when your conscious experience is temporarily switched off. This is different from those times when you simply cannot remember your dreams once you've woken up.

As dream researchers from the University of California, Santa Cruz explain, most people over the age of 10 dream at least four to six times per night during a stage of sleep called REM, or Rapid Eye Movement. (Studies suggest that children under age 10 only dream during roughly 20 percent of their REM periods.)

Considering REM periods can vary in length from 5 to 10 minutes for the first REM period of the night to as long as 30-34 minutes later in the night, researchers have suggested that each dream is probably no longer than 34 minutes each.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Stephen Hawking: AI will be 'either best or worst thing' for humanity

Stephen Hawking: AI will be 'either best or worst thing' for humanity | Philosophy everywhere everywhen | Scoop.it
Professor Stephen Hawking has warned that the creation of powerful artificial intelligence will be “either the best, or the worst thing, ever to happen to humanity”, and praised the creation of an academic institute dedicated to researching the future of intelligence as “crucial to the future of our civilisation and our species”.

Hawking was speaking at the opening of the Leverhulme Centre for the Future of Intelligence (LCFI) at Cambridge University, a multi-disciplinary institute that will attempt to tackle some of the open-ended questions raised by the rapid pace of development in AI research.

“We spend a great deal of time studying history,” Hawking said, “which, let’s face it, is mostly the history of stupidity. So it’s a welcome change that people are studying instead the future of intelligence.”
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Would you ditch your therapist for a “philosophical counselor”?

Would you ditch your therapist for a “philosophical counselor”? | Philosophy everywhere everywhen | Scoop.it
Instead of going to traditional psychotherapists for advice and support, growing numbers of people are turning to philosophical counselors for particularly wise guidance. These counselors work much like traditional psychotherapists. But instead of offering solutions based solely on their understanding of mental health or psychology, philosophical counselors offer solutions and guidance drawn from the writings of great thinkers.

Millennia of philosophical studies can provide practical advice for those experiencing practical difficulties: There’s an entire field of philosophy that explores moral issues; stoic philosophers show us how to weather hardship; the existentialists advise on anxiety; and Aristotle was one of the first thinkers to question what makes a “good life.” All these topics make up a good chunk of any therapy session, philosophical or otherwise.

Philosophical counseling has been available since the early 1990s, when Elliot Cohen came up with the idea and founded the National Philosophical Counseling Association (NPCA) with around 20 counselors. The NPCA’s website suggests writer’s block, job loss, procrastination, and rejection are all appropriate subjects for philosophical guidance. (However, counselors will refer clients to a psychiatrist if they think they’re suffering from a serious mental health issue.) Clients pay about $100 a session for philosophically guided advice, and each session lasts roughly an hour.

“I saw so many people who had all these problems of living that seemed to be amenable to the thinking that students do in Philosophy 101 and Introduction to Logic,” Cohen says. He often draws on French existentialist Jean Paul Sartre, who believed that you are nothing more than your own actions. “If you don’t act, you don’t define yourself and you don’t become anything but a disappointed dream or expectation,” he adds.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Can Transcendence Be Taught?

Can Transcendence Be Taught? | Philosophy everywhere everywhen | Scoop.it
For two professors, the opening words of Goethe’s Faust have always been slightly disturbing, but only recently, as we’ve grown older, have they come to haunt us.

Faust sits in his dusty library, surrounded by tomes, and laments the utter inadequacy of human knowledge. He was no average scholar but a true savant — a master in the liberal arts of philosophy and theology and the practical arts of jurisprudence and medicine. In the medieval university, those subjects were the culminating moments of a lifetime of study in rhetoric, logic, grammar, arithmetic, geometry, music, and astronomy.

In other words, Faust knows everything worth knowing. And still, after all his careful bookwork, he arrives at the unsettling realization that none of it has really mattered. His scholarship has done pitifully little to unlock the mystery of human life.

Are we and our students in that same situation? Are we teaching them everything without teaching them anything regarding the big questions that matter most? Is there a curriculum that addresses why we are here? And why we live only to suffer and die?

Those questions are at the root of every great myth and wisdom tradition: the Katha Upanishad, the opening lines of the Bhagavad Gita, Sophocles’ Ajax, and the Book of Job among them. Job cries to the heavens, entreating God to clarify the tortuous perplexity of being human. But God does not oblige, and Job is left in a whirlwind, in the dark, just like Faust at the beginning of Goethe’s modern remake of the ancient biblical story.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Humans may speak a 'universal' language

Humans may speak a 'universal' language | Philosophy everywhere everywhen | Scoop.it
Even though you’re not fluent in different languages, you may be able to recognise words in others. In German for water is ‘wasser’, in Dutch it's 'water' and in Serbian ‘voda’. Similar sounds and letters are used to form the word across languages.

Looking at this phenomenon, researchers at Cornell’s Cognitive Neuroscience Lab in the US have found we use similar sounds for the words of common objects and ideas, suggesting that humans may speak the same language.

By analysing around 40-100 basic vocabulary words in around 3,700 languages, approximately 62 per cent of the world’s current languages, the researchers came to the conclusion that for basic concepts such as body parts or aspects of the natural natural world, there are common sounds. The research was published in the journal Proceedings of the National Academy of Sciences.

Body parts in particular stood out. The word ‘nose’ was likely to include the sounds ‘neh’ or the ‘oo’ sound, as in ‘ooze’. The words ‘knee’ ‘bone’ and 'breasts’ were also similar across the language spectrum. The word for tongue is likely to have an ‘l’, as in 'langue' in French.

The words 'red' and 'round' were more likely to include the ‘r’ sound. 'Leaf' was found to include the sounds ‘l’, ‘b’ or ‘p’. The words 'bite', 'dog', 'star' and 'water' also stood out as words with strong similar sounds.

Certain words were also found to avoid specific sounds. Words for ‘I’ were found to be unlikely to include sounds involving ‘b’, ‘l’, ‘p’, ‘r’, ‘s’, ‘t’ or ‘u’.
more...
No comment yet.