Philosophy everyw...
Follow
Find
8.1K views | +2 today
 
Scooped by Wildcat2030
onto Philosophy everywhere everywhen
Scoop.it!

Was Wittgenstein Right?-A reminder of philosophy’s embarrassing failure, after over 2000 years, to settle any of its central issues.

Was Wittgenstein Right?-A reminder of philosophy’s embarrassing failure, after over 2000 years, to settle any of its central issues. | Philosophy everywhere everywhen | Scoop.it
The man who insisted that Western philosophy was based in confusion and wishful thinking is not a popular one among philosophers. But he should not be dismissed.

-

The singular achievement of the controversial early 20th century philosopher Ludwig Wittgenstein was to have discerned the true nature of Western philosophy — what is special about its problems, where they come from, how they should and should not be addressed, and what can and cannot be accomplished by grappling with them. The uniquely insightful answers provided to these meta-questions are what give his treatments of specific issues within the subject — concerning language, experience, knowledge, mathematics, art and religion among them — a power of illumination that cannot be found in the work of others.

 

Admittedly, few would agree with this rosy assessment — certainly not many professional philosophers. Apart from a small and ignored clique of hard-core supporters the usual view these days is that his writing is self-indulgently obscure and that behind the catchy slogans there is little of intellectual value. But this dismissal disguises what is pretty clearly the real cause of Wittgenstein’s unpopularity within departments of philosophy: namely, his thoroughgoing rejection of the subject as traditionally and currently practiced; his insistence that it can’t give us the kind of knowledge generally regarded as its raison d’être.

 

Wittgenstein claims that there are no realms of phenomena whose study is the special business of a philosopher, and about which he or she should devise profound a priori theories and sophisticated supporting arguments. There are no startling discoveries to be made of facts, not open to the methods of science, yet accessible “from the armchair” through some blend of intuition, pure reason and conceptual analysis. Indeed the whole idea of a subject that could yield such results is based on confusion and wishful thinking.

more...
No comment yet.
Philosophy everywhere everywhen
The First Law of Philosophy: For every philosopher, there exists an equal and opposite philosopher. The Second Law of Philosophy: They're bo
Curated by Wildcat2030
Your new post is loading...
Your new post is loading...
Scooped by Wildcat2030
Scoop.it!

Do people have a moral duty to have children if they can? — Richard Chappell — Aeon Essays

Many people want to have children. But they might wonder: is it ethical to bring a child into this broken world, where she might suffer – and partake in – various harms and injustices? Others prefer not to have children. This choice also raises ethical qualms: is it ‘selfish’ to refrain from procreating? Are non-parents failing to contribute to the future of humanity – to the building of the next generation – in a way that we all should if we can?

It is tempting to dismiss such questions on the grounds that whether or not you have kids is a personal matter. It is surely nobody else’s damn business. It’s not up to the government or society to tell me. This question falls securely within the ‘private sphere’ that, in a properly liberal society, other people must respect and leave well enough alone.

True enough. But the mere fact that it is a private matter, something that others have no business deciding for us, does not mean that morality is necessarily silent on the issue. We can each, individually, ask ourselves: what should I do? Are there ethical considerations that we should take into account here – considerations that might help guide us as we attempt to navigate these intensely important, intensely personal questions? And if we do undertake such ethical enquiry, the answers we reach might surprise us.

Is it fair to your would-be child to bring her into a life that will inevitably contain significant amounts of pain, discomfort, suffering and heartache? In his essay ‘On the Suffering of the World’ (1850), Arthur Schopenhauer asked:

If children were brought into the world by an act of pure reason alone, would the human race continue to exist? Would not a man rather have so much sympathy with the coming generation as to spare it the burden of existence? Or at any rate not take it upon himself to impose that burden in cold blood?
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Why do we say 'sorry' so much?

Why do we say 'sorry' so much? | Philosophy everywhere everywhen | Scoop.it
Just Not Sorry is a new app that aims to draw attention to the use of apologetic language and the excessive use of sorry. People, and especially women it has been claimed, need help to be more forthright and assertive in their emails. This raises the question: why do we say sorry? And is it necessarily a sign of weakness?

The word sorry goes right back to the earliest stages of the English language, as spoken by the Anglo-Saxons. Tracing its history from Old English to the present day reveals an interesting development, in which there is a marked change from the expression of genuine heartfelt sorrow and remorse to regret for minor inconvenience. The key shift occurs in the 19th century and is accompanied by the change from “I am sorry” to plain “sorry”, thereby creating a distancing effect, taking us a further step away from the apology as a statement of personal distress to a more formulaic use. In his history of English Manners Henry Hitchings links this to the 19th-century association of politeness with detachment and aloofness, and the emergence of the concept of the “stiff upper lip”.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Troubles with Three-ism: Body, Mind, and Soul - The Los Angeles...

Troubles with Three-ism: Body, Mind, and Soul - The Los Angeles... | Philosophy everywhere everywhen | Scoop.it
WHEN I WAS a wee Catholic lad growing up in the New York City suburbs of the late 1950s and early 1960s, I learned that good people go to heaven after they die. This was consoling. But it made me wonder precisely which part of me would go to heaven: my body, my mind, or my soul. Thanks to dead hamsters and such, I understood that bodies die, decay, and disperse. There was talk in school and at church of the resurrection of the body on Judgment Day, but that event, I reckoned, might not happen for several million years, and surely I’d be well ensconced in heaven by then. My mother tentatively explained that the part of me that loved peanut butter and jelly sandwiches and chocolate ice cream sodas would most likely not go to heaven, or, if it did, would not need or want peanut butter and jelly sandwiches and chocolate ice cream sodas anymore — possibly, I speculated, because, in the heavenly state, I’d be able mentally to conjure those great pleasures without there being actual physical manifestations of me or them. I surmised that those perfectly good human desires would either be gone (because my body would be gone), or somehow be eternally satisfied.

So, which was it, my mind or my soul that would go to heaven? Or both? And how did they differ? I didn’t want to go to heaven without my personality and memories. I wanted to be in heaven with my brothers and sisters, parents and grandparents, if not bodily then at least mentally. But personality and memories were, in my little boy ontology, associated with mind, and there was talk that the part of me that would go to heaven was something more ethereal than my mind. It was my eternal soul. But my soul, unlike my mind, seemed a bit too vague and general to be “me.” I wanted to be in heaven with me as me myself. Such were the vicissitudes of boyhood. I was troubled by three-ism. I was not, and am not, alone.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Why it matters that you realize you’re in a computer simulation

Why it matters that you realize you’re in a computer simulation | Philosophy everywhere everywhen | Scoop.it
What if our universe is something like a computer simulation, or a virtual reality, or a video game? The proposition that the universe is actually a computer simulation was furthered in a big way during the 1970s, when John Conway famously proved that if you take a binary system, and subject that system to only a few rules (in the case of Conway’s experiment, four); then that system creates something rather peculiar.

What Conway’s rules produced were emergent complexities so sophisticated that they seemed to resemble the behaviors of life itself. He named his demonstration The Game of Life, and it helped lay the foundation for the Simulation Argument, its counterpart the Simulation Hypothesis, and Digital Mechanics. These fields have gone on to create a massive multi-decade long discourse in science, philosophy, and popular culture around the idea that it actually makes logical, mathematical sense that our universe is indeed a computer simulation. To crib a summary from Morpheus, “The Matrix is everywhere”. But amongst the murmurs on various forums and reddit threads pertaining to the subject, it isn’t uncommon to find a word or two devoted to caution: We, the complex intelligent lifeforms who are supposedly “inside” this simulated universe, would do well to play dumb that we are at all conscious of our circumstance.

The colloquial warning says we must not betray the knowledge that we have become aware of being mere bits in the bit kingdom. To have a tipping point population of players who realize that they are actually in something like a video game would have dire and catastrophic results. Deletion, reformatting, or some kind of biblical flushing of our entire universe (or maybe just our species), would unfold. Leave the Matrix alone! In fact, please pretend it isn’t even there.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

The lost art of getting lost - BBC News

The lost art of getting lost - BBC News | Philosophy everywhere everywhen | Scoop.it
Technology means maps and directions are constantly at hand, and getting lost is more unlikely than ever before. While for many this is a thing of joy, Stephen Smith asks if we may be missing out.

When was the last time you were well and truly lost? Chances are it's been a while.

Extraordinary gadgets like smartphones and satnavs let us pinpoint our location unerringly. Like the people in Downton Abbey, we all know our place.

However, the technology which delivers the world into the palms of our hands may be ushering in a kind of social immobility undreamt of even by Julian Fellowes's hidebound little Englanders.

Discovery used to mean going out and coming across stuff - now it seems to mean turning inwards and gazing at screens. We've become reliant on machines to help us get around, so much so that it's changing the way we behave, particularly among younger people who have no experience of a time before GPS.

We're raising an entire generation of men who will never know what it is to refuse to ask for directions.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Boredom is not a problem to be solved. It's the last privilege of a free mind | Gayatri Devi

Boredom is not a problem to be solved. It's the last privilege of a free mind | Gayatri Devi | Philosophy everywhere everywhen | Scoop.it
Confessing to boredom is confessing to a character-flaw. Popular culture is littered with advice on how to shake it off: find like-minded people, take up a hobby, find a cause and work for it, take up an instrument, read a book, clean your house And certainly don’t let your kids be bored: enroll them in swimming, soccer, dance, church groups – anything to keep them from assuaging their boredom by gravitating toward sex and drugs. To do otherwise is to admit that we’re not engaging with the world around us. Or that your cellphone has died.

But boredom is not tragic. Properly understood, boredom helps us understand time, and ourselves. Unlike fun or work, boredom is not about anything; it is our encounter with pure time as form and content. With ads and screens and handheld devices ubiquitous, we don’t get to have that experience that much anymore. We should teach the young people to feel comfortable with time.

I live and teach in small-town Pennsylvania, and some of my students from bigger cities tell me that they always go home on Fridays because they are bored here.

You know the best antidote to boredom, I asked them? They looked at me expectantly, smartphones dangling from their hands. Think, I told them. Thinking is the best antidote to boredom. I am not kidding, kids. Thinking is the best antidote to boredom. Tell yourself, I am bored. Think about that. Isn’t that interesting? They looked at me incredulously. Thinking is not how they were brought up to handle boredom.

When you’re bored, time moves slowly. The German word for “boredom” expresses this: langeweile, a compound made of “lange,” which means “long,” and “weile” meaning “a while”. And slow-moving time can feel torturous for people who can’t feel peaceful alone with their minds. Learning to do so is why learning to be bored is so crucial. It is a great privilege if you can do this without going to the psychiatrist.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Walter Benjamin’s legacy, 75 years on

Walter Benjamin’s legacy, 75 years on | Philosophy everywhere everywhen | Scoop.it
Like many a refugee in southern and central Europe today, Walter Benjamin was in flight from war and persecution 75 years ago, but was blocked at an intermediate border en route to the country chosen as his haven. He was part of a Jewish group which, hoping to escape occupied France, had hiked through a Pyrenean pass in autumn 1940 with a view to entering Franco’s Spain, crossing it to Portugal and then sailing to the US. However, in the words of Hannah Arendt, they arrived in the frontier village of Portbou “only to learn that Spain had closed the border that same day” and officials were not honouring American visas such as Benjamin’s. Faced with the prospect of returning to France and being handed over to the Nazis, he “took his own life” overnight on 26 September, whereupon the officials “allowed his companions to proceed to Portugal”.

For Arendt, who successfully reached New York via his intended route a few months later, this was a tragedy of misunderstanding, a poignant but fitting end for a brilliant but misfortune-prone older relative (her cousin by marriage) whom she writes about with a kind of affectionate exasperation.

Yet Edward Stourton, in Cruel Crossing: Escaping Hitler Across the Pyrenees, notes “there are all sorts of unanswered questions surrounding Benjamin’s death. His travelling companions remembered him carrying a heavy briefcase containing a manuscript he described as ‘more important than I am’. No such manuscript was found after his death … A Spanish doctor’s report gave the cause of death as a cerebral haemorrhage, not a drugs overdose. There has been persistent speculation that he was actually murdered, perhaps by a Soviet agent who had infiltrated his escaping party.”
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

This free online encyclopedia has achieved what Wikipedia can only dream of

The Stanford Encyclopedia of Philosophy may be the most interesting website on the internet. Not because of the content—which includes fascinating entries on everything from ambiguity to zombies—but because of the site itself.

Its creators have solved one of the internet’s fundamental problems: How to provide authoritative, rigorously accurate knowledge, at no cost to readers. It’s something the encyclopedia, or SEP, has managed to do for two decades.
The internet is an information landfill. Somewhere in it—buried under piles of opinion, speculation, and misinformation—is virtually all of human knowledge.   But sorting through the trash is difficult work. Even when you have something you think is valuable, it often turns out to be a cheap knock-off. The story of how the SEP is run, and how it came to be, shows that it is possible to create a less trashy internet—or at least a less trashy corner of it. A place where actual knowledge is sorted into a neat, separate pile instead of being thrown into the landfill. Where the world can go to learn everything that we know to be true. Something that would make humans a lot smarter than the internet we have today.

go read..

more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Watch: What Is Consciousness? We Now Have the Tools to Find Out - Singularity HUB

Watch: What Is Consciousness? We Now Have the Tools to Find Out - Singularity HUB | Philosophy everywhere everywhen | Scoop.it
The question of consciousness is as old as philosophy. Most animals appear to get along just fine without a sense of ‘me-ness’. But human beings are different. (At least, as far as we know we are.) We’ve evolved a sense of self awareness.

And while the exact nature of human consciousness is exceedingly difficult to pin down—that doesn’t stop us from trying. It's a puzzle that's preoccupied the world’s greatest philosophers for millennia and, in recent centuries, scientists too.

In the information age, we've begun to wonder if consciousness is a uniquely biological phenomenon or if it might arise elsewhere. Is the brain just a mushy computer running wetware—something we can replicate in hardware and software? Or is comparing the brain to a computer a misleading analogy and a vast oversimplification?

A fascinating new video from the Economist, featuring some of the brightest minds working the problem, brings us up to date on the debate and the latest thinking.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

A new process for studying proteins associated with diseases | KurzweilAI

A new process for studying proteins associated with diseases | KurzweilAI | Philosophy everywhere everywhen | Scoop.it
Researchers from Northwestern University and Yale University have developed a new technology to help scientists understand how proteins work and fix them when they are broken. Such knowledge could pave the way for new drugs for a myriad of diseases, including cancer.

The human body turns its proteins on and off (to alter their function and activity in cells) using “phosphorylation” — the reversible attachment of phosphate groups to proteins. These “decorations” on proteins provide an enormous variety of functions and are essential to all forms of life. Little is known, however, about how this important dynamic process works in humans.

Phosphorylation: a hallmark of disease

Using a special strain of E. coli bacteria, the researchers built a cell-free protein synthesis platform technology that can manufacture large quantities of these human phosphoproteins for scientific study. The goal is to enable scientists to learn more about the function and structure of phosphoproteins and identify which ones are involved in disease.

The study was published Sept. 9 in an open-access paper by the journal Nature Communications.

Trouble in the phosphorylation process can be a hallmark of disease, such as cancer, inflammation and Alzheimer’s disease. The human proteome (the entire set of expressed proteins) is estimated to be phosphorylated at more than 100,000 unique sites, making study of phosphorylated proteins and their role in disease a daunting task.

“Our technology begins to make this a tractable problem,” said Michael C. Jewett, an associate professor of chemical and biological engineering who led the Northwestern team. “We now can make these special proteins at unprecedented yields, with a freedom of design that is not possible in living organisms. The consequence of this innovative strategy is enormous.”
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Ignore Your Feelings

Ignore Your Feelings | Philosophy everywhere everywhen | Scoop.it
Put down the talking stick. Stop fruitlessly seeking "closure" with your peevish co-worker. And please, don't bother telling your spouse how annoying you find their tongue-clicking habit—sometimes honesty is less like a breath of fresh air and more like a fart. That’s the argument of Michael Bennett and Sarah Bennett, the father-daughter duo behind the new self-help book F*ck Feelings.

The elder Bennett is a psychiatrist and American Psychiatric Association distinguished fellow. His daughter is a comedy writer. Together, they provide a tough-love, irreverent take on “life's impossible problems.” The crux of their approach is that life is hard and negative emotions are part of it. The key is to see your “bullshit wishes” for just what they are (bullshit), and instead to pursue real, achievable goals.

Stop trying to forgive your bad parents, they advise. Jerks are capable of having as many kids as anyone else—at least until men’s rights conventions come equipped with free vasectomy booths. If you happen to be the child of a jerk, that's just another obstacle to overcome.

In fact, stop trying to free yourself of all anger and hate. In all likelihood you're doing a really awesome job, the Bennetts argue, despite all the shitty things that happen to you.

Oh, and a word on shit: “Profanity is a source of comfort, clarity, and strength,” they write. “It helps to express anger without blame, to be tough in the face of pain.”

I recently spoke with the Bennetts by phone about what the f*cking deal is with their book. A lightly edited transcript of our conversation follows.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Philosophically Interesting Books for Young Kids

Philosophically Interesting Books for Young Kids | Philosophy everywhere everywhen | Scoop.it
A friend is interested in soliciting philosophically-minded books for young children—ones who are reading, but are not at the chapter-book stage. Here are a few I’ve enjoyed with my kids… The Big Orange Splot by Daniel Manus Pinkwater — for the young individualist. A Hole Is To Dig by Ruth Krauss — for the young teleologist. Pierre: A Cautionary Tale by Maurice Sendak — for the young nihilist. It Could Always Be Worse by Margot Zemach — for the young. How To Behave and Why by Munro Leaf — for th
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Study: There Are Instructions for Teaching Critical Thinking | Big Think

Study: There Are Instructions for Teaching Critical Thinking | Big Think | Philosophy everywhere everywhen | Scoop.it
Whether or not you can teach something as subjective as critical thinking has been up for debate, but a fascinating new study shows that it’s actually quite possible. Experiments performed by Stanford's Department of Physics and Graduate School of Education demonstrate that students can be instructed to think more critically.

It’s difficult to overstate the importance of critical-thinking skills in modern society. The ability to decipher information and interpret it, offering creative solutions, is in direct relation to our intellect.
The study took two groups of students in an introductory physics laboratory course, with one group (known as the experimental group) given the instruction to use quantitative comparisons between datasets and the other group given no instruction (the control group). Comparing data in a scientific manner; that is, being able to measure one’s observations in a statistical or mathematical way, led to interesting results for the experimental group.Even after these instructions were removed, they were 12 times more likely to offer creative solutions to improve the experimental methods being used in the class, four times more likely to explain the limitations of the methods, and better at explaining their reasoning than the control group. The results remained consistent even in the next year, with students in a different class. So what does this imply about critical thinking, and how can we utilize these findings to improve ourselves and our society?

We live in an age with unprecedented access to information. Whether you are contributing to an entry on Wikipedia or reading a meme that has no sources cited (do they ever?), your ability to comprehend what you are reading and weigh it is a constant and consistent need. That is why it is so imperative that we have sharp critical-thinking skills. Also, if you don’t use them, you will have nothing to argue with your family about at Thanksgiving. More importantly, it keeps your brain from nomming on junk food and on more of a kale-based diet. Look at any trending topic, and test yourself. Is this true/accurate? How do I know either way? Is there a way I can use data (provable, factual information) to figure this out?

Certainly, we can train ourselves to become better critical thinkers, but it’s also important that we teach these skills to kids. Studies have shown how important this ability is to our success, and yet many feel that we’re doing a terrible job of teaching it. This study, however, may lead to educators and parents realizing that these skills are teachable. The implications of a better thinking society are not quantitative, but I do believe they would be extraordinary.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Horizontal History - Wait But Why

Horizontal History - Wait But Why | Philosophy everywhere everywhen | Scoop.it
Most of us have a pretty terrible understanding of history. Our knowledge is spotty, with large gaps all over the place, and the parts of history we do end up knowing a lot about usually depend on the particular teachers, parents, books, articles, and movies we happen to come across in our lives. Without a foundational, tree-trunk understanding of all parts of history, we often forget the things we do learn, leaving even our favorite parts of history a bit hazy in our heads. Raise your hand if you’d like to go on stage and debate a history buff on the nuances of a historical time period of your choosing. That’s what I thought.

The reason history is so hard is that it’s so soft. To truly, fully understand a time period, an event, a movement, or an important historical figure, you’d have to be there, and many times over. You’d have to be in the homes of the public living at the time to hear what they’re saying; you’d have to be a fly on the wall in dozens of secret, closed-door meetings and conversations; you’d need to be inside the minds of the key players to know their innermost thoughts and motivations. Even then, you’d be lacking context. To really have the complete truth, you’d need background—the cultural nuances and national psyches of the time, the way each of the key players was raised during childhood and the subtle social dynamics between those players, the impact of what was going on in other parts of the world, and an equally-thorough understanding of the many past centuries that all of these things grew out of.

That’s why not only can’t even the most perfect history buff fully understand history, but the key people involved at the time can’t ever know the full story. History is a giant collective tangle of thousands of interwoven stories involving millions of characters, countless chapters, and many, many narrators.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Can Integrated Information Theory Explain Consciousness?

Can Integrated Information Theory Explain Consciousness? | Philosophy everywhere everywhen | Scoop.it
How does matter make mind? More specifically, how does a physical object generate subjective experiences like those you are immersed in as you read this sentence? How does stuff become conscious? This is called the mind-body problem, or, by philosopher David Chalmers, the “hard problem.”

I expressed doubt that the hard problem can be solved--a position called mysterianism--in The End of Science. I argue in a new edition that my pessimism has been justified by the recent popularity of panpsychism. This ancient doctrine holds that consciousness is a property not just of brains but of all matter, like my table and coffee mug.

Panpsychism strikes me as self-evidently foolish, but non-foolish people—notably Chalmers and neuroscientist Christof Koch—are taking it seriously. How can that be? What’s compelling their interest? Have I dismissed panpsychism too hastily?

These questions lured me to a two-day workshop on integrated information theory at New York University last month. Conceived by neuroscientist Guilio Tononi (who trained under the late, great Gerald Edelman), IIT is an extremely ambitious theory of consciousness. It applies to all forms of matter, not just brains, and it implies that panpsychism might be true. Koch and others are taking panpsychism seriously because they take IIT seriously.

At the workshop, Chalmers, Tononi, Koch and ten other speakers presented their views of IIT, which were then batted around by 30 or so other scientists and philosophers. I’m still mulling over the claims and counter-claims, some of which were dauntingly abstract and mathematical. In this post, I’ll try to assess IIT, based on the workshop and my readings. If I get some things wrong, which is highly likely, I trust workshoppers will let me know.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

‘Battling the Gods: Atheism in the Ancient World,’ by Tim Whitmarsh

‘Battling the Gods: Atheism in the Ancient World,’ by Tim Whitmarsh | Philosophy everywhere everywhen | Scoop.it
The philosopher Sydney Morgenbesser, beloved by generations of Columbia University students (including me), was known for lines of wit that yielded nuggets of insight. He kept up his instructive shtick until the end, remarking to a colleague shortly before he died: “Why is God making me suffer so much? Just because I don’t believe in him?” For Morgenbesser, nothing worth pondering, including disbelief, could be entirely de-­paradoxed.

The major thesis of Tim Whitmarsh’s excellent “Battling the Gods” is that atheism — in all its nuanced varieties, even Morgenbesserian — isn’t a product of the modern age but rather reaches back to early Western intellectual tradition in the ancient Greek world.

The period that Whitmarsh covers is roughly 1,000 years, during which the Greek-speaking population emerged from illiteracy and anomie, became organized into independent city-states that spawned a high-achieving culture, were absorbed into the Macedonian Empire and then into the Roman Empire, and finally became Christianized. These momentous political shifts are efficiently traced, with astute commentary on their reflection in religious attitudes.

But the best part of “Battling the Gods” is the Greek chorus of atheists themselves, who speak distinctively throughout each of the political transformations — until, that is, the last of them, when they go silent. If you’ve been paying attention to contemporary atheists you might be startled by the familiarity of the ancient positions.

So here is Democritus in the fifth century B.C. — he who coined the term “atom,” from the Greek for “indivisible,” speculating that reality consisted of nothing but fundamental particles swirling randomly around in the void — propounding an anthropological theory of the origins of religious beliefs. Talk of “the gods,” he argued, comes naturally to primitive people who, unable yet to grasp the laws of nature, resort to fantastical storytelling. The exact titles of his works remain in doubt, but his naturalist explanation of the origins of conventional religion might have made use of Daniel C. Dennett’s title “Breaking the Spell: Religion as a Natural Phenomenon.”
Wildcat2030's insight:

book review go read

more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

How rivalry propels creative genius – Jacob Burak – Aeon

How rivalry propels creative genius – Jacob Burak – Aeon | Philosophy everywhere everywhen | Scoop.it
On 25 May 1832, John Constable was busy adding the final touches to his masterpiece, The Opening of Waterloo Bridge. One of England’s greatest 19th-century landscape artists, he had been working on the painting for more than 10 years and was finally set to reveal it to the world the next day, at the opening of the Royal Academy of Arts’ 64th annual exhibition. Next to his piece hung Helvoetsluys by J M W Turner, an artistic genius in his own right. Watching Constable’s last-minute efforts, Turner decided to add an extra brushstroke of his own: a red buoy floating on the water.

That single daub of red paint against a background of grey sky and sea was so arresting that visitors couldn’t take their eyes off it, certainly not to look at Constable’s painting. It was yet another landmark in the bitter rivalry between the two artists. A year earlier, Constable had used his position in an exhibition committee to have a Turner painting taken down and hung in a side room, replacing it with a painting of his own.
Turner and Constable are not alone in the pantheon of epic rivalries between creative giants. Isaac Newton and Gottfried Leibniz, two of the most brilliant mathematicians and thinkers of the 17th century, laid claim to the development of calculus, the mathematical study of change. Thomas Edison and Nikola Tesla both invented electrical systems in the 1880s. Steve Jobs and Bill Gates went head-to-head as pioneers of the computer age. If you Google almost any famous figure along with ‘rivalry’, you’ll find some interesting results.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Leiter Reports: A Philosophy Blog: Should there be a Nobel Prize (or equivalent prize) in philosophy?

Leiter Reports: A Philosophy Blog: Should there be a Nobel Prize (or equivalent prize) in philosophy? | Philosophy everywhere everywhen | Scoop.it
Russell Blackford (Newcastle) thinks so. If there were, I predict it will end up like the Nobel Prize for Literature: bizarre inclusions and exclusions that will tell us more about fashions and politics than about literature. Part of the difficulty will be in deciding what counts as philosophy. Look at Blackford's gloss:

Philosophy is the reason-based, intellectually rigorous, investigation of deep questions that have always caused puzzlement and anxiety: Is there a god or an afterlife? Do we possess free will? What is a good life for a human being? What is the nature of a just society? Philosophy challenges obfuscation and orthodoxies, and extends into examining the foundations of inquiry itself.

Are these "deep questions that have always caused puzzlement and anxiety"? Doubtful. And it's doubtful that all "good" philosophy "challenges obfuscation and orthodoxies": lots of important philosophy just rationalizes orthodoxy (and sometimes contributes to obfuscation).

Would the later Wittgenstein be eligible for a Nobel Prize in philosophy by Blackford's criteria? Not clear at all.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Meet The Man Who Invents Languages For A Living

Meet The Man Who Invents Languages For A Living | Philosophy everywhere everywhen | Scoop.it
If anyone has the credentials to write a book called The Art Of Language Invention, it's David J. Peterson.

He has two degrees in linguistics. He's comfortable speaking in eight languages (English, Spanish, French, German, Russian, Esperanto, Arabic and American Sign Language) — plus a long list of others he's studied but just hasn't tried speaking yet. He's also familiar with fictional languages — both famous ones like Klingon and deep cuts like Pakuni (the caveman language from Land Of The Lost).

And of course, he's crafted languages of his own — including full alphabets, vocabularies and grammars. Game of Thrones viewers, for instance, might recognize two of these languages: Dothraki, a guttural language of warrior horsemen, and High Valyrian, a language spoken in much of the fantasy world's eastern regions.

And he didn't rest there. Peterson actually created a third language for the show — just for a single giant.

"I didn't know beforehand that he was only going to have one line," he laughs. "I thought he was going to have a bunch of stuff, but whatever. I created a full language for the giant."

Peterson also invented Shiväisith for the Marvel blockbuster Thor: The Dark World, and four languages (and counting?) for the SyFy show Defiance.

In a new book, The Art Of Language Invention, Peterson details the languages he's invented, pairing them with lots of (often technical) advice about how readers can create some of their own.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Home | History of Philosophy without any gaps

Home | History of Philosophy without any gaps | Philosophy everywhere everywhen | Scoop.it
Peter Adamson, Professor of Philosophy at the LMU in Munich and at King's College London, takes listeners through the history of philosophy, "without any gaps."
more...
Berta Civera's curator insight, September 28, 2015 3:03 AM

Historia de la Filosofía de Peter Adamson, en inglés

Scooped by Wildcat2030
Scoop.it!

How did French thought end up in crisis? – Sudhir Hazareesingh – Aeon

How did French thought end up in crisis? – Sudhir Hazareesingh – Aeon | Philosophy everywhere everywhen | Scoop.it
There are many things we have come to regard as quintessentially French: Coco Chanel’s little black dress, the love of fine wines and gastronomy, the paintings of Auguste Renoir, the smell of burnt rubber in the Paris Métro. Equally distinctive is the French mode and style of thinking, which the Irish political philosopher Edmund Burke described in 1790 as ‘the conquering empire of light and reason’. He meant this as a criticism of the French Revolution, but this expression would undoubtedly have been worn as a badge of honour by most French thinkers from the Enlightenment onwards.

Indeed, the notion that rationality is the defining quality of humankind was first celebrated by the 17th-century thinker René Descartes, the father of modern French philosophy. His skeptical method of reasoning led him to conclude that the only certainty was the existence of his own mind: hence his ‘cogito ergo sum’ (‘I think, therefore I am’). This French rationalism was also expressed in a fondness for abstract notions and a preference for deductive reasoning, which starts with a general claim or thesis and eventually works its way towards a specific conclusion – thus the consistent French penchant for grand theories. As the essayist Emile Montégut put it in 1858: ‘There is no people among whom abstract ideas have played such a great role, and whose history is rife with such formidable philosophical tendencies.’
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Cult of the cosmic: how space travel replaced religion in USSR

Cult of the cosmic: how space travel replaced religion in USSR | Philosophy everywhere everywhen | Scoop.it
For most of the 20th century, the thirst for space exploration replaced religion in the Soviet Union, with the cult of science disseminated through propaganda, not sermons.

Yuri Gagarin, the first human in outer space, was the God-like figurehead, a man of the people and a martyr who died too young in mysterious circumstances. The titanium Gagarin monument in Moscow, created by sculptor Pavel Bondarenko, features a 42m-tall column topped with a figure of the cosmonaut rocketing to the sky in a pose similar to Rio De Janeiro’s Christ the Redeemer.

Between the 1950s up until the 70s, space themes were woven into everyday life, into endless festivals and celebrations of interstellar exploration. Children’s playgrounds were designed like rockets, the walls of schools and kindergartens decorated with paper spacecraft and stars. Houses were built to look like spacecraft, lunar stations and flying saucers – to this day, experts refer to the 1960s-80s as the “cosmic period” in Soviet architecture.

Statues and images of revered icons where everywhere, including Valentina Tereshkova, the first female cosmonaut in space, Alexei Leonov, the first astronaut to do a spacewalk, and rocket engineer Sergei Korolev.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

People Are More Likely to Cheat at the End

People Are More Likely to Cheat at the End | Philosophy everywhere everywhen | Scoop.it
Life, for better or worse, is full of endings. We finish school, get a new job, sell a home, break off a relationship. Knowing that a phase is soon coming to an end can elicit the best in us, as we try to make amends for errors past and avoid last-minute regrets. We might try to visit that local museum, or make time for happy hour drinks with a longtime coworker, or be more generous with our praise to a partner.

But while the sense of an ending can draw out people’s finest selves, it can also, new psychological research suggests, bring out their darker side. This study concludes that, as people get closer to finishing an activity, they become more and more likely to deliberately deceive others for their own benefit. And they do this, the research shows, because they anticipate regretting a missed opportunity to cheat the system.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Heroes, monsters and people: When it comes to moral choices, outstanding physicists are very ordinary

Heroes, monsters and people: When it comes to moral choices, outstanding physicists are very ordinary | Philosophy everywhere everywhen | Scoop.it

Last week, on the plane back from Chicago, I finished Philip Ball’s book about physics in Germany in the nineteen-thirties and -forties. I’m still thinking about it, and I’m trying to work out why it has left such a strong impression. I think it is because the compromises, recriminations and judgements formed have echoes, weak but clear, in so many other arguments going on today.

It is difficult to be nuanced about Nazis. There are obvious reasons for this, but it is nevertheless sometimes important to try. That genocidal ideology came from somewhere, and looking back on the period through a lens which colours everyone as hero or monster is not necessarily helpful for gaining understanding, and therefore not necessarily a good approach to the prevention of such abominations in future.

Even that previous paragraph is fraught with difficulty, of course. When the Murdoch media ran a video of the six-year-old future Queen giving a Nazi salute, I thought it defensible to show the film - not as an attack on the Royal Family, but as a reminder that such things could be deemed acceptable at that time. The Nazis didn’t come pre-equipped with the political and moral pariah status they deserved. When I said as much on facebook, at least one German friend of mine thought I came very close to the kind of apologia made too often in postwar Germany, that “ordinary people” just didn’t know how bad the Nazis were. Well, if they had read “Mein Kampf” they would have known. As George Orwell put it in his 1940 review of Hitler’s 1926 manifesto:it is difficult to believe that any real change has taken place in his aims and opinions. When one compares his utterances of a year or so ago with those made fifteen years earlier, a thing which strikes one is the rigidity of his mind, the way in which his world-view doesn’t develop.

more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

A Life in Games | Quanta Magazine

A Life in Games |  Quanta Magazine | Philosophy everywhere everywhen | Scoop.it
Gnawing on his left index finger with his chipped old British teeth, temporal veins bulging and brow pensively squinched beneath the day-before-yesterday’s hair, the mathematician John Horton Conway unapologetically whiles away his hours tinkering and thinkering — which is to say he’s ruminating, although he will insist he’s doing nothing, being lazy, playing games.

Based at Princeton University, though he found fame at Cambridge (as a student and professor from 1957 to 1987), Conway, 77, claims never to have worked a day in his life. Instead, he purports to have frittered away reams and reams of time playing. Yet he is Princeton’s John von Neumann Professor in Applied and Computational Mathematics (now emeritus). He’s a fellow of the Royal Society. And he is roundly praised as a genius. “The word ‘genius’ gets misused an awful lot,” said Persi Diaconis, a mathematician at Stanford University. “John Conway is a genius. And the thing about John is he’ll think about anything.… He has a real sense of whimsy. You can’t put him in a mathematical box.”

The hoity-toity Princeton bubble seems like an incongruously grand home base for someone so gamesome. The campus buildings are Gothic and festooned with ivy. It’s a milieu where the well-groomed preppy aesthetic never seems passé. By contrast, Conway is rumpled, with an otherworldly mien, somewhere between The Hobbit’s Bilbo Baggins and Gandalf. Conway can usually be found loitering in the mathematics department’s third-floor common room. The department is housed in the 13-story Fine Hall, the tallest tower in Princeton, with Sprint and AT&T cell towers on the rooftop. Inside, the professor-to-undergrad ratio is nearly 1-to-1. With a querying student often at his side, Conway settles either on a cluster of couches in the main room or a window alcove just outside the fray in the hallway, furnished with two armchairs facing a blackboard — a very edifying nook. From there Conway, borrowing some Shakespeare, addresses a familiar visitor with his Liverpudlian lilt:

Welcome! It’s a poor place but mine own!
more...
No comment yet.