Philosophy everyw...
7.9K views | +0 today
Scooped by Wildcat2030
onto Philosophy everywhere everywhen!

The Case for Using Drugs to Enhance Our Relationships (and Our Break-Ups)

The Case for Using Drugs to Enhance Our Relationships (and Our Break-Ups) | Philosophy everywhere everywhen |
A philosopher argues that taking love-altering substances might not just be a good idea, but a moral obligation


George Bernard Shaw once satirized marriage as "two people under the influence of the most violent, most insane, most delusive, and most transient of passions, who are required to swear that they will remain in that excited, abnormal, and exhausting condition continuously until death do them part." 

Yikes. And yet, nearly all human cultures value some version of marriage, as a nurturing emotional foundation for children, but also because marriage can give life an extra dimension of meaning. But marriage is hard, for biochemical reasons that may be beyond our control.  What if we could take drugs to get better at love?  Perhaps we could design "love drugs," pharmaceutical cocktails that could boost affection between partners, whisking them back to the exquisite set of pleasures that colored their first years together. The ability to do this kind of fine-tuned emotional engineering is beyond the power of current science, but there is a growing field of research devoted to it. Some have even suggested developing "anti-love drugs" that could dissolve abusive relationships, or reduce someone's attachment to a charismatic cult leader. Others just want a pill to ease the pain of a wrenching breakup.  Evolutionary biologists tell us that we owe the singular bundle of feelings we call "love" to natural selection. As human brains grew larger and larger, the story goes, children needed more and more time to develop into adults that could fend for themselves. A child with two parents around was privy to extra resources and protection, and thus stood a better chance of reaching maturity. The longer parents' chemical reward systems kept them in love, the more children they could shepherd to reproductive age. That's why the neural structures that form love bonds between couples were so strongly selected for. It's also why our relationships seem to come equipped with a set of invisible biochemical handrails: they're meant to support us through the inevitable trials that attend the creation of viable offspring. 
No comment yet.
Philosophy everywhere everywhen
The First Law of Philosophy: For every philosopher, there exists an equal and opposite philosopher. The Second Law of Philosophy: They're bo
Curated by Wildcat2030
Your new post is loading...
Your new post is loading...
Scooped by Wildcat2030!

Troubles with Three-ism: Body, Mind, and Soul - The Los Angeles...

Troubles with Three-ism: Body, Mind, and Soul - The Los Angeles... | Philosophy everywhere everywhen |
WHEN I WAS a wee Catholic lad growing up in the New York City suburbs of the late 1950s and early 1960s, I learned that good people go to heaven after they die. This was consoling. But it made me wonder precisely which part of me would go to heaven: my body, my mind, or my soul. Thanks to dead hamsters and such, I understood that bodies die, decay, and disperse. There was talk in school and at church of the resurrection of the body on Judgment Day, but that event, I reckoned, might not happen for several million years, and surely I’d be well ensconced in heaven by then. My mother tentatively explained that the part of me that loved peanut butter and jelly sandwiches and chocolate ice cream sodas would most likely not go to heaven, or, if it did, would not need or want peanut butter and jelly sandwiches and chocolate ice cream sodas anymore — possibly, I speculated, because, in the heavenly state, I’d be able mentally to conjure those great pleasures without there being actual physical manifestations of me or them. I surmised that those perfectly good human desires would either be gone (because my body would be gone), or somehow be eternally satisfied.

So, which was it, my mind or my soul that would go to heaven? Or both? And how did they differ? I didn’t want to go to heaven without my personality and memories. I wanted to be in heaven with my brothers and sisters, parents and grandparents, if not bodily then at least mentally. But personality and memories were, in my little boy ontology, associated with mind, and there was talk that the part of me that would go to heaven was something more ethereal than my mind. It was my eternal soul. But my soul, unlike my mind, seemed a bit too vague and general to be “me.” I wanted to be in heaven with me as me myself. Such were the vicissitudes of boyhood. I was troubled by three-ism. I was not, and am not, alone.
No comment yet.
Scooped by Wildcat2030!

Why it matters that you realize you’re in a computer simulation

Why it matters that you realize you’re in a computer simulation | Philosophy everywhere everywhen |
What if our universe is something like a computer simulation, or a virtual reality, or a video game? The proposition that the universe is actually a computer simulation was furthered in a big way during the 1970s, when John Conway famously proved that if you take a binary system, and subject that system to only a few rules (in the case of Conway’s experiment, four); then that system creates something rather peculiar.

What Conway’s rules produced were emergent complexities so sophisticated that they seemed to resemble the behaviors of life itself. He named his demonstration The Game of Life, and it helped lay the foundation for the Simulation Argument, its counterpart the Simulation Hypothesis, and Digital Mechanics. These fields have gone on to create a massive multi-decade long discourse in science, philosophy, and popular culture around the idea that it actually makes logical, mathematical sense that our universe is indeed a computer simulation. To crib a summary from Morpheus, “The Matrix is everywhere”. But amongst the murmurs on various forums and reddit threads pertaining to the subject, it isn’t uncommon to find a word or two devoted to caution: We, the complex intelligent lifeforms who are supposedly “inside” this simulated universe, would do well to play dumb that we are at all conscious of our circumstance.

The colloquial warning says we must not betray the knowledge that we have become aware of being mere bits in the bit kingdom. To have a tipping point population of players who realize that they are actually in something like a video game would have dire and catastrophic results. Deletion, reformatting, or some kind of biblical flushing of our entire universe (or maybe just our species), would unfold. Leave the Matrix alone! In fact, please pretend it isn’t even there.
No comment yet.
Scooped by Wildcat2030!

The lost art of getting lost - BBC News

The lost art of getting lost - BBC News | Philosophy everywhere everywhen |
Technology means maps and directions are constantly at hand, and getting lost is more unlikely than ever before. While for many this is a thing of joy, Stephen Smith asks if we may be missing out.

When was the last time you were well and truly lost? Chances are it's been a while.

Extraordinary gadgets like smartphones and satnavs let us pinpoint our location unerringly. Like the people in Downton Abbey, we all know our place.

However, the technology which delivers the world into the palms of our hands may be ushering in a kind of social immobility undreamt of even by Julian Fellowes's hidebound little Englanders.

Discovery used to mean going out and coming across stuff - now it seems to mean turning inwards and gazing at screens. We've become reliant on machines to help us get around, so much so that it's changing the way we behave, particularly among younger people who have no experience of a time before GPS.

We're raising an entire generation of men who will never know what it is to refuse to ask for directions.
No comment yet.
Scooped by Wildcat2030!

Boredom is not a problem to be solved. It's the last privilege of a free mind | Gayatri Devi

Boredom is not a problem to be solved. It's the last privilege of a free mind | Gayatri Devi | Philosophy everywhere everywhen |
Confessing to boredom is confessing to a character-flaw. Popular culture is littered with advice on how to shake it off: find like-minded people, take up a hobby, find a cause and work for it, take up an instrument, read a book, clean your house And certainly don’t let your kids be bored: enroll them in swimming, soccer, dance, church groups – anything to keep them from assuaging their boredom by gravitating toward sex and drugs. To do otherwise is to admit that we’re not engaging with the world around us. Or that your cellphone has died.

But boredom is not tragic. Properly understood, boredom helps us understand time, and ourselves. Unlike fun or work, boredom is not about anything; it is our encounter with pure time as form and content. With ads and screens and handheld devices ubiquitous, we don’t get to have that experience that much anymore. We should teach the young people to feel comfortable with time.

I live and teach in small-town Pennsylvania, and some of my students from bigger cities tell me that they always go home on Fridays because they are bored here.

You know the best antidote to boredom, I asked them? They looked at me expectantly, smartphones dangling from their hands. Think, I told them. Thinking is the best antidote to boredom. I am not kidding, kids. Thinking is the best antidote to boredom. Tell yourself, I am bored. Think about that. Isn’t that interesting? They looked at me incredulously. Thinking is not how they were brought up to handle boredom.

When you’re bored, time moves slowly. The German word for “boredom” expresses this: langeweile, a compound made of “lange,” which means “long,” and “weile” meaning “a while”. And slow-moving time can feel torturous for people who can’t feel peaceful alone with their minds. Learning to do so is why learning to be bored is so crucial. It is a great privilege if you can do this without going to the psychiatrist.
No comment yet.
Scooped by Wildcat2030!

Walter Benjamin’s legacy, 75 years on

Walter Benjamin’s legacy, 75 years on | Philosophy everywhere everywhen |
Like many a refugee in southern and central Europe today, Walter Benjamin was in flight from war and persecution 75 years ago, but was blocked at an intermediate border en route to the country chosen as his haven. He was part of a Jewish group which, hoping to escape occupied France, had hiked through a Pyrenean pass in autumn 1940 with a view to entering Franco’s Spain, crossing it to Portugal and then sailing to the US. However, in the words of Hannah Arendt, they arrived in the frontier village of Portbou “only to learn that Spain had closed the border that same day” and officials were not honouring American visas such as Benjamin’s. Faced with the prospect of returning to France and being handed over to the Nazis, he “took his own life” overnight on 26 September, whereupon the officials “allowed his companions to proceed to Portugal”.

For Arendt, who successfully reached New York via his intended route a few months later, this was a tragedy of misunderstanding, a poignant but fitting end for a brilliant but misfortune-prone older relative (her cousin by marriage) whom she writes about with a kind of affectionate exasperation.

Yet Edward Stourton, in Cruel Crossing: Escaping Hitler Across the Pyrenees, notes “there are all sorts of unanswered questions surrounding Benjamin’s death. His travelling companions remembered him carrying a heavy briefcase containing a manuscript he described as ‘more important than I am’. No such manuscript was found after his death … A Spanish doctor’s report gave the cause of death as a cerebral haemorrhage, not a drugs overdose. There has been persistent speculation that he was actually murdered, perhaps by a Soviet agent who had infiltrated his escaping party.”
No comment yet.
Scooped by Wildcat2030!

This free online encyclopedia has achieved what Wikipedia can only dream of

The Stanford Encyclopedia of Philosophy may be the most interesting website on the internet. Not because of the content—which includes fascinating entries on everything from ambiguity to zombies—but because of the site itself.

Its creators have solved one of the internet’s fundamental problems: How to provide authoritative, rigorously accurate knowledge, at no cost to readers. It’s something the encyclopedia, or SEP, has managed to do for two decades.
The internet is an information landfill. Somewhere in it—buried under piles of opinion, speculation, and misinformation—is virtually all of human knowledge.   But sorting through the trash is difficult work. Even when you have something you think is valuable, it often turns out to be a cheap knock-off. The story of how the SEP is run, and how it came to be, shows that it is possible to create a less trashy internet—or at least a less trashy corner of it. A place where actual knowledge is sorted into a neat, separate pile instead of being thrown into the landfill. Where the world can go to learn everything that we know to be true. Something that would make humans a lot smarter than the internet we have today.

go read..

No comment yet.
Scooped by Wildcat2030!

Watch: What Is Consciousness? We Now Have the Tools to Find Out - Singularity HUB

Watch: What Is Consciousness? We Now Have the Tools to Find Out - Singularity HUB | Philosophy everywhere everywhen |
The question of consciousness is as old as philosophy. Most animals appear to get along just fine without a sense of ‘me-ness’. But human beings are different. (At least, as far as we know we are.) We’ve evolved a sense of self awareness.

And while the exact nature of human consciousness is exceedingly difficult to pin down—that doesn’t stop us from trying. It's a puzzle that's preoccupied the world’s greatest philosophers for millennia and, in recent centuries, scientists too.

In the information age, we've begun to wonder if consciousness is a uniquely biological phenomenon or if it might arise elsewhere. Is the brain just a mushy computer running wetware—something we can replicate in hardware and software? Or is comparing the brain to a computer a misleading analogy and a vast oversimplification?

A fascinating new video from the Economist, featuring some of the brightest minds working the problem, brings us up to date on the debate and the latest thinking.
No comment yet.
Scooped by Wildcat2030!

A new process for studying proteins associated with diseases | KurzweilAI

A new process for studying proteins associated with diseases | KurzweilAI | Philosophy everywhere everywhen |
Researchers from Northwestern University and Yale University have developed a new technology to help scientists understand how proteins work and fix them when they are broken. Such knowledge could pave the way for new drugs for a myriad of diseases, including cancer.

The human body turns its proteins on and off (to alter their function and activity in cells) using “phosphorylation” — the reversible attachment of phosphate groups to proteins. These “decorations” on proteins provide an enormous variety of functions and are essential to all forms of life. Little is known, however, about how this important dynamic process works in humans.

Phosphorylation: a hallmark of disease

Using a special strain of E. coli bacteria, the researchers built a cell-free protein synthesis platform technology that can manufacture large quantities of these human phosphoproteins for scientific study. The goal is to enable scientists to learn more about the function and structure of phosphoproteins and identify which ones are involved in disease.

The study was published Sept. 9 in an open-access paper by the journal Nature Communications.

Trouble in the phosphorylation process can be a hallmark of disease, such as cancer, inflammation and Alzheimer’s disease. The human proteome (the entire set of expressed proteins) is estimated to be phosphorylated at more than 100,000 unique sites, making study of phosphorylated proteins and their role in disease a daunting task.

“Our technology begins to make this a tractable problem,” said Michael C. Jewett, an associate professor of chemical and biological engineering who led the Northwestern team. “We now can make these special proteins at unprecedented yields, with a freedom of design that is not possible in living organisms. The consequence of this innovative strategy is enormous.”
No comment yet.
Scooped by Wildcat2030!

Ignore Your Feelings

Ignore Your Feelings | Philosophy everywhere everywhen |
Put down the talking stick. Stop fruitlessly seeking "closure" with your peevish co-worker. And please, don't bother telling your spouse how annoying you find their tongue-clicking habit—sometimes honesty is less like a breath of fresh air and more like a fart. That’s the argument of Michael Bennett and Sarah Bennett, the father-daughter duo behind the new self-help book F*ck Feelings.

The elder Bennett is a psychiatrist and American Psychiatric Association distinguished fellow. His daughter is a comedy writer. Together, they provide a tough-love, irreverent take on “life's impossible problems.” The crux of their approach is that life is hard and negative emotions are part of it. The key is to see your “bullshit wishes” for just what they are (bullshit), and instead to pursue real, achievable goals.

Stop trying to forgive your bad parents, they advise. Jerks are capable of having as many kids as anyone else—at least until men’s rights conventions come equipped with free vasectomy booths. If you happen to be the child of a jerk, that's just another obstacle to overcome.

In fact, stop trying to free yourself of all anger and hate. In all likelihood you're doing a really awesome job, the Bennetts argue, despite all the shitty things that happen to you.

Oh, and a word on shit: “Profanity is a source of comfort, clarity, and strength,” they write. “It helps to express anger without blame, to be tough in the face of pain.”

I recently spoke with the Bennetts by phone about what the f*cking deal is with their book. A lightly edited transcript of our conversation follows.
No comment yet.
Scooped by Wildcat2030!

Philosophically Interesting Books for Young Kids

Philosophically Interesting Books for Young Kids | Philosophy everywhere everywhen |
A friend is interested in soliciting philosophically-minded books for young children—ones who are reading, but are not at the chapter-book stage. Here are a few I’ve enjoyed with my kids… The Big Orange Splot by Daniel Manus Pinkwater — for the young individualist. A Hole Is To Dig by Ruth Krauss — for the young teleologist. Pierre: A Cautionary Tale by Maurice Sendak — for the young nihilist. It Could Always Be Worse by Margot Zemach — for the young. How To Behave and Why by Munro Leaf — for th
No comment yet.
Scooped by Wildcat2030!

Study: There Are Instructions for Teaching Critical Thinking | Big Think

Study: There Are Instructions for Teaching Critical Thinking | Big Think | Philosophy everywhere everywhen |
Whether or not you can teach something as subjective as critical thinking has been up for debate, but a fascinating new study shows that it’s actually quite possible. Experiments performed by Stanford's Department of Physics and Graduate School of Education demonstrate that students can be instructed to think more critically.

It’s difficult to overstate the importance of critical-thinking skills in modern society. The ability to decipher information and interpret it, offering creative solutions, is in direct relation to our intellect.
The study took two groups of students in an introductory physics laboratory course, with one group (known as the experimental group) given the instruction to use quantitative comparisons between datasets and the other group given no instruction (the control group). Comparing data in a scientific manner; that is, being able to measure one’s observations in a statistical or mathematical way, led to interesting results for the experimental group.Even after these instructions were removed, they were 12 times more likely to offer creative solutions to improve the experimental methods being used in the class, four times more likely to explain the limitations of the methods, and better at explaining their reasoning than the control group. The results remained consistent even in the next year, with students in a different class. So what does this imply about critical thinking, and how can we utilize these findings to improve ourselves and our society?

We live in an age with unprecedented access to information. Whether you are contributing to an entry on Wikipedia or reading a meme that has no sources cited (do they ever?), your ability to comprehend what you are reading and weigh it is a constant and consistent need. That is why it is so imperative that we have sharp critical-thinking skills. Also, if you don’t use them, you will have nothing to argue with your family about at Thanksgiving. More importantly, it keeps your brain from nomming on junk food and on more of a kale-based diet. Look at any trending topic, and test yourself. Is this true/accurate? How do I know either way? Is there a way I can use data (provable, factual information) to figure this out?

Certainly, we can train ourselves to become better critical thinkers, but it’s also important that we teach these skills to kids. Studies have shown how important this ability is to our success, and yet many feel that we’re doing a terrible job of teaching it. This study, however, may lead to educators and parents realizing that these skills are teachable. The implications of a better thinking society are not quantitative, but I do believe they would be extraordinary.
No comment yet.
Scooped by Wildcat2030!

The pronoun 'I' is becoming obsolete

The pronoun 'I' is becoming obsolete | Philosophy everywhere everywhen |
Don't look now, but the pronoun "I" is becoming obsolete.

Recent microbiological research has shown that thinking of plants and animals, including humans, as autonomous individuals is a serious over-simplification.

A series of groundbreaking studies have revealed that what we have always thought of as individuals are actually "biomolecular networks" that consist of visible hosts plus millions of invisible microbes that have a significant effect on how the host develops, the diseases it catches, how it behaves and possibly even its social interactions.

"It's a case of the whole being greater than the sum of its parts," said Seth Bordenstein, associate professor of biological sciences at Vanderbilt University, who has contributed to the body of scientific knowledge that is pointing to the conclusion that symbiotic microbes play a fundamental role in virtually all aspects of plant and animal biology, including the origin of new species.

In this case, the parts are the host and its genome plus the thousands of different species of bacteria living in or on the host, along with all their genomes, collectively known as the microbiome.

(The host is something like the tip of the iceberg while the bacteria are like the part of the iceberg that is underwater: Nine out of every 10 cells in plant and animal bodies are bacterial. But bacterial cells are so much smaller than host cells that they have generally gone unnoticed.)
No comment yet.
Scooped by Wildcat2030!

Will Artificial Intelligence Surpass Our Own?

Will Artificial Intelligence Surpass Our Own? | Philosophy everywhere everywhen |
Famed science-fiction writer Fredric Brown (1906–1972) delighted in creating the shortest of short stories. “Answer,” published in 1954, encapsulated a prescient meditation on the future of human-machine relations within a single double-spaced, typewritten page.

The foreboding of the story echoes current apprehensions of scientists, policy makers and ethicists over the rapid evolution of machine intelligence.

“Answer” begins under the watchful eyes of a dozen television cameras that are recording the ceremonial soldering of the final connection to tie together all the “monster” computers in the universe.

The machines are about to link 96 billion planets into a single “supercircuit” that combines “all the knowledge of all the galaxies.”

Two witnesses on the scene are identified only as Dwar Ev and Dwar Reyn. After throwing the switch that connects the galactic circuit, Dwar Ev suggests to his companion that he ask the machine the first question:

“Thank you,” said Dwar Reyn. “It shall be a question which no single cyber netics machine has been able to answer.”

He turned to face the machine. “Is there a God?”

The mighty voice answered without hesitation, without the clicking of a single relay.

“Yes, now there is a God.”

Sudden fear flashed on the face of Dwar Ev. He leaped to grab the switch.

A bolt of lightning from the cloudless sky struck him down and fused the switch shut.

We are in the midst of a revolution in machine intelligence, the art and engineering practices that let computers perform tasks that, until recently, could be done only by people. There is now software that identifies faces at border crossings and matches them against passports or that labels people and objects in photographs posted to social media. Algorithms can teach themselves to play Atari video games. A camera and chip embedded in top-of-the-line sedans let the vehicles drive autonomously on the open road.
No comment yet.
Scooped by Wildcat2030!

‘Battling the Gods: Atheism in the Ancient World,’ by Tim Whitmarsh

‘Battling the Gods: Atheism in the Ancient World,’ by Tim Whitmarsh | Philosophy everywhere everywhen |
The philosopher Sydney Morgenbesser, beloved by generations of Columbia University students (including me), was known for lines of wit that yielded nuggets of insight. He kept up his instructive shtick until the end, remarking to a colleague shortly before he died: “Why is God making me suffer so much? Just because I don’t believe in him?” For Morgenbesser, nothing worth pondering, including disbelief, could be entirely de-­paradoxed.

The major thesis of Tim Whitmarsh’s excellent “Battling the Gods” is that atheism — in all its nuanced varieties, even Morgenbesserian — isn’t a product of the modern age but rather reaches back to early Western intellectual tradition in the ancient Greek world.

The period that Whitmarsh covers is roughly 1,000 years, during which the Greek-speaking population emerged from illiteracy and anomie, became organized into independent city-states that spawned a high-achieving culture, were absorbed into the Macedonian Empire and then into the Roman Empire, and finally became Christianized. These momentous political shifts are efficiently traced, with astute commentary on their reflection in religious attitudes.

But the best part of “Battling the Gods” is the Greek chorus of atheists themselves, who speak distinctively throughout each of the political transformations — until, that is, the last of them, when they go silent. If you’ve been paying attention to contemporary atheists you might be startled by the familiarity of the ancient positions.

So here is Democritus in the fifth century B.C. — he who coined the term “atom,” from the Greek for “indivisible,” speculating that reality consisted of nothing but fundamental particles swirling randomly around in the void — propounding an anthropological theory of the origins of religious beliefs. Talk of “the gods,” he argued, comes naturally to primitive people who, unable yet to grasp the laws of nature, resort to fantastical storytelling. The exact titles of his works remain in doubt, but his naturalist explanation of the origins of conventional religion might have made use of Daniel C. Dennett’s title “Breaking the Spell: Religion as a Natural Phenomenon.”
Wildcat2030's insight:

book review go read

No comment yet.
Scooped by Wildcat2030!

How rivalry propels creative genius – Jacob Burak – Aeon

How rivalry propels creative genius – Jacob Burak – Aeon | Philosophy everywhere everywhen |
On 25 May 1832, John Constable was busy adding the final touches to his masterpiece, The Opening of Waterloo Bridge. One of England’s greatest 19th-century landscape artists, he had been working on the painting for more than 10 years and was finally set to reveal it to the world the next day, at the opening of the Royal Academy of Arts’ 64th annual exhibition. Next to his piece hung Helvoetsluys by J M W Turner, an artistic genius in his own right. Watching Constable’s last-minute efforts, Turner decided to add an extra brushstroke of his own: a red buoy floating on the water.

That single daub of red paint against a background of grey sky and sea was so arresting that visitors couldn’t take their eyes off it, certainly not to look at Constable’s painting. It was yet another landmark in the bitter rivalry between the two artists. A year earlier, Constable had used his position in an exhibition committee to have a Turner painting taken down and hung in a side room, replacing it with a painting of his own.
Turner and Constable are not alone in the pantheon of epic rivalries between creative giants. Isaac Newton and Gottfried Leibniz, two of the most brilliant mathematicians and thinkers of the 17th century, laid claim to the development of calculus, the mathematical study of change. Thomas Edison and Nikola Tesla both invented electrical systems in the 1880s. Steve Jobs and Bill Gates went head-to-head as pioneers of the computer age. If you Google almost any famous figure along with ‘rivalry’, you’ll find some interesting results.
No comment yet.
Scooped by Wildcat2030!

Leiter Reports: A Philosophy Blog: Should there be a Nobel Prize (or equivalent prize) in philosophy?

Leiter Reports: A Philosophy Blog: Should there be a Nobel Prize (or equivalent prize) in philosophy? | Philosophy everywhere everywhen |
Russell Blackford (Newcastle) thinks so. If there were, I predict it will end up like the Nobel Prize for Literature: bizarre inclusions and exclusions that will tell us more about fashions and politics than about literature. Part of the difficulty will be in deciding what counts as philosophy. Look at Blackford's gloss:

Philosophy is the reason-based, intellectually rigorous, investigation of deep questions that have always caused puzzlement and anxiety: Is there a god or an afterlife? Do we possess free will? What is a good life for a human being? What is the nature of a just society? Philosophy challenges obfuscation and orthodoxies, and extends into examining the foundations of inquiry itself.

Are these "deep questions that have always caused puzzlement and anxiety"? Doubtful. And it's doubtful that all "good" philosophy "challenges obfuscation and orthodoxies": lots of important philosophy just rationalizes orthodoxy (and sometimes contributes to obfuscation).

Would the later Wittgenstein be eligible for a Nobel Prize in philosophy by Blackford's criteria? Not clear at all.
No comment yet.
Scooped by Wildcat2030!

Meet The Man Who Invents Languages For A Living

Meet The Man Who Invents Languages For A Living | Philosophy everywhere everywhen |
If anyone has the credentials to write a book called The Art Of Language Invention, it's David J. Peterson.

He has two degrees in linguistics. He's comfortable speaking in eight languages (English, Spanish, French, German, Russian, Esperanto, Arabic and American Sign Language) — plus a long list of others he's studied but just hasn't tried speaking yet. He's also familiar with fictional languages — both famous ones like Klingon and deep cuts like Pakuni (the caveman language from Land Of The Lost).

And of course, he's crafted languages of his own — including full alphabets, vocabularies and grammars. Game of Thrones viewers, for instance, might recognize two of these languages: Dothraki, a guttural language of warrior horsemen, and High Valyrian, a language spoken in much of the fantasy world's eastern regions.

And he didn't rest there. Peterson actually created a third language for the show — just for a single giant.

"I didn't know beforehand that he was only going to have one line," he laughs. "I thought he was going to have a bunch of stuff, but whatever. I created a full language for the giant."

Peterson also invented Shiväisith for the Marvel blockbuster Thor: The Dark World, and four languages (and counting?) for the SyFy show Defiance.

In a new book, The Art Of Language Invention, Peterson details the languages he's invented, pairing them with lots of (often technical) advice about how readers can create some of their own.
No comment yet.
Scooped by Wildcat2030!

Home | History of Philosophy without any gaps

Home | History of Philosophy without any gaps | Philosophy everywhere everywhen |
Peter Adamson, Professor of Philosophy at the LMU in Munich and at King's College London, takes listeners through the history of philosophy, "without any gaps."
Berta Civera's curator insight, September 28, 3:03 AM

Historia de la Filosofía de Peter Adamson, en inglés

Scooped by Wildcat2030!

How did French thought end up in crisis? – Sudhir Hazareesingh – Aeon

How did French thought end up in crisis? – Sudhir Hazareesingh – Aeon | Philosophy everywhere everywhen |
There are many things we have come to regard as quintessentially French: Coco Chanel’s little black dress, the love of fine wines and gastronomy, the paintings of Auguste Renoir, the smell of burnt rubber in the Paris Métro. Equally distinctive is the French mode and style of thinking, which the Irish political philosopher Edmund Burke described in 1790 as ‘the conquering empire of light and reason’. He meant this as a criticism of the French Revolution, but this expression would undoubtedly have been worn as a badge of honour by most French thinkers from the Enlightenment onwards.

Indeed, the notion that rationality is the defining quality of humankind was first celebrated by the 17th-century thinker René Descartes, the father of modern French philosophy. His skeptical method of reasoning led him to conclude that the only certainty was the existence of his own mind: hence his ‘cogito ergo sum’ (‘I think, therefore I am’). This French rationalism was also expressed in a fondness for abstract notions and a preference for deductive reasoning, which starts with a general claim or thesis and eventually works its way towards a specific conclusion – thus the consistent French penchant for grand theories. As the essayist Emile Montégut put it in 1858: ‘There is no people among whom abstract ideas have played such a great role, and whose history is rife with such formidable philosophical tendencies.’
No comment yet.
Scooped by Wildcat2030!

Cult of the cosmic: how space travel replaced religion in USSR

Cult of the cosmic: how space travel replaced religion in USSR | Philosophy everywhere everywhen |
For most of the 20th century, the thirst for space exploration replaced religion in the Soviet Union, with the cult of science disseminated through propaganda, not sermons.

Yuri Gagarin, the first human in outer space, was the God-like figurehead, a man of the people and a martyr who died too young in mysterious circumstances. The titanium Gagarin monument in Moscow, created by sculptor Pavel Bondarenko, features a 42m-tall column topped with a figure of the cosmonaut rocketing to the sky in a pose similar to Rio De Janeiro’s Christ the Redeemer.

Between the 1950s up until the 70s, space themes were woven into everyday life, into endless festivals and celebrations of interstellar exploration. Children’s playgrounds were designed like rockets, the walls of schools and kindergartens decorated with paper spacecraft and stars. Houses were built to look like spacecraft, lunar stations and flying saucers – to this day, experts refer to the 1960s-80s as the “cosmic period” in Soviet architecture.

Statues and images of revered icons where everywhere, including Valentina Tereshkova, the first female cosmonaut in space, Alexei Leonov, the first astronaut to do a spacewalk, and rocket engineer Sergei Korolev.
No comment yet.
Scooped by Wildcat2030!

People Are More Likely to Cheat at the End

People Are More Likely to Cheat at the End | Philosophy everywhere everywhen |
Life, for better or worse, is full of endings. We finish school, get a new job, sell a home, break off a relationship. Knowing that a phase is soon coming to an end can elicit the best in us, as we try to make amends for errors past and avoid last-minute regrets. We might try to visit that local museum, or make time for happy hour drinks with a longtime coworker, or be more generous with our praise to a partner.

But while the sense of an ending can draw out people’s finest selves, it can also, new psychological research suggests, bring out their darker side. This study concludes that, as people get closer to finishing an activity, they become more and more likely to deliberately deceive others for their own benefit. And they do this, the research shows, because they anticipate regretting a missed opportunity to cheat the system.
No comment yet.
Scooped by Wildcat2030!

Heroes, monsters and people: When it comes to moral choices, outstanding physicists are very ordinary

Heroes, monsters and people: When it comes to moral choices, outstanding physicists are very ordinary | Philosophy everywhere everywhen |

Last week, on the plane back from Chicago, I finished Philip Ball’s book about physics in Germany in the nineteen-thirties and -forties. I’m still thinking about it, and I’m trying to work out why it has left such a strong impression. I think it is because the compromises, recriminations and judgements formed have echoes, weak but clear, in so many other arguments going on today.

It is difficult to be nuanced about Nazis. There are obvious reasons for this, but it is nevertheless sometimes important to try. That genocidal ideology came from somewhere, and looking back on the period through a lens which colours everyone as hero or monster is not necessarily helpful for gaining understanding, and therefore not necessarily a good approach to the prevention of such abominations in future.

Even that previous paragraph is fraught with difficulty, of course. When the Murdoch media ran a video of the six-year-old future Queen giving a Nazi salute, I thought it defensible to show the film - not as an attack on the Royal Family, but as a reminder that such things could be deemed acceptable at that time. The Nazis didn’t come pre-equipped with the political and moral pariah status they deserved. When I said as much on facebook, at least one German friend of mine thought I came very close to the kind of apologia made too often in postwar Germany, that “ordinary people” just didn’t know how bad the Nazis were. Well, if they had read “Mein Kampf” they would have known. As George Orwell put it in his 1940 review of Hitler’s 1926 manifesto:it is difficult to believe that any real change has taken place in his aims and opinions. When one compares his utterances of a year or so ago with those made fifteen years earlier, a thing which strikes one is the rigidity of his mind, the way in which his world-view doesn’t develop.

No comment yet.
Scooped by Wildcat2030!

A Life in Games | Quanta Magazine

A Life in Games |  Quanta Magazine | Philosophy everywhere everywhen |
Gnawing on his left index finger with his chipped old British teeth, temporal veins bulging and brow pensively squinched beneath the day-before-yesterday’s hair, the mathematician John Horton Conway unapologetically whiles away his hours tinkering and thinkering — which is to say he’s ruminating, although he will insist he’s doing nothing, being lazy, playing games.

Based at Princeton University, though he found fame at Cambridge (as a student and professor from 1957 to 1987), Conway, 77, claims never to have worked a day in his life. Instead, he purports to have frittered away reams and reams of time playing. Yet he is Princeton’s John von Neumann Professor in Applied and Computational Mathematics (now emeritus). He’s a fellow of the Royal Society. And he is roundly praised as a genius. “The word ‘genius’ gets misused an awful lot,” said Persi Diaconis, a mathematician at Stanford University. “John Conway is a genius. And the thing about John is he’ll think about anything.… He has a real sense of whimsy. You can’t put him in a mathematical box.”

The hoity-toity Princeton bubble seems like an incongruously grand home base for someone so gamesome. The campus buildings are Gothic and festooned with ivy. It’s a milieu where the well-groomed preppy aesthetic never seems passé. By contrast, Conway is rumpled, with an otherworldly mien, somewhere between The Hobbit’s Bilbo Baggins and Gandalf. Conway can usually be found loitering in the mathematics department’s third-floor common room. The department is housed in the 13-story Fine Hall, the tallest tower in Princeton, with Sprint and AT&T cell towers on the rooftop. Inside, the professor-to-undergrad ratio is nearly 1-to-1. With a querying student often at his side, Conway settles either on a cluster of couches in the main room or a window alcove just outside the fray in the hallway, furnished with two armchairs facing a blackboard — a very edifying nook. From there Conway, borrowing some Shakespeare, addresses a familiar visitor with his Liverpudlian lilt:

Welcome! It’s a poor place but mine own!
No comment yet.
Scooped by Wildcat2030!

How new brain implants can boost free will – Walter Glannon – Aeon

How new brain implants can boost free will – Walter Glannon – Aeon | Philosophy everywhere everywhen |
Some philosophers maintain that free will is incompatible with causal determinism, which by definition allows only one possibility – in essence, it assigns our life trajectories in advance. Others argue that we don’t need alternative possibilities for free will but only the desires and intentions that actually guide what we decide to do.

Yet my student made me think that the debate could be reframed. Free will might have nothing to do with the universe outside and everything to do with how the brain enables or disables our behaviour and thoughts. What if free will relies on the internal, on how successfully the brain generates and sustains the physiological, cognitive and emotional dimensions of our bodies and minds – and has nothing to do with the external at all?

The best way to study free will, I posited, might be through neurological and psychiatric disorders resulting from dysfunction in neural circuits regulating movement, cognition and mood. Patients with Parkinson’s disease experience uncontrollable tremors or equally debilitating rigidity. For those with obsessive-compulsive disorder (OCD), intrusive thoughts and repetitive behaviour seem impossible to suppress. Major depression can dull motivation and destroy the capacity for pleasure. Damage to the region of the brain regulating memory formation can limit the capacity to recall experiences and project oneself into the future. Other traumatic brain injuries undermine free will by causing extensive paralysis and the inability to communicate. If we think of free will as the ability to plan and act without mental or physical compulsion or constraint, then these brain disorders represent a spectrum in which free will is mildly to completely impaired.
No comment yet.
Scooped by Wildcat2030!

Teaching how to think is just as important as teaching anything else

Teaching how to think is just as important as teaching anything else | Philosophy everywhere everywhen |
A new paper on teaching critical thinking skills in science has pointed out, yet again, the value of giving students experiences that go beyond simple recall or learned procedures.

It is a common lamentation that students are not taught to think, but there is usually an accompanying lack of clarity about exactly what that might mean.

There is a way of understanding this idea that is conceptually easy and delivers a sharp educational focus – a way that focuses on the explicit teaching of thinking skills through an inquiry process, and allows students to effectively evaluate their thinking.
What are thinking skills?

Let’s first understand what we might mean by thinking skills. Thinking skills, or cognitive skills, are, in large part, things you do with knowledge. Things like analysing, evaluating, synthesising, inferring, conjecturing, justifying, categorising and many other terms describe your cognitive events at a particular functional level.

Analysis, for example, involves identifying the constituent elements of something and examining their relationships with each other and to the whole. One can analyse a painting, a piece of text, a set of data or a graph.

Analysis is a widely valued cognitive skill and is not unique to any discipline context. It is a general thinking skill.

Most syllabuses from primary to tertiary level are organised by content only, with little mention of such cognitive skills. Usually, even if they are mentioned, little is said about how to teach them. The hope is they will be caught, not taught.

Rigour in course design is too often understood as equating to large amounts of recall of content and specific training in algorithms or set procedures. It is far less common, but far more valuable, to have courses in which rigour is found in the demand for high-level cognitive skill formation.

This is not to say that knowledge is not important in the curriculum. Our knowledge is hard won; we should value what we have learned for how it makes our lives more productive or meaningful.

But there is nothing mutually exclusive about developing high levels of cognitive skills with content knowledge in a discipline context. It just demands attention to these skills, using the content as an opportunity to explore them.

It is knowing how to provide students with these skill-building opportunities in context that is the mark of an outstanding teacher of effective thinking.

After all, we do not expect the scientific, cultural and political leaders of tomorrow simply to know stuff. They must also know what to do with it.
No comment yet.