Philosophy everyw...
7.7K views | +0 today
Scooped by Wildcat2030
onto Philosophy everywhere everywhen!

Where Thomas Nagel Went Wrong-The philosopher's critique of evolution wasn't shocking. So why is he being raked over the coals?

Where Thomas Nagel Went Wrong-The philosopher's critique of evolution wasn't shocking. So why is he being raked over the coals? | Philosophy everywhere everywhen |
The philosopher's critique of evolution wasn't shocking. So why have his colleagues raked him over the coals?


Thomas Nagel is a leading figure in philosophy, now enjoying the title of university professor at New York University, a testament to the scope and influence of his work. His 1974 essay "What Is It Like to Be a Bat?" has been read by legions of undergraduates, with its argument that the inner experience of a brain is truly knowable only to that brain. Since then he has published 11 books, on philosophy of mind, ethics, and epistemology.

But Nagel's academic golden years are less peaceful than he might have wished. His latest book, Mind and Cosmos (Oxford University Press, 2012), has been greeted by a storm of rebuttals, ripostes, and pure snark. "The shoddy reasoning of a once-great thinker," Steven Pinker tweeted. The Weekly Standard quoted the philosopher Daniel Dennett calling Nagel a member of a "retrograde gang" whose work "isn't worth anything—it's cute and it's clever and it's not worth a damn."

The critics have focused much of their ire on what Nagel calls "natural teleology," the hypothesis that the universe has an internal logic that inevitably drives matter from nonliving to living, from simple to complex, from chemistry to consciousness, from instinctual to intellectual.

No comment yet.
Philosophy everywhere everywhen
The First Law of Philosophy: For every philosopher, there exists an equal and opposite philosopher. The Second Law of Philosophy: They're bo
Curated by Wildcat2030
Your new post is loading...
Your new post is loading...
Scooped by Wildcat2030!

Leiter Reports: A Philosophy Blog: Should there be a Nobel Prize (or equivalent prize) in philosophy?

Leiter Reports: A Philosophy Blog: Should there be a Nobel Prize (or equivalent prize) in philosophy? | Philosophy everywhere everywhen |
Russell Blackford (Newcastle) thinks so. If there were, I predict it will end up like the Nobel Prize for Literature: bizarre inclusions and exclusions that will tell us more about fashions and politics than about literature. Part of the difficulty will be in deciding what counts as philosophy. Look at Blackford's gloss:

Philosophy is the reason-based, intellectually rigorous, investigation of deep questions that have always caused puzzlement and anxiety: Is there a god or an afterlife? Do we possess free will? What is a good life for a human being? What is the nature of a just society? Philosophy challenges obfuscation and orthodoxies, and extends into examining the foundations of inquiry itself.

Are these "deep questions that have always caused puzzlement and anxiety"? Doubtful. And it's doubtful that all "good" philosophy "challenges obfuscation and orthodoxies": lots of important philosophy just rationalizes orthodoxy (and sometimes contributes to obfuscation).

Would the later Wittgenstein be eligible for a Nobel Prize in philosophy by Blackford's criteria? Not clear at all.
No comment yet.
Scooped by Wildcat2030!

Meet The Man Who Invents Languages For A Living

Meet The Man Who Invents Languages For A Living | Philosophy everywhere everywhen |
If anyone has the credentials to write a book called The Art Of Language Invention, it's David J. Peterson.

He has two degrees in linguistics. He's comfortable speaking in eight languages (English, Spanish, French, German, Russian, Esperanto, Arabic and American Sign Language) — plus a long list of others he's studied but just hasn't tried speaking yet. He's also familiar with fictional languages — both famous ones like Klingon and deep cuts like Pakuni (the caveman language from Land Of The Lost).

And of course, he's crafted languages of his own — including full alphabets, vocabularies and grammars. Game of Thrones viewers, for instance, might recognize two of these languages: Dothraki, a guttural language of warrior horsemen, and High Valyrian, a language spoken in much of the fantasy world's eastern regions.

And he didn't rest there. Peterson actually created a third language for the show — just for a single giant.

"I didn't know beforehand that he was only going to have one line," he laughs. "I thought he was going to have a bunch of stuff, but whatever. I created a full language for the giant."

Peterson also invented Shiväisith for the Marvel blockbuster Thor: The Dark World, and four languages (and counting?) for the SyFy show Defiance.

In a new book, The Art Of Language Invention, Peterson details the languages he's invented, pairing them with lots of (often technical) advice about how readers can create some of their own.
No comment yet.
Scooped by Wildcat2030!

Home | History of Philosophy without any gaps

Home | History of Philosophy without any gaps | Philosophy everywhere everywhen |
Peter Adamson, Professor of Philosophy at the LMU in Munich and at King's College London, takes listeners through the history of philosophy, "without any gaps."
Berta Civera's curator insight, September 28, 3:03 AM

Historia de la Filosofía de Peter Adamson, en inglés

Scooped by Wildcat2030!

How did French thought end up in crisis? – Sudhir Hazareesingh – Aeon

How did French thought end up in crisis? – Sudhir Hazareesingh – Aeon | Philosophy everywhere everywhen |
There are many things we have come to regard as quintessentially French: Coco Chanel’s little black dress, the love of fine wines and gastronomy, the paintings of Auguste Renoir, the smell of burnt rubber in the Paris Métro. Equally distinctive is the French mode and style of thinking, which the Irish political philosopher Edmund Burke described in 1790 as ‘the conquering empire of light and reason’. He meant this as a criticism of the French Revolution, but this expression would undoubtedly have been worn as a badge of honour by most French thinkers from the Enlightenment onwards.

Indeed, the notion that rationality is the defining quality of humankind was first celebrated by the 17th-century thinker René Descartes, the father of modern French philosophy. His skeptical method of reasoning led him to conclude that the only certainty was the existence of his own mind: hence his ‘cogito ergo sum’ (‘I think, therefore I am’). This French rationalism was also expressed in a fondness for abstract notions and a preference for deductive reasoning, which starts with a general claim or thesis and eventually works its way towards a specific conclusion – thus the consistent French penchant for grand theories. As the essayist Emile Montégut put it in 1858: ‘There is no people among whom abstract ideas have played such a great role, and whose history is rife with such formidable philosophical tendencies.’
No comment yet.
Scooped by Wildcat2030!

Cult of the cosmic: how space travel replaced religion in USSR

Cult of the cosmic: how space travel replaced religion in USSR | Philosophy everywhere everywhen |
For most of the 20th century, the thirst for space exploration replaced religion in the Soviet Union, with the cult of science disseminated through propaganda, not sermons.

Yuri Gagarin, the first human in outer space, was the God-like figurehead, a man of the people and a martyr who died too young in mysterious circumstances. The titanium Gagarin monument in Moscow, created by sculptor Pavel Bondarenko, features a 42m-tall column topped with a figure of the cosmonaut rocketing to the sky in a pose similar to Rio De Janeiro’s Christ the Redeemer.

Between the 1950s up until the 70s, space themes were woven into everyday life, into endless festivals and celebrations of interstellar exploration. Children’s playgrounds were designed like rockets, the walls of schools and kindergartens decorated with paper spacecraft and stars. Houses were built to look like spacecraft, lunar stations and flying saucers – to this day, experts refer to the 1960s-80s as the “cosmic period” in Soviet architecture.

Statues and images of revered icons where everywhere, including Valentina Tereshkova, the first female cosmonaut in space, Alexei Leonov, the first astronaut to do a spacewalk, and rocket engineer Sergei Korolev.
No comment yet.
Scooped by Wildcat2030!

People Are More Likely to Cheat at the End

People Are More Likely to Cheat at the End | Philosophy everywhere everywhen |
Life, for better or worse, is full of endings. We finish school, get a new job, sell a home, break off a relationship. Knowing that a phase is soon coming to an end can elicit the best in us, as we try to make amends for errors past and avoid last-minute regrets. We might try to visit that local museum, or make time for happy hour drinks with a longtime coworker, or be more generous with our praise to a partner.

But while the sense of an ending can draw out people’s finest selves, it can also, new psychological research suggests, bring out their darker side. This study concludes that, as people get closer to finishing an activity, they become more and more likely to deliberately deceive others for their own benefit. And they do this, the research shows, because they anticipate regretting a missed opportunity to cheat the system.
No comment yet.
Scooped by Wildcat2030!

Heroes, monsters and people: When it comes to moral choices, outstanding physicists are very ordinary

Heroes, monsters and people: When it comes to moral choices, outstanding physicists are very ordinary | Philosophy everywhere everywhen |

Last week, on the plane back from Chicago, I finished Philip Ball’s book about physics in Germany in the nineteen-thirties and -forties. I’m still thinking about it, and I’m trying to work out why it has left such a strong impression. I think it is because the compromises, recriminations and judgements formed have echoes, weak but clear, in so many other arguments going on today.

It is difficult to be nuanced about Nazis. There are obvious reasons for this, but it is nevertheless sometimes important to try. That genocidal ideology came from somewhere, and looking back on the period through a lens which colours everyone as hero or monster is not necessarily helpful for gaining understanding, and therefore not necessarily a good approach to the prevention of such abominations in future.

Even that previous paragraph is fraught with difficulty, of course. When the Murdoch media ran a video of the six-year-old future Queen giving a Nazi salute, I thought it defensible to show the film - not as an attack on the Royal Family, but as a reminder that such things could be deemed acceptable at that time. The Nazis didn’t come pre-equipped with the political and moral pariah status they deserved. When I said as much on facebook, at least one German friend of mine thought I came very close to the kind of apologia made too often in postwar Germany, that “ordinary people” just didn’t know how bad the Nazis were. Well, if they had read “Mein Kampf” they would have known. As George Orwell put it in his 1940 review of Hitler’s 1926 manifesto:it is difficult to believe that any real change has taken place in his aims and opinions. When one compares his utterances of a year or so ago with those made fifteen years earlier, a thing which strikes one is the rigidity of his mind, the way in which his world-view doesn’t develop.

No comment yet.
Scooped by Wildcat2030!

A Life in Games | Quanta Magazine

A Life in Games |  Quanta Magazine | Philosophy everywhere everywhen |
Gnawing on his left index finger with his chipped old British teeth, temporal veins bulging and brow pensively squinched beneath the day-before-yesterday’s hair, the mathematician John Horton Conway unapologetically whiles away his hours tinkering and thinkering — which is to say he’s ruminating, although he will insist he’s doing nothing, being lazy, playing games.

Based at Princeton University, though he found fame at Cambridge (as a student and professor from 1957 to 1987), Conway, 77, claims never to have worked a day in his life. Instead, he purports to have frittered away reams and reams of time playing. Yet he is Princeton’s John von Neumann Professor in Applied and Computational Mathematics (now emeritus). He’s a fellow of the Royal Society. And he is roundly praised as a genius. “The word ‘genius’ gets misused an awful lot,” said Persi Diaconis, a mathematician at Stanford University. “John Conway is a genius. And the thing about John is he’ll think about anything.… He has a real sense of whimsy. You can’t put him in a mathematical box.”

The hoity-toity Princeton bubble seems like an incongruously grand home base for someone so gamesome. The campus buildings are Gothic and festooned with ivy. It’s a milieu where the well-groomed preppy aesthetic never seems passé. By contrast, Conway is rumpled, with an otherworldly mien, somewhere between The Hobbit’s Bilbo Baggins and Gandalf. Conway can usually be found loitering in the mathematics department’s third-floor common room. The department is housed in the 13-story Fine Hall, the tallest tower in Princeton, with Sprint and AT&T cell towers on the rooftop. Inside, the professor-to-undergrad ratio is nearly 1-to-1. With a querying student often at his side, Conway settles either on a cluster of couches in the main room or a window alcove just outside the fray in the hallway, furnished with two armchairs facing a blackboard — a very edifying nook. From there Conway, borrowing some Shakespeare, addresses a familiar visitor with his Liverpudlian lilt:

Welcome! It’s a poor place but mine own!
No comment yet.
Scooped by Wildcat2030!

How new brain implants can boost free will – Walter Glannon – Aeon

How new brain implants can boost free will – Walter Glannon – Aeon | Philosophy everywhere everywhen |
Some philosophers maintain that free will is incompatible with causal determinism, which by definition allows only one possibility – in essence, it assigns our life trajectories in advance. Others argue that we don’t need alternative possibilities for free will but only the desires and intentions that actually guide what we decide to do.

Yet my student made me think that the debate could be reframed. Free will might have nothing to do with the universe outside and everything to do with how the brain enables or disables our behaviour and thoughts. What if free will relies on the internal, on how successfully the brain generates and sustains the physiological, cognitive and emotional dimensions of our bodies and minds – and has nothing to do with the external at all?

The best way to study free will, I posited, might be through neurological and psychiatric disorders resulting from dysfunction in neural circuits regulating movement, cognition and mood. Patients with Parkinson’s disease experience uncontrollable tremors or equally debilitating rigidity. For those with obsessive-compulsive disorder (OCD), intrusive thoughts and repetitive behaviour seem impossible to suppress. Major depression can dull motivation and destroy the capacity for pleasure. Damage to the region of the brain regulating memory formation can limit the capacity to recall experiences and project oneself into the future. Other traumatic brain injuries undermine free will by causing extensive paralysis and the inability to communicate. If we think of free will as the ability to plan and act without mental or physical compulsion or constraint, then these brain disorders represent a spectrum in which free will is mildly to completely impaired.
No comment yet.
Scooped by Wildcat2030!

Teaching how to think is just as important as teaching anything else

Teaching how to think is just as important as teaching anything else | Philosophy everywhere everywhen |
A new paper on teaching critical thinking skills in science has pointed out, yet again, the value of giving students experiences that go beyond simple recall or learned procedures.

It is a common lamentation that students are not taught to think, but there is usually an accompanying lack of clarity about exactly what that might mean.

There is a way of understanding this idea that is conceptually easy and delivers a sharp educational focus – a way that focuses on the explicit teaching of thinking skills through an inquiry process, and allows students to effectively evaluate their thinking.
What are thinking skills?

Let’s first understand what we might mean by thinking skills. Thinking skills, or cognitive skills, are, in large part, things you do with knowledge. Things like analysing, evaluating, synthesising, inferring, conjecturing, justifying, categorising and many other terms describe your cognitive events at a particular functional level.

Analysis, for example, involves identifying the constituent elements of something and examining their relationships with each other and to the whole. One can analyse a painting, a piece of text, a set of data or a graph.

Analysis is a widely valued cognitive skill and is not unique to any discipline context. It is a general thinking skill.

Most syllabuses from primary to tertiary level are organised by content only, with little mention of such cognitive skills. Usually, even if they are mentioned, little is said about how to teach them. The hope is they will be caught, not taught.

Rigour in course design is too often understood as equating to large amounts of recall of content and specific training in algorithms or set procedures. It is far less common, but far more valuable, to have courses in which rigour is found in the demand for high-level cognitive skill formation.

This is not to say that knowledge is not important in the curriculum. Our knowledge is hard won; we should value what we have learned for how it makes our lives more productive or meaningful.

But there is nothing mutually exclusive about developing high levels of cognitive skills with content knowledge in a discipline context. It just demands attention to these skills, using the content as an opportunity to explore them.

It is knowing how to provide students with these skill-building opportunities in context that is the mark of an outstanding teacher of effective thinking.

After all, we do not expect the scientific, cultural and political leaders of tomorrow simply to know stuff. They must also know what to do with it.
No comment yet.
Scooped by Wildcat2030!

Does the Atheist have a Theory of Mind?

Does the Atheist have a Theory of Mind? | Philosophy everywhere everywhen |
Planet earth appears to be filled with unseen forces that control the behavior of its inhabitants. No, this isn’t the beginning to a cheesy B-movie science fiction film script. This is reality and even the staunchest of skeptics act as if they believe in these invisible forces. That is, we live in a material world ruled by minds with no physical locality and it is here that we think beliefs, desires, intentions, and other mental states are both responsible for, and explain our behavior [1]. There is nothing particularly magical or surprising about this fact, at least not until we consider particular theories in the cognitive science of religion (CSR) that, for example, suggest atheists may be “socially disabled” [2], have a “malfunction” in their ability to reason about these mental states, or perhaps that there is no such thing as atheism at the level of cognition [3]. Thus, and I ask jokingly, does the atheist have a theory of mind? But, more on this in a moment.

Attributing mental states is something we do to others and ourselves on a daily basis, such that it appears to be commonsense — and it is! In fact, this ability has even been called “commonsense psychology,” among various others terms: social cognition, folk psychology, mind reading, and mentalizing [4]. Our folk psychology may not deliver adequate scientific causal descriptions, but we are nonetheless bound to it in everyday reasoning. While some of these terms have very specific technical meanings within a given discipline or theory, for the purposes of the present essay I will primarily use the term theory of mind (ToM). This essay will present a brief overview of ToM, its relationship to autism spectrum disorders, how this relationship is utilized in CSR, and critically evaluate the suggested links between poor ToM skills and atheism.
Wildcat2030's insight:

a must read..

No comment yet.
Scooped by Wildcat2030!

Yes, Other Animals Do Have Sex For Fun - The Crux

Yes, Other Animals Do Have Sex For Fun - The Crux | Philosophy everywhere everywhen |
There’s an idea circulating that humans are the only animal to experience sexual pleasure; that we approach sex in a way that is distinct from others. As with many questions about sex, this exposes some interesting facts about the way we discuss the subject.

On one level, the question of whether humans and nonhumans experience sex in the same way is fairly simply dismissed: how would we know? We cannot know how a nonhuman experiences anything – they can’t be asked. Sex as an experiential phenomenon for nonhumans is, quite simply, inaccessible. Science is obliged to propose questions that are answerable, and “how does a leopard slug experience sex?” is, at time of writing, about as unanswerable as they get.

Having said that, we can make educated guesses about whether sex is pleasurable for other species. Sex would be a very strange thing to seek if it didn’t bring some form of pleasure. It increases risk of disease, it wastes energy, it can seriously increase the likelihood of something bigger coming along and eating you (seriously, check out leopard-slug reproduction, below).
There’s no reason why an animal should seek sex unless they enjoy it. It is often proposed that an inherent “drive to reproduce” explains nonhuman sexual activity, but that is not an alternative here: if animals possess an instinct to reproduce, it needs to function somehow – and pleasure is a fairly basic motivator. The hypothesis that all sexually reproducing species experience sexual pleasure is, in itself, quite reasonable – as would be the hypothesis that animals find eating pleasurable.
No comment yet.
Scooped by Wildcat2030!

Think Your Conscious Brain Directs Your Actions? Think Again - Singularity HUB

Think Your Conscious Brain Directs Your Actions? Think Again - Singularity HUB | Philosophy everywhere everywhen |
Think your deliberate, guiding, conscious thoughts are in charge of your actions?

Think again.

In a provocative new paper in Behavioral and Brain Sciences, a team led by Dr. Ezequiel Morsella at San Francisco State University came to a startling conclusion: consciousness is no more than a passive machine running one simple algorithm — to serve up what’s already been decided, and take credit for the decision.

Rather than a sage conductor, it’s just a tiny part of what happens in the brain that makes us “aware.” All the real work goes on under the hood — in our unconscious minds.

The Passive Frame Theory, as Morsella calls it, is based on decades of experimental data observing how people perceive and generate motor responses to odors. It’s not about perception (“I smell a skunk”), but about response (running from a skunk). The key to cracking what consciousness does in the brain is to work backwards from an observable physical action, explains Morsella in his paper.

If this isn’t your idea of “consciousness,” you’re not alone.
No comment yet.
Scooped by Wildcat2030!

Boredom is not a problem to be solved. It's the last privilege of a free mind | Gayatri Devi

Boredom is not a problem to be solved. It's the last privilege of a free mind | Gayatri Devi | Philosophy everywhere everywhen |
Confessing to boredom is confessing to a character-flaw. Popular culture is littered with advice on how to shake it off: find like-minded people, take up a hobby, find a cause and work for it, take up an instrument, read a book, clean your house And certainly don’t let your kids be bored: enroll them in swimming, soccer, dance, church groups – anything to keep them from assuaging their boredom by gravitating toward sex and drugs. To do otherwise is to admit that we’re not engaging with the world around us. Or that your cellphone has died.

But boredom is not tragic. Properly understood, boredom helps us understand time, and ourselves. Unlike fun or work, boredom is not about anything; it is our encounter with pure time as form and content. With ads and screens and handheld devices ubiquitous, we don’t get to have that experience that much anymore. We should teach the young people to feel comfortable with time.

I live and teach in small-town Pennsylvania, and some of my students from bigger cities tell me that they always go home on Fridays because they are bored here.

You know the best antidote to boredom, I asked them? They looked at me expectantly, smartphones dangling from their hands. Think, I told them. Thinking is the best antidote to boredom. I am not kidding, kids. Thinking is the best antidote to boredom. Tell yourself, I am bored. Think about that. Isn’t that interesting? They looked at me incredulously. Thinking is not how they were brought up to handle boredom.

When you’re bored, time moves slowly. The German word for “boredom” expresses this: langeweile, a compound made of “lange,” which means “long,” and “weile” meaning “a while”. And slow-moving time can feel torturous for people who can’t feel peaceful alone with their minds. Learning to do so is why learning to be bored is so crucial. It is a great privilege if you can do this without going to the psychiatrist.
No comment yet.
Scooped by Wildcat2030!

Walter Benjamin’s legacy, 75 years on

Walter Benjamin’s legacy, 75 years on | Philosophy everywhere everywhen |
Like many a refugee in southern and central Europe today, Walter Benjamin was in flight from war and persecution 75 years ago, but was blocked at an intermediate border en route to the country chosen as his haven. He was part of a Jewish group which, hoping to escape occupied France, had hiked through a Pyrenean pass in autumn 1940 with a view to entering Franco’s Spain, crossing it to Portugal and then sailing to the US. However, in the words of Hannah Arendt, they arrived in the frontier village of Portbou “only to learn that Spain had closed the border that same day” and officials were not honouring American visas such as Benjamin’s. Faced with the prospect of returning to France and being handed over to the Nazis, he “took his own life” overnight on 26 September, whereupon the officials “allowed his companions to proceed to Portugal”.

For Arendt, who successfully reached New York via his intended route a few months later, this was a tragedy of misunderstanding, a poignant but fitting end for a brilliant but misfortune-prone older relative (her cousin by marriage) whom she writes about with a kind of affectionate exasperation.

Yet Edward Stourton, in Cruel Crossing: Escaping Hitler Across the Pyrenees, notes “there are all sorts of unanswered questions surrounding Benjamin’s death. His travelling companions remembered him carrying a heavy briefcase containing a manuscript he described as ‘more important than I am’. No such manuscript was found after his death … A Spanish doctor’s report gave the cause of death as a cerebral haemorrhage, not a drugs overdose. There has been persistent speculation that he was actually murdered, perhaps by a Soviet agent who had infiltrated his escaping party.”
No comment yet.
Scooped by Wildcat2030!

This free online encyclopedia has achieved what Wikipedia can only dream of

The Stanford Encyclopedia of Philosophy may be the most interesting website on the internet. Not because of the content—which includes fascinating entries on everything from ambiguity to zombies—but because of the site itself.

Its creators have solved one of the internet’s fundamental problems: How to provide authoritative, rigorously accurate knowledge, at no cost to readers. It’s something the encyclopedia, or SEP, has managed to do for two decades.
The internet is an information landfill. Somewhere in it—buried under piles of opinion, speculation, and misinformation—is virtually all of human knowledge.   But sorting through the trash is difficult work. Even when you have something you think is valuable, it often turns out to be a cheap knock-off. The story of how the SEP is run, and how it came to be, shows that it is possible to create a less trashy internet—or at least a less trashy corner of it. A place where actual knowledge is sorted into a neat, separate pile instead of being thrown into the landfill. Where the world can go to learn everything that we know to be true. Something that would make humans a lot smarter than the internet we have today.

go read..

No comment yet.
Scooped by Wildcat2030!

Watch: What Is Consciousness? We Now Have the Tools to Find Out - Singularity HUB

Watch: What Is Consciousness? We Now Have the Tools to Find Out - Singularity HUB | Philosophy everywhere everywhen |
The question of consciousness is as old as philosophy. Most animals appear to get along just fine without a sense of ‘me-ness’. But human beings are different. (At least, as far as we know we are.) We’ve evolved a sense of self awareness.

And while the exact nature of human consciousness is exceedingly difficult to pin down—that doesn’t stop us from trying. It's a puzzle that's preoccupied the world’s greatest philosophers for millennia and, in recent centuries, scientists too.

In the information age, we've begun to wonder if consciousness is a uniquely biological phenomenon or if it might arise elsewhere. Is the brain just a mushy computer running wetware—something we can replicate in hardware and software? Or is comparing the brain to a computer a misleading analogy and a vast oversimplification?

A fascinating new video from the Economist, featuring some of the brightest minds working the problem, brings us up to date on the debate and the latest thinking.
No comment yet.
Scooped by Wildcat2030!

A new process for studying proteins associated with diseases | KurzweilAI

A new process for studying proteins associated with diseases | KurzweilAI | Philosophy everywhere everywhen |
Researchers from Northwestern University and Yale University have developed a new technology to help scientists understand how proteins work and fix them when they are broken. Such knowledge could pave the way for new drugs for a myriad of diseases, including cancer.

The human body turns its proteins on and off (to alter their function and activity in cells) using “phosphorylation” — the reversible attachment of phosphate groups to proteins. These “decorations” on proteins provide an enormous variety of functions and are essential to all forms of life. Little is known, however, about how this important dynamic process works in humans.

Phosphorylation: a hallmark of disease

Using a special strain of E. coli bacteria, the researchers built a cell-free protein synthesis platform technology that can manufacture large quantities of these human phosphoproteins for scientific study. The goal is to enable scientists to learn more about the function and structure of phosphoproteins and identify which ones are involved in disease.

The study was published Sept. 9 in an open-access paper by the journal Nature Communications.

Trouble in the phosphorylation process can be a hallmark of disease, such as cancer, inflammation and Alzheimer’s disease. The human proteome (the entire set of expressed proteins) is estimated to be phosphorylated at more than 100,000 unique sites, making study of phosphorylated proteins and their role in disease a daunting task.

“Our technology begins to make this a tractable problem,” said Michael C. Jewett, an associate professor of chemical and biological engineering who led the Northwestern team. “We now can make these special proteins at unprecedented yields, with a freedom of design that is not possible in living organisms. The consequence of this innovative strategy is enormous.”
No comment yet.
Scooped by Wildcat2030!

Ignore Your Feelings

Ignore Your Feelings | Philosophy everywhere everywhen |
Put down the talking stick. Stop fruitlessly seeking "closure" with your peevish co-worker. And please, don't bother telling your spouse how annoying you find their tongue-clicking habit—sometimes honesty is less like a breath of fresh air and more like a fart. That’s the argument of Michael Bennett and Sarah Bennett, the father-daughter duo behind the new self-help book F*ck Feelings.

The elder Bennett is a psychiatrist and American Psychiatric Association distinguished fellow. His daughter is a comedy writer. Together, they provide a tough-love, irreverent take on “life's impossible problems.” The crux of their approach is that life is hard and negative emotions are part of it. The key is to see your “bullshit wishes” for just what they are (bullshit), and instead to pursue real, achievable goals.

Stop trying to forgive your bad parents, they advise. Jerks are capable of having as many kids as anyone else—at least until men’s rights conventions come equipped with free vasectomy booths. If you happen to be the child of a jerk, that's just another obstacle to overcome.

In fact, stop trying to free yourself of all anger and hate. In all likelihood you're doing a really awesome job, the Bennetts argue, despite all the shitty things that happen to you.

Oh, and a word on shit: “Profanity is a source of comfort, clarity, and strength,” they write. “It helps to express anger without blame, to be tough in the face of pain.”

I recently spoke with the Bennetts by phone about what the f*cking deal is with their book. A lightly edited transcript of our conversation follows.
No comment yet.
Scooped by Wildcat2030!

Philosophically Interesting Books for Young Kids

Philosophically Interesting Books for Young Kids | Philosophy everywhere everywhen |
A friend is interested in soliciting philosophically-minded books for young children—ones who are reading, but are not at the chapter-book stage. Here are a few I’ve enjoyed with my kids… The Big Orange Splot by Daniel Manus Pinkwater — for the young individualist. A Hole Is To Dig by Ruth Krauss — for the young teleologist. Pierre: A Cautionary Tale by Maurice Sendak — for the young nihilist. It Could Always Be Worse by Margot Zemach — for the young. How To Behave and Why by Munro Leaf — for th
No comment yet.
Scooped by Wildcat2030!

Study: There Are Instructions for Teaching Critical Thinking | Big Think

Study: There Are Instructions for Teaching Critical Thinking | Big Think | Philosophy everywhere everywhen |
Whether or not you can teach something as subjective as critical thinking has been up for debate, but a fascinating new study shows that it’s actually quite possible. Experiments performed by Stanford's Department of Physics and Graduate School of Education demonstrate that students can be instructed to think more critically.

It’s difficult to overstate the importance of critical-thinking skills in modern society. The ability to decipher information and interpret it, offering creative solutions, is in direct relation to our intellect.
The study took two groups of students in an introductory physics laboratory course, with one group (known as the experimental group) given the instruction to use quantitative comparisons between datasets and the other group given no instruction (the control group). Comparing data in a scientific manner; that is, being able to measure one’s observations in a statistical or mathematical way, led to interesting results for the experimental group.Even after these instructions were removed, they were 12 times more likely to offer creative solutions to improve the experimental methods being used in the class, four times more likely to explain the limitations of the methods, and better at explaining their reasoning than the control group. The results remained consistent even in the next year, with students in a different class. So what does this imply about critical thinking, and how can we utilize these findings to improve ourselves and our society?

We live in an age with unprecedented access to information. Whether you are contributing to an entry on Wikipedia or reading a meme that has no sources cited (do they ever?), your ability to comprehend what you are reading and weigh it is a constant and consistent need. That is why it is so imperative that we have sharp critical-thinking skills. Also, if you don’t use them, you will have nothing to argue with your family about at Thanksgiving. More importantly, it keeps your brain from nomming on junk food and on more of a kale-based diet. Look at any trending topic, and test yourself. Is this true/accurate? How do I know either way? Is there a way I can use data (provable, factual information) to figure this out?

Certainly, we can train ourselves to become better critical thinkers, but it’s also important that we teach these skills to kids. Studies have shown how important this ability is to our success, and yet many feel that we’re doing a terrible job of teaching it. This study, however, may lead to educators and parents realizing that these skills are teachable. The implications of a better thinking society are not quantitative, but I do believe they would be extraordinary.
No comment yet.
Scooped by Wildcat2030!

The pronoun 'I' is becoming obsolete

The pronoun 'I' is becoming obsolete | Philosophy everywhere everywhen |
Don't look now, but the pronoun "I" is becoming obsolete.

Recent microbiological research has shown that thinking of plants and animals, including humans, as autonomous individuals is a serious over-simplification.

A series of groundbreaking studies have revealed that what we have always thought of as individuals are actually "biomolecular networks" that consist of visible hosts plus millions of invisible microbes that have a significant effect on how the host develops, the diseases it catches, how it behaves and possibly even its social interactions.

"It's a case of the whole being greater than the sum of its parts," said Seth Bordenstein, associate professor of biological sciences at Vanderbilt University, who has contributed to the body of scientific knowledge that is pointing to the conclusion that symbiotic microbes play a fundamental role in virtually all aspects of plant and animal biology, including the origin of new species.

In this case, the parts are the host and its genome plus the thousands of different species of bacteria living in or on the host, along with all their genomes, collectively known as the microbiome.

(The host is something like the tip of the iceberg while the bacteria are like the part of the iceberg that is underwater: Nine out of every 10 cells in plant and animal bodies are bacterial. But bacterial cells are so much smaller than host cells that they have generally gone unnoticed.)
No comment yet.
Scooped by Wildcat2030!

Will Artificial Intelligence Surpass Our Own?

Will Artificial Intelligence Surpass Our Own? | Philosophy everywhere everywhen |
Famed science-fiction writer Fredric Brown (1906–1972) delighted in creating the shortest of short stories. “Answer,” published in 1954, encapsulated a prescient meditation on the future of human-machine relations within a single double-spaced, typewritten page.

The foreboding of the story echoes current apprehensions of scientists, policy makers and ethicists over the rapid evolution of machine intelligence.

“Answer” begins under the watchful eyes of a dozen television cameras that are recording the ceremonial soldering of the final connection to tie together all the “monster” computers in the universe.

The machines are about to link 96 billion planets into a single “supercircuit” that combines “all the knowledge of all the galaxies.”

Two witnesses on the scene are identified only as Dwar Ev and Dwar Reyn. After throwing the switch that connects the galactic circuit, Dwar Ev suggests to his companion that he ask the machine the first question:

“Thank you,” said Dwar Reyn. “It shall be a question which no single cyber netics machine has been able to answer.”

He turned to face the machine. “Is there a God?”

The mighty voice answered without hesitation, without the clicking of a single relay.

“Yes, now there is a God.”

Sudden fear flashed on the face of Dwar Ev. He leaped to grab the switch.

A bolt of lightning from the cloudless sky struck him down and fused the switch shut.

We are in the midst of a revolution in machine intelligence, the art and engineering practices that let computers perform tasks that, until recently, could be done only by people. There is now software that identifies faces at border crossings and matches them against passports or that labels people and objects in photographs posted to social media. Algorithms can teach themselves to play Atari video games. A camera and chip embedded in top-of-the-line sedans let the vehicles drive autonomously on the open road.
No comment yet.
Scooped by Wildcat2030!

MIT claims to have found a “language universal” that ties all languages together

MIT claims to have found a “language universal” that ties all languages together | Philosophy everywhere everywhen |
Language takes an astonishing variety of forms across the world—to such a huge extent that a long-standing debate rages around the question of whether all languages have even a single property in common. Well, there’s a new candidate for the elusive title of “language universal” according to a paper in this week’s issue of PNAS. All languages, the authors say, self-organise in such a way that related concepts stay as close together as possible within a sentence, making it easier to piece together the overall meaning.

Language universals are a big deal because they shed light on heavy questions about human cognition. The most famous proponent of the idea of language universals is Noam Chomsky, who suggested a “universal grammar” that underlies all languages. Finding a property that occurs in every single language would suggest that some element of language is genetically predetermined and perhaps that there is specific brain architecture dedicated to language.

However, other researchers argue that there are vanishingly few candidates for a true language universal. They say that there is enormous diversity at every possible level of linguistic structure from the sentence right down to the individual sounds we make with our mouths (that’s without including sign languages).

There are widespread tendencies across languages, they concede, but they argue that these patterns are just a signal that languages find common solutions to common problems. Without finding a true universal, it’s difficult to make the case that language is a specific cognitive package rather than a more general result of the remarkable capabilities of the human brain.
No comment yet.
Scooped by Wildcat2030!

The false dichotomy of nature-nurture, with notes on feminism, transgenderism, and the construction of races

The false dichotomy of nature-nurture, with notes on feminism, transgenderism, and the construction of races | Philosophy everywhere everywhen |
This is my third essay on what has become an informal series on socially relevant false dichotomies (the first one was on “trigger warnings,” the second one on Islamophobia). On this occasion I’m going to focus again on nature-nurture, perhaps the motherlode of false dichotomies (as well as my area of technical expertise as a practicing biologist).

The occasion is provided by recent controversies concerning the delicate concepts of gender and race, where once again — as in both the cases of trigger warnings and of Islamophobia — I see well intentioned progressives needlessly (in my mind) and harshly attacking fellow progressives, or at the least, people who ought to be their natural political allies. (As in the other two cases, I will ignore contributions from the right and from libertarians, on the ground that I find them both less constructive and less surprising than those from the sources I will be discussing here.)

Let me start with gender. I read with fascination a New York Times op-ed piece by feminist Elinor Burkett entitled “What makes a woman?,” explaining why a number of feminists have issues with certain aspects of the transgender movement, and in particular why Burkett had mixed feelings about the very public coming out of Caitlyn Jenner.

First, Jenner: Burkett says that of course she supports a member of an often vilified gender minority when that person makes the sort of courageous statement that Jenner did by appearing on the cover of Vanity Fair magazine. But, asks Burkett, did Jenner really have to embrace what from a feminist point of view (and yes, I’m perfectly aware that there are different types of feminists, with different points of view) is the stereotype of the babe with big breasts, revealing cleavage, and an unhealthy degree of concern with getting her nails done?
No comment yet.