If there is a subtext to the principle of selection, it lies in an idealised notion of American national values, as showcased by Hollywood films for more than a century. Across generations, millions have laughed with the Marx Brothers comedy Duck Soup and sung along with The Sound of Music.
These, and others such as Citizen Kane and Casablanca make the NASA list as much more than remarkable films. They are cultural articulations of the ethos of America as sought to be portrayed by its establishment. This ethos includes the championing of its national power, as well as its much publicised ability to introspect on a national scale through films such as 12 Years A Slave or To Kill a Mockingbird, which also feature in the selection. The American dream
The great narrative arc of Hollywood film fundamentally reinforces the belief that, its blemishes notwithstanding, there is no country quite like America. Predictable selections, therefore, include films such as The Wizard of Oz and It’s a Wonderful Life which celebrate conservative ideas about American values, reiterating that “there’s no place like home”. The Seven Samurai, later remade as The Magnificent Seven in Hollywood, remains a rare example of a non-English language film on the list.
In the fictional world, the film hero is the most prominent saviour of the Western way of life, and a fascination for all things heroic underlies the selection. Consequently, James Bond appears many times in the list, as does Die Hard’s incorrigible movie cop John Mclane and his television equivalent, the indefatigable Jack Bauer in 24.
Imagine a world in which most people worked only 15 hours a week. They would be paid as much as, or even more than, they now are, because the fruits of their labor would be distributed more evenly across society. Leisure would occupy far more of their waking hours than work. It was exactly this prospect that John Maynard Keynes conjured up in a little essay published in 1930 called "Economic Possibilities for Our Grandchildren." Its thesis was simple. As technological progress made possible an increase in the output of goods per hour worked, people would have to work less and less to satisfy their needs, until in the end they would have to work hardly at all. Then, Keynes wrote, "for the first time since his creation man will be faced with his real, his permanent problem—how to use his freedom from pressing economic cares, how to occupy the leisure, which science and compound interest will have won for him, to live wisely and agreeably and well." He thought this condition might be reached in about 100 years—that is, by 2030.
Peter Singer is considered a founding father of the modern animal rights movement. He is a vegan who gives away a third of his income to charity. So why has he been described as the most dangerous men on the planet?
Frisson. What a strange word. It evokes that peculiar intermingling of excitement and fear that can attend momentous events. The spark of electricity when you lock eyes with someone who is yet unknown to you, but who might just be ‘the one.’ The queasy sensation of anxious adrenaline when a big news story breaks. The fearful joy as you plunge downhill on a vertiginous rollercoaster. The word ‘thrill’ perhaps comes close. But not quite. As such, realising that all near-equivalents in English are imperfect, we gladly alight upon the French loanword. And as we do, our existence feels just a little richer and more nuanced.
Almost the entirety of Western literature can be fit neatly into just six story arcs, according to a new data-mining study.
From the panoply of novels that Western society has produced, distinct narrative patterns emerge, and many attempts have been made to pin down the shape of a story and categorize a protagonist’s journey. French writer Georges Polti claims there are 36 different types dramatic stories, while others have counted seven narrative arcs or 20.
But new research from the University of Vermont utilizing data-mining techniques suggests that the majority of the Western canon falls into one of six basic categories. A Story’s Path
Researchers from the Computational Story Lab looked at over 1,700 books from Project Gutenberg for their study, winnowing out books such as dictionaries or those with less than 150 downloads. They analyzed the content of each book by taking samples of text, what they called “windows”, from throughout the story. They used the aptly named “hedonometer” , also developed by the Computational Story Lab, to compile a list of over 10,000 words and rate them on a spectrum of positive to negative using Amazon’s Mechanical Turk service. They published their results last month on arXiv.org.
Adding up these windows over the course of a whole book produced graphs of characters’ fortunes — the highs and lows — throughout a given novel, and generated a broad visualization of the arc the story takes. According to the researchers, theses are the six story arcs that appear time and time again in Western literature: “Rags to riches” (the story gets better over time); “Man in a hole” (fortunes fall, but the protagonist bounces back); “Cinderella” (there’s an initial rise in good fortunes, followed by a setback, but a happy ending) “Tragedy” or “riches to rags” (things only get worse); “Oedipus” (bad luck, followed by promise, ending in a final fall) “Icarus” (opens with good fortunes, but doomed to fail)
Scientists working on animal cognition often dwell on their desire to talk to the animals. Oddly enough, this particular desire must have passed me by, because I have never felt it. I am not waiting to hear what my animals have to say about themselves, taking the rather Wittgensteinian position that their message might not be all that enlightening. Even with respect to my fellow humans, I am dubious that language tells us what is going on in their heads. I am surrounded by colleagues who study members of our species by presenting them with questionnaires. They trust the answers they receive and have ways, they assure me, of checking their veracity. But who says that what people say about themselves reveals actual emotions and motivations?
This might be true for simple attitudes free from moralisations (‘What is your favourite music?’), but it seems almost pointless to ask people about their love life, eating habits, or treatment of others (‘Are you pleasant to work with?’). It is far too easy to invent post-hoc reasons for one’s behaviour, to be silent about one’s sexual habits, to downplay excessive eating or drinking, or to present oneself as more admirable than one really is.
No one is going to admit to murderous thoughts, stinginess or being a jerk. People lie all the time, so why would they stop in front of a psychologist who writes down everything they say? In one study, female college students reported more sex partners when they were hooked up to a fake lie-detector machine, demonstrating that they had been lying when interviewed without the lie-detector. I am in fact relieved to work with subjects that don’t talk. I don’t need to worry about the truth of their utterances. Instead of asking them how often they engage in sex, I just count the occasions. I am perfectly happy being an animal watcher.
David Chalmers, who coined the phrase “Hard Problem of consciousness,” is arguably the leading modern advocate for the possibility that physical reality needs to be augmented by some kind of additional ingredient in order to explain consciousness—in particular, to account for the kinds of inner mental experience pinpointed by the Hard Problem. One of his favorite tools has been yet another thought experiment: the philosophical zombie.
Unlike undead zombies, which seek out brains and generate movie franchises, philosophical zombies look and behave exactly like ordinary human beings. Indeed, they are perfectly physically identical to non‐zombie people. The difference is that they are lacking in any inner mental experience. We can ask, and be puzzled about, what it is like to be a bat, or another person. But by definition, there is no “what it is like” to be a zombie. Zombies don’t experience.
Perhaps for athletes, a genius is an Olympic medalist. In entertainment, a genius could be defined as an EGOT winner, someone who has won an Emmy, Grammy, Oscar and Tony award. For Mensa, the exclusive international society comprising members of "high intelligence," someone who scores at or above the 98th percentile on an IQ or other standardized intelligence test could be considered genius.
The most common definition of genius falls in line with Mensa's approach: someone with exceptional intelligence. Making a genius
In his new science series "Genius" on PBS, Stephen Hawking is testing out the idea that anyone can "think like a genius." By posing big questions — for instance, "Can we travel through time?" — to people with average intelligence, the famed theoretical physicist aims to find the answers through the sheer power of the human mind.
"It's a fun show that tries to find out if ordinary people are smart enough to think like the greatest minds who ever lived," Hawking said in a statement. "Being an optimist, I think they will." [Mad Geniuses: 10 Odd Tales About Famous Scientists]
Optimism aside, answering a genius-level question does not a genius make — at least, not according to psychologist Frank Lawlis, supervisory testing director for American Mensa.
"The geniuses ask questions. They don't know the answers, but they know a lot of questions and their curiosity takes them into their fields," Lawlis told Live Science. "[They're] somebody that has the capacity to inquire at that high level and to be curious to pursue that high level of understanding and then be able to communicate it to the rest of us."
The importance of loving yourself is a common catchphrase among feel-good gurus and the subject of countless self-help books.
But Harvard University’s Michael Puett argues that loving yourself—and all your flaws—can actually be quite harmful. Puett, who earlier this year published a book on what Chinese philosophy can teach us about the good life, suggests that ancient Chinese philosophers would strongly disapprove of today’s penchant for self-affirmation.
Quartz spoke to Puett as part of an occasional series that attempts to apply serious thinking from the world of philosophy to everyday life. What can great thinkers teach us about how to navigate your career path? Do people ever really change? Can philosophy inform our search for true love? The Chinese philosophy Puett studies raises questions about whether we should we accept and celebrate ourselves as we are or strive to change and improve upon our fundamental nature. And, for that matter, does our “fundamental nature” even exist?
“The common assumption most of us make about the self is that our goal as individuals is to look within, find our true selves, and try to be as authentic and true to ourselves as we can be,” Puett says. “But this assumes we have a stable self.”
By contrast, much of the Chinese philosophical tradition derived from Confucius envisions “the self” as more of a messy product of habit than a clearly-defined inner essence. “From a very young age, we’ll form patterns of responding to the world. Those patterns will harden and become what we mistakenly call a personality,” adds Puett.
Philosophy is often the whipping boy of the supposedly “easy degrees”. No one takes you seriously if you study it, and people assume you can easily get a 2:1 just by turning up.
This couldn’t be further from the truth. Unlike with Geography, where if you bring your 24 pack of Crayola you’re pretty much set, I have to try and decide whether the crayons are actually real or not.
I’m often accused of spending £9,000 a year to sit around and think, and this is largely accurate. Maybe you haven’t tried it for a while, but thinking is actually really hard. With subjects like Economics and History you are spoon fed theories and facts to learn, with your lecturer and seminar tutor holding your hand all the way. Well done! You did really well in that exam where you wrote down everything you were told you to. The difference between your average student and a trained monkey is that the monkey probably dresses slightly better.
In Philosophy you are faced with dilemmas like the trolley problem, there is no economic model that you can plug the information into to get an answer. You have to figure it all out for yourself. As an inherently flawed 21-year old male, I can barely make myself breakfast in the morning, and am shocked when I make it to the end of each day in one piece. In what sort of world can I be expected to offer anything coherent on ethical dilemmas that are asked of me in essays and exams. How am I supposed to get a good mark when I am faced with modules such as ‘The Philosophy of Time’?
People who study other degrees are lucky, only in Philosophy are you subjected to suffering an existential crisis every time you set foot in a seminar room.
A poet, somewhere in Siberia, or the Balkans, or West Africa, some time in the past 60,000 years, recites thousands of memorised lines in the course of an evening. The lines are packed with fixed epithets and clichés. The bard is not concerned with originality, but with intonation and delivery: he or she is perfectly attuned to the circumstances of the day, and to the mood and expectations of his or her listeners.
If this were happening 6,000-plus years ago, the poet’s words would in no way have been anchored in visible signs, in text. For the vast majority of the time that human beings have been on Earth, words have had no worldly reality other than the sound made when they are spoken.
As the theorist Walter J Ong pointed out in Orality and Literacy: Technologizing the Word (1982), it is difficult, perhaps even impossible, now to imagine how differently language would have been experienced in a culture of ‘primary orality’. There would be nowhere to ‘look up a word’, no authoritative source telling us the shape the word ‘actually’ takes. There would be no way to affirm the word’s existence at all except by speaking it – and this necessary condition of survival is important for understanding the relatively repetitive nature of epic poetry. Say it over and over again, or it will slip away. In the absence of fixed, textual anchors for words, there would be a sharp sense that language is charged with power, almost magic: the idea that words, when spoken, can bring about new states of affairs in the world. They do not so much describe, as invoke.
As a consequence of the development of writing, first in the ancient Near East and soon after in Greece, old habits of thought began to die out, and certain other, previously latent, mental faculties began to express themselves. Words were now anchored and, though spellings could change from one generation to another, or one region to another, there were now physical traces that endured, which could be transmitted, consulted and pointed to in settling questions about the use or authority of spoken language.
I remember my grandfather commenting—wry amusement tinged with grim resignation—that what made him finally feel old was seeing his children reach middle age. I was a child then. Now I see my own children, not quite middle aged, starting to have children of their own.
Becoming a grandparent is quite lovely, an affirmation of continuity and a front-row-seat to watch (and even, on occasion, participate) as life itself is conveyed into the future. But aging is also our most undeniable memento mori, a reminder not so much of life as one’s own eventual death. My grandfather’s death frightened me as few things have since, except for the recurring recognition (usually at night, alone, in the dark) that his life, everyone’s life, even—astoundingly—my own, is short indeed.
All things, especially living ones, are marinating in the river of time. We see and understand that our bodies will wear out and we will die. At least that’s how it looks through the lens of Western science, where all things come to an end, winding down in a final surrender to entropy. But there’s another perspective, surprisingly in harmony with science, that helps us revisit that huge and ancient terror—fear of time itself—in a new and perhaps even reassuring way. And that is the perspective offered by Buddhism.
For Buddhists, the “center cannot hold,” as the poet W.B. Yeats pointed out, because it doesn’t exist as something rigidly separate from everything else. Nothing is permanent and unchanging, ourselves included. Attempting to cling to a solid, immutable core of a self is a fool’s errand because time not only creates anarchy, it provides the unavoidable matrix within which everything—animate and inanimate, sentient and insensate—ebbs and flows.
Every day, it seems, some verifiably intelligent person tells us that we don’t know what consciousness is. The nature of consciousness, they say, is an awesome mystery. It’s the ultimate hard problem. The current Wikipedia entry is typical: Consciousness “is the most mysterious aspect of our lives”; philosophers “have struggled to comprehend the nature of consciousness.”
I find this odd because we know exactly what consciousness is — where by “consciousness” I mean what most people mean in this debate: experience of any kind whatever. It’s the most familiar thing there is, whether it’s experience of emotion, pain, understanding what someone is saying, seeing, hearing, touching, tasting or feeling. It is in fact the only thing in the universe whose ultimate intrinsic nature we can claim to know. It is utterly unmysterious.
The nature of physical stuff, by contrast, is deeply mysterious, and physics grows stranger by the hour. (Richard Feynman’s remark about quantum theory — “I think I can safely say that nobody understands quantum mechanics” — seems as true as ever.) Or rather, more carefully: The nature of physical stuff is mysterious except insofar as consciousness is itself a form of physical stuff. This point, which is at first extremely startling, was well put by Bertrand Russell in the 1950s in his essay “Mind and Matter”: “We know nothing about the intrinsic quality of physical events,” he wrote, “except when these are mental events that we directly experience.” In having conscious experience, he claims, we learn something about the intrinsic nature of physical stuff, for conscious experience is itself a form of physical stuff.
The question of being is the darkest in all philosophy.” So concluded William James in thinking about that most basic of riddles: how did something come from nothing? The question infuriates, James realized, because it demands an explanation while denying the very possibility of explanation. “From nothing to being there is no logical bridge,” he wrote.
In science, explanations are built of cause and effect. But if nothing is truly nothing, it lacks the power to cause. It’s not simply that we can’t find the right explanation—it’s that explanation itself fails in the face of nothing.
Imagine a Martian zoologist, visiting Earth and observing Homo sapiens for the first time. He, she or it would see a species of primate that differs from the others in many ways, all of them involving our complex cultural, intellectual, linguistic, symbolic and technologic lifestyles. But looking at us through a zoologist’s lens, our observer wouldn’t be especially impressed. To be sure, we have some distinctive anatomical traits (mostly hairless, bipedal, big brains, non-prognathic jaws, unimpressive teeth, and so forth) but being unique isn’t itself unique. Every species is special in its own way.
Among our catalogue of not-so-special traits would be the fact that men are on the whole larger than women: about 7 per cent taller and 15 per cent heavier, with this difference somewhat greater when it comes to muscularity. Also notable: men outnumber women when it comes to lethal violence by a factor of roughly 10:1, a differential found not only cross-culturally among adults, but even recognisable among young children (as a proclivity for violence).
Given these facts, our zoologist would strongly suspect that these humans are paradigmatic harem-holding mammals, notwithstanding the fact that, in the Western world at least, monogamy is the designated standard. In our sexual dimorphism (physical and behavioural male-female differences), we fit the normal polygynous profile for all other animal species. This profile arises as a result of sexual selection, whereby males compete with other males, and more fit males garner a payoff of enhanced reproductive success via an increased number of sexual partners.
This diagnosis of polygyny would be enhanced if the observer visited a high school: girls are physically and socially more mature than same-age boys (to the consternation of both). This pattern, known as sexual bimaturism, is also a polygyny give-away, if rather a counter-intuitive one. In order to reproduce, women undergo considerably more physiological stress than do men; they must nourish an embryo in utero, give birth and then lactate. By contrast, men need only produce a few cubic centimetres of semen. One might expect that males would mature sexually earlier than females since so much less is required of them, but this is not the case. In polygynous species, males must participate in fierce same-sex competition if they are to reproduce at all. Woe betide a male who enters the reproductive arena when too young, small, weak and inexperienced. Just as the degree of sexual dimorphism maps very closely upon the degree of polygyny (average harem size) in a species, the extent of sexual bimaturism is also strongly correlated with the extent to which males compete with each other for access to females. Humans fall into the moderate polygynous part of that spectrum.
There is a well-documented organ shortage throughout the world. For example, 3,000 kidney transplants were made last year in the United Kingdom, but that still left 5,000 people on the waiting list at the end of the period. A lucrative trade in organs has grown up, and transplant tourism has become relatively common. While politicians wring their hands about sensible solutions to the shortage, including the nudge of opt-out donation, scientists using genetic manipulations have been making significant progress in growing transplantable organs inside pigs.
Scientists in the United States are creating so-called ‘human-pig chimeras’ which will be capable of growing the much-needed organs. These chimeras are animals that combine human and pig characteristics. They are like mules that will provide organs that can be transplanted into humans. A mule is the offspring of a male donkey (jack) and a female horse (mare). Horses and donkeys are different species with different numbers of chromosomes, but they can breed together.
In this case, the scientists take a skin cell from a human and from this make stem cells capable of producing any cell or tissue in the body, known as ‘induced pluripotent stem cells’. They then inject these into a pig embryo to make a human-pig chimera. In order to create the desired organ, they use gene editing, or CRISPR, to knock out the embryo’s pig’s genes that produce, for example, the pancreas. The human stem cells for the pancreas then make an almost entirely human pancreas in the resulting human-pig chimera, with just the blood vessels remaining porcine. Using this controversial technology, a human skin cell, pre-treated and injected into a genetically edited pig embryo, could grow a new liver, heart, pancreas or lung as required.
This is a technique with wider possibilities, too: other US teams are working on a chimera-based treatment, this time for Parkinson’s disease, which will use chimeras to create human neurones. CRISPR is also credited with enhancing the safety of this technique: last year, a team from Harvard was able to use the new and revolutionary technique to remove copies of a pig retrovirus.
Safety is always a major concern when science allows new medical developments. But even if a sufficient guarantee of safety could be achieved, there are further ethical problems that should concern us.
A chimera is a genetic mix. This means that, although the aim might be to isolate only certain organs to express human genetic material, the whole chimera will in fact comprise the genetic material of both humans and pigs. It is not a pig with a human pancreas inserted into it – it is a human-animal chimera, whose pancreas resembles a human’s, and whose other organs are a blend of pig and human. This could affect the chimera’s brain. Pablo Ross, the lead researcher in the pig experiment, is quoted by the BBC as saying: ‘We think there is very low potential for a human brain to grow.’ Even if in this particular case he is correct, given that some of this kind of research is indeed focused on neurons, it is possible that some future chimeras will develop human or human-like brains.
Where the genetic material of humans and animals are mixed, this might result in characteristics that we usually think of as having moral relevance. ‘Moral status’ is the standing or position of a being within a hierarchical framework of moral obligations. The moral status of a chimera entails relevant obligations to treat it in certain ways while it is alive, in virtue of its nature, and has implications for whether it is wrong to kill it.
Where is your mind? Where does your thinking occur? Where are your beliefs? René Descartes thought that the mind was an immaterial soul, housed in the pineal gland near the centre of the brain. Nowadays, by contrast, we tend to identify the mind with the brain. We know that mental processes depend on brain processes, and that different brain regions are responsible for different functions. However, we still agree with Descartes on one thing: we still think of the mind as (in a phrase coined by the philosopher of mind Andy Clark) brainbound, locked away in the head, communicating with the body and wider world but separate from them. And this might be quite wrong. I’m not suggesting that the mind is non-physical or doubting that the brain is central to it; but it could be that (as Clark and others argue) the mind extends beyond the brain.
An astronomer, mathematician, philosopher, and active public figure, Hypatia played a leading role in Alexandrian civic affairs. Her public lectures were popular, and her technical contributions to geometry, astronomy, number theory, and philosophy made Hypatia a highly regarded teacher and scholar.
Philosophy is a remarkably un-diverse discipline. Compared with other scholars who read, interpret and assign texts, philosophers in the United States typically choose a much higher percentage of their sources (often, 100 per cent) from Europe and countries settled by Europeans. Philosophy teachers, too, look homogeneous: 86 per cent of new PhD researchers in philosophy are white, and 72 per cent are male. In the whole country, only about 30 African-American women work as philosophy professors.
In The New York Times’ philosophy blog ‘The Stone’, Jay L Garfield and Bryan W Van Norden recently wrote: ‘No other humanities discipline demonstrates this systematic neglect of most of the civilizations in its domain.’ They urge philosophy departments to diversify their curricula – and, if they can’t or won’t, to rename themselves departments of ‘Anglo-European Philosophical Studies’.
In my own view, philosophy must become more diverse in order to make progress on its fundamental questions. But cultural diversity means something different in philosophy, compared with other humanities disciplines.
The humanities primarily seek to understand what other people mean. Interpretation requires sensitivity, empathy and openness. A humanistic discipline should range across all of human experience, past and present, investigating and learning what people have thought throughout the world.
In contrast, many scientists see themselves as part of one global, culturally neutral, 21st-century community devoted to understanding nature itself. For instance, the physicist Freeman Dyson wrote recently that Fang Lizhi, the late Chinese astrophysicist and dissident, ‘believed passionately in science, not only as an intellectual pursuit of understanding of nature, but also as an international enterprise in which people of diverse cultures and traditions could work together. Scientists throughout the world speak a common language and find it easy to collaborate.’
Bryan Magee (1930 – ) has had a multifaceted career as a professor of philosophy, music and theater critic, BBC broadcaster, public intellectual and member of Parliament. He has starred in two acclaimed television series about philosophy: Men of Ideas (1978) and The Great Philosophers (1987). He is best known as a popularizer of philosophy. His easy-to-read books, which have been translated into more than twenty languages, include:
Confessions of a Philosopher: A Personal Journey Through Western Philosophy from Plato to Popper; The Great Philosophers: An Introduction to Western Philosophy; Talking Philosophy: Dialogues with Fifteen Leading Philosophers; Philosophy and the Real World: An Introduction to Karl Popper; The Story of Philosophy: 2,500 Years of Great Thinkers from Socrates to the Existentialists and Beyond; Men of Ideas.
Now, at age 86, he has written Ultimate Questions, a summary of a lifetime of thinking about “the fundamentals of the human condition.” Its basic theme is that we know little about the human condition, since reality comes to us filtered through the senses and the limitations of our intellect and language. And the most honest response to this predicament is agnosticism.
Magee begins considering that “What we call civilization has existed for something like six thousand years.” If you remember that there have always been some individuals who have lived a hundred years this means that “the whole of civilization has occurred with the successive lifetimes of sixty people …” Furthermore, “most people are as provincial in time as they are in space: they huddle down into their time and regard it as their total environment…” They don’t think about the little sliver of time and space that they occupy. Thus begins this meditation on agnosticism.
Furthermore, we are ignorant of knowledge of our ultimate nature: “We, who do not know what we are, have to fashion lives for ourselves in a universe of which we know little and understand less.” Yet this situation doesn’t lead Magee to despair. Instead he calls for “an active agnosticism,” which is “a positive principle of procedure, an openness to the fact that we do not know, followed by intellectually honest enquiry in full receptivity of mind.” If he had to choose a tag he says, it would be “the agnostic.”
Many people cheat on taxes—no mystery there. But many people don’t, even if they wouldn’t be caught—now, that’s weird. Or is it? Psychologists are deeply perplexed by human moral behavior, because it often doesn’t seem to make any logical sense. You might think that we should just be grateful for it. But if we could understand these seemingly irrational acts, perhaps we could encourage more of them.
It’s not as though people haven’t been trying to fathom our moral instincts; it is one of the oldest concerns of philosophy and theology. But what distinguishes the project today is the sheer variety of academic disciplines it brings together: not just moral philosophy and psychology, but also biology, economics, mathematics, and computer science. They do not merely contemplate the rationale for moral beliefs, but study how morality operates in the real world, or fails to. David Rand of Yale University epitomizes the breadth of this science, ranging from abstract equations to large-scale societal interventions. “I’m a weird person,” he says, “who has a foot in each world, of model-making and of actual experiments and psychological theory building.”
IT was going to be the biggest presentation of my life — my first appearance on the TED Conference main stage — and I had already thrown out seven drafts. Searching for a new direction, I asked colleagues and friends for suggestions. “The most important thing,” the first one said, “is to be yourself.” The next six people I asked gave me the same tip.
We are in the Age of Authenticity, where “be yourself” is the defining advice in life, love and career. Authenticity means erasing the gap between what you firmly believe inside and what you reveal to the outside world. As Brené Brown, a research professor at the University of Houston, defines it, authenticity is “the choice to let our true selves be seen.”
We want to live authentic lives, marry authentic partners, work for an authentic boss, vote for an authentic president. In university commencement speeches, “Be true to yourself” is one of the most common themes (behind “Expand your horizons,” and just ahead of “Never give up”).
“I certainly had no idea that being your authentic self could get you as rich as I have become,” Oprah Winfrey said jokingly a few years ago. ”If I’d known that, I’d have tried it a lot earlier.”
But for most people, “be yourself” is actually terrible advice.
If I can be authentic for a moment: Nobody wants to see your true self. We all have thoughts and feelings that we believe are fundamental to our lives, but that are better left unspoken.
A decade ago, the author A. J. Jacobs spent a few weeks trying to be totally authentic. He announced to an editor that he would try to sleep with her if he were single and informed his nanny that he would like to go on a date with her if his wife left him. He informed a friend’s 5-year-old daughter that the beetle in her hands was not napping but dead. He told his in-laws that their conversation was boring. You can imagine how his experiment worked out.
“Deceit makes our world go round,” he concluded. “Without lies, marriages would crumble, workers would be fired, egos would be shattered, governments would collapse.” Continue reading the main story Adam Grant Work, motivation and values.
See More »
How much you aim for authenticity depends on a personality trait called self-monitoring. If you’re a high self-monitor, you’re constantly scanning your environment for social cues and adjusting accordingly. You hate social awkwardness and desperately want to avoid offending anyone.
But if you’re a low self-monitor, you’re guided more by your inner states, regardless of your circumstances. In one fascinating study, when a steak landed on their plates, high self-monitors tasted it before pouring salt, whereas low self-monitors salted it first. As the psychologist Brian Little explains, “It is as though low self-monitors know their salt personalities very well.”
When I began studying how animals swim, I didn’t feel much like a physicist. I’d just finished my bachelor’s in physics during which time I’d been taught that physicists work on one of a handful of buzzwords: quantum mechanics, cosmology, gauge theory, and so on. To see if graduate school was right for me, I shadowed a friendly research group at the University of California, San Diego—but they didn’t study any of these buzzwords. They used high-powered mathematics to understand things like the locomotion of snails, worms, and microorganisms.
I was grateful for the opportunity, and I thought the problems they studied were beautiful and interesting—just not fundamental physics. As I became more involved in the group, this distinction grew into an identity crisis. Theoretical physicists are kind of like artists, or athletes: If you feel yourself drifting further from Klee or Peyton Manning, it can seem like a catastrophe. I thought I could feel Einstein and Feynman looking down at me and frowning as I took a turn down the wrong path. Sapolsky_TH-F1
It would take some impressive feats by microorganisms to convince me that they were as sexy as smashing atoms together—and they did not fail to deliver. Some are capable of shooting small needles, or even segments of their DNA, which accelerate about 1,000 times faster than a space shuttle launch; others share genetic information with their unrelated neighbors, forming an Internet many millennia older than ours; many outlive us, despite being 1 million times smaller. Even more interesting was that microorganisms do not obey Newton’s laws, which govern basic motion and are a pillar of classical physics.
These remarkable facts changed not only how I perceived bacteria, but how I defined what it meant to be a physicist.
Few people have influenced contemporary philosophy of mind as profoundly as the late Hilary Putnam. One of his best known contributions was the formulation of functionalism. As he understood it, functionalism claims that mental states are functional states—postulates of abstract descriptions, like those employed in computer science, which ignore a system’s physical details and focus instead on the ways it correlates inputs with outputs. Psychological descriptions in particular focus on the ways a system correlates sensory inputs with behavioral outputs, and mental states are the internal states that correlate the two.
By the mid-1970s functionalism had become the dominant outlook in philosophy of mind. But Putnam, showing his characteristic independence of mind, became dissatisfied with the view. He did not retreat to substance dualism or idealism. He was convinced that we are physical beings whose capacities are essentially embodied in the physical mechanisms that compose us, yet he was also a committed antireductionist. He denied that physics, chemistry, and neuroscience could yield an exhaustive account of what we are and what we can do. In articulating a pro-physical yet anti-reductive view along these lines, Putnam found inspiration in a new source: Aristotle. Aristotle’s ideas had been dismissed in many quarters of the philosophical world as expressions of a bygone pre-scientific age. But Putnam saw through the dismissive haze to the empirically and philosophically-respectable core of Aristotle’s philosophy, ‘hylomorphism’.
Sharing your scoops to your social media accounts is a must to distribute your curated content. Not only will it drive traffic and leads through your content, but it will help show your expertise with your followers.
How to integrate my topics' content to my website?
Integrating your curated content to your website or blog will allow you to increase your website visitors’ engagement, boost SEO and acquire new visitors. By redirecting your social media traffic to your website, Scoop.it will also help you generate more qualified traffic and leads from your curation work.
Distributing your curated content through a newsletter is a great way to nurture and engage your email subscribers will developing your traffic and visibility.
Creating engaging newsletters with your curated content is really easy.