Certain readers may turn, for general solace, to the novels of John Steinbeck. But how many, in particular need of romantic advice, open up Of Mice and Men, East of Eden, or The Grapes of Wrath?
"There are several kinds of love. One is a selfish, mean, grasping, egotistical thing which uses love for self-importance. This is the ugly and crippling kind. The other is an outpouring of everything good in you. "
Algorithms have become a hot topic of political lament in the last few years. The literature is expansive; Christopher Steiner's upcoming book Automate This: How Algorithms Came to Rule Our World attempts to lift the lid on how human agency is largely helpless in the face of precise algorithmic bots that automate the majority of daily life and business.
FQXi catalyzes, supports, and disseminates research on questions at the foundations of physics and cosmology, particularly new frontiers and innovative ideas integral to a deep understanding of reality, but unlikely to be supported by conventional...
Positioning Theory articulates a triadic relation between Normative Positions, Individual Storylines and Speech Acts (or more broadly I guess, agency):
Positions condition the thoughts and ideas of individuals. Those thoughts and ideas take the form of a 'storyline' - the way individuals see the world. From the storyline, agency emerges through speech acts which reproduce and transform the normative situation.
Praying, fighting, dancing, chanting — human rituals could illuminate the growth of community and the origins of civilization.
Rituals are a human universal — “the glue that holds social groups together”, explains Whitehouse, who leads the team of anthropologists, psychologists, historians, economists and archaeologists from 12 universities in the United Kingdom, the United States and Canada. Rituals can vary enormously, from the recitation of prayers in church, to the sometimes violent and humiliating initiations of US college fraternity pledges, to the bleeding of a young man's penis with bamboo razors and pig incisors in purity rituals among the Ilahita Arapesh of New Guinea. But beneath that diversity, Whitehouse believes, rituals are always about building community — which arguably makes them central to understanding how civilization itself began.
Prigogine is best known for his definition of dissipative structures and their role in thermodynamic systems far from equilibrium, a discovery that won him the Nobel Prize in Chemistry in 1977. In summary, Ilya Prigogine discovered that importation and dissipation of energy into chemical systems could reverse the maximization of entropy rule imposed by the second law of thermodynamics.
This is part of a popular hypertext guide to semiotics by Daniel Chandler at the University of Wales, Aberystwyth.
We seem as a species to be driven by a desire to make meanings: above all, we are surely Homo significans - meaning-makers. Distinctively, we make meanings through our creation and interpretation of 'signs'. Indeed, according to Peirce, 'we think only in signs' (Peirce 1931-58, 2.302). Signs take the form of words, images, sounds, odours, flavours, acts or objects, but such things have no intrinsic meaning and become signs only when we invest them with meaning. 'Nothing is a sign unless it is interpreted as a sign', declares Peirce (Peirce 1931-58, 2.172). Anything can be a sign as long as someone interprets it as 'signifying' something - referring to or standing for something other than itself. We interpret things as signs largely unconsciously by relating them to familiar systems of conventions. It is this meaningful use of signs which is at the heart of the concerns of semiotics.
Thus, Deleuze and Guattari conceive the relationship between change – in theorm o becoming – and history as one in which processes o becoming pullaway rom the determinacy o history, turning away rom it not in order todispense with it but to exceed it, reinvigorating the present with “an unhistori-cal element.” They write, or instance, that “Philosophy cannot be reduced toits own history, because it continually wrests itsel rom this history in order tocreate new concepts that all back into history but do not come rom it. Howcould something come rom history? Without history, becoming would remainindeterminate and unconditioned, but becoming is not historical. . . . The eventitsel needs becoming as an unhistorical element” (Deleuze and Guattari 1994:96; 1991: 92).
What is the greatest human gift? It is metaphor, carrying a cargo of meaning across the oceans that divide us
Eros is coursing through the forest. The forest is mewing with its jaguar life. Life is spiralling into poetry. I am in the other world, I thought, at once in the actual forest and in the forests of the mind where the visible world is not denied but augmented.
Many of Buddhism’s core tenets significantly overlap with findings from modern neurology and neuroscience. So how did Buddhism come close to getting the brain right?
Over the last few decades many Buddhists and quite a few neuroscientists have examined Buddhism and neuroscience, with both groups reporting overlap. I’m sorry to say I have been privately dismissive. One hears this sort of thing all the time, from any religion, and I was sure in this case it would break down upon closer scrutiny. When a scientific discovery seems to support any religious teaching, you can expect members of that religion to become strict empiricists, telling themselves and the world that their belief is grounded in reality. They are always less happy to accept scientific data they feel contradicts their preconceived beliefs. No surprise here; no human likes to be wrong.
Many philosophers consider the era of “modern” philosophy to begin with René Descartes’s Discourse on Method (1637) and Meditations on First Philosophy (1641). In these works, Descartes aims to ground human knowledge of the external, material world.
I argue; we err systematically and pervasively about even the most basic facts of the stream of experience, and even when we set our minds to it carefully and conscientiously. Our knowledge of our immediate physical environment is much better than our knowledge of our stream of experience, and in fact to a large extent our knowledge of our physical environment is the ground of whatever knowledge we do manage to cobble together about our stream of experience.
Humans are aware of their own and other's thoughts in ways unlike any other animal. But why did consciousness evolve?
Consciousness was for a long time the charged third rail of biology: touch it and … well, maybe you didn’t die, but you were unlikely to get a grant, or tenure. Of course, it helped if you were a Nobel laureate, such as Francis Crick, lauded for his work on DNA, or Gerald Edelman, for his work on antibodies. Yet even their attempts to pin down the electrical-chemical-anatomical (or whatever) substrate of consciousness seemed, until recently, likely to go the way of Albert Einstein’s doomed search for a unified theory of everything. However, the situation has changed dramatically in recent years. Inquiry into the neurobiology of consciousness has become one of the hottest, best-funded, and most media-friendly of research enterprises, along with genomics, stem cells and a few other newly favoured sub-disciplines.For centuries, it was perfectly acceptable for philosophers to ponder consciousness because, after all, no one actually expected them to come up with anything real. Thus, René Descartes’s renowned statement ‘Cogito ergo sum’ (I think therefore I am) becomes, in the words of the early 20th-century American satirist Ambrose Bierce, Cogito cogito ergo cogito sum — I think that I think, therefore I think that I am (which was, according to Bierce, about as close to truth as philosophy was likely to get). Now we have micro-electrodes recording from individual neurons, computer modelling of neural nets, functional MRIs, and an array of even newer 21st-century techniques, all hot on the trail of how consciousness emerges from ‘mere’ matter. Cartesian dualism is on the run, as well it should be.
In this 10 part series from The Open University, Professor Russell Stannard OBE delves into subjects ranging from free will and determinism, to space and t
How can we ever understand the relationship between consciousness and the physical brain? All day long we have to make choices. Are we really free to choose?
What kind of universe we live in and what caused the Big Bang? Is there intelligent life out there? What is beyond the observable universe? Will the universe close back on itself?
Why the word space so-called empty space isn’t simply another name for nothing? We all start off thinking there is just the one time the same for everyone. Relativity theory shows this not to be the case.
What is the nature of matter and how complete our understanding of matter can be? What is the Wave/Particle Paradox and how Neils Bohr solved it?
Supernatural beliefs might not make sense, but they endure because they're so emotionally satisfying
Nevertheless, it’s not the case that there are no limits to what can be accepted as a religious supernatural belief. Scott Atran and Pascal Boyer have independently pointed out that actual religious superstitions over the whole world constitute a narrow subset of all the arbitrary random superstitions that one could theoretically invent. To quote Pascal Boyer, there is no religion proclaiming anything like the following tenet: “There is only one God! He is omnipotent. But he exists only on Wednesdays.” Instead, the religious supernatural beings in which we believe are surprisingly similar to humans, animals, or other natural objects, except for having superior powers. They are more far-sighted, longer-lived, and stronger, travel faster, can predict the future, can change shape, can pass through walls, and so on. In other respects, gods and ghosts behave like people. The god of the Old Testament got angry, while Greek gods and goddesses became jealous, ate, drank, and had sex. Their powers surpassing human powers are projections of our own personal power fantasies; they can do what we wish we could do ourselves. I do have fantasies of hurling thunderbolts that destroy evil people, and probably many other people share those fantasies of mine, but I have never fantasized about existing only on Wednesdays. Hence it doesn’t surprise me that gods in many religions are pictured as smiting evil-doers, but that no religion holds out the dream of existing just on Wednesdays. Thus, religious supernatural beliefs are irrational, but emotionally plausible and satisfying. That’s why they’re so believable, despite at the same time being rationally implausible.
Welcome to Physics for the 21st Century: an on-line course that explores the frontiers of physics. The 11 units, accompanied by videos, interactive simulations, and a comprehensive Facilitator's Guide, work together to present an overview of key areas of rapidly-advancing knowledge in the field, arranged from the sub-atomic scale to the cosmological. The goal is to make the frontiers of physics accessible to anyone with an inquisitive mind who wants to experience the excitement, probe the mystery, and understand the human aspects of modern physics.
Joshua Glenn and Elizabeth Foy Larsen, editors of the fantastic kids' activity book Unbored have an article in the Huffington Post about the power of making in the classroom.
n fact, the idea of "learning by doing" stretches back to education legends Maria Montessori and John Dewey, both of whom felt teachers should act more as guides to students' independent discoveries than as founts of information. Decades of research confirm that making and doing things cement knowledge in ways that lectures can't. "Think about the driver and the passenger in a car," says Adele Diamond, Ph.D., a professor of psychiatry at the University of British Columbia and one of the founders of the field of developmental cognitive neuroscience. "The driver is hands-on and the passenger does what students normally do in class, which is sit passively. The driver will learn the route better because she has to actively use the information."
Friedrich Nietzsche was by all accounts an admirer of the Hellenic aesthetic tradition, and would often refer to the ancient myths and tragedies to frame his own philosophy. In the philosopher’s f...
riedrich Nietzsche was by all accounts an admirer of the Hellenic aesthetic tradition, and would often refer to the ancient myths and tragedies to frame his own philosophy. In the philosopher’s first—and self-admittedly flawed—book, The Birth of Tragedy (1872), Nietzsche presents his views on the development of the ancient Greek dramas, characterizing its growth as an artistic desire to thwart the emergence of pessimism in human expression. He framed this artistic development in terms of the philosophical dichotomy of the Apollonian and Dionysian elements. Much can be written (and has been written) about these two elements as literary concepts, but the simplified idea is that there exists a delicate balance between the human striving for orderliness (the Apollonian element) in light of our innate attraction to chaotic irrationalities (the Dionysian element), in which the two sides are contingent on one another to create an essential harmony of human expression. Nietzsche considered the ancient Athenian dramas of Aeschylus and Sophocles to be the epitome of this dynamic in aesthetic form; i.e. their works signal the birth of tragedy, in human art. To Nietzsche this development was the zenith of artistic creation, a perfect balance between opposing drives of the human instinct, whose blending satisfied the artist in man as a whole. Since this time in antiquity, however, we have experienced a decline—a devolution—in the aesthetic development of man. A loss that Nietzsche traces to one fundamental source: Socrates.
Something central, very central, is missing in historical accounts of scientific and technological discovery. The discourse and controversies focus on the role of luck as opposed to teleological programs (from telos, "aim"), that is, ones that rely on pre-set direction from formal science. This is a faux-debate: luck cannot lead to formal research policies; one cannot systematize, formalize, and program randomness. The driver is neither luck nor direction, but must be in the asymmetry (or convexity) of payoffs, a simple mathematical property that has lied hidden from the discourse, and the understanding of which can lead to precise research principles and protocols.