A one-dimensional working model of a programmable vibration-damping material. Each stub has a piezoelectric disc (which converts mechanical to electrical
Researchers from Empa and ETH Zurich have developed a prototype of a selective vibration-damping material that they claim “could change the world of mechanics forever” as a step toward “programmable materials.”
Described in the journal Advanced Materials, this “material of the future” can damp mechanical vibrations completely or selectively suppress specific vibration frequencies or ranges of frequencies.
The one-dimensional working model consists of a simple aluminum sheet-metal strip, measuring one meter by one centimeter and one millimeter thick and designed to vibrate at different frequencies. To control the wave propagation through the plate, ten small aluminum cylinders (7 mm thick, 1 cm high) are attached to the metal.
Between the sheet and the cylinders sit piezoelectric discs, which can be stimulated electronically to instantly change their thickness.
That allows for controlling exactly how waves are allowed to propagate in the sheet-metal strip. The aluminum strip thus turns into an “adaptive phononic crystal” — a material with controllable vibration properties.
The Golden Age of universities may be dead. And while much of the commentary around the online disruption of education ranges from cost-benefit analyses to assessing ideology of what drives MOOCs (massively open online courses), the real question becomes — what is the point of the university in this landscape?
It’s clear that universities will have to figure out the balance between commercial relevance and basic research, as well as how to prove their value beyond being vehicles for delivering content. But lost in the shuffle of commentary here is something arguably more important than and yet containing all of these factors: culture.
Online courses can be part of, and have, their own culture, but university culture cannot be replicated in an online environment (at least not easily). Once this cultural difference is acknowledged, we can revisit the cost-benefit analysis: Is cheaper tuition worth it if it pays for education that isn’t optimized for innovation? Will university culture further stratify the socioeconomic difference MOOCs may level? And so on…
While innovation is a buzzword that’s bandied about a bit too loosely, we think this is the lens we need to use in judging the relevance of universities. It’s the only thing that prevents us from programming students as robots, a workforce whose jobs can be automated away. In fact, universities that excel at preparing students for such a creative economy prioritize the same three things that drive successful startup cultures: density, shared resources, and community.
Using ordinary fishing line, researchers have crafted coiled muscles that could revolutionize prosthetics and robotic exoskeletons.
Next time you spot a muscly athlete showing off at the gym, try out this compliment: “Wow! You’ve got arms like fishing line.”
Though it may not be taken well, it’s actually a flattering comparison. Scientists at the University of Texas at Dallas have designed super strong artificial muscles by simply twisting and coiling ordinary fishing line. The coiled muscles can lift more than 100 times the weight of a human muscle of the same size, and generate as much mechanical power per kilogram as a jet engine — perhaps offering an inexpensive new material to move prosthetics and robotic exoskeletons.
On a smaller scale, the twisted yarns of polymers could also one day yield clothing with pores that open and close based on temperature, or climate-controlled window shutters.
“There are many types of artificial muscles that have been talked about in the literature for years,” said the study’s lead author Ray Baughman, director of the Alan G. MacDiarmid NanoTech Institute at the University of Texas at Dallas. “Very few are commercially used.”
Albert Einstein accepted the modern cosmological view that the universe is expanding long after many of his contemporaries. Until 1931, physicist Albert Einstein believed that the universe was static. An urban legend attributes this change of perspective to when American astronomer Edwin Hubble showed Einstein his observations of redshift in the light emitted by far away nebulae -- today known as galaxies. But the reality is more complex. The change in Einstein’s viewpoint, in fact, resulted from a tortuous thought process. Now researchers explain how Einstein changed his mind following many encounters with some of the most influential astrophysicists of his generation.
Penn State University chemists and engineers have, for the first time, placed tiny synthetic motors inside live human cells in a lab, propelled them with ultrasonic waves, and steered them magnetically.
The Penn State nanomotors are the closest so far to a “Fantastic Voyage” concept (without the miniature people).
The nanomotors, which are rocket-shaped gold rods ~300 nanometers in diameter and ~3 microns long, move around inside the cells, spinning and battering against the cell membrane.
The nanomotors are activated by resonant ultrasound operating at ~4 MHz, and show axial propulsion as well as spinning.
Santa Fe, New Mexico (PRWEB) February 12, 2014 -- M. Alexander Nugent Consulting of Santa Fe, New Mexico, a private R&D company announces the publication of Alex Nugent and Timothy Molter's PLOS ONE paper "AHaH Computing - FromMetastable Switches to Attractors to Machine Learning". The paper describes a new form of computing based on the attractor dynamics of dissipative systems and details a path from memristor-based circuits to foundational machine learning functions.
A new form of computing based on the attractor dynamics of dissipative systems has been shown to lead to solutions in machine learning and universal logic. In the newly published PLOS ONE paper “AHaH Computing—From Metastable Switches to Attractors to Machine Learning”, authors Alex Nugent and Timothy Molter detail a path from memristor-based circuits to foundational machine learning functions such as pattern classification, prediction, clustering, combinatorial optimization and robotic arm actuation. The main aim of the research is to better understand how nature utilizes the laws of thermodynamics and self-organization to compute. This knowledge is being directed toward the creation of a new type of adaptive neural processing unit (NPU) called “Thermodynamic RAM”.
Nugent and Molter demonstrate that the AHaH Node is a computationally universal building block that can serve as the foundation for a new adaptive computing substrate that meshes memory and processing. This has some big implications in terms of power and space efficiency by addressing the von Neumann Bottleneck
Nanoparticle inks can turn your existing 2D printer into a circuit board production line – and the possibilities for 3D printers are mind-boggling
Printing foldable mobile phones on a sheet of paper from a normal 2D printer is just a decade away, according to Jürgen Steimle, head of the Embodied Interaction Group at the Max Planck Institute for Informatics in Saarbrücken, Germany. Steimle and his colleagues took a step towards this in 2013, when they used a standard printer loaded with nanoparticle ink to print a paper circuit that works even after the sheet is torn.
In the past couple of years, similar applications have popped up in laboratories around the world. "People are starting to realise the power of printing," says Vincent Rotello, a chemist at the University of Massachusetts Amherst, who is working on a printable test strip for pathogenic bacteria in water.
The convergence of nano and printing is partially due to the success of one eye-grabbing device, the 3D printer, which produces objects to a three-dimensional template by extruding soft plastic noodles that rapidly consolidate into the shape of the desired object. Scientists are now adding nanoparticles to the plastic, thereby giving these products "smart" properties, but the humble 2D printer, which is far more commonplace, is being revitalised by nanoparticle ink.
The virtual currency is about more than money – the real innovation is what people are doing with the technology it is based on
BITCOIN has been called many things, from the future of money to a drug dealer's dream and everything else in between. But beyond creating the web's first native currency, the true innovation of Bitcoin's mysterious designer, Satoshi Nakamoto, is its underlying technology, the "block chain". That fundamental concept is being used to transform Bitcoin – and could even replace it altogether.
So what is the block chain? It is a ledger of transactions that keeps Bitcoin secure and allows all users to agree on exactly who owns how many bitcoins. Each new block requires a record of recent transactions along with a string of letters and numbers, known as a hash, which is based on the previous block and produced using a cryptographic algorithm.
Miners, people who run the peer-to-peer Bitcoin software, randomly generate hashes, competing to produce one with a value below a certain target difficulty and thus complete a new block and receive a reward, currently 25 bitcoins. This difficulty means faking a transaction is impossible unless you have more computing power than everyone else on the Bitcoin network combined. Confused? Don't worry, ordinary Bitcoin users needn't know the details of how the block chain works, just as people with a credit card don't bother learning banking network jargon. But those who do understand the power of the block chain are realising how Nakamoto's technology for mass agreement can be adapted. "You can replace that agreement with all sorts of different things and now you have a really powerful building block for any kind of distributed system," says Jeremy Clark of Concordia University in Montreal, Canada.
Most qualities we think of as particularly 'human' can be seen elsewhere in the animal kingdom, thanks to evolution, writes Alice Roberts
Just how special do you think you are? How different do you think you are from other animals? Do you think of yourself as an animal or do you see yourself, and your fellow humans, as somehow set apart from the rest of the animal kingdom?
Most of us – and I would unashamedly label us as the sensible majority of the population – accept that evolution is the best explanation for the pattern of life that we observe on the planet, both living and fossilised. However much creationists bang on about evolution being "just a theory", it beautifully explains all the evidence we have to hand (and there's masses of that: anatomical, genetic, palaeontological, embryological), without a single piece of evidence having turned up that threatens to bring the whole edifice tumbling down around our ears.
So, I'm hoping you're a sensible sort of person and that you consider evolution to be as true as the spherical nature of the Earth, or the fact that the Earth orbits the sun and not vice versa. But just how comfortable are you with the idea of being a product of evolution? I think it's still, even among the most enlightened of us, really hard to come to terms with the idea that we are just another animal. A naked ape. The third chimpanzee, even. You have to admit, science has done a very good job at bringing us down a peg or two, at knocking us off the pedestal of our own construction. We can no longer view ourselves as a special creation, something created in the image of a deity and close to angels (whatever they are or look like). We can no longer see ourselves as the ultimate destination, as the pinnacle of evolution, either. Our species is just a tiny twig on the massive, dense tree of life. But that's so difficult to stomach
Stanislaw Lem's forgotten masterwork Summa Technologiae, now in English half a century after publication, is a heady mix of prescience, philosophy and irony"
The book will be a fabulous shock to those who know only his science fiction, such as Solaris or The Cyberiad. Others will have caught tantalising glimpses of Summa, published in 1964, in a few essays. Diehards may even have read it in translation, notably German or Russian.
The English version has been translated by Joanna Zylinska, professor of new media and communications at Goldsmiths, University of London. Zylinska's work is at once wildly imaginative and painstakingly precise; sometimes one wishes, in the later chapters, that she would be a little more slapdash and cut to the chase, but this, of course, is Lem's fault, not hers.
Summa is not for the faint-hearted. Starting with a title that pastiches Thomas Aquinas's 13th-century Summa Theologiae, Lem sets out to replace god with reason. Zylinska's introduction lays out the map. Is the phenomenon that is humanity typical or exceptional in the universe? Does plagiarising nature count as fraud? Do we need consciousness for human agency? Should we trust our thoughts or perceptions? Are we controlling technology – or vice versa?
It is amazing how much Lem got right, or even predicted. This ranges across artificial intelligence, the theory of search engines (he called it "ariadnology"), bionics, virtual reality ("phantomatics"), technological singularity and nanotechnology.
But Lem's philosophical ambition is the real meat. Zylinska quotes an essay by biophysicist Peter Butko, who describes Summa as an "all-encompassing... discourse on evolution: not only... of science and technology... but also evolution of life, humanity, consciousness, culture, and civilization".
What is new about how teenagers communicate through services such as Facebook, Twitter, and Instagram? Do social media affect the quality of teens’ lives? In this eye-opening book, youth culture and technology expert danah boyd uncovers some of the major myths regarding teens' use of social media. She explores tropes about identity, privacy, safety, danger, and bullying.
In the 1990s, the venture capitalist John Doerr famously predicted that the Internet would lead to the “the largest legal creation of wealth in the history of the planet.” Indeed, the Internet has created a tremendous amount of personal wealth. Just look at the rash of Internet billionaires and millionaires, the investors both small and large that have made fortunes investing in Internet stocks, and the list of multibillion-dollar Internet companies—Google, Facebook, LinkedIn, and Amazon. Add to the list the recent Twitter stock offering, which created a reported 1,600 millionaires.
Then there’s the superstar effect. The Internet multiplies the earning power of the very best high-frequency traders, currency speculators, and entertainers, who reap billions while the merely good are left to slog it out.
But will the Internet also create the greatest economic inequality the global economy has ever known? And will poorly designed government policies aimed at ameliorating the problem of inequality end up empowering the Internet-driven redistribution process?
As the Internet goes about its work making the economy more efficient, it is reducing the need for travel agents, post office employees, and dozens of other jobs in corporate America. The increased interconnectivity created by the Internet forces many middle and lower class workers to compete for jobs with low-paid workers in developing countries. Even skilled technical workers are finding that their jobs can be outsourced to trained engineers and technicians in India and Eastern Europe.
Craig Venter, who managed to make science both lucrative and glamorous with his pioneering approach to gene sequencing and synthetic biology, is taking on a new venture: aging.
He has joined forces with the founder of the X Prize and an expert in cell therapy to launch on Tuesday a new company called Human Longevity Inc. The man who once took off on his personal yacht to sample all the microscopic life in the seas plans to leverage some of the most fashionable new scientific approaches to figure out what makes us sick and old.
The San Diego-based company will tackle aging using gene sequencing; stem cell approaches; the collection of bacteria and other life forms that live in and on us called the microbiome; and the metabolome, which includes the byproducts of life called metabolites.
They’ll start out with what they are calling the largest human sequencing operation in the world.
“We are building a lab to a scale never attempted (before),” Venter told NBC News.
Venter first shot to fame when he raced with government scientists to finish the first map of all human DNA, called the human genome. Venter, himself a former government scientist, annoyed his former colleagues with a brash new approach to gene sequencing that was much faster but far less accurate, in their opinion.
Think back to a time when you were completely engaged in an activity. Maybe it was reading a comic book, or catching up with an old friend. Whatever it was, what do you remember about the experience? Are “effort” and “persistence” words you would use to describe the activity? Even though something technically got done (a comic book was read, a fruitful discussion ensued), it most likely felt effortless and enjoyable.
After interviewing people about their “peak experiences” —from rock climbers to chess masters to artists to scientists— psychologist Mihalyi Csikszentmihalyi found that people kept describing a state of intense concentration and absorption in which no mental resources were left over for distraction. In this state of flow, people felt in control of their consciousness, their inner critic disappeared, and time seemed to recede in the background. Importantly, the activity felt effortless.
The great educational philosopher John Dewey was one of the first to emphasize the important linkages among interest, curiosity, and effort. Dewey made the persuasive case that interest-based learning is more beneficial than effort-based learning. He noted that “willing attention” is more effective than “forced effort” because interest drives active learning: “If we can secure interest in a given set of facts or ideas we may be perfectly sure that the pupil will direct his energies toward mastering them.” In contrast, he noted, an education based on forcing children to expend energy unwillingly only results in a “character dull, mechanical, unalert, because the vital juice of spontaneous interest has been squeezed out.”
Researchers from North Carolina State University have developed a de facto antibiotic “smart bomb” that can identify specific strains of bacteria and sever their DNA, eliminating the infection. The technique offers a potential approach to treat infections by multi-drug resistant bacteria.
“Conventional antibiotic treatments kill both ‘good’ and ‘bad’ bacteria, leading to unintended consequences, such as opportunistic infections,” says Dr. Chase Beisel, an assistant professor of chemical and biomolecular engineering at NC State and senior author of a paper describing the work. “What we’ve shown in this new work is that it is possible to selectively remove specific strains of bacteria without affecting populations of good bacteria.”
The new approach works by taking advantage of a part of an immune system present in many bacteria called the CRISPR-Cas system. The CRISPR-Cas system protects bacteria from invaders such as viruses by creating small strands of RNA called CRISPR RNAs, which match DNA sequences specific to a given invader. When those CRISPR RNAs find a match, they unleash Cas proteins that cut the DNA.
The NC State researchers have demonstrated that designing CRISPR RNAs to target DNA sequences in the bacteria themselves causes bacterial suicide, as a bacterium’s CRISPR-Cas system attacks its own DNA.
In the last two decades, dozens of scientific papers have been published on the biological origins of homosexuality - another announcement was made last week. It's becoming scientific orthodoxy. But how does it fit with Darwin's theory of evolution?
Macklemore and Ryan Lewis's hit song Same Love, which has become an unofficial anthem of the pro-gay marriage campaign in the US, reflects how many gay people feel about their sexuality.
It mocks those who "think it's a decision, and you can be cured with some treatment and religion - man-made rewiring of a predisposition". A minority of gay people disagree, maintaining that sexuality is a social construct, and they have made a conscious, proud choice to take same-sex partners.
But scientific opinion is with Macklemore. Since the early 1990s, researchers have shown that homosexuality is more common in brothers and relatives on the same maternal line, and a genetic factor is taken to be the cause. Also relevant - although in no way proof - is research identifying physical differences in the brains of adult straight and gay people, and a dizzying array of homosexual behaviour in animals.
But since gay and lesbian people have fewer children than straight people, a problem arises.
"This is a paradox from an evolutionary perspective," says Paul Vasey from the University of Lethbridge in Canada. "How can a trait like male homosexuality, which has a genetic component, persist over evolutionary time if the individuals that carry the genes associated with that trait are not reproducing?"
Scientists don't know the answer to this Darwinian puzzle, but there are several theories. It's possible that different mechanisms may be at work in different people. Most of the theories relate to research on male homosexuality. The evolution of lesbianism is relatively understudied - it may work in a similar way or be completely different.
Everyone brainstorms a little differently, but over on the MIT Sloan Management Review they've put together a seven step plan that should help make the brainstorming process more fruitful.
Different groups are always going to brainstorm a little different and every project is different, but the authors at MIT Sloan Management Review have a pretty simple gameplan everyone can follow:
Define the problem and solutions space: Basically, create boundaries and rules for your solutions so you don't waste time thinking of solutions that aren't feasible.Break the problem down: Make the problem easier to tackle by breaking it into smaller parts using diagrams or mind maps.Make the problem personal: Think about how the problem effects you personally.Seek the perspectives of outsiders: Try and find as many people as possible who might have input and see what they have to say.Diverge before you converge: Breed a little conflict into the discussion when you can. One way to do this is to have everyone write down their ideas before the meeting starts so everyone doesn't rally around the first idea just to get out of the meeting quickly.Create "idea resumes": An "idea resume" is a one-page document that breaks down the basics of a solution.Create a plan to learn: Start designing a way to test your ideas and write out what you hope to learn from those tests.
The above seven steps certainly aren't the only way to brainstorm, but they do provide a pretty solid foundation for work off of.
The Discipline of Creativity | MIT Sloan Management Review via INC
How you represent yourself in the virtual world of video games may affect how you behave toward others in the real world, according to new research published in Psychological Science, a journal of the Association for Psychological Science.
“Our results indicate that just five minutes of role-play in virtual environments as either a hero or villain can easily cause people to reward or punish anonymous strangers,” says lead researcher Gunwoo Yoon of the University of Illinois at Urbana-Champaign.
As Yoon and co-author Patrick Vargas note, virtual environments afford people the opportunity to take on identities and experience circumstances that they otherwise can’t in real life, providing “a vehicle for observation, imitation, and modeling.”
They wondered whether these virtual experiences — specifically, the experiences of taking on heroic or villainous avatars — might carry over into everyday behavior.
The researchers recruited 194 undergraduates to participate in two supposedly unrelated studies. The participants were randomly assigned to play as Superman (a heroic avatar), Voldemort (a villainous avatar), or a circle (a neutral avatar). They played a game for 5 minutes in which they, as their avatars, were tasked with fighting enemies. Then, in a presumably unrelated study, they participated in a blind taste test. They were asked to taste and then give either chocolate or chili sauce to a future participant. They were told to pour the chosen food item into a plastic dish and that the future participant would consume all of the food provided.
The results were revealing: Participants who played as Superman poured, on average, nearly twice as much chocolate as chili sauce for the “future participant.” And they poured significantly more chocolate than those who played as either of the other avatars.
Participants who played as Voldemort, on the other hand, poured out nearly twice as much of the spicy chili sauce than they did chocolate, and they poured significantly more chili sauce compared to the other participants.
Lynn Rothschild has short brown hair and smiley eyes. She cracks jokes about biology and microscopes with ease. Diana Gentry, her decades-younger P.h.D.. student, loves classic video games and vegetarian cooking. She lives near Silicon Valley. The two colleagues have a funny banter, and have spent holidays together. But they share one unique goal.
They’re trying to 3D-print wood in space.
The Stanford University researchers have been working long hours honing a three-dimensional printing process to make biomaterials like wood and enamel out of mere clumps of cells. Pundits say such 3D bioprinting has vast potential, and could one day be widely used to transform specially engineered cells into structural beams, food, and human tissue. Rothschild and Gentry don’t only see these laboratory-created materials helping only doctors and Mars voyagers. They also envision their specific research – into so-called “synthetic biomaterials” – changing the way products like good-old-fashioned wooden two-by-fours are made and used by consumers.
This is why IBM is investing $1 billion into Watson.
Besides winning "Jeopardy" and helping "put a permanent end to leukemia," IBM has found another use for its Watson supercomputer: psychoanalyzing people.
Watson is a computer that understands human language. Simply by looking at the language used when posting on social media sites, it can understand your personality and even predict major events likely to happen in your life.
Amazingly, this tech doesn't even have to know a company's customer's social media accounts beforehand. It can figure them out on its own by sifting through what people post online with information in a company database.
The first use that comes to mind for this is marketing. The Watson tech tries to figure out what's happening in a person's life from social media to predict when to send you offers. As Fast Co. Design's John Brownlee writes:
"It can even predict major life-events: if you changed your Facebook status to "Married" a year ago, for example, a company might infer that it was about time to start approaching you about products and services for your first child."
Snacking on a freshly-made pizza in outer space just got a whole lot closer thanks to Anjan Contractor's 3D pizza printer. Contractor, who won NASA's US$125,000 grant last year to create a 3D printer that could print food for astronauts on missions, has come out with a functional prototype.
The prototype prints the blocky pizza out in layers, as demonstrated in the video at the bottom of the page. While it's a tad messy, the end product, when cooked, looks fairly appetizing. Contractor plans to equip the 3D printer with food cartridges that last for 30 years.
Such a long shelf life is pretty much a necessity, since long-distance space missions could take several years. To make the pizza ingredients last, Contractor is investigating ways to remove all the moisture from them and reduce the proteins, carbs and nutrients into a powder form.
Contractor says that it only took 70 seconds to cook the pizza once it was printed out, in a comment on his Youtube page. If NASA implements the technology, astronauts will be able to get 3D-printed fast food in space, which should be a comfort, instead of having to rely solely on freeze-dried and canned foods.
The revolutionary discovery that any cell can be rewound to a pre-embryonic state remarkably easily could usher in new therapies and cloning techniques
A LITTLE stress is all it took to make new life from old. Adult cells have been given the potential to turn into any type of body tissue just by tweaking their environment. This simple change alone promises to revolutionise stem cell medicine.
Yet New Scientist has also learned that this technique may have already been used to make a clone. "The implication is that you can very easily, from a drop of blood and simple techniques, create a perfect identical twin," says Charles Vacanti at Harvard Medical School, co-leader of the team involved.
Details were still emerging as New Scientist went to press, but the principles of the new technique were outlined in mice in work published this week. The implications are huge, and have far-reaching applications in regenerative medicine, cancer treatment and human cloning.
Everyone’s more interested in artificial intelligence since news broke that Google acquired a secretive startup called DeepMind. The technology has big promise, but make no mistake: It’s not sentient yet, and Google is far from alone in its quest.
Artificial intelligence might be the most misunderstood term in technology. It conjures up images of malevolent robots and self-aware computer systems capable of outwitting — or at least matching wits with — human beings. It is not that. At least not today.
Google’s acquisition of artificial intelligence startup DeepMind for $400 million sent the tech world atwitter earlier this week. Everybody wanted to know what the mysterious company was up to and why Google was willing to pay so much for it. After a day or so of mystery and even speculation that Google wanted to turn its new robots into sentient beings, the probable truth finally began to emerge.
Google just wants to build a better search platform, and talent isn’t going to come cheap with everybody in the web vying for it.