In between anatomy and biochemistry, medical students in the US are learning how to sauté, simmer and season healthy, homemade meals.
Since 2012, first and second year students at Tulane University School of Medicine in Louisiana have been learning how to cook. Since the program launched, Tulane has built the country’s first med school-affiliated teaching kitchen and become the first medical school to count a chef as a full-time instructor.
Sixteen med schools have now licensed the center’s curriculum, as have two non-medical schools, the Children’s Hospital San Antonio-Sky Lakes Residency Program and the Nursing School at Northwest Arkansas Community College. In fact, about 10% of America’s medical schools are teaching their students how to cook with Tulane’s program, Tim Harlan, who leads Tulane’s Goldring Center for Culinary Medicine, told the James Beard Foundation conference last month. It also offers continuing medical education programs with a certification for culinary medicine, for doctors, physicians assistants, nurse practitioners, pharmacists, and registered dietitians.
It’s enough to make your heart beat a little faster. A new study suggests that resting heart rate can be used as a ‘death test’ to predict your chance of keeling over in the next two decades.
Although doctors have known for some time that people with low resting heart rates are usually fitter and more healthy, it is the first time the risk has been quantified.
People who have a resting heart rate of 80 beats per minute (bpm) are 45 per cent more likely to die of any cause in the next 20 years compared to those with the lowest measured heart rate of 45 bpm.
Most people’s resting heart rate is between 60 and 100 bpm but the hearts of professional athletes beat around 40 times per minute.
The researchers found that the risk of dying from any illness or health condition raises by around nine per cent for every 10 bpm over. The chance of suffering a fatal heart attack or stroke rises eight per cent.
"The association of resting heart rate with risk of all-cause and cardiovascular mortality is independent of traditional risk factors of cardiovascular disease, suggesting that resting heart rate is a predictor of mortality in the general population," said Dr Dongfeng Zhang, of the Medical College of Qingdao University, Shandong, China.
There are many ways that men try to impress women - be that with their wallets or their wit.
However, eating massive quantities of food is an unusual way that men innately try to show off to the female sex.
A study of 133 adults (74 males and 59 females) by Cornell University found that men consume a considerable amount more food when they are with women.
It doesn't matter what kind of food it is either - they found that not only did men eat 93% more pizza (1.44 more slices) when dining with a female than when with another man, but they also ate 86% more salad.
“These findings suggest that men tend to overeat to show off,” Kevin Kniffin, visiting assistant professor and lead author of the study, told the LA Times. “Instead of a feat of strength, it’s a feat of eating.”
Apparently, by stuffing their faces, men are showing "that they possess extraordinary skills, advantages, and/or surplus energy in degrees that are superior to other men.”
“Conspicuous consumption of food is a much less dramatic ‘risk’ than, say, going off to the front lines of war, but research on the effects of obesity nonetheless show overeating to constitute risky behavior,” added the study authors.
It also may be to show their physical fitness - which may seem an odd conclusion to draw from a pizza binge.
Juliano Pinto, a 29-year-old paraplegic, kicked off the 2014 World Cup in São Paulo with a robotic exoskeleton suit that he wore and controlled with his mind. The event was broadcast internationally and served as a symbol of the exciting possibilities of brain-controlled machines. Over the last few decades research into brain–computer interfaces (BCIs), which allow direct communication between the brain and an external device such a computer or prosthetic, has skyrocketed. Although these new developments are exciting, there are still major hurdles to overcome before people can easily use these devices as a part of daily life.
Until now such devices have largely been proof-of-concept demonstrations of what BCIs are capable of. Currently, almost all of them require technicians to manage and include external wires that tether individuals to large computers. New research, conducted by members of the BrainGate group, a consortium that includes neuroscientists, engineers and clinicians, has made strides toward overcoming some of these obstacles. “Our team is focused on developing what we hope will be an intuitive, always-available brain–computer interface that can be used 24 hours a day, seven days a week, that works with the same amount of subconscious thought that somebody who is able-bodied might use to pick up a coffee cup or move a mouse,” says Leigh Hochberg, a neuroengineer at Brown University who was involved in the research. Researchers are opting for these devices to also be small, wireless and usable without the help of a caregiver.
Cardiovascular disease kills more people on Earth than anything else—over 17 million a year, and the number keeps going up. Of those deaths, more than 40 percent is due to coronary heart disease. Medicine has drugs that can treat it and practices that can help prevent it, but nobody really knows what causes it or how to cure it. Now, Google and the American Heart Association aim to change that by dropping a $50 million funding bomb on the problem. And as you might expect from a Silicon Valley giant that believes in moving fast and breaking things—an approach that hasn’t always transferred well to basic scientific research—the company isn’t spreading the money around.
In an announcement this month at the American Heart Association meeting in Orlando, Florida, Google Life Sciences and the AHA said the money would go to one team over five years. This isn’t covering the bases. This is, to mix a lot of metaphors, a Manhattan Project. Or as Google likes to call such things: a moonshot.
For the microbiologist Justin Sonnenburg, that career-defining moment—the discovery that changed the trajectory of his research, inspiring him to study how diet and native microbes shape our risk for disease—came from a village in the African hinterlands.
A group of Italian microbiologists had compared the intestinal microbes of young villagers in Burkina Faso with those of children in Florence, Italy. The villagers, who subsisted on a diet of mostly millet and sorghum, harbored far more microbial diversity than the Florentines, who ate a variant of the refined, Western diet. Where the Florentine microbial community was adapted to protein, fats, and simple sugars, the Burkina Faso microbiome was oriented toward degrading the complex plant carbohydrates we call fiber.
Scientists suspect our intestinal community of microbes, the human microbiota, calibrates our immune and metabolic function, and that its corruption or depletion can increase the risk of chronic diseases, ranging from asthma to obesity. One might think that if we coevolved with our microbes, they’d be more or less the same in healthy humans everywhere. But that’s not what the scientists observed.
Cholesterol in the body serves an important role by producing vitamin D, hormones and other molecules that help in our food digestion. But when the protein known as PCSK9, which dictates its levels in our blood, retains too much of it, arteries begin to clog up and increase the risk of heart disease and stroke. Researchers have now developed a vaccine that inhibits the activity of this particular protein, which reduces cholesterol levels in animals and suggests a cheap and effective way to prevent dangerously high levels in humans mightn't be so far away.
A common approach to keeping cholesterol in check (outside of regular diet and exercise) has centered on statins, a class of drugs that reduce the concentration of "bad" cholesterol, known as low-density lipoprotein cholesterol (LDL-C), in the blood. Research has indicated that it is effective in cutting the chances of cardiovascular disease, with one study even suggesting fast food outlets provide free statins to customers to cancel out the health risks posed by burgers and fries.
Millions of Americans take statins to lower cholesterol, but one of its major drawbacks is its list of side effects, which includes muscle pain, heightened diabetes risk and even cognitive loss. Pharmaceutical companies have made progress toward drugs that target PCSK9, with some found to reduce LDL-C levels by as much as 60 percent receiving approval from the Food and Drug Administration. The trouble is, these drugs don't come cheap, carrying a price tag of more than US$10,000 a year.
Researchers are testing flexible wireless implants that could be used in different parts of the body to fight pain that doesn’t respond to other therapies.
Unlike devices that need to anchored to bone, these are soft and stretchable, which means they can be implanted into parts of the body that move, says Robert W. Gereau, one of the study leaders and an anesthesiology professor at Washington University School of Medicine in St. Louis.
“Our eventual goal is to use this technology to treat pain in very specific locations by providing a kind of ‘switch’ to turn off the pain signals long before they reach the brain,” says Gereau.
“But when we’re studying neurons in the spinal cord or in other areas outside of the central nervous system, we need stretchable implants that don’t require anchoring,” he adds.
The new devices are held in place with sutures. Like the previous models, they contain microLED lights that can activate specific nerve cells. Gereau hopes to use the implants to blunt pain signals in patients who have pain that cannot be managed with standard therapies.
On Monday, Google released the code for its deep learning software TensorFlow into the wild. Deep learning is responsible for some of Google’s most advanced services, including recent additions like auto-reply emails and image search. But by making the code free to anyone, the company hopes to accelerate progress in deep learning software and the machine learning field more generally.
“Google is five to seven years ahead of the rest of the world,” Chris Nicholson, who runs a deep learning startup called Skymind told Wired. “If they open source their tools, this can make everybody else better at machine learning.”
That’s a big deal because the field is already moving incredibly fast.
Howard noted that long-imagined capabilities like real-time translation and computer-generated art didn’t exist just a few years ago. Even Google’s auto-reply emails (recently announced) were an April Fool’s joke back in 2011.
Computers are now capable of all of these things and more.
“So, something amazing has happened that’s caused an April Fool’s joke from just four years ago to be something that’s actually a real technology today,” Howard said.
An increasingly warped sense of humour could be an early warning sign of impending dementia, say UK experts.
The University College London study involved patients with frontotemporal dementia, with the results appearing in the Journal of Alzheimer's Disease.
Questionnaires from the friends and family of the 48 patients revealed many had noticed a change in humour years before the dementia had been diagnosed.
This included laughing inappropriately at tragic events.
Experts say more studies are now needed to understand how and when changes in humour could act as a red flag for dementia.
There are many different types of dementia and frontotemporal dementia is one of the rarer ones.
The area of the brain it affects is involved with personality and behaviour, and people who develop this form of dementia can lose their inhibition, become more impulsive and struggle with social situations.
A baby girl with aggressive leukaemia has become the first in the world to be treated with designer immune cells that were genetically engineered to wipe out her cancer.
The one-year-old, Layla Richards, was given months to live after conventional treatments failed to eradicate the disease, but she is now cancer free and doing well, a response one doctor described as “almost a miracle”.
It’s time to enjoy some monster stories, and the scariest monsters of all are those that actually exist.
Join us as we share tales of some of the creepiest parasites around — those that control the brains of their human hosts, sometimes leaving insanity and death in their wake. These are the tales of neurological parasites. The Feline Parasite
Toxoplasma gondii tops the list as the most famous — and most controversial — neurological parasite. This tiny protozoan doesn’t look like much more than a blob, but once it makes its way to the brain, it can radically alter the behavior of hosts like rats, cats and, yes, even humans.
T. gondii’s life begins in cat feces, where its eggs (known as “oocytes” or “egg cells”) wait to be picked up by carriers like rats. Once they’re safe and warm in the guts of their temporary hosts, the oocytes morph into tachyzoites, the unassuming little blobs that can really do some damage. Those tachyzoites migrate into their hosts’ muscles, eyes and brains, where they can remain hidden for decades without doing much of anything.
Scientists know that feeling alone can have a negative effect on health, but now they think they may have uncovered one reason why.
Loneliness can lead to fight-or-flight stress signaling, which can ultimately affect the production of white blood cells.
The findings are from a recent study published in the Proceedings of the National Academy of Sciences that examined loneliness in both humans and rhesus macaques, a highly social primate species. The human subjects were participants in the Chicago Health, Aging, and Social Relations Study, a longitudinal study that began in 2002 with adults aged 50-68.
Previous research identified a link between loneliness and a phenomenon they called “conserved transcriptional response to adversity” or CTRA.
This response is characterized by an increased expression of genes involved in inflammation and a decreased expression of genes involved in antiviral responses. Essentially, lonely people had a less effective immune response and more inflammation than non-lonely people.
For the current study, the researchers from the University of Chicago, UCLA, and UC Davis examined gene expression in leukocytes, cells of the immune system that are involved in protecting the body against bacteria and viruses.
Vocal cords that produce realistic sounds have been grown in the lab from human cells.
The work marks a first step towards better treatments for patients who lose their voices to injury or disease.
Vocal cords are formed by two bands of smooth muscle tissue that are lined with a material called mucosa. When air passes through them, the folds vibrate hundreds of times per second to make sounds.
But diseases such as cancer can destroy the delicate folds and for many patients, the medical treatments are limited. Some patients with damaged vocal cords have viscous materials injected to make the folds more pliable. Others improve with voice coaching.
Researchers in the US took a different approach and grew layers of vocal cord cells onto scaffolds that produced tough elastic tissue similar to those within the natural voice box. When doctors tested the lab-grown tissue in voice boxes taken from dead dogs, they found they produced the same sounds as the natural tissue.
“Voice is a pretty amazing thing, yet we don’t give it much thought until something goes wrong,” said lead researcher Nathan Welham at the University of Wisconsin-Madison. “The ability to vibrate and make sounds is pretty remarkable and unique to this part of the body.”
Vocal cord tissue has been grown in the lab for the first time, paving the way for potentially revolutionary treatments for people who have lost their vocal cords.
Stars such as Adele, Frank Ocean and John Mayer have been afflicted with vocal cord damage -- and have undergone extensive, and expensive, treatment to deal with it. But now researchers have found a solution to outdated ways of dealing with vocal cord damage -- by growing them in a lab.
Previous vocal cord treatment required patients, who had received transplanted cords from cadavers, to be injected with huge doses of immunosuppresants. But a team from the University of Wisconsin Medical School has come up with a new way of transplanting cords. They successfully grew 170 sets of vocal cords in a lab -- cords that don't require the usual round of immunosuppresants.
“You have to begin to lose your memory, if only bits and pieces, to realize that memory is what makes our lives. Life without memory is no life at all.” — Luis Buñuel Portolés, Filmmaker
Every year, hundreds of millions of people experience the pain of a failing memory.
The reasons are many: traumatic brain injury, which haunts a disturbingly high number of veterans and football players; stroke or Alzheimer’s disease, which often plagues the elderly; or even normal brain aging, which inevitably touches us all.
Memory loss seems to be inescapable. But one maverick neuroscientist is working hard on an electronic cure. Funded by DARPA, Dr. Theodore Berger, a biomedical engineer at the University of Southern California, is testing a memory-boosting implant that mimics the kind of signal processing that occurs when neurons are laying down new long-term memories.
The revolutionary implant, already shown to help memory encoding in rats and monkeys, is now being tested in human patients with epilepsy — an exciting first that may blow the field of memory prosthetics wide open.
To get here, however, the team first had to crack the memory code.
Bandages are important for stopping germs from entering a wound and making things worse, but could they play a more active role in making things better? New research has brought the idea of wound-healing dressings closer to reality by establishing a method of electrical stimulation that kills off the majority of multi-drug resistant bacterium commonly found in difficult-to-treat infections.
Electrical stimulation has long been explored as a means of speeding up the healing process, but exactly how it works hasn't always been so clear. However, a study earlier this year suggested it does so by triggering a process called angiogenesis, which causes new blood vessels to form and boosts blood flow to the affected area.
In the view of Washington State University researchers, at least part of the answer lies in the results of the electrochemical reaction that takes place as the current is applied. The team found that during this process hydrogen peroxide forms at the electrode surface, which as it turns out, works effectively as a disinfectant.
The team applied the electric current to a film of bacteria (multi-drug resistant Acinetobacter baumannii strain) where it killed almost the entire population within 24 hours, reducing it to 1/10,000th of its original size. The approach was also observed on pig tissue, where it killed the majority of the bacteria without affecting the surrounding healthy tissue.
For thousands of years, Metzl says, we’ve been a relatively stable species, subject to the natural world’s slow evolutionary rules. But this is all about to change.
“We are now at the beginning of an evolutionary phase guided by a different set of rules of self-evolution, driven by new technologies,” Metzl said.
Metzl explains it like this. If an infant from a thousand years ago were transported into today’s world and raised by a typical family, the child would be pretty similar to a kid born in 2015. But reach a thousand years into the future and bring an infant into today’s world—and comparatively, the future child would be superhuman.
Metzl splits the oncoming transformation of the human species into three phases in human genetic engineering. Phase One: Embryo Selection
In-vitro fertilization (IVF) reinvented treatment for women with common forms of infertility. Now, pre-implantation genetic diagnosis (PGD) enables disease detection in an embryo as young as five days. Eventually, we’ll be able to identify polygenic traits, mendelian traits, do full genomic analyses, and select embryos for implantation based on what we find.
At first, we’ll select against genetic diseases. But that may lead to selection of other traits too.
“We're going to understand not just which child is a carrier for Huntington's disease, but we're going to be able to know everything that is knowable through the genome,” says Metzl.
Stephen Hsu, vice president of research and professor of theoretical physics at Michigan State University, for example, believes that superintelligent humans are coming, and that in roughly ten years we’ll be able to predict an individual’s IQ from a cell.
Optogenetics is probably the biggest buzzword in neuroscience today. It refers to techniques that use genetic modification of cells so they can be manipulated with light. The net result is a switch that can turn brain cells off and on like a bedside lamp.
The technique has enabled neuroscientists to achieve previously unimagined feats and two of its inventors—Karl Deisseroth of Stanford University and the Howard Hughes Medical Institute and Ed Boyden of Massachusetts Institute of Technology—received a Breakthrough Prize in the life sciences on November 8 in recognition of their efforts. The technology is able to remotely control motor circuits—one example is having an animal run in circles at the flick of a switch. It can even label and alter memories that form as a mouse explores different environments. These types of studies allow researchers to firmly establish a cause-and-effect relationship between electrical activity in specific neural circuits and various aspects of behavior and cognition, making optogenetics one of the most widely used methods in neuroscience today.
A strategy designed to improve memory by delivering brain stimulation through implanted electrodes is undergoing trials in humans. The US military, which is funding the research, hopes that the approach might help many of the thousands of soldiers who have developed deficits to their long-term memory as a result of head trauma. At the Society for Neuroscience meeting in Chicago, Illinois, on October 17–21, two teams funded by the Defense Advanced Research Projects Agency presented evidence that such implanted devices can improve a person’s ability to retain memories.
By mimicking the electrical patterns that create and store memories, the researchers found that gaps caused by brain injury can be bridged. The findings raise hopes that a ‘neuroprosthetic’ that automatically enhances flagging memory could aid not only brain-injured soldiers, but also people who have had strokes—or even those who have lost some power of recall through normal ageing.
Because of the risks associated with surgically placing devices in the brain, both groups are studying people with epilepsy who already have implanted electrodes. The researchers can use these electrodes both to record brain activity and to stimulate specific groups of neurons. Although the ultimate goal is to treat traumatic brain injury, these people might benefit as well, says biological engineer Theodore Berger at the University of Southern California (USC) in Los Angeles. That is because repeated seizures can destroy the brain tissue needed for long-term-memory formation.
The next time you start to feel special, keep in mind that much of your DNA isn’t even yours. In fact, your genome is littered with the ancient corpses of viral invaders of hundreds (or even millions) of years ago. Basically, each of us is just a giant junk heap.
If you find that dispiriting, here’s another bit of unsettling news: Some of these skeletons come back to life during very early human development. The viral DNA makes viral proteins, which assemble themselves into something that looks suspiciously like infectious viral particles.
“It’s both fascinating and a little creepy,” says Joanna Wysocka, PhD, Stanford associate professor of developmental biology and of chemical and systems biology. “We can’t say yet whether these viral particles can be infectious, but regardless of whether they are, viral proteins within a cell are rarely completely inert.”
Wysocka described the phenomenon in a paper published earlier this year in Nature. Graduate student Edward Grow was the study’s first author.
The finding raises questions as to who, or what, is really pulling the strings during human embryogenesis. Grow and Wysocka have found that these viral proteins are well-placed to manipulate some of the earliest steps in our development by affecting gene expression and even possibly protecting the embryo’s cells from further viral infection.
It’s unclear, however, whether we are watching an ongoing battle between viruses and humans or the outcome of an uneasy truce hashed out over tens of thousands of years of evolution.
“Does the virus selfishly benefit by switching itself on in these early embryonic cells?” wonders Grow. “Or is the embryo instead commandeering the viral proteins to protect itself? Can they both benefit? That’s possible, but we don’t really know.”
We already know that mood affects the way we judge and perceive things -- and it's generally considered to be negative. Studies have found that bad or sad moods can affect our judgement and reasoning, with clinical depression even having an impact on perception of time.
But a new study suggests that mood, and extreme changes in mood, may actually be "an evolutionary relic that may have been advantageous for early humans".
The study, conducted by a team at UCL and published in Trends in Cognitive Science, found that mood influences our perception of reward outcomes "such that outcomes are perceived as better when one is in a good mood relative to when one is in a bad mood". This change in mood leads to a subsequent change in behaviour -- and thus allows us to adapt to fast moving environmental factors.
Sharing your scoops to your social media accounts is a must to distribute your curated content. Not only will it drive traffic and leads through your content, but it will help show your expertise with your followers.
How to integrate my topics' content to my website?
Integrating your curated content to your website or blog will allow you to increase your website visitors’ engagement, boost SEO and acquire new visitors. By redirecting your social media traffic to your website, Scoop.it will also help you generate more qualified traffic and leads from your curation work.
Distributing your curated content through a newsletter is a great way to nurture and engage your email subscribers will developing your traffic and visibility.
Creating engaging newsletters with your curated content is really easy.