Researchers at Oregon State University have discovered that one gene in a common fungus acts as a master regulator, and deleting it has opened access to a wealth of new compounds that have never before been studied – with the potential to identify new antibiotics.
Scientists succeeded in flipping a genetic switch that had silenced more than 2,000 genes in this fungus, the cereal pathogen Fusarium graminearum. Until now this had kept it from producing novel compounds that may have useful properties, particularly for use in medicine but also perhaps in agriculture, industry, or biofuel production.
"About a third of the genome of many fungi has always been silent in the laboratory," said Michael Freitag, an associate professor of biochemistry and biophysics in the OSU College of Science. "Many fungi have antibacterial properties. It was no accident that penicillin was discovered from a fungus, and the genes for these compounds are usually in the silent regions of genomes.
"What we haven't been able to do is turn on more of the genome of these fungi, see the full range of compounds that could be produced by expression of their genes," he said. "Our finding should open the door to the study of dozens of new compounds, and we'll probably see some biochemistry we've never seen before."
In the past, the search for new antibiotics was usually done by changing the environment in which a fungus or other life form grew, and see if those changes generated the formation of a compound with antibiotic properties.
The iPhone app created for the iLimb allows users to program and train the hand themselves, thereby relieving those who travel hundreds of miles to see their prosthetist to adjust their grip. The app gives users the ability choose which of the 24 grip patterns they want available. The ability to adjust the grip is another plus as changes in the environment, such as heat and humidity, can affect how strongly muscles (which are used to control the prosthetic limb) contract.
The neurocam is the world's first wearable camera system that automatically records what interests you. It consists of a headset with a brain-wave sensor and connects to an iPhone.
The system estimates whether you're interested in something from the brain-waves captured by the sensor, and uses the iPhone's camera to record the scenes that appear to interest you. Just buy wearing the neurocam, the scenes you have shown an interest in are recorded automatically, and are added to an album which you can look through later.
"Right now, the iPhone's camera is ready to record what's in my line of sight through a prism. The iPhone shows what the camera has captured, so it feels as if it's reading my mind. My brain-waves are analyzed by an iPhone app, which quantifies my level of interest on a scale from 0 to 100. If the level exceeds 60, the number turns red, and the camera starts to record automatically, producing a 5-second GIF animation."
"We're using the iPhone so that analysis and capture can be done with one device. But this is still a concept model. So, we think there are lots of possibilities, such as turning this into a wearable camera."
The neurocam arose from the neurowear project, which is involved with items that use brain-waves and bio-sensors, like necomimi, which works using brain-waves. The algorithm for quantifying brain-waves was co-developed with Associate Professor Mitsukura at Keio University.
In the future, the project team aims to create an emotional interface, which could link a range of devices and services to people's individual thoughts and feelings.
Glass may be for geeks now, but that could all change once computational photography starts to give wearable cameras amazing new capabilities.
Stanford professor Marc Levoy, fresh off a two-year leave to work on Google Glass, recently spoke to a packed house at Stanford’s Center for Image Engineering (SCIEN) about the new era of photography that Glass, and other increasingly powerful wearable cameras, have begun to usher in. While many of the new applications have been talked about — like first-person videos and the ability to take pictures without losing eye contact — Levoy explained that those are only the tip of the iceberg. He sees a combination of computational imaging and new-form-factor, camera-equipped devices will allow for a set of what he described as “superhero vision” capabilities.
Rapidly increasing processor power will help fuel this new world of powerful new photographic tools. Levoy, a pioneer in both computer graphics and computational imaging, noted that GPU power is growing by roughly 80% per year, while megapixels are only growing by about 20%. That means more horsepower to process each pixel — with the available cycles increasing each year. Coupled with near-real-time multi-frame image capture, the bounds of traditional photography can even be stretched beyond the borders of a single image.
While our eyes and brain have a pretty good ability to see in a variety of lighting conditions, we know from how well animals can see “in the dark” that more is possible. Digital cameras have the luxury of leaving their aperture open long enough to gather photons even in very low light levels. By combining those long exposures with shorter ones, high-dynamic-range (HDR) scenes can be captured. It doesn’t take much imagination to see how a wearable camera could provide virtual-reality or head-up assistance to the wearer — allowing them to see into the shadows or even in largely dark rooms. Newer smartphones from Apple and others have some HDR capability built-in, but it will take integration with a a wearable computer to allow the use of HDR to augment our own vision.
Now is probably a good time to define the different ways a wearable display can change what the viewer sees. Augmented reality adds information to the normal field of view. Head-up displays are a common example. By contrast, virtual reality completely synthesizes your world view — although combined with camera input, a VR system can certainly be programmed to present augmented reality. Google Glass isn’t technically either of these — although Levoy groups it with the head-up AR category. It displays information above and to the side of the wearer’s normal field of view, so it doesn’t intrude on your attention unless you deliberately look at it. That makes Glass limited in what it can present to the wearer, but also makes it more practical and easier to get used to in the short term.
During this 1 hour conversation with Dr. Tipler and Socrates of Singularity 1 on 1 (published on Oct 29, 2013) , they cover a variety of interesting topics such as: why he is both a physics imperialist and fundamentalist; the cosmological singularity, the technological singularity and the omega point; his personal journey from Christian fundamentalism through agnosticism and atheism and back to theism and Christianity; why most physicists are good atheists and bad scientists; immortality; determinism and whether God plays dice with the universe; mind-uploading and [Quantum] consciousness…
Ten years ago, we boldly declared that we'd be living with phones on our wrists, data-driven goggles on our eyes and gadgets that would safety-test our food for us. Turns out, we were remarkably prescient.
The biggest problem with wetware is the “ware” part. Enormous metal implants like those seen in The Matrix or Elysium look cool and all, but any real-world interface of metal and flesh is precarious; surface implants are often rejected by the body, leading to infection and even death. Technology has gotten smaller, more efficient, and able to better communicate wirelessly, but for all the nifty implants we can build, actually implanting them has proven difficult, and controlling them even more so. Now, researchers at Massachusetts General Hospital claim that a special hydrogel could change all that.
J Craig Venter has been a molecular-biology pioneer for two decades. After developing expressed sequence tags in the 90s, he led the private effort to map the human genome, publishing the results in 2001. In 2010 the J Craig Venter Institute manufactured the entire genome of a bacterium, creating the first synthetic organism. Now Venter, author of Life at the Speed of Light: From the Double Helix to the Dawn of Digital Life, explains the coming era of discovery.
The lesson from the TASTE study is that we should implement registries throughout the US and Canadian health care systems and use them to run quick and efficient clinical trials. That will help us adapt our way to a health care system that works well at an affordable cost.The current NEJM has published a clinical trial with a statistical commentary that is really exciting. (How is that for a sentence that you never expected to
University of Cincinnati researchers have developed the first-of-its-kind nanostructure which is unusual because it can carry a variety of cancer-fighting materials on its double-sided (Janus) surface and within its porous interior.
Because of its unique structure, the nano carrier can do all of the following:
• Transport cancer-specific detection nanoparticles and biomarkers to a site within the body, e.g., the breast or the prostate. This promises earlier diagnosis than is possible with today’s tools.
• Attach fluorescent marker materials to illuminate specific cancer cells, so that they are easier to locate and find for treatment, whether drug delivery or surgery.
• Deliver anti-cancer drugs for pinpoint targeted treatment of cancer cells, which should result in few drug side effects. Currently, a cancer treatment like chemotherapy affects not only cancer cells but healthy cells as well, leading to serious and often debilitating side effects.
This recently developed Janus nanostructure is unusual in that, normally, these super-small structures (that are much smaller than a single cell) have limited surface. This makes is difficult to carry multiple components, e.g., both cancer detection and drug-delivery materials. The Janus nanocomponent, on the other hand, has functionally and chemically distinct surfaces to allow it to carry multiple components in a single assembly and function in an intelligent manner.
“In this effort, we’re using existing basic nano systems, such as carbon nanotubes, graphene, iron oxides, silica, quantum dots and polymeric nano materials in order to create an all-in-one, multidimensional and stable nano carrier that will provide imaging, cell targeting, drug storage and intelligent, controlled drug release,” said UC’s Shi, adding that the nano carrier’s promise is currently greatest for cancers that are close to the body’s surface, such as breast and prostate cancer.
For centuries the art of medicine has been dominated by bumps, bruises, or other symptoms, felt by the patient or discovered by the physician, with eyes ever-magnified by increasingly sophisticated scanning technology: the microscope, the x-ray,...