A compilation of illustrated articles by Brian Johnston hosted on Micscape.
Your new post is loading...
NOTE: To subscribe to the RSS feed of Amazing Science, copy http://www.scoop.it/t/amazing-science/rss.xml into the URL field of your browser and click "subscribe".
This newsletter is aggregated from over 1450 news sources:
All my Tweets and Scoop.It! posts sorted and searchable:
You can search through all the articles semantically on my
NOTE: All articles in the amazing-science newsletter can also be sorted by topic. To do so, click the FIND buntton (symbolized by the FUNNEL on the top right of the screen) and display all the relevant postings SORTED by TOPICS.
You can also type your own query:
e.g., you are looking for articles involving "dna" as a keyword
MOST_READ • 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video
Let's face it, humans are pretty intelligent. Most people would not argue with this. We spend a large majority of our lives trying to become MORE intelligent. Some of us spend nearly three decades of our lives in school, learning about the world. We also strive to work together in groups, as nations, and as a species, to better tackle the problems that face us.
A second track of transhumanism is to facilitate and support improvement of machines in parallel to improvements in human quality of life. Many people argue that we have also already built complex computer programs which show a glimmer of autonomous intelligence, and that in the future we will be able to create computer programs that are equal to, or have a much greater level of intelligence than humans. Such an intelligent system will be able to self-improve, just as we humans identify gaps in our knowledge and try to fill them by going to school and by learning all we can from others. Our computer programs will soon be able to read Wikipedia and Google Books to learn, just like their creators.
She is also the cofounder of carboncopies.org - and organization that works on connectome mapping of the brain and downloading memories.
Even in our deepest theories of machine intelligence, the idea of reward comes up. There is a theoretical model of intelligence called AIXI, developed by Marcus Hutter , which is basically a mathematical model which describes a very general, theoretical way in which an intelligent piece of code can work. This model is highly abstract, and allows, for example, all possible combinations of computer program code snippets to be considered in the construction of an intelligent system. Because of this, it hasn’t actually ever been implemented in a real computer. But, also because of this, the model is very general, and captures a description of the most intelligentprogram that could possibly exist. Note that in order to try and build something that even approximates this model is way beyond our computing capability at the moment, but we are talking now about computer systems that may in the future may be much more powerful. Anyway, the interesting thing about this model is that one of the parameters is a term describing… you guessed it… REWARD.
Changing your own code
We, as humans, are clever enough to look at this model, to understand it, and see that there is a reward term in there. And if we can see it, then any computer system that is based on this highly intelligent model will certainly be able to understand this model, and see the reward term too. But – and here’s the catch – the computer system that we build based on this model has the ability to change its own code! In fact it had to in order to become more intelligent than us in the first place, once it realized we were such lousy programmers and took over programming itself!
So imagine a simple example – our case from earlier – where a computer gets an additional ’1′ added to a numerical value for each good thing it does, and it tries to maximize the total by doing more good things. But if the computer program is clever enough, why can’t it just rewrite it’s own code and replace that piece of code that says ‘add 1′ with an ‘add 2′? Now the program gets twice the reward for every good thing that it does! And why stop at 2? Why not 3, or 4? Soon, the program will spend so much time thinking about adjusting its reward number that it will ignore the good task it was doing in the first place!
Physicists Sergei Filippov (MIPT and Russian Quantum Center at Skolkovo) and Mario Ziman (Masaryk University in Brno, Czech Republic, and the Institute of Physics in Bratislava, Slovakia) have found a way to preserve quantum entanglement of particles passing through an amplifier and, conversely, when transmitting a signal over long distances. Details are provided in an article published in the journal Physical Review A.
Decoherence is the destruction of the quantum state due to the interaction of a quantum system with the outside world. For experiments in quantum computing, scientists use single atoms caught in magnetic traps and cooled to temperatures close to absolute zero. After going through kilometers of fiber, photons cease to be quantum entangled in most cases and become ordinary, unrelated light quanta.
To create an effective quantum computing system, scientists have to solve a number of problems, including preserving quantum entanglement when the signal abates and when it passes through an amplifier. Fiber-optic cables on the ocean bed contain a great deal of special amplifiers composed of optical glass and rare earth elements. It is these amplifiers that make it possible to watch high-resolution videos stored on a server in California from the MIPT campus or a university in Beijing.
In their article, Filippov and Ziman say that a certain class of signals can be transmitted so that the risk ofruining quantum entanglement becomes much lower. In this case, neither the attenuation nor the amplification of a signal ruins the entanglement. To achieve this effect, it is necessary to have the particles in a special, non-Gaussian state, or, as physicists put it, "the wave function of the particles in the coordinate representation should not be in the form of a Gaussian wave packet." A wave function is a basic concept of quantum mechanics, and Gaussian distribution is a major mathematical function used not only by physicists but also by statisticians, sociologists and economists.
Scientists have discovered that greater mouse-eared bats use polarization patterns in the sky to navigate -- the first mammal that's known to do this.
The bats use the way the Sun's light is scattered in the atmosphere at sunset to calibrate their internal magnetic compass, which helps them to fly in the right direction, a study published in Nature Communications has shown.
Despite this breakthrough, researchers have no idea how they manage to detect polarized light. 'We know that other animals use polarization patterns in the sky, and we have at least some idea how they do it: bees have specially-adapted photoreceptors in their eyes, and birds, fish, amphibians and reptiles all have cone cell structures in their eyes which may help them to detect polarization,' says Dr Richard Holland of Queen's University Belfast, co-author of the study.
'But we don't know which structure these bats might be using.' Polarization patterns depend on where the sun is in the sky. They're clearest in a strip across the sky 90° from the position of the sun at sunset or sunrise. But animals can still see the patterns long after sunset. This means they can orient themselves even when they can't see the sun, including when it's cloudy. Scientists have even shown that dung beetles use the polarization pattern of moonlight for orientation.
A hugely diverse range of creatures – including bees, anchovies, birds, reptiles and amphibians – use the patterns as a compass to work out which way is north, south, east and west.
The 27-kilometer Large Hadron Collider at CERN could soon be overtaken as the world’s largest particle smasher by a proposed Chinese machine. Proposals for two accelerators could see country become collider capital of the world.
For decades, Europe and the United States have led the way when it comes to high-energy particle colliders. But a proposal by China that is quietly gathering momentum has raised the possibility that the country could soon position itself at the forefront of particle physics.
Scientists at the Institute of High Energy Physics (IHEP) in Beijing, working with international collaborators, are planning to build a ‘Higgs factory’ by 2028 — a 52-kilometre underground ring that would smash together electrons and positrons. Collisions of these fundamental particles would allow the Higgs boson to be studied with greater precision than at the much smaller Large Hadron Collider (LHC) at CERN, Europe’s particle-physics laboratory near Geneva, Switzerland.
Physicists say that the proposed US$3-billion machine is within technological grasp and is considered conservative in scope and cost. But China hopes that it would also be a stepping stone to a next-generation collider — a super proton–proton collider — in the same tunnel.
European and US teams have both shown interest in building their own super collider (see Nature 503, 177; 2013), but the huge amount of research needed before such a machine could be built means that the earliest date either can aim for is 2035. China would like to build its electron–positron collider in the meantime, unaided by international funding if needs be, and follow it up as fast as technologically possible with the super proton collider. Because only one super collider is likely to be built, China’s momentum puts it firmly in the driving seat.
Speaking this month at the International Conference on High Energy Physics in Valencia, Spain, IHEP director Yifang Wang said that, to secure government support, China wanted to work towards a more immediate goal than a super collider by 2035. “You can’t just talk about a project which is 20 years from now,” he said.
An arms race has been waged between bacteria and bacteriophage that would bring a satisfactory tear to Sun Tzu’s eye. Scientists have recently recognized that countermeasures developed by bacteria (and archaea) in response to phage infections can be retooled for use within molecular biology. In 2013, large strides have been made to co-opt this system (specifically and most commonly from Streptococcus pyogenes) for use in mammalian cells. This countermeasure, CRISPR (clustered regularly interspaced short palindromic repeats), has brought about another successive wave of genome engineering initiated by recombineering and followed more recently by zinc finger nucleases (ZFNs) and transcription activator-like effector nucleases (TALENs).
ZFNs and TALENs perform a similar function yet the learning curve appears to be more difficult for development due to the use of protein-DNA contacts rather than the simplicity of designing RNA-DNA homology contacts. Although the potential for CRISPR in regards to genome editing within mammalian cells will be of greatest interest to the reader, the CRISPR backstory is equally compelling. Just as we have evolved immune responses to pathogens, so too have bacteria. CRISPR is an adapted immune response evolved by bacteria to create an immunological memory to ward off future phage infections. When a phage infects and injects its DNA within a bacterium, the DNA commandeers bacterial proteins and enzymes for use towards lytic or lysogenic phases. However, exposure of phage DNA allows the bacterium to copy and insert snippets (called spacers) of phage DNA into its genomic DNA between direct repeats (DR). These snippets can later be expressed as an operon (pre-CRISPR RNA, pre-crRNA) alongside a trans-activating CRISPR RNA (tracrRNA) and an effector CRISPR associated nuclease (Cas). Together these components surveil for foreign crRNA cognate sequence and cleave the targeted sequence.
Although hallmarks of CRISPR have been known since the late 80’s (CRISPR timeline) and was acronymed in 2002, Jinek et al. in August 2012 were the first to suggest the suitability of CRISPR towards genome editing. In February of 2013, Feng Zhang’s and George Church’s labs simultaneously published the first papers describing the use long oligos/constructs for editing via CRISPR in mammalian cells and made their plasmids readily available on Addgene. Zhang’s lab went one step further and has supplemented their papers with a helpful website and user forum. They have even gone so far as to publish a methods paper to streamline the use of their plasmids towards a plug-and-play, modular cloning approach with your target sequence of interest.
CRISPR works fairly well out of the box yet still has some imperfections that are being addressed. For example, CRISPR relies upon a protospacer adjacent motif (PAM; S. pyogenes sequence: NGG) 3’ to the targeting sequence to permit digestion. Although the ubiquity of NGG within the genome may seem advantageous, it may be limiting in some regions. Other species make use of different PAM sites that can be considered when choosing a cut sites of interest. Since double-stranded cuts could potentially create DNA lesions (a byproduct of the cell using non-homologous end joining [NHEJ] instead of homologous recombination) some labs are choosing to use modified Cas enzymes that nick DNA, instead of creating a double-strand break. This potential weakness of CRISPR to create DNA lesions via NHEJ, however, has been exploited by Eric Lander’s and Zhang’s lab this month (Jan. 2014). They have capitalized on the cell’s use of NHEJ to manufacture DNA lesions (frameshift mutations) at cut sites within genes on a large scale as a means to perform large genetic screens. Using this technique knocks out a gene and has the obvious advantage of fully ablating a gene’s expression compared to RNAi where some residual expression can be expected.
The advantages of CRISPR lends itself to future therapies. High efficiency, low-to-no background mutagenesis and easy construction put CRISPR front and center as the tool de jour for gene therapy. In combination with induced pluripotent stem cells (iPSCs), one can imagine the creation of patient-specific iPSCs created with non-integrative iPSC vectors and modified by CRISPR, devoid of any residual DNA footprint left behind by the iPSC vector or CRISPR correction. In conjunction with whole genome sequencing, genetically clean cell lines can be selected that are suitable for differentiation towards the germ layer of interest for subsequent autologous transplantation. Proof of principle experiments have already been published in models of cystic fibrosis and cataracts.
For better or worse, CRISPR is catching on like wildfire with young investigators, as noted recently by Michael Eisen. What may be looming in the future and not as openly discussed at this time is the potential for CRISPR to open up the genome to large scale editing. We tend to think of any particular genome as fairly static with slight variations between any two individuals and increased variation down the evolutionary line. However, CRISPR has proven to be a fantastic multitasker, capable of modifying multiple loci in one fell swoop as demonstrated by the Jaenisch lab (five loci). With the creation of Caribou Biosciences and a surprising round of venture capital raised by a powerhouse team at Editas Medicine in November ($43 million), CRISPR appears to also have sparked an interest in the private sector. With large sums of money at their disposal, these companies can now begin to look at the genome, not as a static entity, but more akin to operating system, a code that now has a facile editing tool. George Church, an Editas co-founder, has speculated in the past about the potential use of the human genome as the backbone for recreating the Neanderthal genome in his recent book and interview with "Der Spiegel". In an era where the J. Craig Venter Institute can create an organism’s genome de novo and a collaboration between Synthetic Genomics and Integrated DNA Technologies has proposed to synthesize DNA upwards of 2Mbp, the combination of CRISPR, synthetic DNA and some elbow grease will make the genome more accessible and Church’s speculations a potential reality.
A kit of 3D-printed anatomical body parts could revolutionize medical education and training, according to its developers at Monash University.
Professor Paul McMenamin, Director of the University’s Centre for Human Anatomy Education, said the simple and cost-effective anatomical kit would dramatically improve trainee doctors’ and other health professionals’ knowledge and could even contribute to the development of new surgical treatments.
“Many medical schools report either a shortage of cadavers, or find their handling and storage too expensive as a result of strict regulations governing where cadavers can be dissected,” he said.
“Without the ability to look inside the body and see the muscles, tendons, ligaments, and blood vessels, it’s incredibly hard for students to understand human anatomy. We believe our version, which looks just like the real thing, will make a huge difference.”
The 3D Printed Anatomy Series kit, to go on sale later this year, could have particular impact in developing countries where cadavers aren’t readily available, or are prohibited for cultural or religious reasons.
After scanning real anatomical specimens with either a CT or surface laser scanner, the body parts are 3D printed either in a plaster-like powder or in plastic, resulting in high resolution, accurate color reproductions.
Further details have been published online in the journal Anatomical Sciences Education.
Organoids have been generated for a number of organs from both mouse and human stem cells. To date, human pluripotent stem cells have been coaxed to generate intestinal, kidney, brain, and retinal organoids, as well as liver organoid-like tissues called liver buds.
Derivation methods are specific to each of these systems, with a focus on recapitulation of endogenous developmental processes. Specifically, the methods so far developed use growth factors or nutrient combinations to drive the acquisition of organ precursor tissue identity.
Then, a permissive three-dimensional culture environment is applied, often involving the use of extracellular matrix gels such as Matrigel. This allows the tissue to self-organize through cell sorting out and stem cell lineage commitment in a spatially defined manner to recapitulate organization of different organ cell types.
These complex structures provide a unique opportunity to model human organ development in a system remarkably similar to development in vivo. Although the full extent of similarity in many cases still remains to be determined, organoids are already being applied to human-specific biological questions. Indeed, brain and retinal organoids have both been shown to exhibit properties that recapitulate human organ development and that cannot be observed in animal models. Naturally, limitations exist, such as the lack of blood supply, but future endeavors will advance the technology and, it is hoped, fully overcome these technical hurdles.
Outlook: The therapeutic promise of organoids is perhaps the area with greatest potential. These unique tissues have the potential to model developmental disease, degenerative conditions, and cancer. Genetic disorders can be modeled by making use of patient-derived induced pluripotent stem cells or by introducing disease mutations. Indeed, this type of approach has already been taken to generate organoids from patient stem cells for intestine, kidney, and brain.
Furthermore, organoids that model disease can be used as an alternative system for drug testing that may not only better recapitulate effects in human patients but could also cut down on animal studies. Liver organoids, in particular, represent a system with high expectations, particularly for drug testing, because of the unique metabolic profile of the human liver. Finally, tissues derived in vitro could be generated from patient cells to provide alternative organ replacement strategies. Unlike current organ transplant treatments, such autologous tissues would not suffer from issues of immunocompetency and rejection.
Soil deep in a crater dating to some 3.7 billion years ago contains evidence that Mars was once much warmer and wetter, says University of Oregon geologist Gregory Retallack, based on images and data captured by the rover Curiosity.
NASA rovers have shown Martian landscapes littered with loose rocks from impacts or layered by catastrophic floods, rather than the smooth contours of soils that soften landscapes on Earth. However, recent images from Curiosity from the impact Gale Crater, Retallack said, reveal Earth-like soil profiles with cracked surfaces lined with sulfate, ellipsoidal hollows and concentrations of sulfate comparable with soils in Antarctic Dry Valleys and Chile's Atacama Desert.
"The pictures were the first clue, but then all the data really nailed it," Retallack said. "The key to this discovery has been the superb chemical and mineral analytical capability of the Curiosity Rover, which is an order of magnitude improvement over earlier generations of rovers. The new data show clear chemical weathering trends, and clay accumulation at the expense of the mineral olivine, as expected in soils on Earth. Phosphorus depletion within the profiles is especially tantalizing, because it attributed to microbial activity on Earth."
The Great Filter, in the context of the Fermi paradox, is whatever prevents "dead matter" from giving rise, in time, to "expanding lasting life" in the universe. The concept originates in Robin Hanson's argument that the failure to find any extraterrestrial civilizations in the observable universe implies the possibility something is wrong with one or more of the arguments from various scientific disciplines that the appearance of advanced intelligent life is probable; this observation is conceptualized in terms of a "Great Filter" which acts to reduce the great number of sites where intelligent life might arise to the tiny number of intelligent species actually observed (currently just one: human). This probability threshold, which could lie behind us (in our past) or in front of us (in our future), might work as a barrier to the evolution of intelligent life, or as a high probability of self-destruction. The main counter-intuitive conclusion of this observation is that the easier it was for life to evolve to our stage, the bleaker our future chances probably are.
The idea was first proposed in an online essay titled, "The Great Filter - Are We Almost Past It?" written by economist Robin Hanson. The first version was written in August 1996 and the article was last updated on September 15, 1998. Since that time, Hanson's formulation has received recognition in several published sources discussing the Fermi paradox and its implications.
According to the Great Filter hypothesis at least one of these steps - if the list were complete - must be improbable. If it's not an early step (i.e. in our past), then the implication is that the improbable step lies in our future and our prospects of reaching step 9 (interstellar colonization) are still bleak. If the past steps are likely, then many civilizations would have developed to the current level of the human race. However, none appear to have made it to step 9, or the Milky Way would be full of colonies. So perhaps step 9 is the unlikely one, and the only thing that appears likely to keep us from step 9 is some sort of catastrophe or the resource exhaustion leading to impossibility to make the step due to consumption of the available resources (like for example highly constrained energy resources). So by this argument, finding multicellular life on Mars (provided it evolved independently) would be bad news, since it would imply steps 2–6 are easy, and hence only 1, 7, 8 or 9 (or some unknown step) could be the big problem.
Although steps 1–7 have occurred on Earth, any one of these may be unlikely. If the first seven steps are necessary preconditions to calculating the likelihood (using the local environment) then an anthropically biased observer can infer nothing about the general probabilities from its (pre-determined) surroundings.
Dr. David Kirtley comes across as a smart, practical guy with a head for business. That creates some cognitive dissonance when he explains that his Redmond startup is developing fusion energy.
You’ve heard about fusion energy, the amazing power source of the future. Nuclear scientists promise fusion will have all the best qualities of conventional nuclear and natural-gas energy but none of the downsides. Fusion is carbon-free like today’s atomic power, but without the need to protect a thousand future generations from radioactive waste. Other than being mind-blowing, fusion would be relatively safe — no China Syndrome, no contamination, no weapons-grade materials to proliferate.
Thus far, fusion energy always has been an unfinished science that’s 50 years and $50 billion from commercialization. Seven nations are collaborating to build an experimental fusion reactor in France as an $11 billion proof of concept that still won’t produce electricity when it’s operational in 2027.
Dr. Kirtley’s company Helion Energy has taken the proven parts of fusion science and combined them into a design that can be commercially deployable within six years. That would be a decade ahead of Helion’s Bellevue neighbor, TerraPower LLC, a startup funded in part by Nathan Myhrvold and Bill Gates to build a traveling wave reactor that runs on uranium.
Helion Energy last week won the top prize in the Energy Generation category at the Cleantech Open Global Forum in Silicon Valley. The prize comes with a $5,000 check and a long menu of in-kind services. The audience also gave Helion a People’s Choice Award. The annual competition culminates a nine-month business accelerator for Cleantech startups.
The team at Helion comes out of the University of Washington and Mathematical Sciences Northwest. At its headquarters in Redmond, Helion has a working prototype that they say proves their design works. Deuterium gas goes in two ends of the device and produces a pair of plasmas per second. Plasma is responsible for the glow of lightning, neon lights and the Sun. As the two plasmas collide in the center, a magnetic pulse generates electricity.
At the Global Forum, Dr. Kirtley told me his design is compact, modular and competitive in today’s market. In the footprint of a semi trailer, each module will produce 50 megawatts of electricity (it would take ten of them to equal the output of a conventional power plant). The deuterium fuel is derived from seawater. The byproduct is a harmless stream of helium.
Helion Energy is raising $35 million to build a fusion reactor core that will demonstrate electricity production from fusion energy. Its technology previously received $4 million in funding from the U.S. Department of Energy.
“Helion isn’t looking for funding to do more science,” says Kirtley. “We already proved our technology. We’re now ready to start commercializing fusion energy.”
A University of Central Florida research team has developed a facial recognition tool that promises to be useful in rapidly matching pictures of children with their biological parents and in potentially identifying photos of missing children as they age.
The work verifies that a computer is capable of matching pictures of parents and their children. The study will be presented at the nation's premier event for the science of computer vision - the IEEE Computer Vision and Pattern Recognition conference in Columbus, Ohio, which begins Monday, June 23. Graduate Student Afshin Dehfghan and a team from UCF's Center for Research in Computer Vision started the project with more than 10,000 online images of celebrities, politicians and their children.
"We wanted to see whether a machine could answer questions, such as 'Do children resemble their parents?' 'Do children resemble one parent more than another?' and 'What parts of the face are more genetically inspired?'" he said.
Anthropologists have typically studied these questions. However Dehghan and his team are advancing a new wave of computational science that uses the power of a mechanical "mind" to evaluate data completely objectively – without the clutter of subjective human emotions and biases. The tool could be useful to law enforcement and families in locating missing children.
"As this tool is developed I could see it being used to identify long-time missing children as they mature," said Ross Wolf, associate professor of criminal justice at UCF.
Wolf said that facial recognition technology is already heavily used by law enforcement, but that it has not been developed to the point where it can identify the same characteristics in photos over time, something this technology could have the capability to do. Dehghan said he is planning to expand on the work in that area by studying how factors such as age and ethnicity affect the resemblance of facial features.
Cardiologists have developed a minimally invasive gene transplant procedure that changes unspecialized heart cells into "biological pacemaker" cells that keep the heart steadily beating.
The laboratory animal research, published online and in today's print edition of the peer-reviewed journalScience Translational Medicine, is the result of a dozen years of research with the goal of developing biological treatments for patients with heart rhythm disorders who currently are treated with surgically implanted pacemakers. In the United States, an estimated 300,000 patients receive pacemakers every year.
"We have been able, for the first time, to create a biological pacemaker using minimally invasive methods and to show that the biological pacemaker supports the demands of daily life," said Eduardo Marbán, MD, PhD, director of the Cedars-Sinai Heart Institute, who led the research team. "We also are the first to reprogram a heart cell in a living animal in order to effectively cure a disease."
These laboratory findings could lead to clinical trials for humans who have heart rhythm disorders but who suffer side effects, such as infection of the leads that connect the device to the heart, from implanted mechanical pacemakers.
Eugenio Cingolani, MD, the director of the Heart Institute's Cardiogenetics-Familial Arrhythmia Clinic who worked with Marbán on biological pacemaker research team, said that in the future, pacemaker cells also could help infants born with congenital heart block.
"Babies still in the womb cannot have a pacemaker, but we hope to work with fetal medicine specialists to create a life-saving catheter-based treatment for infants diagnosed with congenital heart block," Cingolani said. "It is possible that one day, we might be able to save lives by replacing hardware with an injection of genes."
"This work by Dr. Marbán and his team heralds a new era of gene therapy, in which genes are used not only to correct a deficiency disorder, but to actually turn one kind of cell into another type," said Shlomo Melmed, dean of the Cedars-Sinai faculty and the Helene A. and Philip E. Hixson Distinguished Chair in Investigative Medicine.
In the study, laboratory pigs with complete heart block were injected with the gene called TBX18 during a minimally invasive catheter procedure. On the second day after the gene was delivered to the animals' hearts, pigs who received the gene had significantly faster heartbeats than pigs who did not receive the gene. The stronger heartbeat persisted for the duration of the 14-day study.
No Man’s Sky is a video game quite unlike any other. Sean Murray, one of the creators of the computer game No Man’s Sky, can’t guarantee that the virtual universe is infinite, but he’s certain that, if it isn’t, nobody will ever find out. “If you were to visit one virtual planet every second,” he says, “then our own sun will have died before you’d have seen them all.”
Developed for Sony’s PlayStation 4 by an improbably small team (the original four-person crew has grown only to 10 in recent months) at Hello Games, an independent studio in the south of England, it’s a game that presents a traversable universe in which every rock, flower, tree, creature, and planet has been “procedurally generated” to create a vast and diverse play area.
“We are attempting to do things that haven’t been done before,” says Murray. “No game has made it possible to fly down to a planet, and for it to be planet-sized, and feature life, ecology, lakes, caves, waterfalls, and canyons, then seamlessly fly up through the stratosphere and take to space again. It’s a tremendous challenge.”
Procedural generation, whereby a game’s landscape is generated not by an artist’s pen but an algorithm, is increasingly prevalent in video games. Most famously Minecraft creates a unique world for each of its players, randomly arranging rocks and lakes from a limited palette of bricks whenever someone begins a new game (see “The Secret to a Video Game Phenomenon”). But No Man’s Sky is far more complex and sophisticated. The tens of millions of planets that comprise the universe are all unique. Each is generated when a player discovers it, and is subject to the laws of its respective solar systems and vulnerable to natural erosion. The multitude of creatures that inhabit the universe dynamically breed and genetically mutate as time progresses. This is virtual world building on an unprecedented scale (see video).
This presents numerous technological challenges, not least of which is how to test a universe of such scale during its development – the team is currently using virtual testers—automated bots that wander around taking screenshots which are then sent back to the team for viewing. Additionally, while No Man’s Sky might have an infinite-sized universe, there aren’t an infinite number of players. To avoid the problem of a kind of virtual loneliness, where a player might never encounter another person on his or her travels, the game starts every new player in the same galaxy (albeit on his or her own planet) with a shared initial goal of traveling to its center. Later in the game, players can meet up, fight, trade, mine, and explore. “Ultimately we don’t know whether people will work, congregate, or disperse,” Murray says. “I know players don’t like to be told that we don’t know what will happen, but that’s what is exciting to us: the game is a vast experiment.”
Workers with the Insect Museum of West China, who were recently given several very large dragon-fly looking insects, with long teeth, by locals in a part of Sichuan, have declared it, a giant dobsonfly the largest known aquatic insect in the world alive today. The find displaces the previous record holder, the South American helicopter damselfly, by just two centimeters.
The dobsonfly is common (there are over 220 species of them) in China, India, Africa, South America and some other parts of Asia, but until now, no specimens as large as those recently found in China have been known. The largest specimens in the found group had a wingspan of 21 centimeters, making it large enough to cover the entire face of a human adult. Locals don't have to worry too much about injury from the insects, however, as officials from the museum report that larger males' mandibles are so huge in proportion to their bodies that they are relatively weak—incapable of piercing human skin. They can kick up a stink, however, as they are able to spray an offensive odor when threatened.
Also, despite the fact that they look an awful lot like dragonflies, they are more closely related to fishflies. The long mandibles, though scary looking to humans, are actually used for mating—males use them to show off for females, and to hold them still during copulation. Interestingly, while their large wings (commonly twice their body length) make for great flying, they only make use of them for about a week—the rest of their time alive as adults is spent hiding under rocks or moving around on or under the water. That means that they are rarely seen as adults, which for most people is probably a good thing as the giants found in China would probably present a frightening sight. They are much better known during their long larval stage when they are used as bait by fishermen.
Scientists at the National Institute of Standards and Technology (NIST) have discovered that a gold nanorod submerged in water and exposed to high-frequency ultrasound waves can spin at an incredible speed of 150,000 RPM, about ten times faster than the previous record. The advance could lead to powerful nanomotors with important applications in medicine, high-speed machining, and the mixing of materials.
Take a rod only a few nanometers in size and find a way to make it spin as fast as possible, for as long as possible, and controlling it as precisely as possible. What you get is a nanomotor, a device that could one day be used to power hordes of tiny robots to build complex nanostructured materials or deliver drugs directly from inside a living cell.
Nanomotors have made giant strides in recent years: they've gotten much smaller and more reliable, and we can now also power them in many different ways. Available options include electricity, magnetic fields,blasting them with photons and, more recently, using ultrasound to rotate rods while they're submerged in water, which could prove very useful in a biological environment.
Previous studies have shown that applying a combination of ultrasound and magnetic fields can control both the spin and the forward motion of the nanorods, but nobody could tell just how fast they were spinning. Now, researchers at NIST have found that, despite being submerged in water, the rods are spinning at an impressive 150,000 RMP, which is 10 times faster than any nanoscale object submerged in liquid ever reported.
To clock the motor's speed, the researchers used gold rods which were 2 micrometers long and 300 nanometer wide. The rods were submerged in water and mixed with polystyrene nanoparticles, and positioned just above a speaker-type shaker.
The researchers will now focus on understanding exactly why the motors rotate (which is not yet well understood) and how the vortexes around the rods affects their interactions with each other.
A paper published in the journal ACS Nano describes the advance.
Cancer has left its 'footprint' on our evolution, according to a study which examined how the relics of ancient viruses are preserved in the genomes of 38 mammal species. The team found that as animals increased in size they 'edited out' potentially cancer-causing relics from their genomes so that mice have almost ten times as many ERVs as humans. The findings offer a clue as to why larger animals have a lower incidence of cancer than expected compared to smaller ones, and could help in the search for new anti-viral therapies.
Viral relics are evidence of the ancient battles our genes have fought against infection. Occasionally the retroviruses that infect an animal get incorporated into that animal's genome and sometimes these relics get passed down from generation to generation -- termed 'endogenous retroviruses' (ERVs). Because ERVs may be copied to other parts of the genome they contribute to the risk of cancer-causing mutations.
Now a team from Oxford University, Plymouth University, and the University of Glasgow has identified 27,711 ERVs preserved in the genomes of 38 mammal species, including humans, over the last 10 million years. The team found that as animals increased in size they 'edited out' these potentially cancer-causing relics from their genomes so that mice have almost ten times as many ERVs as humans. The findings offer a clue as to why larger animals have a lower incidence of cancer than expected compared to smaller ones, and could help in the search for new anti-viral therapies.
We set out to find as many of these viral relics as we could in everything from shrews and humans to elephants and dolphins,' said Dr Aris Katzourakis of Oxford University's Department of Zoology, lead author of the report. 'Viral relics are preserved in every cell of an animal: Because larger animals have many more cells they should have more of these endogenous retroviruses (ERVs) -- and so be at greater risk of ERV-induced mutations -- but we've found this isn't the case. In fact larger animals have far fewer ERVs, so they must have found ways to remove them.'
A combination of mathematical modelling and genome research uncovered some striking differences between mammal genomes: mice (c.19 grams) have 3331 ERVs, humans (c.59 kilograms) have 348 ERVs, whilst dolphins (c.281 kilograms) have just 55 ERVs.
'This is the first time that anyone has shown that having a large number of ERVs in your genome must be harmful -- otherwise larger animals wouldn't have evolved ways of limiting their numbers,' said Dr Katzourakis. 'Logically we think this is linked to the increased risk of ERV-based cancer-causing mutations and how mammals have evolved to combat this risk. So when we look at the pattern of ERV distribution across mammals it's like looking at the 'footprint' cancer has left on our evolution.'
Dr Robert Belshaw of Plymouth University Peninsula Schools of Medicine and Dentistry, School of Biomedical and Healthcare Sciences, added: "Cancer is caused by errors occurring in cells as they divide, so bigger animals -- with more cells -- ought to suffer more from cancer. Put simply, the blue whale should not exist. However, larger animals are not more prone to cancer than smaller ones: this is known as Peto's Paradox (named after Sir Richard Peto, the scientist credited with first spotting this). A team of scientists at Oxford, Plymouth and Glasgow Universities had been studying endogenous retroviruses, viruses like HIV but which have become part of their host's genome and which in other animals can cause cancer. Surprisingly, they found that bigger mammals have fewer of these viruses in their genome. This suggests that similar mechanism might be involved in fighting both cancer and the spread of these viruses, and that these are better in bigger animals (like humans) than smaller ones (like laboratory mice)."
The African elephant's genome contains the largest number of smell receptor genes - nearly 2,000 - say the researchers in the journal Genome Research.
Olfactory receptors detect odors in the environment. That means elephants' sniffers are five times more powerful than people's noses, twice that of dogs, and even stronger than the previous known record-holder in the animal kingdom: rats.
"Apparently, an elephant's nose is not only long but also superior," says lead study author Dr Yoshihito Niimura of the University of Tokyo.
Just how these genes work is not well understood, but they likely helped elephants survive and navigate their environment over the ages.
The ability to smell allows creatures to find mates and food - and avoid predators.
The study compared elephant olfactory receptor genes to those of 13 other animals, including horses, rabbits, guinea pigs, cows, rodents and chimpanzees.
Primates and people actually had very low numbers of olfactory receptor genes compared to other species, the study found.
This could be "a result of our diminished reliance on smell as our visual acuity improved," sats Niimura.
Researchers at Rice University’s Laboratory for Nanophotonics (LANP) have created a unique sensor that amplifies the optical signature of molecules by about 100 billion times — accurately identifying the composition and structure of individual molecules containing fewer than 20 atoms.
The new single-molecule imaging method, described in the journal Nature Communications, uses a form of Raman spectroscopy in combination with optical amplifier, making the sensor about 10 times more powerful that previously reported devices, said LANP Director Naomi Halas, the lead scientist on the study.
“The ideal single-molecule sensor would be able to identify an unknown molecule — even a very small one — without any prior information about that molecule’s structure or composition. That’s not possible with current technology, but this new technique has that potential.”
The optical sensor uses Raman spectroscopy, a technique pioneered in the 1930s that blossomed after the advent of lasers in the 1960s. When light strikes a molecule, most of its photons bounce off or pass directly through, but a tiny fraction — fewer than one in a trillion — are absorbed and re-emitted into another energy level that differs from their initial level. By measuring and analyzing these re-emitted photons through Raman spectroscopy, scientists can decipher the types of atoms in a molecule as well as their structural arrangement.
Scientists have created a number of techniques to boost Raman signals. In the new study, LANP graduate student Yu Zhang used one of these, a two-coherent-laser technique called “coherent anti-Stokes Raman spectroscopy,” or CARS. By using CARS in conjunction with a light amplifier made of four tiny gold nanodiscs, Halas and Zhang were able to measure single molecules in a powerful new way. LANP has dubbed the new technique “surface-enhanced CARS,” or SECARS.
Cedars-SinaI Medical Center researchers have developed a noninvasive retinal imaging device that can provide early detection of changes indicating Alzheimer’s disease 15 to 20 years before clinical diagnosis.
“In preliminary results in 40 patients, the test could differentiate between Alzheimer’s disease and non-Alzheimer’s disease with 100 percent sensitivity and 80.6 percent specificity, meaning that all people with the disease tested positive and most of the people without the disease tested negative,” said Shaun Frost, a biomedical scientist and the study manager at the Commonwealth Scientific and Industrial Research Organisation (CSIRO), Australia’s national science agency.
Keith Black, MD, professor and chair of Cedars-Sinai’s Department of Neurosurgery and director of the Maxine Dunitz Neurosurgical Institute and the Ruth and Lawrence Harvey Chair in Neuroscience, said the accumulation of beta-amyloid plaque in the brain is a hallmark sign of Alzheimer’s, but current tests detect changes only after the disease has advanced to late stages.
Researchers believe that as treatment options improve, early detection will be critical, but existing diagnostic methods are inconvenient, costly and impractical for routine screening.
“PET scans require the use of radioactive tracers, and cerebrospinal fluid analysis requires that patients undergo invasive and often painful lumbar punctures, but neither approach is quite feasible, especially for patients in the earlier stages of disease,” he said. Positron emission tomography, or PET, is the current diagnostic standard.
“The retina, unlike other structures of the eye, is part of the central nervous system, sharing many characteristics of the brain. A few years ago, we discovered at Cedars-Sinai that the plaques associated with Alzheimer’s disease occur not only in the brain but also in the retina.
Research reveals large increases in population expected in the next three decades need not result in widespread hunger.
The world’s existing cropland could feed at least 3 billion extra people if it were used more efficiently, a new study has found, showing that the large increases in population expected in the next three decades need not result in widespread hunger.
More than half of the fertiliser currently poured on to crops in many countries is wasted, according to the study. About 60% of the nitrogen applied to crops worldwide is not needed, as well as about half of the phosphorus, an element whose readily available sources are dwindling.
Cutting waste even by modest amounts would also feed millions, the authors found: between one-third and a half of the viable crops and food produced from them around the world are wasted, in the developing world usually because of a lack of infrastructure such as refrigerated transport, and in the rich world because of wasteful habits.
The study, published in the peer-review journal Science and led by scientists at the University of Minnesota in the US, suggested that a focus on staple crops such as wheat and rice in key countries, including China, India, the US, Brazil, Indonesia, Pakistan and Europe, would pay off in terms of producing more food for the world’s growing population. Most forecasts are that the world will number more than 9 billion people by 2050, up from about 7 billion people today.
Looking after water could also yield vast dividends, the report found: if the water used for irrigation was pinpointed more efficiently to where it is needed, then much more could be grown, but currently much of it is sprayed uselessly over crops. Between 8% and 15% of the water currently used could be saved, the study suggested.
But the research also found that at least 4 billion people could be fed with the crops we currently devote to fattening livestock, fuelling the argument that the over-reliance on meat in the west and among the growing middle classes in the developing world is an increasing problem when it comes to feeding the world.
A major breakthrough in understanding the molecular basis of fibroadenoma, one of the most common breast tumors diagnosed in women, has been made by a multidisciplinary team of scientists. The team used advanced DNA sequencing technologies to identify a critical gene called MED12 that was repeatedly disrupted in nearly 60 percent of fibroadenoma cases.
A multi-disciplinary team of scientists from the National Cancer Centre Singapore, Duke-NUS Graduate Medical School Singapore, and Singapore General Hospital have made a major breakthrough in understanding the molecular basis of fibroadenoma, one of the most common breast tumors diagnosed in women. The team, led by Professors Teh Bin Tean, Patrick Tan, Tan Puay Hoon and Steve Rozen, used advanced DNA sequencing technologies to identify a critical gene called MED12 that was repeatedly disrupted in nearly 60% of fibroadenoma cases. Their findings have been published in the top-ranked journal Nature Genetics.
Fibroadenomas are the most common benign breast tumors in women of reproductive age, affecting thousands of women in Singapore each year. Worldwide, it is estimated that millions of women are diagnosed with fibroadenoma annually. Frequently discovered in clinical workups for breast cancer diagnosis and during routine breast cancer screening, clinicians often face of challenge of distinguishing fibroadenomas from breast cancer.
To facilitate this diagnostic question, the team embarked on a study to identify if there are any genetic abnormalities in fibroadenomas that may be used to differentiate them. By analysing all the protein-coding genes in a panel of fibroadenomas from Singapore patients, the team identified frequent mutations in a gene called MED12 in a remarkable 60% of fibroadenomas. Prof Tan Puay Hoon said, "It is amazing that these common breast tumors can be caused by such a precise disruption in a single gene. Our findings show that even common diseases can have a very exact genetic basis. Importantly, now that we know the cause of fibroadenoma, this research can have many potential applications."
Prof Tan added, "For example, measuring the MED12 gene in breast lumps may help clinicians to distinguish fibroadenomas from other types of breast cancer. Drugs targeting the MED12 pathway may also be useful in patients with multiple and recurrent fibroadenomas as this could help patients avoid surgery and relieve anxiety."
The team's findings have also deepened the conceptual understanding of how tumors can develop. Like most breast tumors including breast cancers, fibroadenomas consist of a mixed population of different cell types, called epithelial cells and stromal cells. However, unlike breast cancers where the genetic abnormalities arise from the epithelial cells, the scientists, using a technique called laser capture microdissection (LCM), showed that the pivotal MED12 mutations in fibroadenomas are found in the stromal cells.
A paper analyses the potential of the electric solar wind sail for solar system space missions. Applications studied include fly-by missions to terrestrial planets (Venus, Mars and Phobos, Mercury) and asteroids, missions based on non-Keplerian orbits (orbits that can be maintained only by applying continuous propulsive force), one-way boosting to outer solar system, off-Lagrange point space weather forecasting and low-cost impactor probes for added science value to other missions. We also discuss the generic idea of data clippers (returning large volumes of high resolution scientific data from distant targets packed in memory chips) and possible exploitation of asteroid resources. Possible orbits were estimated by orbit calculations assuming circular and coplanar orbits for planets. Some particular challenge areas requiring further research work and related to some more ambitious mission scenarios are also identified and discussed.
The main purpose of this article is to analyze the potential of E-sail technology in some of the envisaged possible applications for solar system space activities. To a limited extent we also adopt a comparative approach,estimating the added value and other advantages stemming from E-sail technology in comparison with present chemical and electric propulsion systems and(in some cases) with other propellantless propulsion concepts. When making such comparisons a key quantity that we use for representing the mission cost is the total required velocity change, Av, also called delta-v.The Sail Propulsion Working Group, a joint working group between the Navigation Guidance and Control Section and the Electric Propulsion Section of the European Space Agency, has envisaged the study of three reference missions which could be successfully carried out using propellantless propulsion concepts.
Scientists analyzing data from NASA’s Cassini mission have firm evidence the ocean inside Saturn's largest moon, Titan, might be as salty as Earth's Dead Sea.
The new results come from a study of gravity and topography data collected during Cassini's repeated flybys of Titan during the past 10 years. Using the Cassini data, researchers presented a model structure for Titan, resulting in an improved understanding of the structure of the moon's outer ice shell. The findings are published in this week’s edition of the journal Icarus.
"Titan continues to prove itself as an endlessly fascinating world, and with our long-lived Cassini spacecraft, we’re unlocking new mysteries as fast as we solve old ones," said Linda Spilker, Cassini project scientist at NASA's Jet Propulsion Laboratory in Pasadena, California, who was not involved in the study.
Additional findings support previous indications the moon's icy shell is rigid and in the process of freezing solid. Researchers found that a relatively high density was required for Titan's ocean in order to explain the gravity data. This indicates the ocean is probably an extremely salty brine of water mixed with dissolved salts likely composed of sulfur, sodium and potassium. The density indicated for this brine would give the ocean a salt content roughly equal to the saltiest bodies of water on Earth.
"This is an extremely salty ocean by Earth standards," said the paper's lead author, Giuseppe Mitri of the University of Nantes in France. "Knowing this may change the way we view this ocean as a possible abode for present-day life, but conditions might have been very different there in the past."
Cassini data also indicate the thickness of Titan's ice crust varies slightly from place to place. The researchers said this can best be explained if the moon's outer shell is stiff, as would be the case if the ocean were slowly crystalizing and turning to ice. Otherwise, the moon's shape would tend to even itself out over time, like warm candle wax. This freezing process would have important implications for the habitability of Titan's ocean, as it would limit the ability of materials to exchange between the surface and the ocean.
A further consequence of a rigid ice shell, according to the study, is any outgassing of methane into Titan's atmosphere must happen at scattered "hot spots" -- like the hot spot on Earth that gave rise to the Hawaiian Island chain. Titan's methane does not appear to result from convection or plate tectonics recycling its ice shell.
How methane gets into the moon's atmosphere has long been of great interest to researchers, as molecules of this gas are broken apart by sunlight on short geological timescales. Titan's present atmosphere contains about five percent methane. This means some process, thought to be geological in nature, must be replenishing the gas. The study indicates that whatever process is responsible, the restoration of Titan's methane is localized and intermittent.
"Our work suggests looking for signs of methane outgassing will be difficult with Cassini, and may require a future mission that can find localized methane sources," said Jonathan Lunine, a scientist on the Cassini mission at Cornell University, Ithaca, New York, and one of the paper's co-authors. "As on Mars, this is a challenging task."
Interested in an ultra-fast, unbreakable, and flexible smart phone that recharges in a matter of seconds? Monolayer materials may make it possible. These atom-thin sheets—including the famed super material graphene—feature exceptional and untapped mechanical and electronic properties. But to fully exploit these atomically tailored wonder materials, scientists must pry free the secrets of how and why they bend and break under stress.
Fortunately, researchers have now pinpointed the breaking mechanism of several monolayer materials hundreds of times stronger than steel with exotic properties that could revolutionize everything from armor to electronics. A Columbia University team used supercomputers at the U.S. Department of Energy's Brookhaven National Laboratory to simulate and probe quantum mechanical processes that would be extremely difficult to explore experimentally.
They discovered that straining the materials induced a novel phase transition—a restructuring in their near-perfect crystalline structures that leads to instability and failure. Surprisingly, the phenomenon persisted across several different materials with disparate electronic properties, suggesting that monolayers may have intrinsic instabilities to be either overcome or exploited. The results were published in the journal Physical Review B.
"Our calculations exposed these monolayer materials' fundamental shifts in structure and character when stressed," said study coauthor and Columbia University Ph.D. candidate Eric Isaacs. "To see the beautiful patterns exhibited by these materials at their breaking points for the first time was enormously exciting—and important for future applications."
Research carried out at the ICR has revealed the structure of one of the most important and complicated proteins in cell division – a fundamental process the development of cancer, published in Nature.
Images of the gigantic protein in unprecedented detail will transform scientists’ understanding of exactly how cells copy their chromosomes and divide, and could reveal binding sites for future cancer drugs.
A team from The Institute of Cancer Research, London, and the Medical Research Council Laboratory of Molecular Biology in Cambridge produced the first detailed images of the anaphase-promoting complex (APC/C).
The APC/C performs a wide range of vital tasks associated with mitosis, the process during which a cell copies its chromosomes and pulls them apart into two separate cells. Mitosis is used in cell division by all animals and plants.
Discovering its structure could ultimately lead to new treatments for cancer, which hijacks the normal process of cell division to make thousands of copies of harmful cancer cells.
In the study, which was funded by Cancer Research UK, the researchers reconstituted human APC/C and used a combination of electron microscopy and imaging software to visualize it at a resolution of less than a nanometer.
The resolution was so fine that it allowed the researchers to see the secondary structure – the set of basic building blocks which combine to form every protein. Alpha-helix rods and folded beta-sheet constructions were clearly visible within the 20 subunits of the APC/C, defining the overall architecture of the complex.
Previous studies led by the same research team had shown a globular structure for APC/C in much lower resolution, but the secondary structure had not previously been mapped. The new study could identify binding sites for potential cancer drugs.
Each of the APC/C’s subunits bond and mesh with other units at different points in the cell cycle, allowing it to control a range of mitotic processes including the initiation of DNA replication, the segregation of chromosomes along protein ‘rails’ called spindles, and the ultimate splitting of one cell into two, called cytokinesis. Disrupting each of these processes could selectively kill cancer cells or prevent them from dividing.