Horses were domesticated 6,000 years ago on the grasslands of Ukraine, southwest Russia and west Kazakhstan, a genetic study shows. Domestic horses then spread across Europe and Asia, breeding with wild mares along the way.
Your new post is loading...
NOTE: To subscribe to the RSS feed of Amazing Science, copy http://www.scoop.it/t/amazing-science/rss.xml into the URL field of your browser and click "subscribe".
This newsletter is aggregated from over 1450 news sources:
All my Tweets and Scoop.It! posts sorted and searchable:
You can search through all the articles semantically on my
NOTE: All articles in the amazing-science newsletter can also be sorted by topic. To do so, click the FIND buntton (symbolized by the FUNNEL on the top right of the screen) and display all the relevant postings SORTED by TOPICS.
You can also type your own query:
e.g., you are looking for articles involving "dna" as a keyword
MOST_READ • 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video
Gastric cancer was the world’s third leading cause of cancer mortality in 2012, responsible for 723,000 deaths1. The vast majority of gastric cancers are adenocarcinomas, which can be further subdivided into intestinal and diffuse types according to the Lauren classification2. An alternative system, proposed by the World Health Organization, divides gastric cancer into papillary, tubular, mucinous (colloid) and poorly cohesive carcinomas3. These classification systems have little clinical utility, making the development of robust classifiers that can guide patient therapy an urgent priority.
Molecular analysis of its molecular and clinical characteristics has been complicated by histological and etiological heterogeneity. A team of scientists now describes in Nature a comprehensive molecular evaluation of 295 primary gastric adenocarcinomas as part of The Cancer Genome Atlas (TCGA) project. They propose a molecular classification dividing gastric cancer into four subtypes: (i) tumors positive for Epstein–Barr virus, which display recurrentPIK3CA mutations, extreme DNA hypermethylation, and amplification of JAK2, CD274 (also known as PD-L1) and PDCD1LG2 (also known as PD-L2); (ii) microsatellite unstable tumors, which show elevated mutation rates, including mutations of genes encoding targetable oncogenic signaling proteins; (iii) genomically stable tumors, which are enriched for the diffuse histological variant and mutations of RHOA or fusions involving RHO-family GTPase-activating proteins; and (iv) tumors with chromosomal instability, which show marked aneuploidy and focal amplification of receptor tyrosine kinases. Identification of these subtypes provides a roadmap for patient stratification and trials of targeted therapies.
The majority of gastric cancers are associated with infectious agents, including the bacterium Helicobacter pylori4 and Epstein–Barr virus (EBV). The distribution of histological subtypes of gastric cancer and the frequencies of H. pylori and EBV associated gastric cancer vary across the globe5. A small minority of gastric cancer cases are associated with germline mutation in E-cadherin (CDH1)6 or mismatch repair genes7 (Lynch syndrome), whereas sporadic mismatch repair-deficient gastric cancers have epigenetic silencing of MLH1 in the context of a CpG island methylator phenotype (CIMP)8. Molecular profiling of gastric cancer has been performed using gene expression or DNA sequencing9, 10, 11, 12, but has not led to a clear biologic classification scheme. The goals of this study by The Cancer Genome Atlas (TCGA) were to develop a robust molecular classification of gastric cancer and to identify dysregulated pathways and candidate drivers of distinct classes of gastric cancer.
A mystery crater spotted in the frozen Yamal peninsula in Siberia earlier this month was probably caused by methane released as permafrost thawed, researchers in Russia say.
Air near the bottom of the crater contained unusually high concentrations of methane — up to 9.6% — in tests conducted at the site on 16 July, says Andrei Plekhanov, an archaeologist at the Scientific Centre of Arctic Studies in Salekhard, Russia. Plekhanov, who led an expedition to the crater, says that air normally contains just 0.000179% methane.
Since the hole was spotted in mid-July by a helicopter pilot, conjecture has abounded about how the 30-metre-wide crater was formed — a gas or missile explosion, a meteorite impact and alien involvement have all been suggested.
But Plekhanov and his team believe that it is linked to the abnormally hot Yamal summers of 2012 and 2013, which were warmer than usual by an average of about 5°C. As temperatures rose, the researchers suggest, permafrost thawed and collapsed, releasing methane that had been trapped in the icy ground.
Other researchers argue that long-term global warming might be to blame — and that a slow and steady thaw in the region could have been enough to free a burst of methane and create such a big crater. Over the past 20 years, permafrost at a depth of 20 metres has warmed by about 2°C, driven by rising air temperatures1, notes Hans-Wolfgang Hubberten, a geochemist at the Alfred Wegener Institute in Potsdam, Germany.
The torrents of data flowing out of cancer research and treatment are yielding fresh insight into the disease.
In 2013, geneticist Stephen Elledge answered a question that had puzzled cancer researchers for nearly 100 years. In 1914, German biologist Theodor Boveri suggested that the abnormal number of chromosomes — called aneuploidy — seen in cancers might drive the growth of tumors. For most of the next century, researchers made little progress on the matter. They knew that cancers often have extra or missing chromosomes or pieces of chromosomes, but they did not know whether this was important or simply a by-product of tumor growth — and they had no way of finding out.
Elledge found that where aneuploidy had resulted in missing tumor-suppressor genes, or extra copies of the oncogenes that promote cancer, tumors grow more aggressively (T. Davoli et al.Cell 155, 948–962; 2013). His insight — that aneuploidy is not merely an odd feature of tumors, but an engine of their growth — came from mining voluminous amounts of cellular data. And, says Elledge, it shows how the ability of computers to sift through ever-growing troves of information can help us to deepen our understanding of cancer and open the door to discoveries.
Modern cancer care has the potential to generate huge amounts of data. When a patient is diagnosed, the tumor's genome might be sequenced to see if it is likely to respond to a particular drug. The sequencing might be repeated as treatment progresses to detect changes. The patient might have his or her normal tissue sequenced as well, a practice that is likely to grow as costs come down. The doctor will record the patient's test results and medical history, including dietary and smoking habits, in an electronic health record. The patient may also have computed tomography (CT) and magnetic resonance imaging (MRI) scans to determine the stage of the disease. Multiply all that by the nearly 1.7 million people diagnosed with cancer in 2013 in the United States alone and it becomes clear that oncology is going to generate even more data than it does now. Computers can mine the data for patterns that may advance the understanding of cancer biology and suggest targets for therapy.
Elledge's discovery was the result of a computational method that he and his colleagues developed, called the Tumor Suppressor and Oncogene Explorer. They used it to mine large data sets, including the Cancer Genome Atlas, maintained by the US National Cancer Institute, based in Bethesda, Maryland, and the Catalogue of Somatic Mutations in Cancer, run by the Wellcome Trust Sanger Institute in Hinxton, UK. The databases contained roughly 1.2 million mutations from 8,207 tissue samples of more than 20 types of tumor.
Analyzing the genomes of 8,200 tumors is just a start. Researchers are “trying to figure out how we can bring together and analyze, over the next few years, a million genomes”, says Robert Grossman, who directs the Initiative in Data Intensive Science at the University of Chicago in Illinois. This is an immense undertaking; the combined cancer genome and normal genome from a single patient constitutes about 1 terabyte (1012 bytes) of data, so a million genomes would generate an exabyte (1018 bytes). Storing and analysing this much data could cost US$100 million a year, Grossman says.
But it is the new technologies that are creating an information boom. “We can collect data faster than we can physically do anything with them,” says Manish Parashar, a computer scientist and head of the Rutgers Discovery Informatics Institute in Piscataway, New Jersey, who collaborates with Foran to find ways of handling the information. “There are some fundamental challenges being caused by our ability to capture so much data,” he says.
A major problem with data sets at the terabyte-and-beyond level is figuring out how to manipulate all the data. A single high-resolution medical image can take up tens of gigabytes, and a researcher might want the computer to compare tens of thousands of such images. Breaking down just one image in the Rutgers project into sets of pixels that the computer can identify takes about 15 minutes, and moving that much information from where it is stored to where it can be processed is difficult. “Already we have people walking around with disk drives because you can't effectively use the network,” Parashar says.
Informatics researchers are developing algorithms to split data into smaller packets for parallel processing on separate processors, and to compress files without omitting any relevant information. And they are relying on advances in computer science to speed up processing and communications in general.
Foran emphasizes that the understanding and treatment of cancer has undergone a dramatic shift as oncology has moved from one-size-fits-all attacks on tumours towards personalized medicine. But cancers are complex diseases controlled by many genes and other factors. “It's not as if you're going to solve cancer,” he says. But big data can provide new, better-targeted ways of grappling with the disease. “You're going to come up with probably a whole new set of blueprints for how to treat patients.”
The new nanocarriers are just 15 nanometers in diameter, based on building blocks called amphiphilic polymers: they have both hydrophilic (water-loving, polar) and lipophilic (fat-loving) properties). That allows the nanocarriers to hold the guest molecules within their water-insoluble interior and use their water-soluble exterior to travel through an aqueous environment. And that makes the nanocarriers ideal for transferring molecules that would otherwise be insoluble in water.
They also emit a fluorescent signal that can be observed with a microscope, allowing for tracking and photographing the nanoparticles in the body.
“The size of these nanoparticles, their dynamic character and the fact that the reactions take place under normal biological conditions (at ambient temperature and neutral environment) makes these nanoparticles an ideal vehicle for the controlled activation of therapeutics, directly inside the cells,” says lead investigator Francisco Raymo, professor of chemistry in the University of Miami College of Arts and Sciences and UM laboratory for molecular photonics.
The next phase of this investigation involves demonstrating that this method can be used to achieve chemical reactions inside cells, instead of energy transfers.
Several DARPA programs are exploring innovative technologies and approaches that could supplement GPS to provide reliable, highly accurate real-time positioning, navigation and timing (PNT) data for military and civilian uses and deal with possible loss of GPS accuracy from solar storms or jamming, for example.
DARPA Director Arati Prabhakar said DARPA currently has five programs that focus on PNT-related technology.
Adaptable Navigation Systems (ANS) is developing new algorithms and architectures that can create better inertial measurement devices. By using cold-atom interferometry, which measures the relative acceleration and rotation of a cloud of atoms stored within a sensor, extremely accurate inertial measurement devices could operate for long periods without needing external data to determine time and position. ANS also seeks to exploit non-navigational electromagnetic signals — including commercial satellite, radio and television signals and even lightning strikes — to provide additional points of reference for PNT.
Microtechnology for Positioning, Navigation, and Timing (Micro-PNT) leverages extreme miniaturization made possible by DARPA-developed micro-electromechanical systems (MEMS) technology. These include precise chip-scale gyroscopes, clocks, and complete integrated timing and inertial measurement devices. DARPA researchers have fabricated a prototype with three gyroscopes, three accelerometers and a highly accurate master clock on a chip that fits easily on the face of a penny.
Quantum-Assisted Sensing and Readout (QuASAR) intends to make the world’s most accurate atomic clocks — which currently reside in laboratories — both robust and portable. QuASAR researchers have developed optical atomic clocks in laboratories with a timing error of less than 1 second in 5 billion years. Making clocks this accurate and portable could improve upon existing military systems such as GPS, and potentially enable entirely new radar, LIDAR, and metrology applications.
The Program in Ultrafast Laser Science and Engineering (PULSE) applies the latest in pulsed laser technology to significantly improve the precision and size of atomic clocks and microwave sources, enabling more accurate time and frequency synchronization over large distances. It could enable global distribution of time precise enough to take advantage of the world’s most accurate optical atomic clocks.
The Spatial, Temporal and Orientation Information in Contested Environments (STOIC) program seeks to develop PNT systems that are independent of GPS: long-range robust reference signals, ultra-stable tactical clocks, and multifunctional systems that provide PNT information between multiples users.
What if computer screens had glasses instead of the people staring at the monitors? That concept is not too far afield from technology being developed by UC Berkeley computer and vision scientists.
The researchers are developing computer algorithms to compensate for an individual’s visual impairment, and creating vision-correcting displays that enable users to see text and images clearly without wearing eyeglasses or contact lenses. The technology could potentially help hundreds of millions of people who currently need corrective lenses to use their smartphones, tablets and computers. One common problem, for example, is presbyopia, a type of farsightedness in which the ability to focus on nearby objects is gradually diminished as the aging eyes’ lenses lose elasticity.
More importantly, the displays could one day aid people with more complex visual problems, known as high order aberrations, which cannot be corrected by eyeglasses, said Brian Barsky, UC Berkeley professor of computer science and vision science, and affiliate professor of optometry.
“We now live in a world where displays are ubiquitous, and being able to interact with displays is taken for granted,” said Barsky, who is leading this project. “People with higher order aberrations often have irregularities in the shape of the cornea, and this irregular shape makes it very difficult to have a contact lens that will fit. In some cases, this can be a barrier to holding certain jobs because many workers need to look at a screen as part of their work. This research could transform their lives, and I am passionate about that potential.”
“The significance of this project is that, instead of relying on optics to correct your vision, we use computation,” said lead author Fu-Chung Huang, who worked on this project as part of his computer science Ph.D. dissertation at UC Berkeley under the supervision of Barsky and Austin Roorda, professor of vision science and optometry. “This is a very different class of correction, and it is non-intrusive.”
The algorithm, which was developed at UC Berkeley, works by adjusting the intensity of each direction of light that emanates from a single pixel in an image based upon a user’s specific visual impairment. In a process called deconvolution, the light passes through the pinhole array in such a way that the user will perceive a sharp image.
A common acrylic polymer used in biomedical applications and as a substitute for glass has been given the ability to completely self-heal underwater by US researchers. The method, which takes inspiration from the self-healing abilities of adhesive proteins secreted by mussels, could allow for longer lasting biomedical implants. Temporary hydrogen bonding network stitches damage as the material fuses together.
'Polymer self-healing research is about 10 years old now and many different strategies have been developed,' says Herbert Waite, who conducted the work with colleagues at the University of California, Santa Barbara. 'None, however, address the need for healing in a wet medium – a critical omission as all biomaterials function, and fail, in wet environments.'
The idea of mimicking the biological self-healing ability of mussel adhesive proteins is not new, and previous attempts have involved polymer networks functionalised with catechols – synthetic water-soluble organic molecules that mimic mussel adhesive proteins – and metal-ion mediated bonding. However, how mussel adhesive proteins self-heal remains poorly understood, which has limited attempts to synthesise catechols that accurately mimic biological self-healing underwater.
Now, Waite and colleagues have discovered a new aspect of catechols after they were simply 'goofing around' in the lab and found a new way to modify the surface of poly(methyl methacrylate), or PMMA, with catechols. This led them to explore the material's properties and discover that hydrogen bonding enables the polymer to self-heal underwater after being damaged. 'Usually, catechols in wet adhesives are associated with covalent or coordination mediated cross-linking. Our results argue that hydrogen bonding can also be critical, especially as an initiator of healing,' he says.
The healing process begins because catechols provide multidentate hydrogen-bonding faces that trigger a network of hydrogen bonds to fix any damage – the interaction is strong enough to resist interference by water but reversible. Acting a bit like dissolvable stitches, hydrogen bonding between the catechols appears to stitch the damaged area, which allows the underlying polymer to fuse back together. After about 20 minutes, the hydrogen bonded catechols mysteriously disappear leaving the original site of damage completely healed. 'We don't know where the hydrogen bonded catechols go,’ Waite says. ‘Possibly back to the surface, dispersed within the bulk polymer, or some other possibility.'
Phillip Messersmith, a biomaterials expert at the University of California, Berkeley, US, says that this is ‘really creative work’. '[This] reveals a new dimension of catechols, which in this case mediate interfacial self-healing through the formation of hydrogen bonds between surfaces, and which are ultimately augmented or replaced by other types of adhesive interactions.'
Does the Milky Way look fat in this picture? Has Andromeda been taking skinny selfies? Using a new, more accurate method for measuring the mass of galaxies, and international group of researchers has shown that the Milky Way has only half the Mass of the Andromeda Galaxy.
In previous studies, researchers were only able to estimate the mass of the Milky Way and Andromeda based on observations made using their smaller satellite dwarf galaxies. In the new study, researchers culled previously published data that contained information about the distances between the Milky Way, Andromeda and other close-by galaxies -- including those that weren't satellites -- that reside in and right outside an area referred to as the Local Group.
Galaxies in the Local Group are bound together by their collective gravity. As a result, while most galaxies, including those on the outskirts of the Local Group, are moving farther apart due to expansion, the galaxies in the Local Group are moving closer together because of gravity. For the first time, researchers were able to combine the available information about gravity and expansion to complete precise calculations of the masses of both the Milky Way and Andromeda.
"Historically, estimations of the Milky Way's mass have been all over the map," said Walker, an assistant professor of physics at Carnegie Mellon. "By studying two massive galaxies that are close to each other and the galaxies that surround them, we can take what we know about gravity and pair that with what we know about expansion to get an accurate account of the mass contained in each galaxy. This is the first time we've been able to measure these two things simultaneously."
By studying both the galaxies in and immediately outside the Local Group, Walker was able to pinpoint the group's center. The researchers then calculated the mass of both the ordinary, visible matter and the invisible dark matter throughout both galaxies based on each galaxy's present location within the Local Group. Andromeda had twice as much mass as the Milky Way, and in both galaxies 90 percent of the mass was made up of dark matter.
The study was supported by the UK's Science and Technology Facilities Council and led by Jorge Peñarrubia of the University of Edinburgh's School of Physics and Astronomy. Co-authors include Yin-Zhe Ma of the University of British Columbia and Alan McConnachie of the NRC Herzberg Institute of Astrophysics.
Carnegie Mellon University. "Weighing the Milky Way: Researchers devise precise method for calculating the mass of galaxies."
Invertebrate numbers have decreased by 45% on average over a 35 year period in which the human population doubled, reports a study on the impact of humans on declining animal numbers.
This decline matters because of the enormous benefits invertebrates such as insects, spiders, crustaceans, slugs and worms bring to our day-to-day lives, including pollination and pest control for crops, decomposition for nutrient cycling, water filtration and human health.
The study, published in Science and led by UCL, Stanford and UCSB, focused on the demise of invertebrates in particular, as large vertebrates have been extensively studied. They found similar widespread changes in both, with an on-going decline in invertebrates surprising scientists, as they had previously been viewed as nature’s survivors.
The decrease in invertebrate numbers is due to two main factors – habitat loss and climate disruption on a global scale. In the UK alone, scientists noted the areas inhabited by common insects such as beetles, butterflies, bees and wasps saw a 30-60% decline over the last 40 years.
Scientists believe there is a growing understanding of how ecosystems are changing but to tackle these issues, better predictions of the impact of changes are needed together with effective policies to reverse the losses currently seen. Using this approach, conservation of species can be prioritized with the benefit of protecting processes that serve human needs, and successful campaigns scaled-up to effect a positive change globally.
The phenomenon is named after the curious feline in Alice in Wonderland, who vanishes leaving only its grin. Researchers took a beam of neutrons and separated them from their magnetic moment, like passengers and their baggage at airport security. They describe their feat in Nature Communications.
The same separation trick could in principle be performed with any property of any quantum object, say researchers from Vienna University of Technology. Their technique could have a useful application in metrology - helping to filter out disturbances during high-precision measurements of quantum systems.
The idea of a "quantum Cheshire Cat" was first proposed in 2010 by Dr Jeff Tollaksen from Chapman University, a co-author on this latest paper. In the world familiar to us, an object and its properties are always bound together. A rotating ball, for instance, cannot become separated from its spin.
The cat (the neutron) goes via the upper beam path, while its grin (the magnetic moment) goes via the lower. But quantum theory predicts that a particle (such as a photon or neutron) can become physically separated from one of its properties - such as its polarisation or its magnetic moment.
"We find the cat in one place, and its grin in another," as the researchers once put it. The feline analogy is a nod to Schrodinger's Cat - the infamous thought experiment in which a cat in a box is both alive and dead simultaneously - illustrating a quantum phenomenon known as superposition.
To prove that the Cheshire Cat is not just a cute theory, the researchers used an experimental set-up known as an interferometer, at the Institute Laue-Langevin (ILL) in Grenoble, France.
A neutron beam was passed through a silicon crystal, sending it down two different paths - like passengers and their luggage at airport security.
By applying filters and a technique known as "post-selection", they were able to detect the physical separation of the neutrons from their magnetic moment - as measured by the direction of their spin.
"The system behaves as if the neutrons go through one beam path, while their magnetic moment travels along the other," the researchers reported.
Highly purified crystals that split light with uncanny precision are key parts of high-powered lenses, specialized optics and, potentially, computers that manipulate light instead of electricity. But producing these crystals by current techniques, such as etching them with a precise beam of electrons, is often extremely difficult and expensive.
Now, researchers at Princeton and Columbia universities have proposed a new method that could allow scientists to customize and grow these specialized materials, known asphotonic crystals, with relative ease.
"Our results point to a previously unexplored path for making defect-free crystals using inexpensive ingredients," said Athanassios Panagiotopoulos, the Susan Dod Brown Professor of Chemical and Biological Engineering and one of the paper's authors. "Current methods for making such systems rely on using difficult-to-synthesize particles with narrowly tailored directional interactions."
Via Alin Velea
Scientists using mission data from NASA's Cassini spacecraft have identified 101 distinct geysers erupting on Saturn's icy moon Enceladus.
This graphic shows a 3-D model of 98 geysers whose source locations and tilts were found in a Cassini imaging survey of Enceladus' south polar terrain by the method of triangulation. While some jets are strongly tilted, it is clear the jets on average lie in four distinct "planes" that are normal to the surface at their source location.
Dotted vectors indicate five jets whose sources were determined from images acquired too closely in time to determine tilts accurately. Consequently their 3-D configuration has a large uncertainty associated with it. Two geysers, indicated by crosses in PIA17188, have no tilt determinations at all and are not shown here.
A movie showing a 360-degree view of this model is also presented here. The still graphic and the movie illustrate some of the findings reported in a paper by Porco, DiNino, and Nimmo, and published in the online version of the Astronomical Journal in July 2014: http://dx.doi.org/10.1088/0004-6256/148/3/46. .
Post-equinox images like this, clearly showing the different projected locations of the intersection between the shadow and the curtain of jets from each fracture, were useful for scientists in checking the triangulated positions of the geysers, as described in a paper by Porco, DiNino, and Nimmo, and published in the online version of the Astronomical Journal in July 2014: http://dx.doi.org/10.1088/0004-6256/148/3/45.
A companion paper, by Nimmo et al. is available at: http://dx.doi.org/10.1088/0004-6256/148/3/46.
The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the mission for NASA's Science Mission Directorate, Washington. The Cassini orbiter and its two onboard cameras were designed, developed and assembled at JPL. The imaging operations center is based at the Space Science Institute in Boulder, Colorado.
The world's most successful data transfer protocol could underlie the next generation chat client: Bleep will provide totally secure, totally peer-to-peer chatting from BitTorrent.
BitTorrent, creator of the protocol that now handles more than a third of all internet traffic, has a new product called Bleep on the horizon. It aims to bring the distributed, anonymous technology that made BitTorrent so successful to the oldest action in the history of the internet: chat. Using the same BitTorrent connection logic that has allowed you to pirate TV shows and movies for over a decade, Bleep will facilitate direct, encrypted connections directly between peers, meaning that no outside observer ever gets its hands on your words. To everyone buy the intended recipient, your words are effectively “bleeped.” This could be big news for whistleblowers who are trying to keep their identity secret, for businesses that want to ensure the confidentiality of their communications, or just for normal people who want to escape the ever-watchful eye of the NSA.
Encrypted chat programs like Bleep, or even long-standing encrypted email schemes, are generally pretty difficult to use. If you wanted to send me a totally secure email, you’d need to visit by Twitter account for a PGP key (generously hosted at an external MIT key-server), use that to add me as an encrypted messaging buddy, then use specialized email software to send/receive messages. Most of the difficulty in sending secure messages comes from the fact that those emails must pass (unreadable) through a number of third parties — but BitTorrent’s whole addition to the tech sphere was its circumvention of unnecessary servers to allow direct peer-to-peer (p2p) communication.
As soon as you bring third parties into the system (i.e. remote servers), you bring trust into the equation, and exponentially increase the ways your communications could be attacked. If your conversations are all flowing through some Google, Microsoft, or Apple server somewhere, then it doesn’t really matter how well you protect things on your end; if the NSA/Snowden leaks have taught us anything, it’s that third parties can readily reveal your communications. Even encrypted chat services like ChatCrypt don’t fully get around this problem — though to be fair, they do a pretty good job.
With Bleep, the creators use something called a Distributed Hash Table (DHT) to basically associate public encryption keys (you’ll still need those) with IP addresses. Using this information (encryption+online location) the BitTorrent protocol can establish a direct link between two users with no intermediaries. BitTorrent says there will be absolutely no record of the IP-lookup (this would be a piece of metadata), as each user finds the other through the network’s many distributed nodes rather than a central lookup server. This system might work via a one-time lookup per user, or require a DHT-check to establish a connection at the beginning of every conversation; the documentation is still quite vague.
BitTorrent, as an idea, is sort of the apex predator of modern data-giants like Google. Encryption and p2p tech has not hurt Google much because, frankly, it’s always been too clunky to catch on for any large proportion of users’ time. On the other hand, Bleep will offer features like importing your Google contacts list for easy setup; people will still need to generate an encryption key-pair, but as it becomes more practically feasible to pass around more and more types of data without any assistance, it will ruffle more and more feathers. Don’t be surprised if those pro-tech super-PACs everyone’s so excited about end up opposing anti-cloud efforts like this one, especially when BitTorrent takes its thinking to the logical conclusion and releases a competitor to the onion routing protocol – i.e. Tor – which would allow a fully p2p web browser.
You can sign up for an early-access list for Bleep, but there’s no telling how long you might wait, and you’ll need to convince some friends to be as up in arms about privacy as you are. Though it’s still in a closed alpha phase, Bleep is built on the framework of the old BitTorrent Chat experiment, so it has already had extensive testing in its basic functionality. For those who want it, Bleep could be a near-perfect solution — but relying on supposedly impregnable software has burned many people in the past. We’ll see how well Bleep can measure up to the unforgiving storm of cyber-attacks that come to bear on virtually every “secure” software ever made.
A team of researchers has created a new way of manufacturing microstructured surfaces that have novel three-dimensional textures. These surfaces, made by self-assembly of carbon nanotubes, could exhibit a variety of useful properties — including controllable mechanical stiffness and strength, or the ability to repel water in a certain direction.
“We have demonstrated that mechanical forces can be used to direct nanostructures to form complex three-dimensional microstructures, and that we can independently control … the mechanical properties of the microstructures,” says A. John Hart, the Mitsui Career Development Associate Professor of Mechanical Engineering at MIT and senior author of a paper describing the new technique in the journal Nature Communications.
The technique works by inducing carbon nanotubes to bend as they grow. The mechanism is analogous to the bending of a bimetallic strip, used as the control in old thermostats, as it warms: One material expands faster than another bonded to it. But in this new process, the material bends as it is produced by a chemical reaction.
The process begins by printing two patterns onto a substrate: One is a catalyst of carbon nanotubes; the second material modifies the growth rate of the nanotubes. By offsetting the two patterns, the researchers showed that the nanotubes bend into predictable shapes as they extend.
“We can specify these simple two-dimensional instructions, and cause the nanotubes to form complex shapes in three dimensions,” says Hart. Where nanotubes growing at different rates are adjacent, “they push and pull on each other,” producing more complex forms, Hart explains. “It’s a new principle of using mechanics to control the growth of a nanostructured material,” he says.
Few high-throughput manufacturing processes can achieve such flexibility in creating three-dimensional structures, Hart says. This technique, he adds, is attractive because it can be used to create large expanses of the structures simultaneously; the shape of each structure can be specified by designing the starting pattern. Hart says the technique could also enable control of other properties, such as electrical and thermal conductivity and chemical reactivity, by attaching various coatings to the carbon nanotubes after they grow.
“If you coat the structures after the growth process, you can exquisitely modify their properties,” says Hart. For example, coating the nanotubes with ceramic, using a method called atomic layer deposition, allows the mechanical properties of the structures to be controlled. “When a thick coating is deposited, we have a surface with exceptional stiffness, strength, and toughness relative to [its] density,” Hart explains. “When a thin coating is deposited, the structures are very flexible and resilient.”
This approach may also enable “high-fidelity replication of the intricate structures found on the skins of certain plants and animals,” Hart says, and could make it possible to mass-produce surfaces with specialized characteristics, such as the water-repellent and adhesive ability of some insects. “We’re interested in controlling these fundamental properties using scalable manufacturing techniques,” Hart says.
Although the virus is exerting a heavy toll in West Africa, it does not spread easily.
Deadly Ebola probably touched down in Lagos, Nigeria, the largest city in Africa, on 20 July. A man who was thought to be infected with the virus had arrived there on a flight from Liberia, where, along with Guinea and Sierra Leone, the largest recorded Ebola outbreak is currently raging. The Lagos case is the first to be internationally exported by air travel and today the UK foreign secretary announced that he would chair a government meeting on Ebola. As long as the virus continues to infect people in Liberia, Guinea and Sierra Leone, there is a small risk of more long-distance exports of the disease. But, Ebola does not pose a global threat.
The World Health Organization still considers the Lagos case a “probable” infection because it has not yet confirmed that the 40-year-old Liberian man had Ebola. He was quarantined upon arrival at the airport and taken to hospital, where he died on 25 July. Assuming he had Ebola, if proper control measures were taken at the airport and at the hospital, the risk that health-care workers or others will become infected as a result of contact with him is low. The European Centre for Disease Prevention and Control classifies people sharing public transport with someone infected as having a “very low” risk of catching the virus. Healthcare workers and doctors, several of whom have now been infected and died as a result of caring for people in the current outbreak, are at much higher risk and the WHO advises that they take strict precautions, which greatly lowers the risk.
The ECDC also says the probability of an infected person getting on a flight in the first place is low,given the small overall number of Ebola cases. Moreover, functional health systems should be able to prevent onward spread from any exported cases. Overall, the World Health Organisation estimates that there is a high risk of spread to countries bordering those with existing outbreaks, a moderate risk to countries further afield in the sub-region, but that there is little chance of spread overseas. There is no reason to assume that an exported case — be it to Lagos, a city of 17 million people, or any other place — will spark new outbreaks, because Ebola is not highly contagious.
Schematic of micro- and nanopropellers in hyaluronan gels. The polymeric mesh structure blocks the larger micropropellers (top left), but smaller propellers.
Israeli and German researchers have created a nanoscale screw-shaped propeller that can move in a gel-like fluid, mimicking the environment inside a living organism, as described in a paper published in the June 2014 issue of ACS Nano.
The filament that makes up the propeller, made of silica and nickel, is only 70 nanometers in diameter; the entire propeller is 400 nanometers long, small enough that their motion can be affected by Brownian motion of nearby molecules.
To test if the propellers could move through living organisms, they used hyaluronan, a material that occurs throughout the human body, including the synovial fluids in joints and the vitreous humor in your eyeball.
The hyaluronan gel contains a mesh of long proteins called polymers; the polymers are large enough to prevent micron-sized (millionths of a meter) propellers from moving much at all. But the openings are large enough for nanometer-sized objects to pass through. The scientists were able to control the motion of the propellers using a relatively weak rotating magnetic field.
“One can now think about targeted applications, for instance, in the eye, where they may be moved to a precise location at the retina,” says Peer Fischer, a member of the research team and head of the Micro, Nano, and Molecular Systems Lab at the Max Planck Institute for Intelligent Systems.
Scientists could also attach “active molecules” to the tips of the propellers, or use the propellers to deliver tiny targeted doses of radiation.
A 53-year-old Swede can take credit for 2.7 million articles on Wikipedia, but some "purists" complain about his method.
Sverker Johansson could be the most prolific author you've never heard of. Volunteering his time over the past seven years publishing to Wikipedia, the 53-year-old Swede can take credit for 2.7 million articles, or 8.5% of the entire collection, according to Wikimedia analytics, which measures the site's traffic. His stats far outpace any other user, the group says.
He has been particularly prolific cataloging obscure animal species, including butterflies and beetles, and is proud of his work highlighting towns in the Philippines. About one-third of his entries are uploaded to the Swedish language version of Wikipedia, and the rest are composed in two versions of Filipino, one of which is his wife's native tongue. An administrator holding degrees in linguistics, civil engineering, economics and particle physics, he says he has long been interested in "the origin of things, oh, everything."
It isn't uncommon, however, for Wikipedia purists to complain about his method. That is because the bulk of his entries have been created by a computer software program—known as a bot. Critics say bots crowd out the creativity only humans can generate.
Mr. Johansson's program scrubs databases and other digital sources for information, and then packages it into an article. On a good day, he says his "Lsjbot" creates up to 10,000 new entries.
On Wikipedia, any registered user can create an entry. Mr. Johansson has to find a reliable database, create a template for a given subject and then launch his bot from his computer. The software program searches for information, then publishes it to Wikipedia.
Bots have long been used to author and edit entries on Wikipedia, and, more recently, an increasingly large amount of the site's new content is written by bots. Their use is regulated by Wikipedia users called the "Bot Approvals Group."
While Mr. Johansson works to achieve consensus approval for his project, he and his bot-loving peers expect to continue facing resistance. "There is a vocal minority who don't like it," he said during a recent speech on his work. Still, he soldiers on.
"I'm doing this to create absolute democracy online," Mr. Johansson said recently while sitting in front of a computer at his office at Sweden's Dalarna University.
Wikipedia, he reckons, should someday be able to tell people everything about everything. His bot, which took him months' worth of programming to create, is a step toward achieving that goal sooner rather than later—even if the entries it creates are bare-boned "stubs" containing basic information.
Achim Raschka is one of the people who would like Mr. Johansson to change course. The 41-year-old German Wikipedia enthusiast can spend days writing an in-depth article about a single type of plant.
"I am against production of bot-generated stubs in general," he said. He is particularly irked by Mr. Johansson's Lsjbot, which prizes quantity over quality and is "not helping the readers and users of Wikipedia."
Why is mercury a liquid at room temperature? If you ask that question in a school classroom you will probably be told that relativity affects the orbitals of heavy metals, contracting them and changing how they bond. However, the first evidence that this explanation is correct has only just been published.
In the 1960s, Pekka Pyykkö, now at University of Helsinki, Finland, discovered that gold’s colour was the result of relativistic effects. He showed that the lower energy levels of the 6s orbital of gold means that the energy required to excite an electron from the 5d band lies in the visible rather than UV range of light. This means that gold absorbs blue light, while reflecting yellow and red light, and it is this that gives the metal its characteristic hue. If the energies of the two bands were calculated without including relativistic effects, the energy required is much greater. Further calculations have subsequently shown the influence of relativity on the colour and bond lengths of heavy metal compounds, as well as its importance in catalysis. However, the low melting point of mercury could still only be described as ‘probably’ due to relativistic effects.
An international team led by Peter Schwerdtfeger of Massey University Auckland in New Zealand used quantum mechanics to make calculations of the heat capacity of the metal either including or excluding relativistic effects. They showed that if they ignored relativity when making their calculations, the predicted melting point of mercury was 82°C. But if they included relativistic effects their answer closely matched the experimental value of -39°C.
Relativity states that objects get heavier the faster they move. In atoms, the velocity of the innermost electrons is related to the nuclear charge. The larger the nucleus gets the greater the electrostatic attraction and the faster the electrons have to move to avoid falling into it. So, as you go down the periodic table these 1s electrons get faster and faster, and therefore heavier, causing the radius of the atom to shrink. This stabilises some orbitals, which also have a relativistic nature of their own, while destabilising others. This interplay means that for heavy elements like mercury and gold, the outer electrons are stabilised. In mercury’s case, instead of forming bonds between neighbouring mercury atoms, the electrons stay associated with their own nuclei, and weaker interatomic forces such as van der Waals bonds hold the atoms together.
The microbes living in the guts of males and females react differently to diet, even when the diets are identical, according to a study by scientists from The University of Texas at Austin and six other institutions published this week in the journal Nature Communications. These results suggest that therapies designed to improve human health and treat diseases through nutrition might need to be tailored for each sex.
The researchers studied the gut microbes in two species of fish and in mice, and also conducted an in-depth analysis of data that other researchers collected on humans. They found that in fish and humans diet affected the microbiota of males and females differently. In some cases, different species of microbes would dominate, while in others, the diversity of bacteria would be higher in one sex than the other.
These results suggest that any therapies designed to improve human health through diet should take into account whether the patient is male or female.
Only in recent years has science begun to completely appreciate the importance of the human microbiome, which consists of all the bacteria that live in and on people’s bodies. There are hundreds or even thousands of species of microbes in the human digestive system alone, each varying in abundance.
Genetics and diet can affect the variety and number of these microbes in the human gut, which can in turn have a profound influence on human health. Obesity, diabetes, and inflammatory bowel disease have all been linked to low diversity of bacteria in the human gut.
One concept for treating such diseases is to manipulate the microbes within a person’s gut through diet. The idea is gaining in popularity because dietary changes would make for a relatively cheap and simple treatment.
Much has to be learned about which species, or combination of microbial species, is best for human health. In order to accomplish this, research has to illuminate how these microbes react to various combinations of diet, genetics and environment. Unfortunately, to date most such studies only examine one factor at a time and do not take into account how these variables interact.
“Our study asks not just how diet influences the microbiome, but it splits the hosts into males and females and asks, do males show the same diet effects as females?” said Daniel Bolnick, professor in The University of Texas at Austin's College of Natural Sciences and lead author of the study.
The Silk Leaf project uses chloroplasts from real plants suspended in silk proteins to create a hardy vehicle for photosynthesis.
The leaves, created by Royal College of Art student Julian Melchiorri, absorb water and carbon dioxide just like real plants but are made from tough silk proteins that could let them survive space voyages.
Melchiorri explains: "NASA is researching different ways to produce oxygen for long-distance space journeys to let us live in space. This material could allow us to explore space much further than we can now."
The Silk Leaf project was engineered in collaboration with Tufts University silk lab, which helped Melchiorri extract chloroplasts from real leaves and suspend them in a silk matrix.
"The material is extracted directly from the fibres of silk," explains Melchiorri. "This material has an amazing property of stabilising molecules. I extracted chloroplasts from plant cells and placed them inside this silk protein. As an outcome I have the first photosynthetic material that is living and breathing as a leaf does."
Chloroplasts are the parts of plant cells that conduct photosynthesis, using the energy of the sun to turn carbon dioxide and water to create glucose and oxygen.
Melchiorri’s creations are currently more conceptual than practical (the efficiency of the photosynthesis process hasn’t been tested for one) but he hopes they could be used in all manner of futuristic architectural projects, perhaps even deploying giant leaves as air filters, hanging them on the exterior of buildings to absorb CO2 and channel fresh air inside.
Astronomers have long known that interstellar molecules containing carbon atoms exist and that by their nature they will absorb light shining on them from stars and other luminous bodies. Because of this, a number of scientists have previously proposed that some type of interstellar molecules are the source of diffuse interstellar bands -- the hundreds of dark absorption lines seen in color spectrograms taken from Earth. In showing nothing, these dark bands reveal everything. The missing colors correspond to photons of given wavelengths that were absorbed as they travelled through the vast reaches of space before reaching us. More than that, if these photons were filtered by falling on space-based molecules, the wavelengths reveal the exact energies it took to excite the electronic structures of those absorbing molecules in a defined way.
Over the vast, empty reaches of interstellar space, these countless small molecules tumble quietly though the cold vacuum. The interstellar medium is the matter that exists in the space between the star systems in a galaxy. This matter includes gas in ionic, atomic, and molecular form, dust, and cosmic rays. Forged in the fusion furnaces of ancient stars and ejected into space when those stars exploded, these lonely molecules account for a significant amount of all the carbon, hydrogen, silicon and other atoms in the universe. In fact, some 20 percent of all the carbon in the universe is thought to exist as some form of interstellar molecule.
Many astronomers hypothesize that these interstellar molecules are also responsible for an observed phenomenon on Earth known as the "diffuse interstellar bands," spectrographic proof that something out there in the universe is absorbing certain distinct colors of light from stars before it reaches the Earth. But since we don't know the exact chemical composition and atomic arrangements of these mysterious molecules, it remains unproven whether they are, in fact, responsible for the diffuse interstellar bands.
Now in a paper appearing this week in The Journal of Chemical Physics, a group of scientists led by researchers at the Harvard-Smithsonian Center for Astrophysics has offered a tantalizing new possibility: these mysterious molecules may be silicon-capped hydrocarbons like SiC3H, SiC4H and SiC5H, and they present data and theoretical arguments to back that hypothesis. At the same time, the group cautions that history has shown that while many possibilities have been proposed as the source of diffuse interstellar bands, none has been proven definitively.
Armed with that information, scientists here on Earth should be able to use spectroscopy to identify those interstellar molecules -- by demonstrating which molecules in the laboratory have the same absorptive "fingerprints." But despite decades of effort, the identity of the molecules that account for the diffuse interstellar bands remains a mystery. Nobody has been able to reproduce the exact same absorption spectra in laboratories here on Earth.
"Not a single one has been definitively assigned to a specific molecule," said Neil Reilly, a former postdoctoral fellow at Harvard-Smithsonian Center for Astrophysics and a co-author of the new paper. Now Reilly, McCarthy and their colleagues are pointing to an unusual set of molecules — silicon-terminated carbon chain radicals — as a possible source of these mysterious bands.
As they report in their new paper, the team first created silicon-containing carbon chains SiC3H, SiC4H and SiC5H in the laboratory using a jet-cooled silane-acetylene discharge. They then analyzed their spectra and carried out theoretical calculations to predict that longer chains in this family might account for some portion of the diffuse interstellar bands.
The ESA has tested a novel system that may allow the agency to safely land rovers on Mars using a quadcopter-like dropship. A fully automated, proof of concept Skycrane prototype was created over the course of eight months under the ESA's StarTiger program, with the system's hardware largely derived from commercially available quadcopter components.
The primary challenge for the Dropter project development team revolved around creating a system that could successfully detect and navigate hazardous terrain without the aide of real-time human input. This is a vital feature for any potential rover delivery system, as it is impossible to create a directly controllable sky crane due to the distance between the operator and the vehicle that creates a time lag between command and execution.
Therefore the new rover delivery method had to be designed around an autonomous navigation system. Initially the dropship navigates to the pre determined deployment zone using GPS and inertia control. Once in the vicinity of the target zone, the lander switches to vision-based navigation, utilizing laser ranging and barometers to allow it to detect a safe, flat area upon which to set down its precious cargo.
Once such a site is identified, the lander drops to a height of 10 m (33 ft) above the surface and lowers the rover with the use of a bridle, gradually descending until the rover gently touches down on the planet's surface.
The culmination of eight months of development took place at Airbus’s Trauen site, located in northern Germany, where the concept dropship was put through its paces in a 40 m (131 ft) by 40 m (131 ft) recreation of the Martian surface. During the test, the lander managed to successfully use its navigation systems to safely transport a mock rover to the chosen target zone, whereupon the delivery vehicle assessed and selected a flat, safe landing site, and deployed the rover using the 5 m (16 ft) bridle.
Now, with the concept a proven success, the agency and its partners can focus on further developing the dropship for heavier, more realistic payloads.
The video below displays footage of the prototype dropship during the test at Airbus’s Trauen facility.
Communities are groups that are densely connected among their members, and sparsely connected with the rest of the network. Community structure can reveal abundant hidden information about complex networks that is not easy to detect by simple observation. There are many large-scale complex networks (systems) in the real world whose structure is not fully understood. A great deal of research has been carried out to uncover the structures of these real world networks, to improve the ability to manage, maintain, renovate and control them. With the help of varied approaches, it is possible to shed light on the general structure of these networks, and further understand their function.
Network science methods have been used in various settings [1, 2] including social,[3, 4] information, transportation, energy, ecological, disease,  and biological networks. [10, 11, 12,13] In most of these cases we can find clear community structures, which are usually associated with specific functions. However, to date, most detection methods have limitations, and there is still a lot of room to develop more general approaches.
Some methods [13, 14,29, 34, 38, 39, 40] force every node to be assigned to a single community. This assumption doesn't always reflect real world networks, where several overlapping communities can co-exist. For example, in social networks, a person may have family relationship circles, job circles, friend circles, social circles, hobby circles and so on. Algorithms that can discover overlapping communities [16, 17, 18, 19, 20, 21, 22, 23] have been developed, and recently, methods to detect link communities [20, 24, 25] have been presented.
The concept of a link community is useful for discovering overlapping communities, as edges are more likely to have unique identities than nodes, which instead tend to have multiple identities. In addition, statistical,  information-theoretic [35, 48, 53] and synchronization and dynamical clustering approaches [49, 50, 58, 59, 60] have also been developed to detect communities.
Via Ashish Umre, june holley
The interactive map produced by researchers from Oxford University and UCL (University College London), details the histories of genetic mixing between each of the 95 populations across Europe, Africa, Asia and South America spanning the last four millennia.
The study, published this week in Science, simultaneously identifies, dates and characterises genetic mixing between populations. To do this, the researchers developed sophisticated statistical methods to analyse the DNA of 1490 individuals in 95 populations around the world. The work was chiefly funded by the Wellcome Trust and Royal Society.
'DNA really has the power to tell stories and uncover details of humanity's past,' said Dr Simon Myers of Oxford University's Department of Statistics and Wellcome Trust Centre for Human Genetics, co-senior author of the study.
'Because our approach uses only genetic data, it provides information independent from other sources. Many of our genetic observations match historical events, and we also see evidence of previously unrecorded genetic mixing. For example, the DNA of the Tu people in modern China suggests that in around 1200CE, Europeans similar to modern Greeks mixed with an otherwise Chinese-like population. Plausibly, the source of this European-like DNA might be merchants travelling the nearby Silk Road.'
The powerful technique, christened 'Globetrotter', provides insight into past events such as the genetic legacy of the Mongol Empire. Historical records suggest that the Hazara people of Pakistan are partially descended from Mongol warriors, and this study found clear evidence of Mongol DNA entering the population during the period of the Mongol Empire. Six other populations, from as far west as Turkey, showed similar evidence of genetic mixing with Mongols around the same time.
'What amazes me most is simply how well our technique works,' said Dr Garrett Hellenthal of the UCL Genetics Institute, lead author of the study. 'Although individual mutations carry only weak signals about where a person is from, by adding information across the whole genome we can reconstruct these mixing events. Sometimes individuals sampled from nearby regions can have surprisingly different sources of mixing.
'For example, we identify distinct events happening at different times among groups sampled within Pakistan, with some inheriting DNA from sub-Saharan Africa, perhaps related to the Arab Slave Trade, others from East Asia, and yet another from ancient Europe. Nearly all our populations show mixing events, so they are very common throughout recent history and often involve people migrating over large distances.'
The team used genome data for all 1490 individuals to identify 'chunks' of DNA that were shared between individuals from different populations. Populations sharing more ancestry share more chunks, and individual chunks give clues about the underlying ancestry along chromosomes.
'Each population has a particular genetic 'palette', said Dr Daniel Falush of the Max Planck Institute for Evolutionary Anthropology in Leipzig, co-senior author of the study.
'If you were to paint the genomes of people in modern-day Maya, for example, you would use a mixed palette with colours from Spanish-like, West African and Native American DNA. This mix dates back to around 1670CE, consistent with historical accounts describing Spanish and West African people entering the Americas around that time. Though we can't directly sample DNA from the groups that mixed in the past, we can capture much of the DNA of these original groups as persisting, within a mixed palette of modern-day groups. This is a very exciting development.'