Amazing Science
Follow
Find tag "computers"
378.3K views | +289 today
Your new post is loading...
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Cambrian Explosion of Technology: Stephen Wolfram Wants To Inject Computation Everywhere

Cambrian Explosion of Technology: Stephen Wolfram Wants To Inject Computation Everywhere | Amazing Science | Scoop.it

At the 2014 SXSW Conference, Stephen Wolfram introduced the Wolfram Language, a symbolic language.  His video presentation shows some of  the profound implications of this new technology.


Imagine a future where there's no distinction between code and data. Where computers are operated by programming languages that work like human language, where knowledge and data are built in, where everything can be computed symbolically like the X and Y of school algebra problems. Where everything obvious is automated; the not-so-obvious revealed and made ready to explore. A future where billions of interconnected devices and ubiquitous networks can be readily harnessed by injecting computation.


That's the future Stephen Wolfram has pursued for over 25 years: Mathematica, the computable knowledge of Wolfram|Alpha, the dynamic interactivity of Computable Document Format, and soon, the universally accessible and computable model of the world made possible by the Wolfram Language and Wolfram Engine.


"Of the various things I've been trying to explain, this is one of the more difficult ones," Wolfram told Wired recently. What Wolfram Language essentially does, is work like a plug-in-play system for programmers, with many subsystems already in place.  Wolfram calls this knowledge-based programming.

Wolfram Language has a vast depth of built-in algorithms and knowledge, all automatically accessible through its elegant unified symbolic language. Scalable for programs from tiny to huge, with immediate deployment locally and in the cloud, the Wolfram Language builds on clear principles to create what Wolfram claims will be the world's most productive programming language.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Motion Detector: Computer Game Reveals 'Space-Time' Neurons in the Eye

Motion Detector: Computer Game Reveals 'Space-Time' Neurons in the Eye | Amazing Science | Scoop.it

You open the overstuffed kitchen cabinet and a drinking glass tumbles out. With a ninjalike reflex, you snatch it before it shatters on the floor, as if the movement of the object were being tracked before the information even reached your brain. According to one idea of how the circuitry of the eye processes visual data, that is literally what happens. Now, a deep anatomical study of a mouse retina—carried out by 120,000 members of the public—is bringing scientists a step closer to confirming the hypothesis.


Researchers have known for decades that the eye does much more than just detect light. The dense patch of neurons in the retina also processes basic features of a scene before sending the information to the brain. For example, in 1964, scientists showed that some neurons in the retina fire up only in response to motion. What's more, these “space-time” detectors have so-called direction selectivity, each one sensitive to objects moving in different directions. But exactly how that processing happens in the retina has remained a mystery.


The stumbling block is a lack of fine-grained anatomical detail about how the neurons in the retina are wired up to each other. Although researchers have imaged the retina microscopically in ultrathin sections, no computer algorithm has been able to accurately trace out the borders of all the neurons to map the circuitry. At this point, only humans have good enough spatial reasoning to figure out what is part of a branching cell and what is just background noise in the images.


Enter the EyeWire project, an online game that recruits volunteers to map out those cellular contours within a mouse’s retina. The game was created and launched in December 2012 by a team led by H. Sebastian Seung, a neuroscientist at the Massachusetts Institute of Technology in Cambridge. Players navigate their way through the retina one 4.5-micrometer tissue block at a time, coloring the branches of neurons along the way. Most of the effort gets done in massive online competitions between players vying to map out the most volume. (Watch a video of a player walking through a tissue block here.) By last week, the 120,000 EyeWire players had completed 2.3 million blocks. That may sound like a lot, but it is less than 2% of the retina.


The sample is already enough to reveal new features, however. The EyeWire map shows two types of retinal cells with unprecedented resolution. The first, called starburst amacrine cells (SACs), have branches spread out in a flat, plate-shaped array perpendicular to the incoming light. The second, called bipolar cells (BPs), are smaller and bushy. The BPs come in two varieties, one of which reacts to light more slowly than the other—a time delay of about 50 milliseconds. The SACs and BPs are known to be related to direction sensitivity, but exactly how they sense direction remains to be discovered.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Nanoelectronic circuits reach speeds of 245 THz, 10,000 times faster than a normal microprocessors

Nanoelectronic circuits reach speeds of 245 THz, 10,000 times faster than a normal microprocessors | Amazing Science | Scoop.it

Researchers at the National University of Singapore (NUS) have designed and manufactured circuits that can reach speeds of up to 245 THz, tens of thousands of times faster than contemporary microprocessors. The results open up possible new design routes for plasmonic-electronics, that combine nano-electronics with the fast operating speed of optics.


When light interacts with some metals, it can be captured in the form of collective, extremely fast oscillations of electrons called plasmons. If harnessed, the interaction of photons and electrons could be used to build ultra-fast computers (among other things). But these phenomena occur at a scale so small that we don't yet have the tools to investigate them, let alone harness them.


Assistant Professor Christian A. Nijhuis and his team have now found a way to harness quantum-plasmonic effects even with the current generation of electronics, using a process called "quantum plasmonic tunneling."


The team built a molecular-scale circuit consisting of two plasmonic resonators (structures that can convert photons into plasmons) separated by a single layer of molecules only 0.5 nanometers in size.


Using electron microscopy, Nijhuis and colleagues saw that the layer of molecules allowed the quantum plasmonic tunneling effects to take place, allowing the circuit to operate at frequencies of up to 24 THz. What's more, the frequency of the circuit could be adjusted by changing the material of the molecular layer.


This marks the first time that scientists have observed the quantum plasmonic tunneling effects directly, and is a convincing demonstration that molecular electronics can indeed handle speeds that are miles beyond that of contemporary electronics.


Future applications include plasmonic-electronics hybrids that combine nanoelectronics with the fast operating speed of optics, and single-molecule photon detectors. The researchers will now focus their efforts on trying to integrate these devices into actual electronic circuits.


The results were published in the latest issue of the journal Science.

more...
Eli Levine's curator insight, May 1, 7:15 PM

And this is just the beginning.

Bottom floor on this?  I think I'd like.

Scooped by Dr. Stefan Gruenwald
Scoop.it!

Stanford bioengineers create circuit board modeled on the human brain, which operates 9,000 times faster

Stanford bioengineers create circuit board modeled on the human brain, which operates 9,000 times faster | Amazing Science | Scoop.it

Stanford bioengineers have developed faster, more energy-efficient microchips based on the human brain – 9,000 times faster and using significantly less power than a typical PC. This offers greater possibilities for advances in robotics and a new way of understanding the brain. For instance, a chip as fast and efficient as the human brain could drive prosthetic limbs with the speed and complexity of our own actions.


For all their sophistication, computers pale in comparison to the brain. The modest cortex of the mouse, for instance, operates 9,000 times faster than a personal computer simulation of its functions.

Not only is the PC slower, it takes 40,000 times more power to run, writes Kwabena Boahen, associate professor of bioengineering at Stanford, in an article for the Proceedings of the IEEE.


"From a pure energy perspective, the brain is hard to match," says Boahen, whose article surveys how "neuromorphic" researchers in the United States and Europe are using silicon and software to build electronic systems that mimic neurons and synapses.


Boahen and his team have developed Neurogrid, a circuit board consisting of 16 custom-designed "Neurocore" chips. Together these 16 chips can simulate 1 million neurons and billions of synaptic connections. The team designed these chips with power efficiency in mind. Their strategy was to enable certain synapses to share hardware circuits. The result was Neurogrid – a device about the size of an iPad that can simulate orders of magnitude more neurons and synapses than other brain mimics on the power it takes to run a tablet computer.


The National Institutes of Health funded development of this million-neuron prototype with a five-year Pioneer Award. Now Boahen stands ready for the next steps – lowering costs and creating compiler software that would enable engineers and computer scientists with no knowledge of neuroscience to solve problems – such as controlling a humanoid robot – using Neurogrid.


Its speed and low power characteristics make Neurogrid ideal for more than just modeling the human brain. Boahen is working with other Stanford scientists to develop prosthetic limbs for paralyzed people that would be controlled by a Neurocore-like chip.


"Right now, you have to know how the brain works to program one of these," said Boahen, gesturing at the $40,000 prototype board on the desk of his Stanford office. "We want to create a neurocompiler so that you would not need to know anything about synapses and neurons to able to use one of these."

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Google's Street View address reading software also able to decipher CAPTCHAs

Google's Street View address reading software also able to decipher CAPTCHAs | Amazing Science | Scoop.it

Google engineers working on software to automatically read home and business addresses off photographs taken by Street View vehicles, have created a product so good that not only can it be used for address reading, it can solve CAPTCHAs, as well.


CAPTCHAs are, of course, words that have been intentionally distorted presented to live humans who wish to enter a web site—to gain access, they must correctly type the word into a box. CAPTCHAs are believed to be difficult if not impossible for spam bots to decipher, thus they serve to protect the site—at least for now.


It's sort of ironic actually, that software has inadvertently been created that thwarts the efforts of other software engineers attempting to keep spam bots from accessing web sites. The finding was posted by Google Product Manager Vinay Shet on the Google blog.


To make Google Street View (part of Google Maps) ever smarter, engineers have been hard at work developing a sophisticated neural network based on both prior research and new image recognition techniques. The aim is to make Google's products more accurate. To display an image of a house or building given an address by a user takes a lot of computer smarts—Google connects new addresses to older known addresses, constantly updating its databases.


Presumably, the goal is to map every building in the known world to an address. But the work has produced an unexpected by-product, the very same software developed for Street View can also be used to decipher CAPTCHAs with 96 percent accuracy (98.8 percent when working on Google's own reCAPTCHA).

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

An Interactive Map Showing Global Cyberattacks In Real Time

An Interactive Map Showing Global Cyberattacks In Real Time | Amazing Science | Scoop.it
Security firm Kaspersky Lab has launched an interactive cyberthreat map that visualizes cyber security incidents occurring worldwide in real time. A quick glance shows that the world is a pretty scary place.


The interactive map is a promotional tool created by Kaspersky Lab, but it's fascinating nonetheless. Threats displayed include malicious objects detected during on-access and on-demand scans, email and web antivirus detections, as well as objects identified by vulnerability and intrusion detection sub-systems.


Every day Kaspersky Lab handles more than 300,000 malicious objects. Three years ago the figure was just 70,000 but antivirus technologies have also changed with the times and we have no problem coping with this huge stream of traffic. Where do the attacks come from? Where do users click on malicious links most often? Which types of malware are the most prevalent? These are the sort of questions being asked by lots of users. Our new map of the cyberworld threat landscape allows everyone to see the scale of cyber activity in real time and to get a taste of what it feels like to be one of our experts.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

NASA set to release online software catalog

NASA set to release online software catalog | Amazing Science | Scoop.it

Get ready for a stimulating software catalog. You may want to write NASA CAT. next to Thursday, April 10, on your calendar. That is the day that the National Aeronautics and Space Administration (NASA) is to make available to the public, at no cost, more than 1,000 codes with its release of a new online software catalog. The catalog, a master list organized into 15 categories, is intended for industry, academia, other government agencies, and general public. The catalog covers technology topics ranging from project management systems, design tools, data handling, image processing, solutions for life support functions, aeronautics, structural analysis, and robotic and autonomous systems. NASA said the codes represent NASA's best solutions to an array of complex mission requirements.

"Software is an increasingly important element of the agency's intellectual asset portfolio," said Jim Adams, deputy chief technologist with NASA. "It makes up about one-third of its reported inventions each year." With this month's release of the software catalog, he said, the software becomes widely available to the public. Each NASA code was evaluated, however, for access restrictions and designated for a specific type of release, ranging from codes open to all U.S. citizens to codes restricted to use by other federal agencies.


The catalog nonetheless fits into NASA's ongoing efforts to transfer more NASA technologies to American industry and U.S. consumers As Wired's Robert McMillan wrote on Friday, "This NASA software catalog will list more than 1,000 projects, and it will show you how to actually obtain the code you want. The idea to help hackers and entrepreneurs push these ideas in new directions—and help them dream up new ideas."


Adams said, "By making NASA resources more accessible and usable by the public, we are encouraging innovation and entrepreneurship. Our technology transfer program is an important part of bringing the benefit of space exploration back to Earth for the benefit of all people."


Daniel Lockney, technology transfer program executive with NASA's Office of the Chief Technologist, underscored this down-to-earth mission side of NASA in 2012 in an article in Innovation in 2012. "NASA really is the gold standard for technology transfer," he then said. "The money spent on research and development doesn't just go up into space; it comes down to earth in the form of some very practical and tangible results."


Lockney said they know the investment in technology creates jobs, boosts the economy and provides benefits in addition to the mission focus. "Our technologies have done everything from make hospitals more efficient to making transportation safer and greener. The technology reaches into all aspects about our lives."

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

▶ "Design of a Superconducting Quantum Computer" - Talk by John Martinis

Superconducting quantum computing is now at an important crossroad, where "proof of concept" experiments involving small numbers of qubits can be transitioned to more challenging and systematic approaches that could actually lead to building a quantum computer. Our optimism is based on two recent developments: a new hardware architecture for error detection based on "surface codes" [1], and recent improvements in the coherence of superconducting qubits [2]. I will explain how the surface code is a major advance for quantum computing, as it allows one to use qubits with realistic fidelities, and has a connection architecture that is compatible with integrated circuit technology. Additionally, the surface code allows quantum error detection to be understood using simple principles. I will also discuss how the hardware characteristics of superconducting qubits map into this architecture, and review recent results that suggest gate errors can be reduced to below that needed for the error detection threshold. 

References 

[1] Austin G. Fowler, Matteo Mariantoni, John M. Martinis and Andrew N. Cleland, PRA 86, 032324 (2012). 
[2] R. Barends, J. Kelly, A. Megrant, D. Sank, E. Jeffrey, Y. Chen, Y. Yin, B. Chiaro, J. Mutus, C. Neill, P. O'Malley, P. Roushan, J. Wenner, T. C. White, A. N. Cleland and John M. Martinis, arXiv:1304:2322.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

CODE_n: Data Visualizations in Grande Scale Shown at CeBit 2014

CODE_n: Data Visualizations in Grande Scale Shown at CeBit 2014 | Amazing Science | Scoop.it

I guess that CODE_n [http://kramweisshaar.com], developed by design agency Kram/Weisshaar, is best appreciated when perceived in the flesh, that is at the Hannover Fairgrounds during CeBit 2014 in Hannover, Germany.


CODE_n consists of more than 3.000 square meters (approx. 33,000 ft2) of ink-jet printed textile membranes, stretching more than 260 meters of floor-to-ceiling tera-pixel graphics. The 12.5 terapixel, 90-meter long wall-like canopy titled "Retrospective Trending", shows over 400 lexical frequency timelines ranging from the years 1800 to 2008, each generated using Google's Ngram tool. The hundreds of search terms relate to ethnographic themes of politics, economics, engineering, science, technology, mathematics, and philosophy, resulting in the output of historical trajectories of word usage over time.


The 6.2 terapixel "Hydrosphere Hyperwall" is a visualization of the global ocean as dynamic pathways, polychrome swathes of sea climate, data-collecting swarms of mini robots and sea animals, as well as plumes of narrow current systems. NASA's ECCO2 maps were interwoven with directional arrows that specify wind direction and data vectors that represent buoys, cargo floats, research ships, wave gliders, sea creatures and research stations.


Finally, the 6.6 terapixel "Human Connectome" is a morphological map of the human brain. Consisting of several million multi-coloured fibre bundles and white matter tracts that were captured by diffusion-MRIs, the structural descriptions of the human mind were generated at 40 times the scale of the human body. The 3D map of human neural connections visualizes brain dynamics on an ultra-macro scale as well as the infinitesimal cell-scale.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

UCLA engineering team increases power efficiency for future computer processors

UCLA engineering team increases power efficiency for future computer processors | Amazing Science | Scoop.it
Have you ever wondered why your laptop or smartphone feels warm when you're using it? That heat is a byproduct of the microprocessors in your device using electric current to power computer processing functions — and it is actually wasted energy.
 
Now, a team led by researchers from the UCLA Henry Samueli School of Engineering and Applied Science has made major improvements in computer processing using an emerging class of magnetic materials called "multiferroics," and these advances could make future devices far more energy-efficient than current technologies.
 
With today's device microprocessors, electric current passes through transistors, which are essentially very small electronic switches. Because current involves the movement of electrons, this process produces heat — which makes devices warm to the touch. These switches can also "leak" electrons, making it difficult to completely turn them off. And as chips continue to get smaller, with more circuits packed into smaller spaces, the amount of wasted heat grows.
 
The UCLA Engineering team used multiferroic magnetic materials to reduce the amount of power consumed by "logic devices," a type of circuit on a computer chip dedicated to performing functions such as calculations. A multiferroic can be switched on or off by applying alternating voltage — the difference in electrical potential. It then carries power through the material in a cascading wave through the spins of electrons, a process referred to as a spin wave bus.
 
A spin wave can be thought of as similar to an ocean wave, which keeps water molecules in essentially the same place while the energy is carried through the water, as opposed to an electric current, which can be envisioned as water flowing through a pipe, said principal investigator Kang L. Wang, UCLA's Raytheon Professor of Electrical Engineering and director of the Western Institute of Nanoelectronics (WIN).
 
"Spin waves open an opportunity to realize fundamentally new ways of computing while solving some of the key challenges faced by scaling of conventional semiconductor technology, potentially creating a new paradigm of spin-based electronics," Wang said.
 
The UCLA researchers were able to demonstrate that using this multiferroic material to generate spin waves could reduce wasted heat and therefore increase power efficiency for processing by up to 1,000 times. Their research is published in the journal Applied Physics Letters
more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Are you ready for the Internet of Cops?

Are you ready for the Internet of Cops? | Amazing Science | Scoop.it

FirstNet — a state-of-the-art communications network for paramedics, firemen and law enforcement at the federal, state and local level — will give cops on the streets unprecedented technological powers, and possibly hand over even more intimate data about our lives to the higher ends of the government and its intelligence agencies, Motherboard reports.


According to a series of presentation slides from December last year, FirstNet will be the “MOST secure wireless network ever built,” as a dedicated 4G network just for first responders.


FirstNet will allow users to “tag” a disaster victim with a small device to allow patients’ vital signs to be monitored from a control center, allowing medical staff to keep an eye on who needs treatment the most at any one time. But FirstNet will also give local law enforcement the ability to take digital “fingerprints from the field,” record and share high-quality video, with facial recognition, and instantaneously marry these freshly sourced data with others over the network.


The uses of FirstNet — biometric data gathering, license plate readers and high speed information sharing — are explicit aims of the project, as laid out in presentations and other documents, along with a possible “kill switch” to disable the civilian network in emergencies.


There is also the possibility that this will create a new means for the federal government to harvest massive quantities of the biometric data being collected by local agencies.


In related news last week, under a surveillance program codenamed Optic Nerve, Britain’s surveillance agency GCHQ, with aid from the NSA, collected millions of still images of Yahoo webcam chats in bulk and saved them to agency databases, The Guardian reported. The agencies also collected gamers’ chats and deployed real-life agents into World of Warcraft and Second Life.


more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

IBM wants to put the power of Watson in your smartphone

IBM wants to put the power of Watson in your smartphone | Amazing Science | Scoop.it

Watson, IBM's Jeopardy-conquering super computer, has set its sites on mobile apps. Not long ago, the recently created Watson Business Group announced that would offer APIs to developers to create cloud-based apps built around cognitive computing. Now IBM is launching a competition to lure mobile app creators to its new platform. Over the next three months the company will be taking submissions that leverage Watson's unique capabilities like deep data analysis and natural language processing to put impossibly powerful tools to the palm of your hand. IBM is hoping for apps that "change the way consumers and businesses interact with data on their mobile devices." It's an ambitious goal, but considering the way Watson spanked Ken Jennings, it seems something that is well within its reach. The machine has already changed the way we view computers and artificial intelligence, not only by winningJeopardy, but by making cancer treatment decisions and attending college. Now it wants to make your smartphone smarter than you could ever hope to be.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Scientists create Chameleon virus that could move undetected between Wi-Fi access points

Scientists create Chameleon virus that could move undetected between Wi-Fi access points | Amazing Science | Scoop.it

We all know to look out for viruses that can be spread over the internet, or by sharing files between computers. Now, however, scientists at the University of Liverpool have shown for the first time that special viruses could move between wireless access points using existing Wi-Fi networks – as efficiently as the common cold virus spreads between people through the air.


The team computer-simulated an attack by a virus known as Chameleon, which they created. Although the virus didn't affect the functions of the access points (APs) or users' computers, it was able to access and report the credentials of all the people who were using those APs at the time. Some APs were impregnable due to encryption or password protection, but in those cases Chameleon would just move on to other more vulnerable access points.


Due to the fact that existing anti-virus software is only designed to look for viruses in computers or on the internet, the virus itself remained undetected.


The simulated attack was set in London and Belfast. Just like the cold virus spreads quicker in crowded cities, Chameleon spread faster in situations where multiple APs were located in close proximity to one another.


"Wi-Fi connections are increasingly a target for computer hackers because of well-documented security vulnerabilities, which make it difficult to detect and defend against a virus," said Prof. Alan Marshall, who took part in the study. "It was assumed, however, that it wasn’t possible to develop a virus that could attack Wi-Fi networks, but we demonstrated that this is possible and that it can spread quickly. We are now able to use the data generated from this study to develop a new technique to identify when an attack is likely."

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Extreme: Sony Crams 3,700 Blu-Rays' Worth of Storage onto a Single Cassette Tape

Extreme: Sony Crams 3,700 Blu-Rays' Worth of Storage onto a Single Cassette Tape | Amazing Science | Scoop.it
There was a time, in computing's not-so-distant past, where magnetic tape was the best way to back up large amounts of data. In the mid-90s, tape could store tens or hundreds of gigabytes, while hard drive capacities were still mostly measured in megabytes. That would soon change, of course, with the advent of writable optical media and cheap, large hard drives, but even today tape drives still hang around as one of the best options for mass data backup. Now, Sony has developed a new technology that pushes tape drives far beyond where they once were, leading to individual tapes with 185 terabytes of storage capacity.


Back in 2010, the standing record for how much data magnetic tape could store was 29.5GB per square inch. To compare, a standard dual-layer Blu-ray disc can hold 25GB per layer — this is why big budget, current-gen video games can clock in at around 40 or 50GB. That, however, is an entire disc, whereas magnetic tape could store more than half of that capacity in one little square inch. Sony has announced that it has developed a new magnetic tape material that demolishes the previous 29.5GB record, and can hold a whopping 148GB per square inch, making it the new record holder of storage density for the medium. If spooled into a cartridge, each tape could have a mind-boggling 185TB of storage. Again, to compare, that’s 3,700 dual-layer 50GB Blu-rays (a stack that would be 14.3 feet or 4.4 meters high, incidentally). In fact, one of these tapes would hold five more terabytes than a $9,305 hard drive storage array.


In order to create the new tape, Sony employed the use of sputter deposition, which creates layers of magnetic crystals by firing argon ions at a polymer film substrate. Combined with a soft magnetic under-layer, the magnetic particles measured in at just 7.7 nanometers on average, able to be closely packed together.


Perhaps surprisingly, storage tape shipments grew 13% two years ago, and were headed for a 26% growth just last year. Sony also stated that it would like to commercialize the new material — as well as continue developing its sputter deposition methods — but did not say if or when it will ever happen. While 185TB of storage sitting on a single cartridge is extremely appealing for people with large digital collections — music, games, or really any kind of media — it’s best to remember that the storage medium of tape has never been easy access. Read and write times feel like (and often are) an oblivion, and tape is used mainly for safe-keeping backup, rather than because you have too much music on your SSD and want to free up space for a new game. Still, when it comes to massive, non-time-sensitive storage, tape storage libraries are still one of the most common methods used by big corporations.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Harvard scientists created quantum switch that can be turned on and off using a single photon

Harvard scientists created quantum switch that can be turned on and off using a single photon | Amazing Science | Scoop.it

Harvard researchers have succeeded in creating quantum switches that can be turned on and off using a single photon, a technological achievement that could pave the way for creating highly secure quantum networks.


Built from single atoms, the first-of-their-kind switches could one day be networked via fiber-optic cables to form the backbone of a “quantum Internet” that allows for perfectly secure communications, said Professor of Physics Mikhail Lukin, who, together with Professor Vladan Vuletic of MIT, led a team consisting of graduate students Jeff Thompson and Lee Liu and postdoctoral fellows Tobias Tiecke and Nathalie de Leon to construct the new system. Their research is detailed in a recently published paper in the journal Nature.


“From a technical standpoint, it’s a remarkable accomplishment,” Lukin said of the advance. “Conceptually, the idea is very simple: Push the conventional light switch to its ultimate limit. What we’ve done here is to use a single atom as a switch that, depending on its state, can open or close the flow of photons … and it can be turned on and off using a single photon.”


Though the switches could be used to build a quantum computer, Lukin said it’s unlikely the technology will show up in the average desktop computer. Where it will be used, he said, is in creating fiber-optical networks that use quantum cryptography, a method for encrypting communications using the laws of quantum mechanics to allow for perfectly secure information exchanges. Such systems make it impossible to intercept and read messages sent over a network, because the very act of measuring a quantum object changes it, leaving behind telltale signs of the spying.


“It’s unlikely everyone would need this type of technology,” he said. “But there are some realistic applications that could someday have transformative impact on our society. At present, we are limited to using quantum cryptography over relatively short distances — tens of kilometers. Based on the new advance, we may eventually be able to extend the range of quantum cryptography to thousands of kilometers.”


Importantly, Tiecke said, their system is highly scalable, and could one day allow for the fabrication of thousands of such switches in a single device.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

New quantum gate seen as an essential logic element for future quantum computers

New quantum gate seen as an essential logic element for future quantum computers | Amazing Science | Scoop.it

Physicists from the Max Planck Institute of Quantum Optics in Garching have developed a novel quantum gate, an essential component of quantum computers. A future quantum computer would be able to handle certain types of tasks far faster than any classical computer. As a central element of their quantum gate, the Max Planck physicists are using an atom trapped between two mirrors of a resonator. By reflecting the photon off the resonator with the atom, they are able to switch the state of the photon. Moreover, the gate operation can entangle the atom with the photon. When quantum particles are entangled, their properties become interdependent. Entanglement opens up whole new horizons in information processing. The quantum gate recently presented by the Garching-based physicists makes it possible to design quantum networks in which information is transferred between remote quantum processors in the form of photons.


The purpose of the experiments is to explore ways to process data in the form of quantum bits, or qubits for short. Whereas classical bits only exist in the states of “0” or “1”, in qubits superpositions of these two states are possible. When several qubits are combined into a single unit – a phenomenon known as entanglement – it is possible to perform parallel calculations that would simply be inconceivable with conventional computers. “A quantum gate such as the one we have developed is an essential component in the construction of a quantum computer,” says Stephan Ritter, who heads the experiment.


A CNOT gate couples a control bit with a target bit: whether or not the control bit changes the state of the target bit depends on its state. All logic circuits required for quantum calculations can be realized with this logic element and a few other simple operations. Many such logic elements are needed to build a quantum computer. A quantum computer could, within a reasonable period of time perform intricate searches in databases that would take even the fastest computer today months to complete. In addition, a quantum computer could break the encryption commonly used today. To prevent eavesdroppers from gaining access to transmitted data, quantum information technology has a tried-and-tested trick up its sleeve: quantum cryptography, which stops spies from tapping information from a data line undetected.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Programming for Scientists: 139 Python VIDEO Lectures from PyCon US 2014

Programming for Scientists: 139 Python VIDEO Lectures from PyCon US 2014 | Amazing Science | Scoop.it

PyCon is the largest annual gathering for the community using and developing the open-source Python programming language. It is produced and underwritten by the Python Software Foundation, the 501(c)(3) nonprofit organization dedicated to advancing and promoting Python. Through PyCon, the PSF advances its mission of growing the international community of Python programmers.


Because PyCon is backed by the non-profit PSF, we keep registration costs much lower than comparable technology conferences so that PyCon remains accessible to the widest group possible. The PSF also pays for the ongoing development of the software that runs PyCon and makes it available under a liberal open source license.


PyCon is a diverse conference dedicated to providing an enjoyable experience to everyone. Help us do this by following our code of conduct.


https://us.pycon.org/2014/

Date:April 9, 2014

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

New body-hack app shortcuts jet-lag recovery

New body-hack app shortcuts jet-lag recovery | Amazing Science | Scoop.it

A new jet-lag mobile app called Entrain released by University of Michigan mathematicians reveals previously unknown shortcuts that can help travelers entrain (synchronize) their circadian rhythms to new time zones as efficiently as possible.


Entrain is built around the premise that light, particularly from the sun and in wavelengths that appear to our eyes as the color blue, is the strongest signal to regulate circadian rhythms. These fluctuations in behaviors and bodily functions, tied to the planet’s 24-hour day, do more than guide us to eat and sleep. They govern processes in each one of our cells.


The study, published April 10, 2014, in Public Library of Science Computational Biology (open access journal), relies on two leading mathematical models that have been shown to accurately describe human circadian rhythms. The researchers used these equations and a technique called optimal control theory to calculate ideal adjustment schedules for more than 1,000 possible trips.


The app gives users access to these schedules. Start by entering your typical hours of light and darkness in your current time zone, then choose the time zone you’re traveling to and when, as well as the brightest light you expect to spend the most time in during your trip (indoor or outdoor.) The app offers a specialized plan and predicts how long it will you take to adjust.


The shortcuts the app offers are custom schedules of light and darkness depending on the itinerary. The schedules boil down to one block of time each day when you should seek the brightest light possible and another when you should put yourself in the dark, or at least in dim light. You don’t even have to be asleep.


If you must go outside, you can wear pink-tinted glasses to block blue wavelength light, the researchers say. And if the app prescribes “bright outdoor light” in the middle of the night, a therapeutic lightbox can do the job — yes, its shortcuts sometimes require odd hours.


The Entrain app is available now as a free app in the Apple store.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

MIT: New ‘switch’ could power quantum computing

MIT: New ‘switch’ could power quantum computing | Amazing Science | Scoop.it

MIT: A light lattice that traps atoms may help scientists build networks of quantum information transmitters.


Using a laser to place individual rubidium atoms near the surface of a lattice of light, scientists at MIT and Harvard University have developed a new method for connecting particles — one that could help in the development of powerful quantum computing systems.


The new technique, described in a paper published today in the journal Nature, allows researchers to couple a lone atom of rubidium, a metal, with a single photon, or light particle. This allows both the atom and photon to switch the quantum state of the other particle, providing a mechanism through which quantum-level computing operations could take place.


Moreover, the scientists believe their technique will allow them to increase the number of useful interactions occurring within a small space, thus scaling up the amount of quantum computing processing available.


“This is a major advance of this system,” says Vladan Vuletić, a professor in MIT’s Department of Physics and Research Laboratory for Electronics (RLE), and a co-author of the paper. “We have demonstrated basically an atom can switch the phase of a photon. And the photon can switch the phase of an atom.”


That is, photons can have two polarization states, and interaction with the atom can change the photon from one state to another; conversely, interaction with the photon can change the atom’s phase, which is equivalent to changing the quantum state of the atom from its “ground” state to its “excited” state. In this way the atom-photon coupling can serve as a quantum switch to transmit information — the equivalent of a transistor in a classical computing system. And by placing many atoms within the same field of light, the researchers may be able to build networks that can process quantum information more effectively.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Where Art Meets Math: The Hypnotic Animated Gifs of David Szakaly

Where Art Meets Math: The Hypnotic Animated Gifs of David Szakaly | Amazing Science | Scoop.it

Since 2008 Hungarian/German graphic designer David Szakaly has been churning out some of the most dizzying, hypnotic and wholly original gifs on the web under the name Davidope. His blend of twisting organic forms, flashes of black and white, and forays into pulsing technicolor shapes have inspired legions of others to experiment with the medium, many of whom have been featured here on Colossal. It’s hard to determine the scale of Szakaly’s influence online, but a simple Google image search for “animated gif” brings up dozens of his images that have been shared around Tumblr hundreds of thousands of times.


Szakaly began experimenting with the vector animation program Macromedia Flash back in 1999 where he used the software to create presentations, banners, and other creatives for clients. It was nearly a decade later when he decided to dedicate more time to experimenting with motion graphics and found that Tumblr was a great platform to share his quirky gifs. While he still works in the corporate world on other digital projects, he has also found commercial success making animations for clients around the world. Though it’s his personal work that really stands out. If or when gifs end up on gallery walls, it will be hard to deny Szakaly’s role in getting them there.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Face-To-Face: Crude Mugshots built from DNA data alone

Face-To-Face: Crude Mugshots built from DNA data alone | Amazing Science | Scoop.it
Computer program crudely predicts a facial structure from genetic variations.


Researchers have now shown how 24 gene variants can be used to construct crude models of facial structure. Thus, leaving a hair at a crime scene could one day be as damning as leaving a photograph of your face. Researchers have developed a computer program that can create a crude three-dimensional (3D) model of a face from a DNA sample.


Using genes to predict eye and hair color is relatively easy. But the complex structure of the face makes it more valuable as a forensic tool — and more difficult to connect to genetic variation, says anthropologist Mark Shriver of Pennsylvania State University in University Park, who led the work, published today in PLOS Genetics1.


Shriver and his colleagues took high-resolution images of the faces of 592 people of mixed European and West African ancestry living in the United States, Brazil and Cape Verde. They used these images to create 3D models, laying a grid of more than 7,000 data points on the surface of the digital face and determining by how much particular points on a given face varied from the average: whether the nose was flatter, for instance, or the cheekbones wider. They had volunteers rate the faces on a scale of masculinity and femininity, as well as on perceived ethnicity.


Next, the authors compared the volunteers’ genomes to identify points at which the DNA differed by a single base, called a single nucleotide polymorphism (SNP). To narrow down the search, they focused on genes thought to be involved in facial development, such as those that shape the head in early embryonic development, and those that are mutated in disorders associated with features such as cleft palate. Then, taking into account the person’s sex and ancestry, they calculated the statistical likelihood that a given SNP was involved in determining a particular facial feature.


This pinpointed 24 SNPs across 20 genes that were significantly associated with facial shape. A computer program the team developed using the data can turn a DNA sequence from an unknown individual into a predictive 3D facial model (see 'Face to face'). Shriver says that the group is now trying to integrate more people and genes, and look at additional traits, such as hair texture and sex-specific differences.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Internet surveillance predicts disease outbreak before WHO

Internet surveillance predicts disease outbreak before WHO | Amazing Science | Scoop.it

Have you ever Googled for an online diagnosis before visiting a doctor? If so, you may have helped provide early warning of an infectious disease epidemic.


In a new study published in Lancet Infectious Diseases, Internet-based surveillance has been found to detect infectious diseases such as Dengue Fever and Influenza up to two weeks earlier than traditional surveillance methods, according to Queensland University of Technology (QUT) research fellow and senior author of the paper Wenbiao Hu.


Hu, based at the Institute for Health and Biomedical Innovation, said there was often a lag time of two weeks before traditional surveillance methods could detect an emerging infectious disease.


“This is because traditional surveillance relies on the patient recognizing the symptoms and seeking treatment before diagnosis, along with the time taken for health professionals to alert authorities through their health networks. In contrast, digital surveillance can provide real-time detection of epidemics.”


Hu said the study used search engine algorithms such as Google Trends and Google Insights. It found that detecting the 2005–06 avian influenza outbreak “Bird Flu” would have been possible between one and two weeks earlier than official surveillance reports.


“In another example, a digital data collection network was found to be able to detect the SARS outbreak more than two months before the first publications by the World Health Organization (WHO),” Hu said.

more...
No comment yet.
Rescooped by Dr. Stefan Gruenwald from Tracking the Future
Scoop.it!

How do you build a large-scale quantum computer?

How do you build a large-scale quantum computer? | Amazing Science | Scoop.it

How do you build a universal quantum computer? Turns out, this question was addressed by theoretical physicists about 15 years ago. The answer was laid out in a research paper and has become known as the DiVincenzo criteriaThe prescription is pretty clear at a glance; yet in practice the physical implementation of a full-scale universal quantum computer remains an extraordinary challenge.


To glimpse the difficulty of this task, consider the guts of a would-be quantum computer. The computational heart is composed of multiple quantum bits, or qubits, that can each store 0 and 1 at the same time. The qubits can become “entangled,” or correlated in ways that are impossible in conventional devices. A quantum computing device must create and maintain these quantum connections in order to have a speed and storage advantage over any conventional computer. That’s the upside. The difficulty arises because harnessing entanglement for computation only works when the qubits are almost completely isolated from the outside world. Isolation and control becomes much more difficult as more and more qubits are added into the computer. Basically, as quantum systems are made bigger, they generally lose their quantum-ness.  


In pursuit of a quantum computer, scientists have gained amazing control over various quantum systems. One leading platform in this broad field of research is trapped atomic ions, where nearly 20 qubits have been juxtaposed in a single quantum register. However, scaling this or any other type of qubit to much larger numbers while still contained in a single register will become increasingly difficult, as the connections will become too numerous to be reliable.


Physicists led by ion-trapper Christopher Monroe at the JQI have now proposed a modular quantum computer architecture that promises scalability to much larger numbers of qubits. This research is described in the journal Physical Review A (reference below), a topical journal of the American Physical Society. The components of this architecture have individually been tested and are available, making it a promising approach. In the paper, the authors present expected performance and scaling calculations, demonstrating that their architecture is not only viable, but in some ways, preferable when compared to related schemes.

Individual qubit modules are at the computational center of this design, each one consisting of a small crystal of perhaps 10-100 trapped ions confined with electromagnetic fields. Qubits are stored in each atomic ion’s internal energy levels. Logical gates can be performed locally within a single module, and two or more ions can be entangled using the collective properties of the ions in a module.


One or more qubits from the ion trap modules are then networked through a second layer of optical fiber photonic interconnects. This higher-level layer hybridizes photonic and ion-trap technology, where the quantum state of the ion qubits is linked to that of the photons that the ions themselves emit. Photonics is a natural choice as an information bus as it is proven technology and already used for conventional information flow. In this design, the fibers are directed to a reconfigurable switch, so that any set of modules could be connected.


The switch system, which incorporates special micro-electromechanical mirrors (MEMs) to direct light into different fiber ports, would allow for entanglement between arbitrary modules and on-demand distribution of quantum information.


Via Szabolcs Kósa
more...
Andreas Pappas's curator insight, March 28, 4:40 AM

This article shows how scientists can increase the scale of quantum machine while still making them behave quantum mechanically by reading the qu-bits with lasers instead of conventional wiring.

Scooped by Dr. Stefan Gruenwald
Scoop.it!

Qualcomm Is Developing Brain-Inspired Computing - The Zeroth Processor

Qualcomm Is Developing Brain-Inspired Computing - The Zeroth Processor | Amazing Science | Scoop.it

Qualcomm’s technologies are designed from the ground-up with speed and power efficiency in mind. This way, devices that use our products can run smoothly and maximize battery life driven experiences. As mobile computing becomes increasingly pervasive, so do our expectations of the devices we use and interact with in our everyday lives. We want these devices to be smarter, anticipate our needs, and share our perception of the world so we can interact with them more naturally. The computational complexity of achieving these goals using traditional computing architectures is quite challenging, particularly in a power- and size-constrained environment vs. in the cloud and using supercomputers.


For the past few years our Research and Development teams have been working on a new computer architecture that breaks the traditional mold. We wanted to create a new computer processor that mimics the human brain and nervous system so devices can have embedded cognition driven by brain inspired computing—this is Qualcomm Zeroth processing.


Biologically Inspired Learning


We want Qualcomm Zeroth products to not only mimic human-like perception but also have the ability to learn how biological brains do.  Instead of preprogramming behaviors and outcomes with a lot of code, we’ve developed a suite of software tools that enable devices to learn as they go and get feedback from their environment.


In the video below, we outfitted a robot with a Qualcomm Zeroth processor and placed it in an environment with colored boxes. We were then able to teach it to visit white boxes only. We did this through dopaminergic-based learning, a.k.a. positive reinforcement—not by programming lines of code.


Another major pillar of Zeroth processor function is striving to replicate the efficiency with which our senses and our brain communicate information. Neuroscientists have created mathematical models that accurately characterize biological neuron behavior when they are sending, receiving or processing information. Neurons send precisely timed electrical pulses referred to as “spikes” only when a certain voltage threshold in a biological cell’s membrane is reached. These spiking neural networks (SNN) encode and transmit data very efficiently in both how our senses gather information from the environment and then how our brain processes and fuses all of it together.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Prototype holographic memory stores data in magnetic holograms with quick read-out access

Prototype holographic memory stores data in magnetic holograms with quick read-out access | Amazing Science | Scoop.it

A new type of memory device based on the interference of spin waves has been unveiled by scientists in the US and Russia. Data are stored in the form of magnetic bits and read out simultaneously as holographic images. Because the wavelengths of the spin waves are much shorter than those of light, the storage density of the memory has the potential to be much greater than systems based on optical holograms, and could someday be used to store very large amounts of information.


Conventional holography involves splitting a beam of laser light into an illumination beam and a reference beam. The illumination beam is fired at the object of interest and the deflected light is sent to a detector (or photographic film), where it is reunited with the reference beam. The detector records the interference between the two beams and this information is then used to create a 3D image of the object. As well as being used as a security feature on credit cards and banknotes, holograms also have the potential to store and retrieve large amounts of information in a very efficient way.


However, the storage densities that can be achieved using optical holograms are limited by the relatively long wavelengths of visible light – about 500 nm. Now, Alexander Khitun and colleagues at the University of California, Riverside and the Kotel'nikov Institute of Radioengineering and Electronics in Saratov, Russia, have created a holographic memory that uses spin waves, which have much shorter wavelengths.


The team's prototype device comprises two small magnets – each about 360 μm wide – that are connected by a magnetic wire. Data are stored in the device in terms of the orientations of magnetic moments of the magnets. For example, the "00" state corresponds to both magnets being oriented along the x-axis and "01" corresponds to the first magnet being oriented along x-axis and the second along the y-axis.


Data are written to the device using spin waves with relatively large amplitudes that are capable of changing the orientation of the magnet bits. The read-out process involves sending spin waves with smaller amplitudes through the device, where the phases of the waves are affected by the orientations of the two bits. The antennas are then used to measure the interference between the waves. By varying the relative phases of the input spin waves, the team can build up a holographic image of the orientation of the two magnets. This is analogous to how an optical holographic image is built up by varying the angle between the object and the illumination beam.


more...
No comment yet.