Telvent GIT, S.A., an IT solutions and information provider for a sustainable world, announced today that it is working together with the Chinese cities of Nanning, Fushun and Erdos to transform the way these major cities are managed at present. The company will implement its SmartMobility technology aimed at intelligent urban and mobility management to enable local authorities to make the most of their road infrastructures. These cities are expected to lower the current number of traffic delays by over 35% and the inner-city commute rate is anticipated to drop by around 15%. Click headline to learn more
Researchers have revealed how white blood cells move to infection or inflammation in the body; findings which could help lead to developing drug therapies for immune system disorders. Click on the image or title to learn more.
SINGAPORE - The Housing and Development Board will soon be able to make better and more informed decisions on sustainable urban planning initiatives - such as which public housing blocks should have green roofs or solar panels. A Memorandum of Understanding to test-bed a new complex systems modeling tool was signed between the HDB, Electricite de France and VEOLIA Environement Recherche et Innovation yesterday at the World Cities Summit. The complex Systems Model simulates the built environment of a city and its impact on resource use, environment, people and costs. Click on the headline to learn more...
Interesting article on social-ethographic research and social cognition. Key quote "it is important to pay attention to what people articulate as their own understanding of how social processes work and how they as individuals might negotiate the complex social terrain, rather than simply looking at their actions…." Worth a read
Each year, crowd disasters happen in different areas of the world. How and why do such disasters happen?
Are the fatalities caused by relentless behavior of people or a psychological state of panic that makes the crowd 'go mad'? Or are they a tragic consequence of a breakdown of coordination? These and other questions are addressed, based on a qualitative analysis of publicly available videos and materials, which document the planning and organization of the Love Parade in Duisburg, Germany, and the crowd disaster on July 24, 2010. Our analysis reveals a number of misunderstandings that have widely spread.
Crowd Disasters as Systemic Failures: Analysis of the Love Parade Disaster Helbing D, Mukerji P EPJ Data Science 2012, 1:7 (25 June 2012)
Society is complicated. But this book argues that this does not place it beyond the reach of a science that can help to explain and perhaps even to predict social behaviour. As a system made up of many interacting agents – people, groups, institutions and governments, as well as physical and technological structures such as roads and computer networks – society can be regarded as a complex system. In recent years, scientists have made great progress in understanding how such complex systems operate, ranging from animal populations to earthquakes and weather. These systems show behaviours that cannot be predicted or intuited by focusing on the individual components, but which emerge spontaneously as a consequence of their interactions: they are said to be ‘self-organized’. Attempts to direct or manage such emergent properties generally reveal that ‘top-down’ approaches, which try to dictate a particular outcome, are ineffectual, and that what is needed instead is a ‘bottom-up’ approach that aims to guide self-organization towards desirable states. This book shows how some of these ideas from the science of complexity can be applied to the study and management of social phenomena, including traffic flow, economic markets, opinion formation and the growth and structure of cities.
From ArsTechnica: Google has announced a new cloud service Compute Engine, offering large-scale Linux virtualization on Google's infrastructure.
Google boasts that Compute Engine offers up to 50 percent more compute power per dollar than competing IaaS offerings. For applications with low bandwidth and I/O demands, Compute Engine can offer hundreds of thousands of cores; Google demonstrated a genetic application running on 600,000 cores.
(Phys.org) -- Ion channel proteins – teeny batteries in cells that are the basis for all thought and muscle contraction, among other things – also serve as important docking stations for other proteins that need help figuring out where to go according to groundbreaking new research by a team of Colorado State University scientists. The research by Diego Krapf, an assistant professor in the Department of Electrical and Computer Engineering, Mike Tamkun, a professor in the Department of Biomedical Sciences and Emily Deutsch appears this month in the peer-reviewed journal, Molecular Biology of the Cell. Click image or title to learn more.
Google Fusion Tables is a data visualization web application that you can use to work with large data tables to extract meaningful information. Fusion Tables lets you store, share, query, and visualize data tables. It has a REST API that you can use to manage tables, and a query facility that can be used to manage data rows (insert/update/delete), and query the table for all rows that match spatial or data conditions. Google has announced the public availability of its new Fusion Tables API and at the same has begun a six-month deprecation period for the SQL API. So developers need to start to move over to Fusion Tables soon. Click headline or image to learn more...
The results of a public voting process have confirmed that the Open Geospatial Consortium (OGC) has decided to adopt WaterML 2.0 as an official OGC standard for hydrological time series encoding. This helps pave the way for better integrated geo-hydro simulation models and open exchange of different of kinds of hydro-meteorological observations and measurements. Very Cool. Learn more by clicking on the title.
UAB biologist Robert Thacker, Ph.D., is part of an interdisciplinary team developing a digital Tree of Life that will enable scientists to see the relationships between organisms at a glance.
The project, Arbor: Comparative Analysis Workflows for the Tree of Life, is part of the National Science Foundation’s $13 million Assembling, Visualizing and Analyzing the Tree of Life program, AVAToL. The Arbor team will receive $4 million; $500,000 is dedicated to the work of Thacker, a professor in the UAB Department of Biology, whose focus is symbiosis research in marine invertebrates and sponges. “The idea is that we will be able to accelerate science quickly and easily by sharing data and the way it is analyzed. And that will encourage scientists to try new ideas, find new answers and enhance the quality of research in multiple fields of science,” said Thacker.
IV International Conference on Science Matters :June 25-27, 2013 Porto, Portugal.
Science Matters (SciMat) series of conferences and publications aim to bring natural and social sciences and humanities under a unified paradigm where all are studied scientifically from the perspective of complex systems. For the SciMat program, see the webpage: www.sjsu.edu/people/lui.lam/scimat/
Jakob Engblom at WindRiver's blog talkes abut their new synthetic simulation-only Simics target machine called QSP (Quick-Start Platform) which helps hardware and system designers to design a piece of virtual-only hardware that is as simple as possible, while still running real operating systems similar to how they work on ordinary hardware. This enables faster real world prototyping for new devices. Click on image or title to learn more.
Wireless Sensor Networks (WSNs) are composed of tiny devices (known as “motes”) that sense the environment and communicate via radio. The Synchronous Blog has a great demo using a fixed ring topology with three motes placed side by side within their radio ranges. The motes follow the same behavior: receive a message with an integer counter, show it on the leds, wait for 1 second, increment the counter, and forward it to the mote on its right. The demo code is less than 70 lines of code and shows how to use parallel compositions and communication via internal events to achieve the intended behavior.
This European Public Sector Information Platform topic report focuses on the question of how improved access to and analysis of data can help increase transparency, accountability and effectiveness to improve our understanding of how aid can be made more effective. Link to PDF file on web page. Click the headline to learn more...
famous paper by Dr. Richard Cook of the University of Chicago - “How Complex Systems Fail”. It offers a cross-discipline perspective on complex systems. Among other things, Dr. Cook emphasizes the following:
More robust system performance requires appreciation and experience with failure.
Complex systems run as broken systems (with many latent failures within, but functional because it contains redundancies).
Failures in complex systems require the combination of multiple factors (thus, looking for a single ‘root cause’ is fundamentally wrong.
Change/interventions introduce complexity and new forms of failure. When new technologies are used to eliminate well understood system failures or to gain high precision performance they often introduce new pathways to large scale, catastrophic failures, which potentially can have even greater impact than those eliminated by the new system.
Unfortunately you will need an account to access this. Excellent paper from Arizona state and Center of Complexity Research at Beijing Normal University. The research provides a general method to detect hidden nodes in complex networks, using only time series from nodes that are accessible to external observation. This is important for a bunch of research and analysis areas that are looking at complex systems with time series data and trying to determine hidden actors/processes or patterns in black boxed systems/subsystems with network connections. The detailed method is based on compressive sensing and encompasses continuous- and discrete-time and the evolutionary-game type of dynamical systems as well. The practical demonstration of the technique is done using analysis of a social network. Worth taking a look at. To learn more click on title of article.
If you are looking at hosting costs for cloud computing it quickly gets really complicated to make comparisons. Great article on thenextweb.com compares Google's compute engine eith Microsoft Azure. If you need the details - read the article. Takeaway for linux hosting compute cycles are similarly priced - Google is slightly cheaper and offers a lot more RAM than Microsoft machines. So if you are doing Memory intensive compute and need cloud - Google Compute Engine is the way to go for now. Click on the title for more info.
My second job after leaving college was working in a robotics research company that built autonomous robot research platforms. So when Robot-Times scooped news on Freescale Semiconductors' new a modular robotics development system, and an entry-level robot for US$200 it certainly caught my eye. There is a lot of goodies in one package and the sensor modules look like a good place to start sensor experiments. All in all nice little bot kit. Click on headline or image to learn more..
A good review of Open House: Smarter cities, smarter thinking meeting on the 28 June, 2012 .
This year’s conference centred on the cohesive development of our cities. Delegates from BDP, Hawkins Brown, Nicholas Hare Architects, Pick Everard and Foster + Partners were among those present at the second day of the Open House Worldwide Conference 2012 hosted by CBRE in London. Worth reading. Click on image or headline to learn more.
Jenna Burrel is a sociologist and assistant professor in the School of Information at UC-Berkeley. If you are interested in the intersection of socioeconmics, ethnography, big data and good research her blog is a must read. Also her latest article “Technology Hype vs. Enduring Uses: A longitudinal study of Internet use among early adopters in an African city” is out this month in the wonderful open access journal First Monday. Click on the image or headline to learn more..
Second part of an interesting interview with David Saul, Chief Scientist at State Street Corp. He talks about the challenges and tradeoffs of getting a semantic database off the ground, and how semantic technology could produce really big benefits for many companies, thanks to his and others' efforts now under way to develop standards. Given the role of semantics in large scale data analysis - certainly worth a read. Learn more by clicking on the image or the headline.
Event: Quite some time ago I helped build realtime 3D simulators for nuclear submarines and power plant control rooms. More recently our last generation of 3D software was used by a number of groups for similar activities, so the European Nuclear Power Plant Simulation Forum on 3rd and 4th of October 2012 in Barcelona, Spain caught my eye. It is the only event of its type in Europe and will bring together those who are involved with simulators, training personnel, training managers, operation support managers and anyone with an interest in nuclear power plant simulators or those who develop or use simulation tools for nuclear power plants. If you are connected with the industy it might be worth attending.
Both the number of viruses in initial flu infection, and the virus type, affects the patient's outcome. Mice infected by high concentrations developed immunity, and generated immune cells in the lungs to fight other strains. Mice that were infected with a relatively low concentration of the virus developed weaker immunity against the strain that infected them, did not build up this crucial population of immune cells in the lungs, and showed only delayed immunity toward other flu strains. This discovery could pave the way for new prophylactic strategies to fight flu infections and provides a novel basis for vaccine design. Learn more by clicking on the image or headline.