meta 1 Dev Kit- the first device allowing visualization and interaction with 3D virtual objects in the real world using your hands.
meta presents the world’s first developer kit and platform for augmented reality; users will have direct gestural control of 3D virtual objects attached to their real environment. A game-changing two part wearable computer allows users to play with virtual objects in 3D space using nature’s perfect controllers - their hands. This truly unique product has to be worn to be believed, so meta put the device on the heads of an Emmy® award winning team, and a number of top-notch UI engineers and they produced a series of promotional materials, the first of which is featured on http://www.meta-view.com.
We were inspired by the interfaces in films like Iron Man, Avatar and Minority Report and wanted to make them a reality. The meta 1 Developers Kit has the power to finally deliver a natural interface between the virtual world and reality.
We are integrating customized hardware components and building a robust SDK (software development kit). meta 1 is the most advanced and affordable interface for augmented reality, we want every developer to have the opportunity to create the apps of the future.
Wildfire management in the United States and elsewhere is challenged by substantial uncertainty regarding the location and timing of fire events, the socioeconomic and ecological consequences of these events, and the costs of suppression. Escalating U.S. Forest Service suppression expenditures is of particular concern at a time of fiscal austerity as swelling fire management budgets lead to decreases for non-fire programs, and as the likelihood of disruptive within-season borrowing potentially increases. Thus there is a strong interest in better understanding factors influencing suppression decisions and in turn their influence on suppression costs. As a step in that direction, this paper presents a probabilistic analysis of geographic and temporal variation in incident management team response to wildfires. The specific focus is incident complexity dynamics through time for fires managed by the U.S. Forest Service. The modeling framework is based on the recognition that large wildfire management entails recurrent decisions across time in response to changing conditions, which can be represented as a stochastic dynamic system. Daily incident complexity dynamics are modeled according to a first-order Markov chain, with containment represented as an absorbing state. A statistically significant difference in complexity dynamics between Forest Service Regions is demonstrated. Incident complexity probability transition matrices and expected times until containment are presented at national and regional levels. Results of this analysis can help improve understanding of geographic variation in incident management and associated cost structures, and can be incorporated into future analyses examining the economic efficiency of wildfire management.
Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity) of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such “less can be more” feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency.
Persistence is a prime example of phenotypic heterogeneity, where a microbial population splits into two distinct subpopulations with different growth and survival properties as a result of reversible phenotype switching. Specifically, persister cells grow more slowly than normal cells under unstressed growth conditions, but survive longer under stress conditions such as the treatment with bactericidal antibiotics. We analyze the population dynamics of such a population for several typical experimental scenarios, namely a constant environment, shifts between growth and stress conditions, and periodically switching environments. We use an approximation scheme that allows us to map the dynamics to a logistic equation for the subpopulation ratio and derive explicit analytical expressions for observable quantities that can be used to extract underlying dynamic parameters from experimental data. Our results provide a theoretical underpinning for the study of phenotypic switching, in particular for organisms where detailed mechanistic knowledge is scarce.
Recently, a combination of non-invasive neuroimaging techniques and graph theoretical approaches has provided a unique opportunity for understanding the patterns of the structural and functional connectivity of the human brain (referred to as the human brain connectome). Currently, there is a very large amount of brain imaging data that have been collected, and there are very high requirements for the computational capabilities that are used in high-resolution connectome research. In this paper, we propose a hybrid CPU-GPU framework to accelerate the computation of the human brain connectome. We applied this framework to a publicly available resting-state functional MRI dataset from 197 participants. For each subject, we first computed Pearson’s Correlation coefficient between any pairs of the time series of gray-matter voxels, and then we constructed unweighted undirected brain networks with 58 k nodes and a sparsity range from 0.02% to 0.17%. Next, graphic properties of the functional brain networks were quantified, analyzed and compared with those of 15 corresponding random networks. With our proposed accelerating framework, the above process for each network cost 80~150 minutes, depending on the network sparsity. Further analyses revealed that high-resolution functional brain networks have efficient small-world properties, significant modular structure, a power law degree distribution and highly connected nodes in the medial frontal and parietal cortical regions. These results are largely compatible with previous human brain network studies. Taken together, our proposed framework can substantially enhance the applicability and efficacy of high-resolution (voxel-based) brain network analysis, and have the potential to accelerate the mapping of the human brain connectome in normal and disease states.
In experimental and theoretical neuroscience, synaptic plasticity has dominated the area of neural plasticity for a very long time. Recently, neuronal intrinsic plasticity (IP) has become a hot topic in this area. IP is sometimes thought to be an information-maximization mechanism. However, it is still unclear how IP affects the performance of artificial neural networks in supervised learning applications. From an information-theoretical perspective, the error-entropy minimization (MEE) algorithm has newly been proposed as an efficient training method. In this study, we propose a synergistic learning algorithm combining the MEE algorithm as the synaptic plasticity rule and an information-maximization algorithm as the intrinsic plasticity rule. We consider both feedforward and recurrent neural networks and study the interactions between intrinsic and synaptic plasticity. Simulations indicate that the intrinsic plasticity rule can improve the performance of artificial neural networks trained by the MEE algorithm.
A walkthrough of interesting data and stats found when benchmarking the checkout process design of the top 100 gros
Here’s a walkthrough of just a handful of the interesting stats we’ve found when benchmarking the top 100 grossing e-commerce websites’ checkout processes:
The average checkout process consist of 5.08 steps.24% require account registration.81% think their newsletter is a must have (opt-out or worse).41% use address validators.50% asks for the same information twice.The average top 100 checkouts violate 33% of the checkout usability guidelines.
In this article I’ll go over each of them and explain exactly what’s behind these numbers, showing you some real life implementations of do’s and don’ts when it comes to checkout processes.
Simulating Social Complexity examines all aspects of using agent- or individual-based simulation. This approach represents systems as individual elements having each their own set of differing states and internal processes. The interactions between elements in the simulation represent interactions in the target systems. What makes these elements "social" is that they are usefully interpretable as interacting elements of an observed society. In this, the focus is on human society, but can be extended to include social animals or artificial agents where such work enhances our understanding of human society.
Edmonds, B. & Meyer, R. (eds.) (2013) Simulating Social Complexity - a handbook. Springer.
For a person – start feeling as part of the whole body or real-time sensor/motor cell of entire humanity organism, with every your search query, email, chat message or mouse click making it a bit more clever and strong, and every your action to some extent inspired by it.
For software developers – get ready for emerging market of intellectual agent software (with first lonely players like Siri, Google Now and Sherpa), either keeping in mind business model of a «pilot fish», operating in biocenosis with one of the «Big Sharks» or having a good exit strategy for the case when your functionality may get on the way of some of major players (like it has happened to Yandex Wonder).
For business – for competitive business promotion, understand how to craft «double-sided» web pages looking attractive for fellow people on one side and rich of true semantic markup on the other side. That kind of markup, invisible to human eye (see http://schema.org/ for more details) is to be indexed by «semantic crawler» at Google, collecting the thought-food for its Knowledge Graph — so that your site could get returned to user as single right answer on user's query, instead of being on 10th row of second page of search results.
For government – be clear that ability to enable national projects of intellectual globalization might turn into a key for national security in the very close future. That does not necessarily mean any governmental funding of certain developments, as we have seen couple business enterprises managed to capture the third of the world in few years, so the most efficient option would be creation of appropriate business environments for high technology and information technology businesses within national borders.
For humanity – get ready to pass through the next (since invention of computers and internet) pivotal point of development, with all coming surprises, frustrations and openings of new opportunities.
For evolution – prepare to record forthcoming meta-system transition (since assembly of atoms in the molecule, molecules in the cell, cells into organism and neurons in the brain) in the Universe's diary book.
Recent advancement of technologies has now made it routine to obtain and compare gene orders within genomes. Rearrangements of gene orders by operations such as reversal and transposition are rare events that enable researchers to reconstruct deep evolutionary histories. An important application of genome rearrangement analysis is to infer gene orders of ancestral genomes, which is valuable for identifying patterns of evolution and for modeling the evolutionary processes. Among various available methods, parsimony-based methods (including GRAPPA and MGR) are the most widely used. Since the core algorithms of these methods are solvers for the so called median problem, providing efficient and accurate median solver has attracted lots of attention in this field. The “double-cut-and-join” (DCJ) model uses the single DCJ operation to account for all genome rearrangement events. Because mathematically it is much simpler than handling events directly, parsimony methods using DCJ median solvers has better speed and accuracy. However, the DCJ median problem is NP-hard and although several exact algorithms are available, they all have great difficulties when given genomes are distant. In this paper, we present a new algorithm that combines genetic algorithm (GA) with genomic sorting to produce a new method which can solve the DCJ median problem in limited time and space, especially in large and distant datasets. Our experimental results show that this new GA-based method can find optimal or near optimal results for problems ranging from easy to very difficult. Compared to existing parsimony methods which may severely underestimate the true number of evolutionary events, the sorting-based approach can infer ancestral genomes which are much closer to their true ancestors. The code is available at http://phylo.cse.sc.edu.
While the prevalence of density-dependence is well-established in population ecology, few field studies have investigated its underlying mechanisms and their relative population-level importance. Here, we address these issues, and more specifically, how differences in body-size influence population regulation. For this purpose, two experiments were performed in a small coastal stream on the Swedish west coast, using juvenile brown trout (Salmo trutta) as a study species. We manipulated densities of large and small individuals, and observed effects on survival, migration, condition and individual growth rate in a target group of intermediate-sized individuals. The generality of the response was investigated by reducing population densities below and increasing above the natural levels (removing and adding large and small individuals). Reducing the density (relaxing the intensity of competition) had no influence on the response variables, suggesting that stream productivity was not a limiting factor at natural population density. Addition of large individuals resulted in a negative density-dependent response, while no effect was detected when adding small individuals or when maintaining the natural population structure. We found that the density-dependent response was revealed as reduced growth rate rather than increased mortality and movement, an effect that may arise from exclusion to suboptimal habitats or increased stress levels among inferior individuals. Our findings confirm the notion of interference competition as the primary mode of competition in juvenile salmonids, and also show that the feedback-mechanisms of density-dependence are primarily acting when increasing densities above their natural levels.
The Earth, with its core-driven magnetic field, convective mantle, mobile lid tectonics, oceans of liquid water, dynamic climate and abundant life is arguably the most complex system in the known universe. This system has exhibited stability in the sense of, bar a number of notable exceptions, surface temperature remaining within the bounds required for liquid water and so a significant biosphere. Explanations for this range from anthropic principles in which the Earth was essentially lucky, to homeostatic Gaia in which the abiotic and biotic components of the Earth system self-organise into homeostatic states that are robust to a wide range of external perturbations. Here we present results from a conceptual model that demonstrates the emergence of homeostasis as a consequence of the feedback loop operating between life and its environment. Formulating the model in terms of Gaussian processes allows the development of novel computational methods in order to provide solutions. We find that the stability of this system will typically increase then remain constant with an increase in biological diversity and that the number of attractors within the phase space exponentially increases with the number of environmental variables while the probability of the system being in an attractor that lies within prescribed boundaries decreases approximately linearly. We argue that the cybernetic concept of rein control provides insights into how this model system, and potentially any system that is comprised of biological to environmental feedback loops, self-organises into homeostatic states.
The philosopher's critique of evolution wasn't shocking. So why have his colleagues raked him over the coals?
Thomas Nagel is a leading figure in philosophy, now enjoying the title of university professor at New York University, a testament to the scope and influence of his work. His 1974 essay "What Is It Like to Be a Bat?" has been read by legions of undergraduates, with its argument that the inner experience of a brain is truly knowable only to that brain. Since then he has published 11 books, on philosophy of mind, ethics, and epistemology.
But Nagel's academic golden years are less peaceful than he might have wished. His latest book, Mind and Cosmos (Oxford University Press, 2012), has been greeted by a storm of rebuttals, ripostes, and pure snark. "The shoddy reasoning of a once-great thinker," Steven Pinker tweeted. The Weekly Standard quoted the philosopher Daniel Dennett calling Nagel a member of a "retrograde gang" whose work "isn't worth anything—it's cute and it's clever and it's not worth a damn."
Pedestrian movements are the consequence of several complex and stochastic facts. The modelling of pedestrian movements and the ability to predict the travel time are useful for evaluating the performance of a pedestrian facility. However, only a few studies can be found that incorporate the design of the facility, local pedestrian body dimensions, the delay experienced by the pedestrians, and level of service to the pedestrian movements. In this paper, a queuing based analytical model is developed as a function of relevant determinants and functional factors to predict the travel time on pedestrian facilities. The model can be used to assess the overall serving rate or performance of a facility layout and correlate it to the level of service that is possible to provide the pedestrians. It has also the ability to provide a clear suggestion on the designing and sizing of pedestrian facilities. The model is empirically validated and is found to be a robust tool to understand how well a particular walking facility makes possible comfort and convenient pedestrian movements. The sensitivity analysis is also performed to see the impact of some crucial parameters of the developed model on the performance of pedestrian facilities.
Listening to and understanding people in a “cocktail-party situation” is a remarkable feature of the human auditory system. Here we investigated the neural correlates of the ability to localize a particular sound among others in an acoustically cluttered environment with healthy subjects. In a sound localization task, five different natural sounds were presented from five virtual spatial locations during functional magnetic resonance imaging (fMRI). Activity related to auditory stream segregation was revealed in posterior superior temporal gyrus bilaterally, anterior insula, supplementary motor area, and frontoparietal network. Moreover, the results indicated critical roles of left planum temporale in extracting the sound of interest among acoustical distracters and the precuneus in orienting spatial attention to the target sound. We hypothesized that the left-sided lateralization of the planum temporale activation is related to the higher specialization of the left hemisphere for analysis of spectrotemporal sound features. Furthermore, the precuneus − a brain area known to be involved in the computation of spatial coordinates across diverse frames of reference for reaching to objects − seems to be also a crucial area for accurately determining locations of auditory targets in an acoustically complex scene of multiple sound sources. The precuneus thus may not only be involved in visuo-motor processes, but may also subserve related functions in the auditory modality.
The Psychology Experimental Building Language http://pebl.sourceforge.net/ Berg Card Sorting Test is an open-source neurobehavioral test. Participants (N = 207, ages 6 to 74) completed the Berg Card Sorting Test. Performance on the first 64 trials were isolated and compared to that on the full-length (128 trials) test. Strong correlations between the short and long forms (total errors: r = .87, perseverative response: r = .83, perseverative errors r = .77, categories completed r = .86) support the Berg Card Sorting Test-64 as an abbreviated alternative for the full-length executive function test.
Shyness and social anxiety are correlated to some extent and both are associated with hyper-responsivity to social stimuli in the frontal cortex and limbic system. However to date no studies have investigated whether common structural and functional connectivity differences in the brain may contribute to these traits. We addressed this issue in a cohort of 61 healthy adult subjects. Subjects were first assessed for their levels of shyness (Cheek and Buss Shyness scale) and social anxiety (Liebowitz Social Anxiety scale) and trait anxiety. They were then given MRI scans and voxel-based morphometry and seed-based, resting-state functional connectivity analysis investigated correlations with shyness and anxiety scores. Shyness scores were positively correlated with gray matter density in the cerebellum, bilateral superior temporal gyri and parahippocampal gyri and right insula. Functional connectivity correlations with shyness were found between the superior temporal gyrus, parahippocampal gyrus and the frontal gyri, between the insula and precentral gyrus and inferior parietal lobule, and between the cerebellum and precuneus. Additional correlations were found for amygdala connectivity with the medial frontal gyrus, superior frontal gyrus and inferior parietal lobule, despite the absence of any structural correlation. By contrast no structural or functional connectivity measures correlated with social or trait anxiety. Our findings show that shyness is specifically associated with structural and functional connectivity changes in cortical and limbic regions involved with processing social stimuli. These associations are not found with social or trait anxiety in healthy subjects despite some behavioral correlations with shyness.
Many ants rely on both visual cues and self-generated chemical signals for navigation, but their relative importance varies across species and context. We evaluated the roles of both modalities during colony emigration by Temnothorax rugatulus. Colonies were induced to move from an old nest in the center of an arena to a new nest at the arena edge. In the midst of the emigration the arena floor was rotated 60°around the old nest entrance, thus displacing any substrate-bound odor cues while leaving visual cues unchanged. This manipulation had no effect on orientation, suggesting little influence of substrate cues on navigation. When this rotation was accompanied by the blocking of most visual cues, the ants became highly disoriented, suggesting that they did not fall back on substrate cues even when deprived of visual information. Finally, when the substrate was left in place but the visual surround was rotated, the ants' subsequent headings were strongly rotated in the same direction, showing a clear role for visual navigation. Combined with earlier studies, these results suggest that chemical signals deposited by Temnothorax ants serve more for marking of familiar territory than for orientation. The ants instead navigate visually, showing the importance of this modality even for species with small eyes and coarse visual acuity.
Brains, it has recently been argued, are essentially prediction machines. They are bundles of cells that support perception and action by constantly attempting to match incoming sensory inputs with top-down expectations or predictions. This is achieved using a hierarchical generative model that aims to minimize prediction error within a bidirectional cascade of cortical processing. Such accounts offer a unifying model of perception and action, illuminate the functional role of attention, and may neatly capture the special contribution of cortical processing to adaptive success. This target article critically examines this “hierarchical prediction machine” approach, concluding that it offers the best clue yet to the shape of a unified science of mind and action.
Whatever next? Predictive brains, situated agents, and the future of cognitive science Andy Clark
Behavioral and Brain Sciences / Volume 36 / Issue 03 / June 2013, pp 181-204
If a potential customer enjoys your advert, they are more likely to buy your product. It's a simple enough concept, but it is extremely difficult to know how well your advert is being received in the real world. Now a new system could help advertisers know exactly how their latest offering is going down with viewers, just by watching their face.
The system, developed by Daniel McDuff and colleagues at the Massachusetts Institute of Technology Media Lab, looks at how muscles in the face move in response to watching a video. Software can then classify what counts as positive facial responses and smiles during the video and from that predict which adverts the viewer most enjoys.
The team collected more than 3200 videos of people, whose faces were filmed by their own computer's webcam as they watched three adverts online during the Super Bowl in 2011. After each commercial – one for Doritos, one for Google and one for Volkswagen – the viewers were asked if they liked the video and whether they would want to watch it again. They had three choices of response: "Heck ya! I loved it!", "Meh! It was ok" and "Na… not my thing".
Transcranial magnetic stimulation alters the activity of the brain without the need for an invasive physical procedure. But for such a ground-breaking and potentially alarming technique, it is not very well known.
If you were to tell people that the technology exists to manipulate the workings of people's brains, they may not believe you. That sort of thing is the stuff of cheap sci-fi B movies. If someone in the real world were to try to develop it, that's exactly the sort of scenario where they'd send James Bond in to stop them before it got too far.
But the fact is that this technology genuinely exists and is widely used in neuroscientific research. It is known as Transcranial magnetic stimulation, or TMS, and as the name suggests it stimulates the brain through the cranium using magnetism.
Magnets and the brain work together a lot. Neuroscience is an increasingly media-friendly area of science, and this is due in part to the increasing use of magnetic resonance imaging (MRI), an invaluable but complex technique that uses intense magnetic fields and radio waves to produce eye-catching images of a working body and brain.
For a given multi-agent system where the local interaction rule of the existing agents can not be re-designed, one way to intervene the collective behavior of the system is to add one or a few special agents into the group which are still treated as normal agents by the existing ones. We study how to lead a Vicsek-like flocking model to reach synchronization by adding special agents. A popular method is to add some simple leaders (fixed-headings agents). However, we add one intelligent agent, called ‘shill’, which uses online feedback information of the group to decide the shill's moving direction at each step. A novel strategy for the shill to coordinate the group is proposed. It is strictly proved that a shill with this strategy and a limited speed can synchronize every agent in the group. The computer simulations show the effectiveness of this strategy in different scenarios, including different group sizes, shill speed, and with or without noise. Compared to the method of adding some fixed-heading leaders, our method can guarantee synchronization for any initial configuration in the deterministic scenario and improve the synchronization level significantly in low density groups, or model with noise. This suggests the advantage and power of feedback information in intervention of collective behavior.
We investigate the dynamics of a synthetic genetic repressilator with quorum sensing feedback. In a basic genetic ring oscillator network in which three genes inhibit each other in unidirectional manner, an additional quorum sensing feedback loop stimulates the activity of a chosen gene providing competition between inhibitory and stimulatory activities localized in that gene. Numerical simulations show several interesting dynamics, multi-stability of limit cycle with stable steady-state, multi-stability of different stable steady-states, limit cycle with period-doubling and reverse period-doubling, and infinite period bifurcation transitions for both increasing and decreasing strength of quorum sensing feedback. We design an electronic analog of the repressilator with quorum sensing feedback and reproduce, in experiment, the numerically predicted dynamical features of the system. Noise amplification near infinite period bifurcation is also observed. An important feature of the electronic design is the accessibility and control of the important system parameters.
We define an interface-interaction network (IIN) to capture the specificity and competition between protein-protein interactions (PPI). This new type of network represents interactions between individual interfaces used in functional protein binding and thereby contains the detail necessary to describe the competition and cooperation between any pair of binding partners. Here we establish a general framework for the construction of IINs that merges computational structure-based interface assignment with careful curation of available literature. To complement limited structural data, the inclusion of biochemical data is critical for achieving the accuracy and completeness necessary to analyze the specificity and competition between the protein interactions. Firstly, this procedure provides a means to clarify the information content of existing data on purported protein interactions and to remove indirect and spurious interactions. Secondly, the IIN we have constructed here for proteins involved in clathrin-mediated endocytosis (CME) exhibits distinctive topological properties. In contrast to PPI networks with their global and relatively dense connectivity, the fragmentation of the IIN into distinctive network modules suggests that different functional pressures act on the evolution of its topology. Large modules in the IIN are formed by interfaces sharing specificity for certain domain types, such as SH3 domains distributed across different proteins. The shared and distinct specificity of an interface is necessary for effective negative and positive design of highly selective binding targets. Lastly, the organization of detailed structural data in a network format allows one to identify pathways of specific binding interactions and thereby predict effects of mutations at specific surfaces on a protein and of specific binding inhibitors, as we explore in several examples. Overall, the endocytosis IIN is remarkably complex and rich in features masked in the coarser PPI, and collects relevant detail of protein association in a readily interpretable format.