FortuneThe Big Apple's Big Data advantageFortuneHeaded by mathematical physicist Jennifer Chayes, the lab will focus on Big Data analysis, parsing the vast troves of information created by the world's digital denizens.
Big Data, Big Opportunity!LocalGovThe term 'big data' was coined by an analyst at Gartner Group in 2001 and describes data sets which are beyond the capacity of commonly-used software tools to capture, manage, and process in a timely manner.
GigaOMStartup delivers integrated storage, serverEE TimesSAN JOSE – Startup SimpliVity (Westborough, Mass.) comes out of stealth mode today debuting an integrated server, storage and networking system packed into a single 2U box.
Big Data Hadoop Death Match – Again But – Hadoop Era is Different from Linux ...DABCC.comMatt Asay, VP of big data cloud startup called Nodeable, writes a post today titled “Becoming Red Hat: Cloudera & Hortonworks Big Data Death Match”.
GigaOMMaking big-data analytics payGigaOMIn this analyst roundtable discussion, our experts will dig into the challenges CSPs face with big-data analytics today and how one customer in particular, T-Mobile, was able to overcome some of these hurdles.
VMware Adds To Big Data Portfolio With Log Insight BuyCRNVMware, which sees big data analytics as an enabling technology for cloud computing, added to its portfolio in April by acquiring Cetas Software, a Palo Alto, Calif.-based startup that handles...
Summary: 'Internet2,' the high-capacity network designed to support collaborative ventures between universities, government research agencies and businesses, is ready to step up into the big data realm.
R is an incredibly comprehensive statistics package. Even if you just look at the standard R distribution (the base and recommended packages), R can do pretty much everything you need for data manipulation, visualization, and statistical analysis.
If you listen to analysts talk about complex data, they all agree, it’s growing, and faster than anything else before. Complex data can mean a lot of things, but to our research group, ever increasing volumes of naturally occurring human text and speech—from blogs to YouTube videos—enable new and novel questions for Natural Language Processing (NLP). The dominating characteristic of these new questions involves making sense of lots of data in different forms, and extracting useful insights. NLP is hot and getting hotter NLP is a highly interdisciplinary field of study comprising of concepts and ideas from Mathematics, Computer Science and Linguistics. Naturally occurring instances of human language, be it text or speech, are growing at an exponential rate given the popularity of the Web and social media. In addition, people are increasingly becoming more and more reliant on internet services to search, filter, process and, in some cases, even understand the subset of such instances they encounter in their daily lives. Whether you think about it or not, those services allowing you to do so much with language everyday are generally trying to solve well-understood NLP problems under active research. To put it into context, let us show you some examples. Let’s say that a blogger is trying to gather the latest information on the earthquake in Chile. Her workflow might consist of the following sequence of web-based tasks. With each task, we include the name of the specific NLP problem being solved by the service performing the task: • “Show me the 10 most relevant documents on the web about the earthquake in Chile” (Information Retrieval) • “Show me a useful summary of these 200 news articles about the earthquake in Chile” (Automatic Document Summarization) • “Translate this Spanish blog into English so I can get the latest information about the earthquake in Chile” (Machine Translation)
The common perception of how big data is used centers around giant multi-national enterprises spending millions trying to fine tune their business strategies to eke out every last penny from their customers.
RedOrbitNASA's Hurricane Mission A Reality Due To Cutting-Edge TechnologyRedOrbitThe Global Hawk's ability to fly for a much longer period of time than manned aircraft will allow it to obtain previously difficult-to-get data.
RedOrbitArtificial Intelligence Used To Examine Mutant WormsRedOrbitThis new automated system created by Georgia Tech scientists uses Artificial Intelligence (AI) and brand new, cutting-edge image processing to accurately and quickly study large...
Big Data and the Deep Blue SeaE-Commerce TimesA fascinating global ocean studies initiative helps best define some of the IT superlatives around big data, cloud computing and middleware integration capabilities.
Lilly's 'big data' ally finds new Indy digs for IT workersFierceBiotech ITThe project taps "big data" to enable better decision-making in drug development, Phil Bridges, a spokesman for Quintiles, said via email.
Compuverde Unveils Game-Changing Big Data Virtualization and Storage SuiteSacramento Bee20, 2012 -- /PRNewswire/ -- Compuverde, the Big Data virtualization and storage solution for service providers, telecommunications companies and enterprises, today...
Mass High TechBig data startup collects $2M fundraiseMass High TechA former Washington, D.C.-based startup, founded by a trio of national security experts that are placing all of their bets on big data, has received $2 million in its first seed...
Come October and Dublin will see The IE Group host two conferences that will help organizations brainstorm on two key areas that are critical to their understanding of customers: Big Data Science and Predictive Analysis.
For the super super geeks..you shall find this amazing. It appears that IBM has already accomplished research that may be considered nearly impossible. Right afterward MIT presented something also nearly impossible. All of this has happened in the last few years.
IBM has created cognitive chips that map human brain neurons. I believe MIT has created a chip that's slightly different and can connect them
This proves in fact that we are now in the era of cognitive computing through exponential growth. So hold onto your seat belts because the next 10 years could be far different than just the Internet era and development of computing as a whole.
IBM and MIT have mapped the human neurons to a computing chip at nano scale and is now capable of creating a large neural network using an exact duplicate of neurons that map to the human brain. I thought it was all garbage until I did more research.
They did this by originally slicing a rats brain into small slivers and then saving the neurons in a chemical that kept the neurons alive and actually recorded the neurons using a chip custom designed to read the neurons. I actually found this because a child hood friend of mine by the name of Sean Ramirez actually is a professor in nano technology and use to work for Cornell.
The project is called synapse sponsored by DARPA and is now in phase 2. My understanding is they are testing these chips to control Drones today.
/PRNewswire/ -- Kognitio, driving the convergence of Big Data, in-memory analytics and cloud computing, today announced the immediate availability of its in-memory analytical platform via Amazon Web Services (AWS).
Machine Learning and Data Streams – David Thompson (SETI Talks) · ACTA IS BACK: Leaked docs show Canada/European Commission trying to sneak ACTA into Canada & back into Europe · Are we safe from the sun?