healthcare techno...
Follow
Find tag "bigdata"
92.3K views | +15 today

Would you like me to help you?

Please fill this short form and I will get in touch with you
healthcare technology
The ways in which technology benefits healthcare
Curated by nrip

${leadGenConfiguration.title}

$leadGenConfiguration.description
Your new post is loading...
Your new post is loading...
Scooped by nrip
Scoop.it!

Can Computing Keep up With the Neuroscience Data Deluge?

Can Computing Keep up With the Neuroscience Data Deluge? | healthcare technology | Scoop.it

When an imaging run generates 1 terabyte of data, analysis becomes the problem


Today's neuroscientists have some magnificent tools at their disposal. They can, for example, examine the entire brain of a live zebrafish larva and record the activation patterns of nearly all of its 100,000 neurons in a process that takes only 1.5 seconds.


The only problem: One such imaging run yields about 1 terabyte of data, making analysis the real bottleneck as researchers seek to understand the brain.


To address this issue, scientists at Janelia Farm Research Campus have come up with a set of analytical tools designed for neuroscience and built on a distributed computing platform called Apache Spark. In their paper in Nature Methods, they demonstrate their system's capabilities by making sense of several enormous data sets. (The image above shows the whole-brain neural activity of a zebrafish larva when it was exposed to a moving visual stimulus; the different colors indicate which neurons activated in response to a movement to the left or right.)


The researchers argue that the Apache Spark platform offers an improvement over a more popular distributed computing model known as Hadoop MapReduce, which was originally based on Google's search engine technology. 


The researchers have made their library of analytic tools, which they call Thunder, available to the neuroscience community at large. With U.S. government money pouring into neuroscience research for the new BRAIN Initiative, which emphasizes recording from the brain in unprecedented detail, this computing advance comes just in the nick of time. 


more at http://spectrum.ieee.org/tech-talk/biomedical/imaging/can-computing-keep-up-with-the-neuroscience-data-deluge/



more...
No comment yet.
Scooped by nrip
Scoop.it!

Big Data Peeps At Your Medical Records To Find Drug Problems

Big Data Peeps At Your Medical Records To Find Drug Problems | healthcare technology | Scoop.it

It's been tough to identify the problems that only turn up after medicines are on the market. An experimental project is now combing through data to get earlier, more accurate warnings.


No one likes it when a new drug in people's medicine cabinets turns out to have problems — just remember the Vioxx debacle a decade ago, when the painkiller was removed from the market over concerns that it increased the risk of heart attack and stroke.


To do a better job of spotting unforeseen risks and side effects, the Food and Drug Administration is trying something new — and there's a decent chance that it involves your medical records.


It's called Mini-Sentinel, and it's a $116 million government project to actively go out and look for adverse events linked to marketed drugs. This pilot program is able to mine huge databases of medical records for signs that drugs may be linked to problems.


The usual system for monitoring the safety of marketed drugs has real shortcomings. It largely relies on voluntary reports from doctors, pharmacists, and just plain folks who took a drug and got a bad outcome.


"We get about a million reports a year that way," says Janet Woodcock, the director of the FDA's Center for Drug Evaluation and Research. "But those are random. They are whatever people choose to send us."

more...
No comment yet.
Scooped by nrip
Scoop.it!

Bringing Big Data Analytics To Health Care

Bringing Big Data Analytics To Health Care | healthcare technology | Scoop.it

Big data offers breakthrough possibilities for new research and discoveries, better patient care, and greater efficiency in health and health care, as detailed in the July issue of Health Affairs. As with any new tool or technique, there is a learning curve.

Here are some guidelines to help take full advantage of big data's potential:

Acquire the “right” data for the project, even if it might be difficult to obtain.

Many organizations – both inside and outside of health care – tend to stick with the data that’s easily accessible and that they’re comfortable with, even if it provides only a partial picture and doesn’t successfully unlock the value big data analytics may offer. But we have found that when organizations develop a “weighted data wish list” and allocate their resources towards acquiring high-impact data sources as well as easy-to-acquire sources, they discover greater returns on their big data investment.

Ensure that initial pilots have wide applicability.

Health organizations will get the most from big data when everyone sees the value and participates. Too often, though, initial analytics projects may be so self-contained that it is hard to see how any of the results might apply elsewhere in the organization.

Before using new data, make sure you know its provenance (where it came from) and its lineage (what’s been done to it).

Often in the excitement of big data, decision-makers and project staff forget this basic advice. They are often in a hurry to immediately start data mining efforts to search for unknown patterns and anomalies. We’ve seen many cases where such new data wasn’t properly scrutinized – and where supposed patterns and anomalies later turned out to be irrelevant or grossly misleading.

Don’t start with a solution; introduce a problem and consult with a data scientist.

Unlike conventional analytics platforms, big data platforms can easily allow subject-matter experts direct access to the data, without the need for database administrators or others to serve as intermediaries in making queries. This provides health researchers with an unprecedented ability to explore the data – to pursue promising leads, search for patterns and follow hunches, all in real time. We have found, however, that many organizations don’t take advantage of this capability.

Health organizations often build a big data platform, but fail to take full advantage of it. They continue to use the small-data approaches they’re accustomed to, or they rush headlong into big data, forgetting best practices in analytics.


It’s important to aim for initial pilots with wide applicability, a clear understanding of where one’s data comes from, and an approach that starts with a problem, not a solution. Perhaps the hardest task is finding the right balance.

more...
No comment yet.
Scooped by nrip
Scoop.it!

EHR + Geography = Population Health Management

EHR + Geography  = Population Health Management | healthcare technology | Scoop.it

Duke University Medicine is using geographical information to turn electronic health records (EHRs) into population health predictors. By integrating its EHR data with its geographic information system, Duke can enable clinicians to predict patients' diagnoses.


According to Health Data Management, Sohayla Pruitt was hired by Duke to run this project; “I thought, wow, if we could automate some of this, pre select some of the data, preprocess a lot and then sort of wait for an event to happen, we could pass it through our models, let them plow through thousands of geospatial variables and [let the system] tell us the actual statistical significance,” Pruitt says. “Then, once you know how geography is influencing events and what they have in common, you can project that to other places where you should be paying attention because they have similar probability.”


iHealth Beat explains that the system works by using an automated geocoding system to verify addresses with a U.S. Postal Service database. These addresses are then passed through a commercial mapping database to geocode them. Finally, the system imports all U.S. Census Bureau data with a block group ID. This results in an assessment of socioeconomic indicators for each group of patients.


“When we visually map a population and a health issue, we want to give an understanding about why something is happening in a neighborhood,” says Pruitt. “Are there certain socioeconomic factors that are contributing? Do they not have access to certain things? Do they have too much access to certain things like fast food restaurants?”


Duke is working to develop a proof of concept and algorithms that would map locations and patients. They are also working on a system to track food-borne illnesses.

more...
No comment yet.
Scooped by nrip
Scoop.it!

Can Mobile Technologies and Big Data Improve Health?

Can Mobile Technologies and Big Data Improve Health? | healthcare technology | Scoop.it

After decades as a technological laggard, medicine has entered its data age. Mobile technologies, sensors, genome sequencing, and advances in analytic software now make it possible to capture vast amounts of information about our individual makeup and the environment around us. The sum of this information could transform medicine, turning a field aimed at treating the average patient into one that’s customized to each person while shifting more control and responsibility from doctors to patients.


The question is: can big data make health care better?


“There is a lot of data being gathered. That’s not enough,” says Ed Martin, interim director of the Information Services Unit at the University of California San Francisco School of Medicine. “It’s really about coming up with applications that make data actionable.”


The business opportunity in making sense of that data—potentially $300 billion to $450 billion a year, according to consultants McKinsey & Company—is driving well-established companies like Apple, Qualcomm, and IBM to invest in technologies from data-capturing smartphone apps to billion-dollar analytical systems. It’s feeding the rising enthusiasm for startups as well.


Venture capital firms like Greylock Partners and Kleiner Perkins Caufield & Byers, as well as the corporate venture funds of Google, Samsung, Merck, and others, have invested more than $3 billion in health-care information technology since the beginning of 2013—a rapid acceleration from previous years, according to data from Mercom Capital Group. 

more...
Paul's curator insight, July 24, 9:06 AM

Yes - but bad data/analysis can harm it

Pedro Yiakoumi's curator insight, July 24, 10:48 AM

http://theinnovationenterprise.com/summits/big-data-boston-2014

Vigisys's curator insight, July 27, 1:34 AM

La collecte de données de santé tout azimut, même à l'échelle de big data, et l'analyse de grands sets de données est certainement utile pour formuler des hypothèses de départ qui guideront la recherche. Ou permettront d'optimiser certains processus pour une meilleure efficacité. Mais entre deux, une recherche raisonnée et humaine reste indispensable pour réaliser les "vraies" découvertes. De nombreuses études du passé (bien avant le big data) l'ont démontré...

Scooped by nrip
Scoop.it!

Genetic researchers have a new tool in API-controlled lab robots

Genetic researchers have a new tool in API-controlled lab robots | healthcare technology | Scoop.it

A life-sciences-as-a-service startup called Transcriptic has opened its APIs to the general public, allowing researchers around the world offload tedious lab work to robots so researchers can spend more of their time analyzing the results.


Using a set of APIs, researchers can now command Transcriptic’s purpose-built robots to process, analyze, and store their genetic or biological samples, and receive results in days.


The high concept idea, says Founder and CEO Max Hodak, is cloud computing for life sciences — only with “robotic work cells” instead of servers on the other end. “We see the lab in terms of the devices that make it up,” he said, meaning stuff like incubators, freezers, liquid handlers and robotic arms to replace human arms.


And although Transcriptic’s technology is complex, the process for getting work done is actually pretty simple. Researchers write code to tell the robots exactly what to do with the samples (right now, the company focuses on molecular cloning, genotyping, bacteria-growing and bio-banking), and then they send their samples to the Transcriptic lab.


Alternatively, Transcriptic’s robotic infrastructure can also synthesize samples for users.


And although Transcriptic’s technology is complex, the process for getting work done is actually pretty simple.


Researchers write code to tell the robots exactly what to do with the samples (right now, the company focuses on molecular cloning, genotyping, bacteria-growing and bio-banking), and then they send their samples to the Transcriptic lab. Alternatively, Transcriptic’s robotic infrastructure can also synthesize samples for users.


When the job is done, researchers get their results. That process can take anywhere from a day to weeks, Hodak explained, in part because the company’s operation is still pretty small and in part because “cells only grow and divide so quickly.”


more at http://gigaom.com/2014/07/15/genetic-researchers-have-a-new-tool-in-api-controlled-lab-robots/


more...
No comment yet.