The answers we get out of data will always depend on the questions we ask.
Useful. It also reminded me of one of the tools we use in Whole Scale change thinking: Data, Purpose, Plan, Evaluate, or DPPE. Thanks to twitter follower @resilientchange for this link this week.
"Throughout history ....science has made huge progress in precisely the areas where we can measure things — and lagged where we can't."
Data-driven predictions can succeed — and they can fail. It is when we deny our role in the process that the odds of failure rise. Before we demand more of our data, we need to demand more of ourselves.
One key role we play in the process is choosing which data to look at. That this choice is often made for us by what happens to be easiest to measure doesn't make it any less consequential, as Samuel Arbesman writes,
"Throughout history, in one field after another, science has made huge progress in precisely the areas where we can measure things — and lagged where we can't."
In his book, political forecaster Nate Silver writes about a crucial element,
how we go about revising our views as new data comes in.
Silver is a big believer in the Bayesian approach to probability, in which we all have our own subjective ideas about how things are going to pan out, but follow the same straightforward rules in revising those assessments as we get new information.
It's a process that uses data to refine our thinking. But it doesn't work without some thinking first.
Read the full article here.
Perspective on change planning, facilitating, organizing, implementing or sustaining via Reveln.
Via Deb Nystrom, REVELN