Cultural Trendz
Find tag "analysis"
4.9K views | +12 today
Cultural Trendz
Insight. Entertainment. Style.
Curated by Vilma Bonilla
Your new post is loading...
Your new post is loading...
Rescooped by Vilma Bonilla from Leadership, Innovation, and Creativity!

Four characteristics of learning leaders

Four characteristics of learning leaders | Cultural Trendz |

by Stewart Hase, Heutagogy of Community Practice

Writing is always a learning experience for me. It forces greater clarity. In addition, the tranquility of the unique Australian bush setting in which I am currently sitting, miles from anywhere, provides a perfect environment for learning. I’ve been working on a chapter for our new forthcoming book (from Amazon in September) called ‘A Practical Guide to Self-Determined Learning: Experiences from the Field’.

It’s an edited work where lots of people share their experiences of using heutagogy in a variety of contexts. It should be fun and, hopefully, useful to people wanting to try something a bit different in their ‘classrooms’. I got so excited while writing the chapter that I thought I’d share some of its content with you. In this day and age there is no need to be patient, which suits me, as patience is not a strong point. And I might get some comments back to help me refine the chapter before it goes to air.

A number of insightful writers have suggested the skills that people need in order to cope with the 21st century. One of my favourites that appears to summarise all of them is from Jackie Gerstein who has put together a neat pictorial of these skills. See also Tony Wanger’s work, which Jackie acknowledges.

The skills she has identified are: effective oral and written communication; collaboration across networks; agility and adaptability; grit; resilience; empathy and global stewardship; vision; self-regulation; hope and optimism; curiosity and imagination; initiative and entrepreneurialism; and critical thinking and problem solving.

Some of the implications of self-determined learning are:

*  involve the learner in designing their own learning content and process as a partner;
*  make the curriculum process flexible so that new questions and understanding can be explored as new neuronal pathways are explored;
*  individualize learning as much as possible;
*  use social media to network learners;
*  provide flexible or negotiated assessment;
*  enable the learner to contextualize concepts, knowledge and new understanding;
*  provide lots of resources and enable the learner to explore essential content;
*  experiment and research;
*  base practice on the latest science;
*  engage learners in collaborative learning;
*  differentiate between knowledge and skill acquisition (competencies) and deep learning;
*  recognize the importance of informal learning and that we only need to enable it rather than control it;
*  have confidence in the learner;
*  be on top of the subject area so you can be a resource;
*  and recognize that teaching can become a block to learning

So, what of the skills needed by learning leaders, given the abilities we should foster in our learners and the rather more learner-centric approach prescribed by self-determined learning?

At the outset, I think we need to get rid of the terms teach and teacher from our lexicon and start talking about the ‘learning leader’. It immediately changes the focus from teacher-centred to learner-centred approaches. So, I think what we used to call teaching is really leadership and the broad abilities are similar whether or not you are leading students or leading people in an organisation.

4 Characteristics Of Learning Leaders

1. Ability to deal with ambiguity

    Low need for control
    Openness to Experience (one of the Big 5 personality traits)
    Moderate perfectionism
    High Stability (low anxiety)
    Project management skills
    Ability to use social media

2. The capacity to foster engagement

    An understanding of how to motivate others
    Ability to foster a shared purpose and vision
    An understanding of human needs
    Interpersonal effectiveness
    Ability to self-regulate

3. The capacity to learn

    Ability to research and learn
    Being thoroughly on top of one’s subject area
    Wide and accessible networks
    Able to share with others
    Knowledge management skills
    The ability to foster collaborative learning

4. The ability to use open systems thinking

    The capacity to scan the external environment
    Able to foster participative democracy/collaboration decision-making and process
    Able to actively diffuse power
    Capacity to work in a team
    Ongoing internal and external analysis of effectiveness (continuous improvement)


Via Skip Zalneraitis, Suvi Salo, Ivon Prefontaine
Vilma Bonilla's insight:

I love this analysis of a learning leader! It is spot on.  ~ V.B.


Ivon Prefontaine's curator insight, July 7, 10:25 AM

Peter Vaill suggested learning and leading are intertwined. Teaching is about learning and leading being intertwined with it.

Scooped by Vilma Bonilla!

Jason Dorsey on generational differences

Excellent, poignant and humorous video analysis of the workplace.

Vilma Bonilla's insight:

Good humor and analysis of the workplace..

No comment yet.
Scooped by Vilma Bonilla!

Visual data is great, real data is better | Big Data

Visual data is great, real data is better | Big Data | Cultural Trendz |

There’s a big push at the minute by marketers and technology vendors around the concept and importance of Big Data. Run a Google Search for the term and the resulting titles of posts, articles or books speak for themselves:

    Big Data: The Next Frontier for Innovation, Competition and Productivity;
    Big Data: A Revolution That Will Transform How We Live, Work and Think;
    Big Data Transforms Business;
    Put a Fork In Big Data – It’s Done (just to balance the positive/negative results).

So, Big Data is clearly big business, and – with more than 1.7 billion search results – something that businesses are looking to understand, come to grips with and benefit from.

That’s understandable – after all, the potential of Big Data is huge. My colleague Hessie Jones, for example, recently wrote an insightful piece on how Big Data is transforming advertising, and in March 2012, no less an institution than the White House itself announced the Big Data Research and Development Initiative.

So, yes, Big Data = Big News.

The thing is, though, while access to such huge amounts of data helps us be better marketers and – by association – better businesses, there’s also the danger that we let this data inform our decisions, without stopping to think of that most important aspect of any data analysis – context.
Context Drives Educated and Informed Decisions

Think of any major decision you’ve made in life, either personally or professionally. While there will be examples of impulse buys or snap decisions made in the heat of the moment, the majority of your actions will be based on the context surrounding them.

    I wanted the sports car, but it wasn’t kid-friendly;
    Job A offered more money, but Job B offered me deeper satisfaction;
    The penthouse condo in the city offered amazing views, but the suburb neighbourhood was safer.

Three very simple examples of decisions that looked at the bigger picture of context, and took into account the long-term view versus the short-term buzz. Each option would satisfy our basic instincts, but the latter option of each choice is the one I’d go for based on its deeper context.

It’s simple economics of educated decisions, based on the data available – yet as the following examples show, context is still being missed where it’s needed the most.

Visual Data is Great, Real Data is Better

Professional social network LinkedIn is continuously looking to increase connections and the viability of its service with new additions, some useful, others less so. At least, currently.

One of the new features they’ve released is the visual ability to see who’s viewed your updates, and how far they’ve spread. Visually, it’s pretty cool, as can be seen below:

The problem is, functionality-wise, it’s very limited.

While the image on the left tells me my update had 536 views, it doesn’t allow me to dive into the data to see who actually viewed the update. The same with the image on the right – I can’t click into the big purple circle to identify the type of people viewing my content.

The potential for this visual data is obvious – I can see if I’m attracting my target audience to my content – either potential clients or new employers – and, by having access to this information, tailor my sharing even more, as well as connect with these folks in particular.

It’s not just LinkedIn that’s missing the importance of context, though. Check out the image below from a technology/data company in Toronto (click to expand):

The results are from a search around the words “social business”, and show not only the main keywords around the topic, but also who’s discussing them, via what platform, and the time they’re most likely to be discussed.

This basic data offers a simple overview of that particular search – but where’s the bigger context?

For example, you can see that “business” is the most discussed word, and then I’ve highlighted “product”, “agencies”, “customers” and “platform”. As you can see from the two yellow circles I’ve overlaid, a couple of people are in multiple results. So what’s the context behind that?

    Is it simply because they mention the words together?
    is it because they’re connected to these different communities?
    Is it because they’re seen as influential around these joint topics?
    Is it because they’re more active than the other profiles?

Again, these are simple questions, but ones that the software doesn’t answer, or at least attempts to help with. Because of this, other software and analysis is needed to see how valuable these folks might be to my business.

That’s not to advocate lazy marketing, nor to forget about the legwork that real analysis requires. But if a software tool can’t provide further context around the solution it offers, why use that platform at all?
Dig Deeper, Think Bigger

And this is where Big Data’s main weakness can be found – it’s encouraging lazy solutions that seem to offer reams of data, but in reality offer very little. By doing so, it’s impacting the true potential of Big Data when used properly.

It’s this type of limitation that’s attracting valid critique of Big Data.

In his 2013 paper entitled Big Data for Development: From Information to Knowledge Societies, Martin Hilbert raised the concern that Big Data-led decisions are “informed by the world as it was in the past, or, at best, as it currently is.”

Last year, Harvard Business Review published an article, Good Data Won’t Guarantee Good Decisions, which highlighted the bigger issues around the data available to us today.

For all the breathless promises about the return on investment in Big Data, however, companies face a challenge. Investments in analytics can be useless, even harmful, unless employees can incorporate that data into complex decision making. Meeting these challenges requires anthropological skills and behavioral understanding—traits that are often in short supply in IT departments.

Simply put, we can have all the data in the world available to us, but unless we understand the context in which it’s presented, and the actions that will drive based on our analysis, we’re as effective as driving at night with the lights off.

It’s up to us to think bigger when it comes to Big Data, and start providing the context and meaning behind it, as opposed to just the “But it looks cool, right?” mindset that seems popular today.

Challenge on.

Vilma Bonilla's insight:

"We can have all the data in the world available to us, but unless we understand the context in which it’s presented, and the actions that will drive based on our analysis, we’re as effective as driving at night with the lights off."

No comment yet.
Scooped by Vilma Bonilla!

The DOD's data strategy: huge, massive and distributed

The DOD's data strategy: huge, massive and distributed | Cultural Trendz |

Ely Kahn has spent more than a decade working in the national security world, including stints at the Transportation Safety Administration, Department of Homeland Security and the Executive Office of the President, where he was director of cybersecurity. In 2012, he joined up with a team of former National Security Agency engineers to create Sqrrl, a database company based on open-source Accumulo technology they created within the spy agency. Kahn came on the Structure Show podcast this week to talk about what the technology is really capable of, who’s using it and what’s in store when it comes to national security and technology.

Here are some highlights of an insightful interview that covers everything from the importance of baking security into Hadoop-based technologies (like Accumulo) to impending attacks on critical infrastructure. However, anyone interested in the whole story of how Accumulo works and how advanced analytic techniques can improve cybersecurity will want to hear the whole thing. They might also want to attend our Structure Data conference on March 19 and 20, where Booz Allen Hamilton’s Peter Guerra will be discussing the state of the art in using big data to combat cyber threats.

PRISM? Yeah, it’s that database

“Accumulo is at the centerpiece of NSA’s enterprise architecture. Most of NSA’s major analytical applications run on Accumulo,” Kahn said. “I won’t go in and state specifically each one, because I think that gets me into a slippery slope, but most of the ones that people have been reading about, those have a pretty good shot of having an Accumulo backend.”

Not only is it the centerpiece, but Accumulo might be just as capable as NSA critics assume it is. While it’s easy enough technologically to identify questionable behavior and target that, or to examine the networks of known suspects, the NSA has bigger ideas around what Kahn calls “patterns of life analysis”:

    “This really boils down to anomaly detection, which is a big focus for us. How do you establish a pattern of what’s normal and then detect outliers from that baseline of normalcy? That can cross truly a huge set of use cases….

    A lot of what we’re doing is around graph analytics now, and building huge, massive distributed graphs of data sets — building out what a normal graph of data will look like around a particular use case, and then looking for deviations from that normal pattern of behavior over time.”

For more on Accumulo and the NSA’s graph-analysis capabilities, check out our coverage from June, when the Edward Snowden story was first developing:

    Here’s how the NSA analyzes all the call data:

    Under the cover’s of the NSA’s big data effort:

The whole Defense Department is getting in on the act

What works for the NSA will work across the entire Department of Defense, it hopes. Accumulo was part of an NSA mission to build a utility cloud computing and data infrastructure that could aggregate its resources agency-wide, and the Defense Department now wants to bring all of its data — from drone footage down to medical data — into a single analyzable system.

“There’s a major effort underway called the Joint Information Environment to really develop utility cloud and data cloud architectures across the entire Department of Defense for a truly massive set of use cases, ranging from cybersecurity to battlefield intelligence to even medical use cases,” Kahn explained.

Companies might not like the NSA, but they respect its tech

“Regardless of what people feel from a political perspective about NSA, I think people recognize that NSA is a leader in these big data technologies and a leader in security. And so in that sense, I’d say it’s a mark of approval having NSA legacy,” Kahn said in a response to a question about whether Sqrrl’s NSA roots have been a blessing or a curse. “Of course, I also go to conferences and I have conversations with folks from Pandora or Facebook or consumer web app-type things, and folks at the ground level may have some questions about our history, but I think the decision makers see it as a good thing.”

How much do decision makers like the security aspects Sqrrl is pushing? “We’re installed in three of the Fortune 20 companies, five of the Fortune 50, and then dozens of others,” Kahn said. He later added, in reference to increased Accumulo support by Hadoop vendors Cloudera and Hortonworks, “I think what some of the big Hadoop vendors are seeing is that if they want to play in government, they need to support Accumulo.”
The state of cybersecurity: Scary as hell, but getting better

First, the good news, which has just come to fruition over the past few weeks:

    “Via executive order…there has been a major effort by both the Department of Homeland Security and the National Institute for Standards and Technology to create a cybersecurity framework that can be utilized to raise that bar, at least initially, on a voluntary basis. So really for the first time now, there is a document that people can go to that says ‘here are the minimum standards that everyone should be utilizing in these critical infrastructure sectors.’ It may sound simple, but for an area as complex as cybersecurity, this is a major step forward.”

However, Kahn added, a “major step forward” is far from perfection: “[M]inimum adherence to a baseline is not sufficient.”

And, he noted, “There have been some pretty scary reports about foreign nations probing our electric grid that have been reported in the New York Times but, yes, nothing disastrous has happened yet. Personally, I think that’s probably a matter of time, but fingers crossed.”

Vilma Bonilla's insight:

Insightful look at data strategy and tech within the DOD.

No comment yet.