e-Xploration
Follow
Find
20.2K views | +2 today
e-Xploration
antropologo.net, dataviz, collective intelligence, algorithms, social learning, social change, digital humanities
Curated by luiy
Your new post is loading...
Your new post is loading...
Rescooped by luiy from Knowledge Broker
Scoop.it!

The Strength of Weak Ties

The Strength of Weak Ties | e-Xploration | Scoop.it
The relationships between individuals with weak ties generate more innovation that those between individuals with a more constant and related relationships.

Via Kenneth Mikkelsen
luiy's insight:

Kenneth Mikkelsen's comment,Today, 8:40 PM


Read about Granovetter's work in his original research paper here:http://sociology.stanford.edu/people/mgranovetter/documents/granstrengthweakties.pdf
more...
Kenneth Mikkelsen's comment, July 2, 2013 11:40 AM
Read about Granovetter's work in his original research paper here: http://sociology.stanford.edu/people/mgranovetter/documents/granstrengthweakties.pdf
Rescooped by luiy from IT Books Free Share
Scoop.it!

book : Graph Theory with #Algorithms and its Applications

book : Graph Theory with #Algorithms and its Applications | e-Xploration | Scoop.it
eBook Free Download: Graph Theory with Algorithms and its Applications | PDF, EPUB | ISBN: 8132207491 | 2012-11-02 | English | PutLocker

Via Fox eBook
luiy's insight:

LINK : http://uploaded.net/file/0ago8mwc

more...
Fox eBook's curator insight, June 24, 2013 6:21 PM

Graph Theory with Algorithms and its Applications: In Applied Science and Technology
The book has many important features which make it suitable for both undergraduate and postgraduate students in various branches of engineering and general and applied sciences. The important topics interrelating Mathematics & Computer Science are also covered briefly. The book is useful to readers with a wide range of backgrounds including Mathematics, Computer Science/Computer Applications and Operational Research. While dealing with theorems and algorithms, emphasis is laid on constructions which consist of formal proofs, examples with applications. Uptill, there is scarcity of books in the open literature which cover all the things including most importantly various algorithms and applications with examples.

Scooped by luiy
Scoop.it!

Algorithms Every Data Scientist Should Know: Reservoir Sampling | #datascience #algorithms

Algorithms Every Data Scientist Should Know: Reservoir Sampling | #datascience #algorithms | e-Xploration | Scoop.it

Say you have a stream of items of large and unknown length that we can only iterate over once. Create an algorithm that randomly chooses an item from this stream such that each item is equally likely to be selected.

 

luiy's insight:

Data scientists, that peculiar mix of software engineer and statistician, are notoriously difficult to interview. One approach that I’ve used over the years is to pose a problem that requires some mixture of algorithm design and probability theory in order to come up with an answer. Here’s an example of this type of question that has been popular in Silicon Valley for a number of years: 

 

Say you have a stream of items of large and unknown length that we can only iterate over once. Create an algorithm that randomly chooses an item from this stream such that each item is equally likely to be selected.


The first thing to do when you find yourself confronted with such a question is to stay calm. The data scientist who is interviewing you isn’t trying to trick you by asking you to do something that is impossible. In fact, this data scientist is desperate to hire you. She is buried under a pile of analysis requests, her ETL pipeline is broken, and her machine learning model is failing to converge. Her only hope is to hire smart people such as yourself to come in and help. She wants you to succeed.

more...
No comment yet.
Scooped by luiy
Scoop.it!

Social Network Dynamics in a Massive Online Game | #dataviz #complexity

more...
No comment yet.
Scooped by luiy
Scoop.it!

The Spy Files #wikileaks Internet's spy map OWNIs | #surveillance

The Spy Files #wikileaks Internet's spy map OWNIs | #surveillance | e-Xploration | Scoop.it
The Spy Files Wikileaks The Internet's spy maps OWNI
more...
No comment yet.
Scooped by luiy
Scoop.it!

All about "Data mining tools" | Data Mining and Knowledge Discovery | #datamining #dataviz #datascience

All about "Data mining tools" | Data Mining and Knowledge Discovery | #datamining #dataviz #datascience | e-Xploration | Scoop.it
luiy's insight:

The development and application of data mining algorithms requires the use of powerful software tools. As the number of available tools continues to grow, the choice of the most suitable tool becomes increasingly difficult. This paper attempts to support the decision-making process by discussing the historical development and presenting a range of existing state-of-the-art data mining and related tools. Furthermore, we propose criteria for the tool categorization based on different user groups, data structures, data mining tasks and methods, visualization and interaction styles, import and export options for data and models, platforms, and license policies. These criteria are then used to classify data mining tools into nine different types. The typical characteristics of these types are explained and a selection of the most important tools is categorized.


This paper is organized as follows: the first section Historical Development and State-of-the-Art highlights the historical development of data mining software until present; the criteria to compare data mining software are explained in the second section Criteria for Comparing Data Mining Software. The last section Categorization of Data Mining Software into Different Types proposes a categorization of data mining software and introduces typical software tools for the different types.

more...
Fàtima Galan's curator insight, July 1, 2013 6:30 AM

Is a very interesting article because also provides the relation of DataMining Commercial and Open-Source tools and distinguish about DataMinin suites, Business Intelligence packages, Mathematical packages…..

Scooped by luiy
Scoop.it!

Building data science teams - O'Reilly Radar | #datascience #dataviz #bigdata

Building data science teams - O'Reilly Radar | #datascience #dataviz #bigdata | e-Xploration | Scoop.it
A data science team needs people with the right skills and perspectives, and it also requires strong tools, processes, and interaction between the team and the rest of the company.
luiy's insight:

Starting in 2008, Jeff Hammerbacher(@hackingdata) and I sat down to share our experiences building the data and analytics groups at Facebook and LinkedIn. In many ways, that meeting was the start of data science as a distinct professional specialization (see the “What makes a data scientist” section of this report for the story on how we came up with the title “Data Scientist”). Since then, data science has taken on a life of its own. The hugely positive response to “What Is Data Science?,” a great introduction to the meaning of data science in today’s world, showed that we were at the start of a movement. There are now regular meetups, well-established startups, and even college curricula focusing on data science. AsMcKinsey’s big data research report and LinkedIn’s data indicates, data science talent is in high demand.

more...
No comment yet.
Rescooped by luiy from Didactics and Technology in Education
Scoop.it!

Marc A. Smith - Charting Collections of Connections with Maps and Measures | #SNA #dataviz #nodexl

MARC A. SMITH Marc Smith is a sociologist specializing in the social organization of online communities and computer mediated interaction. Smith leads the Co...

Via João Greno Brogueira, Rui Guimarães Lima
more...
No comment yet.
Rescooped by luiy from Natural Language processing
Scoop.it!

A Data Scientist's Real Job: Storytelling | #datascience #bigdata

A Data Scientist's Real Job: Storytelling | #datascience #bigdata | e-Xploration | Scoop.it
Crunching numbers is only half the battle.

Via Mariana Soffer
luiy's insight:

When many people hear "Big Data," they think "Big Brother" (Type "big data is..." into Google and one of the top recommendations is, "...watching you."). Central to this anxiety is a feeling that what it means to be human can't be tracked or quantified by computers. This fear is well-founded. As the cost of collecting and storing data continues to decrease, the volume of raw data an organization has available can be overwhelming. Of all the data in existence, 90% was created in the last 2 years. Inundated organizations can lose sight of the difference between what's statistically significant and what's important for decision-making.

 

Using Big Data successfully requires human translation and context whether it's for your staff or the people your organization is trying to reach. Without a human frame, like photos or words that make emotion salient, data will only confuse, and certainly won't lead to smart organizational behavior.

 

more...
Mariana Soffer's curator insight, May 22, 2013 2:46 PM

Using Big Data successfully requires human translation and context whether it's for your staff or the people your organization is trying to reach. Without a human frame, like photos or words that make emotion salient, data will only confuse, and certainly won't lead to smart organizational behavior.

Data gives you the what, but humans know the why.

The best business decisions come from intuitions and insights informed by data. Using data in this way allows your organization to build institutional knowledge and creativity on top of a solid foundation of data-driven insights.

 
Scooped by luiy
Scoop.it!

Commetrix - Dynamic Network Visualization Software | #dataviz

Commetrix - Dynamic Network Visualization Software | #dataviz | e-Xploration | Scoop.it
Commetrix - Dynamic Network Visualization Software - Dynamic Visualization of Networks - Dynamic Social Network Analysis Software Visualization - Dynamic Network Analysis - Virtual Communities
luiy's insight:

Commetrix is a Software Framework for Dynamic Network Visualization and Analysis that supports Community Moderators, Members, and Network Researchers. It provides easy exploratory yet comprehensive access to network data and allows for:

Extracting virtual communities in electronic communication networks

Analyzing dynamic network change, properties, lifecycles, and structures 

Creating rich expert network maps or recommendation systems from communication logs or other network data sources (including surveys)

Searching, filtering, navigating social corpora, like e-mail, discussions

Understanding and utilizing your social networks

Trace dissemination of topics or properties through the network

Extendable to all sources of network data (e.g. collaborative work on electronic documents or contents, electronic project collaboration, VoIP telephony/Contact Centers, Instant Messaging, E-Mail, Discussion, ...)

more...
No comment yet.
Scooped by luiy
Scoop.it!

@pgloor : Coolhunting and Coolfarming through Swarm Creativit | #SNA #dataviz

@pgloor : Coolhunting and Coolfarming through Swarm Creativit | #SNA #dataviz | e-Xploration | Scoop.it
Intelligent Collaborative Knowledge Networks
luiy's insight:

IAP Course Coolnetworking 3.0Coolhunting and Coolfarming through Swarm Creativity

Special Course during January 2013 MIT IAP independent activities period, Jan 9 to 11, 2013. to register send e-mail to pgloor@mit.edu

This course consists of three parts, part I is the foundation for parts II and III, parts can be taken separately.

Course times: Wednesday Jan 9, Thursday Jan 10, Friday Jan 11, 2013: 03-06:00pm

MIT Web Site        

Day 1: I. How to Be an Efficient (Online) Networker

Part I is for everybody who would like to learn how they can be more efficient in their online and face-to-face networking. It consists of the twenty rules for networking, based on 10 years of research analyzing (online) social networks by the instructor.

General networking rules: 2 for personal and 3 for group networksOnline networking rules: 3 for e-mail, 2 for Facebook, 3 for LinkedIn3 face-to-face networking rules

As part of the course, you will create a "virtual mirror" of your own communication behavior, telling you how much of a “star” or a “galaxy” you are, analyzing your own Facebook and e-mail networks. You will use the new Condor 3.0lite, which allows easy download of individual Facebook and E-Mail networks. To experiment with the coolhunting software tools you will need to bring your own laptop with a working WiFi connection. You will get the software on the first course day on a USB memory stick.

more...
No comment yet.
Scooped by luiy
Scoop.it!

#databrokers : Finally You'll Get To See The Secret Consumer Dossier They Have On You | #privacy #controverses

#databrokers : Finally You'll Get To See The Secret Consumer Dossier They Have On You | #privacy #controverses | e-Xploration | Scoop.it
Emerging from the shadows: Acxiom will soon let people see their previously secret consumer files (Photo credit: Adam Tanner) For the first time ever, the big daddy of all data brokers is nearly ready to show consumers their intimate personal dossiers,...
luiy's insight:

For the first time ever, the big daddy of all data brokers is nearly ready to show consumers their intimate personal dossiers, a move aimed at staving off public fears of Big Brother and government regulation.

 

Since the company’s founding in 1969, Acxiom ACXM -0.13%, the data giant with profiles on 700 million individuals, has never allowed people to see their own commercial profiles extensively used by major companies for marketing. By the end of the summer, perhaps around Labor Day, the Little Rock, Arkansas-based company with more than a billion dollars in annual sales a year, will open up the vault, company officials say.

more...
No comment yet.
Rescooped by luiy from Social and digital network
Scoop.it!

L'ideal "data scientists" : A la recherche des 'Data scientists' | #datascience #bigdata

Le Big Data analytique implique des compétences nouvelles visant les technologies de traitement de données très diverses. D'où le job de Data Scientist.

Via Duarte Terencio, Pascale Mousset
luiy's insight:
Comment ces problématiques se traduisent-elles sur le terrain ? Aujourd’hui, la réalisation d’analytiques sur le Big Data nécessite des compétences que ne possèdent pas tous les développeurs d’outils de BI ou les utilisateurs de tableurs de type Excel ! Ces compétences doivent porter sur le pilotage des données Big Data en environnements non structurés et l’application de statistiques. Nous sommes donc là en présence de deux métiers qui s’ignorent : l’informaticien et le statisticien.

Pourtant un métier pourrait émerger, dont la promesse est de réunir ces deux compétences : le Data scientist. La fonction a un nom, il  faut maintenant trouver la perle rare qui saura réunir ces compétences. Comme son nom l’indique, ce nouveau métier réunit les mondes de la donnée, donc du stockage en Big Data et du traitement de la donnée, et celui des sciences – mathématiques et statistiques…

 

Ne le cherchez pas dans les cursus de formation universitaires et de formation des ingénieurs, ces deux mondes ne cohabitent pas… ou tout du moins pas encore. Certaines universités américaines proposent déjà des formations de Data Scientist, les écoles européennes et françaises suivront. Attendons encore 2 à 3 ans avant que sortent des moules les premiers ingénieurs Data scientists diplômés.

 

En attendant, le focus des entreprises qui lancent des solutions de Big Data analytique porte principalement sur le déploiement des infrastructures de stockage et de compilation des données. Autour des technologies Hadoop.

 

Quelques intégrateurs et sociétés de services ont commencé à compenser l’absence de Data scientist par la réunion d’équipes pluridisciplinaires réunissant des spécialistes des IT et des statisticiens. Un mariage délicat pour le moment, surtout concentré sur la création des premiers PoC (Proof of concept). Les entreprises les plus avancées dans le Big Data analytique sont rares, mais existent, à l’image d’IBM. Le recrutement de scientifiques et statisticiens par une société dont la culture est à la fois orientée IT et R&D leur offre un avantage unique, celui de disposer de Data scientists.

Si la perle rare du Big Data analytique, le Data Scientist, existe, encore faut-il la trouver… ou faire appel au bon partenaire.

more...
Duarte Terencio's curator insight, June 25, 2013 12:56 AM

Une serie d'articles sur le BigData

Pascale Mousset's comment, June 25, 2013 1:13 AM
Tres technique mais compréhensible globalement par une non experte. Big data is future !
Duarte Terencio's comment, June 25, 2013 5:57 AM
Une partie du futur ..
Scooped by luiy
Scoop.it!

Using SEO Tools to extract Twitter JSON data into an Excel file | #dataviz

This is the second of a set of 2 videos - explaining how to get Twitter information dynamically into an Excel spreadsheet. This example shows the use of the ...
more...
No comment yet.
Scooped by luiy
Scoop.it!

How #Algorithms Change The World As We Know It [Infographic]

How #Algorithms Change The World As We Know It [Infographic] | e-Xploration | Scoop.it
The algorithms around us every day change the world constantly. This infographic goes into detail about some of the most profound algorithms in our lives.
more...
No comment yet.
Rescooped by luiy from Big Data, Cloud and Social everything
Scoop.it!

What happens when the world turns into one giant brain | #algorithms #bigdata

What happens when the world turns into one giant brain | #algorithms #bigdata | e-Xploration | Scoop.it
Currently much of the big data being churned out is merely exhaust. But imagine the possibilities once we figure out how to produce and process better data on the fly on a global scale. Call it Big Inference.

Via Pierre Levy
luiy's insight:
Key problems in the way

To get there though we’ll have to confront a number of hurdles:

We need to gather the data. Emerging, massively distributed and networked sensors will be the equivalent of human sensory transducers like rods and cones. The rise of the Internet of Things also means that every device will be able to contribute its own data stream to a collective understanding of the current state of the world.

 

Much of the content of big data these days is exhaust – data originally collected for transactional or other purposes, for which mining and analysis are afterthoughts, and whose characteristics are often ill-suited to further analysis. This will certainly change, as data collection matures into a process explicitly designed to improve our peceptual and decision-making capabilities.

 

We need the processing power to interpret the data While it has become fashionable to note how cheap compute cycles have become, it’s certainly not the case that we can process billions or trillions of input streams in real time –especially when we need to find patterns that are distributed across many noisy and possibly contradictory sensor inputs (i.e., we can’t just process each stream in isolation). We may need to develop new processor technologies to handle these kind of astronomically parallel and heterogeneous inputs.

 

We need the algorithms. To actually make sense of the data and decide what actions and responses to take, we have to figure out how to extract high-level patterns and concepts from the raw inputs. There is an ongoing debate over the right approach: Most researchers will say that we need something more “brain-like” than current systems, but there are many different (and opposing) theories about which aspects of our brain’s computational architecture are actually important. My own bet is on probabilistic programming methods, which are closely aligned with an emerging body of theory that views the brain as a Bayesian inference and decision engine.

more...
No comment yet.
Scooped by luiy
Scoop.it!

NSA slides explain the #PRISM data-collection program - The Washington Post | #privacy #surveillance

NSA slides explain the #PRISM data-collection program - The Washington Post | #privacy #surveillance | e-Xploration | Scoop.it
Through a Top-Secret program authorized by federal judges working under the Foreign Intelligence Surveillance Act (FISA), the U.S. intelligence community can gain access to the servers of nine internet companies for a wide range of digital data.
luiy's insight:

The top-secret PRISM program allows the U.S. intelligence community to gain access from nine Internet companies to a wide range of digital information, including e-mails and stored data, on foreign targets operating outside the United States. The program is court-approved but does not require individual warrants. Instead, it operates under a broader authorization from federal judges who oversee the use of the Foreign Intelligence Surveillance Act (FISA). Some documents describing the program were first released by The Washington Post on June 6. The newly released documents below give additional details about how the program operates, including the levels of review and supervisory control at the NSA and FBI. The documents also show how the program interacts with the Internet companies. These slides, annotated by The Post, represent a selection from the overall document, and certain portions are redacted. Read related article.

more...
No comment yet.
Scooped by luiy
Scoop.it!

This Is What It Feels Like to Pass Through A Singularity | #privacy #surveillance

This Is What It Feels Like to Pass Through A Singularity | #privacy #surveillance | e-Xploration | Scoop.it
The government has an automated system to track your movements and monitor who your friends are. Our news comes from remote-controlled "drone reporters." There's a device in your pocket that can produce a sex partner for you at the touch of a button.
luiy's insight:

The government has an automated system to track your movements and monitor who your friends are. Our news comes from remote-controlled "drone reporters." There's a device in your pocket that can produce a sex partner for you at the touch of a button. Maybe the singularity just happened, and we didn't notice.

 

Perhaps the most shocking aspect of whistleblower Ed Snowden's recent revelations about the NSA's surveillance of Americans is how little they shocked most people. A common response was that we already knew the government was spying on us, or that only a fool would think their emails and phone calls were private. Snowden's story was just confirmation of something many of us already took for granted. And yet it blew up into the story of the year because it was also a genuine revelation. Our vague, occasionally paranoid, suspicions that we live in a landscape alive with surveillance devices turned out to be true.

 

What is that feeling, the uncanny realization that you are actually living in your own fantasies? In the 1970s, Alvin and Heidi Toffler called it "future shock." Today, we might call it passing through the singularity. Either way, we've gone from dreaming about a world that might be real, to accepting that our dreams are hard facts.

more...
No comment yet.
Scooped by luiy
Scoop.it!

Twitter visualizes billions of tweets in artful, interactive 3D maps | #dataviz

Twitter visualizes billions of tweets in artful, interactive 3D maps | #dataviz | e-Xploration | Scoop.it
Twitter visualizes billions of tweets in artful, interactive 3D maps The Verge Today, the social network is getting artsy once agsain, using the same dataset — which it calls Billion Strokes — to produce interactive elevation maps that render...
luiy's insight:

On June 1st, Twitter created beautiful maps visualizing billions of geotagged tweets. Today, the social network is getting artsy once agsain, using the same dataset — which it calls Billion Strokes — to produce interactive elevation maps that render geotagged tweets in 3D. This time around, Twitter visualized geotagged tweets from San Francisco, New York, and Istanbul in maps that viewers can manipulate.

 

For each city map, Twitter gives users the option of adding eight different layers over the topography. Users can also change the size of the elevation differences mapped out, to get a better idea of where most tweets are sent from. The maps can be seen from either an overhead view, or on a horizontal plane. The resulting maps looking like harsh mountain ranges, but the peaks and valleys aren't representative of the land — rather, a peak illustrates a high amount of tweets being sent from that location, while a trough displays an area where fewer tweets are sent. The whole thing was put together byNicolas Belmonte, Twitter's in-house data visualization scientist. You can check out the interactive maps on Twitter's GitHub page.

more...
No comment yet.
Rescooped by luiy from Tech for well-being, health,Technologies et santé
Scoop.it!

#Watson : Cognitive systems IBM | #AI #deeplearning

#Watson : Cognitive systems IBM | #AI #deeplearning | e-Xploration | Scoop.it
Watson - Vous avez sûrement entendu parler de Watson et de ses performances dans le jeu Jeopardy! relayés dans tous les médias. En savoir plus.

Via SERENDIPITIC
luiy's insight:
Comment fonctionne Watson ?
Durant les parties de Jeopardy!, Watson analysait les questions qui lui étaient posées afin d’en saisir le sens et d’identifier ce qui lui était demandé. Il se plongeait ensuite dans les 200 millions de pages de langage naturel que contient sa mémoire dans le but de trouver la réponse exacte à la question. Il effectuait tout cela en moins de trois secondes et apportait également des arguments quant à la justesse de la réponse. Plus loin que Deep Blue

Watson va encore plus loin que Deep Blue (qui analysait un monde fini de possibilités) : en effet, Watson représente une véritable innovation dans la compréhension par la machine du langage naturel (le " langage réel ") utilisé par chacun d'entre nous pour communiquer et échanger. Ce qui est incroyable, c'est qu'il est même capable de comprendre les jeux de mots, les ambiguïtés ou l’ironie.

 

Watson est l’illustration la plus récente de l’impact de l’investissement d’IBM en Recherche et Développement (R&D IBM : 6 milliards de dollars, 5896 brevets déposés en 2010, 9 centres de recherche dans le monde, employant 3000 chercheurs).

more...
Rescooped by luiy from Social Network Analysis - Practicum
Scoop.it!

Twitter Visual Exploration | #dataviz

Exploring Social Network Twitter using Rhizome Navigation http://www.rhizomenavigation.net

Via João Greno Brogueira, ThePinkSalmon, Pablo Torres
luiy's insight:

Rhizome Navigation is a framework for building graph based interactive environments and interfaces for a broad range of applications. The samples below include blogosphere visualizations, text mining, genealogy visualizations and prototypes of interfaces for graph based knowledge management.


Please contact me if you're interested in services based on Rhizome Navigation


Walter Rafelsberger — @walterra

more...
No comment yet.
Rescooped by luiy from Natural Language processing
Scoop.it!

Agent-Based Models of Strategies for the Emergence and Evolution of Grammatical Agreement | #ABM

Agent-Based Models of Strategies for the Emergence and Evolution of Grammatical Agreement | #ABM | e-Xploration | Scoop.it

Grammatical agreement means that features associated with one linguistic unit (for example number or gender) become associated with another unit and then possibly overtly expressed, typically with morphological markers. It is one of the key mechanisms used in many languages to show that certain linguistic units within an utterance grammatically depend on each other. Agreement systems are puzzling because they can be highly complex in terms of what features they use and how they are expressed. Moreover, agreement systems have undergone considerable change in the historical evolution of languages. This article presents language game models with populations of agents in order to find out for what reasons and by what cultural processes and cognitive strategies agreement systems arise.

 

Beuls K, Steels L (2013) Agent-Based Models of Strategies for the Emergence and Evolution of Grammatical Agreement. PLoS ONE 8(3): e58960. http://dx.doi.org/10.1371/journal.pone.0058960


Via Complexity Digest, Mariana Soffer
luiy's insight:

We presented here the first agent-based models to explore how and why a grammatical agreement system may originate and get culturally transmitted in a process of cultural invention and social learning, based on the hypothesis that agreement systems are useful to avoid combinatorial explosions in parsing and semantic ambiguity in interpretation. Agreement systems thus help to minimize cognitive effort and maximize communicative success. After demonstrating how formal markers could arise, we presented strategies showing how meaningful markers could originate, and how markers could become recruited from existing words. We demonstrated also how recruited words could erode to lead to greater articulatory efficiency, at a cost of giving fewer hints for new language users, and how coercion helps to apply an agreement system more broadly so that fewer agreement markers are needed.

more...
No comment yet.
Rescooped by luiy from History 2[+or less 3].0
Scoop.it!

Dynamic Network Visualization - Wars on Earth over time (1816-2001) | #dataviz

"This dynamic network visualization shows a dynamic picture of the global war conflicts between 1816 and 2001. The network relationships indicate which country was in conflict with another country. In the first part of the video the network data was overlayed over a geographic world map to show global reach. The second part shows the pure network layout in 3D. The dynamic network analysis and animations were generated with the software Commetrix (www.commetrix.de) by M.Schulz and R.Hillmann of IKMResearch at TU Berlin."


Via João Greno Brogueira, Rui Guimarães Lima
more...
Leoncio Lopez-Ocon's curator insight, June 28, 2013 9:33 AM

Visualizando las dinámicas de los conflictos mundiales de la era contemporánea usando el sotware Commetrix

Scooped by luiy
Scoop.it!

Sorry, #NSA, Terrorists Don't Use Verizon. Or Skype. Or Gmail. | #privacy #dataawareness

Sorry, #NSA, Terrorists Don't Use Verizon. Or Skype. Or Gmail. | #privacy #dataawareness | e-Xploration | Scoop.it
It turns out the NSA is compromising our privacy in order to do a terrible job of looking for terrorists.
luiy's insight:

The NSA has to collect the metadata from all of our phone calls because terrorists, right? And the spy agency absolutely must intercept Skypes you conduct with folks out-of-state, or else terrorism. It must sift through your iCloud data and Facebook status updates too, because Al Qaeda.

Terrorists are everywhere, they are legion, they are dangerous, and, unfortunately, they don't really do any of the stuff described above. 

Even though the still-growing surveillance state that sprung up in the wake of 9/11 was enacted almost entirely to "fight terrorism," reports show that the modes of communication that agencies like the NSA are targeting are scarcely used by terrorists at all.



Read more: http://motherboard.vice.com/blog/hey-nsa-terrorists-dont-use-verizon-or-skype-or-gmail#ixzz2XLCpo5g8 ;
Follow us: @motherboard on Twitter | motherboardtv on Facebook

more...
No comment yet.
Scooped by luiy
Scoop.it!

Big data brokers are "taking advantage of us without our permission" | #databrokers #privacy #bigdata

Big data brokers are "taking advantage of us without our permission" | #databrokers #privacy #bigdata | e-Xploration | Scoop.it
commissioner julie brill called on congress to craft legislation giving consumers access to their data, suggests mobile device ids are personally identifiable.
luiy's insight:

Big data brokers are "taking advantage of us without our permission." Those were the words of Federal Trade Commissioner Julie Brill this morning at the Computers, Freedom and Privacy Conference in Washington.

 

The commissioner, often vocal on data-privacy issues, called on Congress to legislate what she calls a "Reclaim Your Name" program, one that would establish technical controls allowing people to access the information data collectors have stored about them, control how it is shared and correct it when necessary.

 

The commissioner suggested such a program could operate in tandem with the browser-based Do Not Track standard currently in development. That slow-moving process, however, is under intensifying scrutiny as some World Wide Web Consortium participants question the chances of reaching consensus on key tech and policy elements of DNT, and as Firefox maker Mozilla plans a new approach to DNT with Stanford University.

 

"I urge the W3C stakeholders to forge ahead and reach consensus" on DNT, said Ms. Brill.

 

But there's no reason why "big data" cannot coexist with the establishment of standards for DNT and Reclaim Your Name, she added.

more...
No comment yet.