Homo Numericus Bis
1.4K views | +0 today
Follow
Homo Numericus Bis
humanités numériques
Curated by Mlik Sahib
Your new post is loading...
Your new post is loading...
Rescooped by Mlik Sahib from The urban.NET
Scoop.it!

The #Algorithmic City | #smartcities #opendata

The #Algorithmic City | #smartcities #opendata | Homo Numericus Bis | Scoop.it
:snips - We Scale Cities

Via luiy
more...
luiy's curator insight, July 10, 2014 6:00 AM

What else can we predict? In theory, any event that is not random, provided we have enough data to model the context. Examples include passenger load in public transports, availability of parking spots, traffic jams, waste production, energy consumption and revenues of a shop in a specific street. These all share a common underlying principle: use context rather than history to predict behavior.

 

In themselves, each of these predictions could lead to amazing new products and services. The real power though comes from integrating everything together and modeling an entire city and its interactions with people. For instance, if you can predict where people will need to go tomorrow, then you can create optimal bus routes, minimizing time to destination and walking distance, taking into account predicted traffic, weather and garbage collection schedules. In this ideal system, all services would be optimal and available to citizens at anytime. We call this new way of designing cities "Algorithmic Urbanism".

Rescooped by Mlik Sahib from Public Datasets - Open Data -
Scoop.it!

7 ways #BigData could revolutionize life | #health #hadoop #algorithms

7 ways #BigData could revolutionize life | #health #hadoop #algorithms | Homo Numericus Bis | Scoop.it
7 ways Big Data could revolutionize life by 2020 #infographic | See more about big data.

Via RuleIS, luiy
more...
No comment yet.
Rescooped by Mlik Sahib from e-Xploration
Scoop.it!

#Algorithms are watching I #privacy

#Algorithms are watching I #privacy | Homo Numericus Bis | Scoop.it
In his prescient novel '1984,' English author George Orwell predicted a future that bears an uncanny resemblance to current reality—except for a simple twist.

Via luiy
Mlik Sahib's insight:

Eli Pariser, co-founder of the Internet news site Upworthy, coined the term "filter bubble" to describe how invisible algorithmic editing selectively guesses the information that users would like to see based on their past click behavior, search history and location. The results, however, can be quite one-sided. "There's a sense of being placed in this echo chamber - a term people use a lot," Zhao said. "Whatever you already believe, whatever you already like tends to get reflected back at you. If you're a hardcore liberal Democrat, for instance, Google shows you news from blue-leaning states. If you're a conservative Republican, then you get everything that's slanted that way."

more...
luiy's curator insight, December 9, 2013 4:56 PM

As algorithms become more sophisticated, their influence over our lives increases exponentially. "Much of what we see today is customized for us because of all the data tracking done by Google and Facebook," Zhao said. "They customize everything for you because of what you've already done." He and other researchers are trying to understand just how much this impacts us and to what extent data tracking influences what we see on a daily basis.

 

Eli Pariser, co-founder of the Internet news site Upworthy, coined the term "filter bubble" to describe how invisible algorithmic editing selectively guesses the information that users would like to see based on their past click behavior, search history and location. The results, however, can be quite one-sided. "There's a sense of being placed in this echo chamber - a term people use a lot," Zhao said. "Whatever you already believe, whatever you already like tends to get reflected back at you. If you're a hardcore liberal Democrat, for instance, Google shows you news from blue-leaning states. If you're a conservative Republican, then you get everything that's slanted that way."



Read more at: http://phys.org/news/2013-12-algorithms.html#jCp

Rescooped by Mlik Sahib from e-Xploration
Scoop.it!

'Social dispersion': the Facebook factor that predicts relationships - and when they will end | #datascience

'Social dispersion': the Facebook factor that predicts relationships - and when they will end | #datascience | Homo Numericus Bis | Scoop.it
A scientific paper authored by a computer scientist and a senior engineer at Facebook has shown how your online social networks not only reveal who you’re going out with, but also when you’ll break up (and yes, that's without checking your...

Via luiy
more...
luiy's curator insight, October 31, 2013 1:12 PM

Backstrom and Kleinberg found that looking at just the number of mutual friends between any two individuals – a factor known as ‘embeddedness’ - was not actually a strong indicator the pair were in a relationship, and that instead a quality known as ‘dispersion’ was far more telling.

 

Dispersion measures not only mutual friends but the network structures that connect these friends together. ‘Low dispersion’ – the quality that was associated with couples – indicates not only that two people have a large number of mutual friends, but also that these mutual friends knew one another.

 

Essentially, romantic partners act as social bridges between individuals’ networks, introducing people to each other and creating friendships. Eg, you might go for drinks with your boyfriend's friends from work and bring some of your friends from home to meet them.

Using this dispersion algorithm Backstrom and Kleinberg  were able to correctly identify who somebody’s spouse was 60 per cent of the time and correctly guess somebody’s partner a third of the time – a far better return than the 2 per cent success rate from pure guesswork.

Rescooped by Mlik Sahib from Anthropology, communication & technology
Scoop.it!

“Culture now has two audiences: people and machines" : A conversation with Ted Striphas

“Culture now has two audiences: people and machines" :  A conversation with Ted Striphas | Homo Numericus Bis | Scoop.it

How are technology and culture shaping each other?

This is a difficult question, but only because we cannot presume to know in advance what “technology” and “culture” mean. For my part, I believe it’s always better to think of both as moving targets.

Technology and culture can “shape” or “influence” each another if and only if one proceeds from the assumption that they are separable, conceptually or semantically. For most of the past two centuries this has effectively been the case, but it is has not always been so. Until about 1800, the word “culture” in English referred to husbandry—that is, to techniques for tending crops and domesticated animals, including selective breeding. Sometimes it was used interchangeably with the world “coulter,” which is a part of a plough. Technology and culture used to be very closely aligned, so much so that it was difficult to imagine the one apart from the other.


Via Jessica Parland, nicolasthely, luiy, Pierre Levy, Andrea Naranjo
more...
luiy's curator insight, May 21, 2014 6:37 AM

How will you define the “Culture of Algorithms”?


My preferred phrase is “algorithmic culture,” which I use in the first instance to refer to the the ways in which computers, running complex mathematical formulae, engage in what’s often considered to be the traditional work of culture: the sorting, classifying, and hierarchizing of people, places, objects, and ideas. The Google example from above illustrates the point, although it’s also the case elsewhere on the internet. Facebook engages in much the same work in determining which of your friends, and which of their posts, will appear prominently in your news feed. The same goes for shopping sites and video or music streaming services, when they offer you products based on the ones you (or someone purportedly like you) have already consumed.

 

What’s important to note, though, is the way in which algorithmic culture then feeds back to produce new habits of thought, conduct, and expression that likely wouldn’t exist in its absence—a culture of algorithms, as it were. The worry here, pointed out by Eli Pariser and others, is that this culture tends to reinforce more than it challenges one’s existing preferences or ways of doing things. This is what is often called “personalization,” though Pariser calls it a “you loop” instead. By the same token, it is possible for algorithmic systems to introduce you to cultural goods that you might not have encountered otherwise. Today, culture may only be as good as its algorithms.

Rescooped by Mlik Sahib from Cyborgs_Transhumanism
Scoop.it!

A Wikipedia for #robots allowing them to #share knowledge and experience worldwide | #algorithms

A Wikipedia for #robots allowing them to #share knowledge and experience worldwide | #algorithms | Homo Numericus Bis | Scoop.it

European scientists from six institutes and two universities have developed an online platform where robots can learn new skills from each other worldwide — a kind of “Wikipedia for robots.” The objective is to help develop robots better at helping elders with caring and household tasks. “The problem right now is that robots are often developed specifically for one task”, says René van de Molengraft, TU/e researcher and RoboEarth project leader.

 

“RoboEarth simply lets robots learn new tasks and situations from each other. All their knowledge and experience are shared worldwide on a central, online database.” In addition, some computing and “thinking” tasks can be carried out by the system’s “cloud engine,” he said, “so the robot doesn’t need to have as much computing or battery power on‑board.”

 

For example, a robot can image a hospital room and upload the resulting map to RoboEarth. Another robot, which doesn’t know the room, can use that map on RoboEarth to locate a glass of water immediately, without having to search for it endlessly. In the same way a task like opening a box of pills can be shared on RoboEarth, so other robots can also do it without having to be programmed for that specific type of box.

 

RoboEarth is based on four years of research by a team of scientists from six European research institutes (TU/e, Philips, ETH Zürich, TU München and the universities of Zaragoza and Stuttgart).

 

 

Robots learn from each other on 'Wiki for robots'


Via Dr. Stefan Gruenwald, luiy
more...
Rescooped by Mlik Sahib from e-Xploration
Scoop.it!

#Facial Recognition #Analytics - When #Algorithms Grow Accustomed to Your Face

#Facial Recognition #Analytics - When #Algorithms Grow Accustomed to Your Face | Homo Numericus Bis | Scoop.it
Companies are developing software to analyze our fleeting facial expressions and to get at the emotions behind them.

Via AnalyticsInnovations, luiy
more...
luiy's curator insight, December 1, 2013 9:30 AM

Ever since Darwin, scientists have systematically analyzed facial expressions, finding that many of them are universal. Humans are remarkably consistent in the way their noses wrinkle, say, or their eyebrows move as they experience certain emotions. People can be trained to note tiny changes in facial muscles, learning to distinguish common expressions by studying photographs and video. Now computers can be programmed to make those distinctions, too.

 

Companies in this field include Affectiva, based in Waltham, Mass., and Emotient, based in San Diego. Affectiva used webcams over two and a half years to accumulate and classify about 1.5 billion emotional reactions from people who gave permission to be recorded as they watched streaming video, said Rana el-Kaliouby, the company’s co-founder and chief science officer. These recordings served as a database to create the company’s face-reading software, which it will offer to mobile software developers starting in mid-January.

Robert McKenzie's curator insight, December 1, 2013 6:08 PM

This is an emerging field and complements some of the post GFC analytics . e.g. people who take less than 3 weeks leave in 1 stint are more likely to have breached policies...add to that facial and voice recognition. A UK university was looking at IR camera's in immigration based upon the hypothesis that 'untruth' caused greater brain activity that could be picked up on an IR camera as a trigger for deeper enquiry. Sentiment++

Ali Anani's curator insight, December 3, 2013 9:33 AM

Information from faces ans how to turn information into knowledge