Connecting Media, Audience and Advertsing with Data. How media companies can benefit from Big Data, Open Data , Linked Data, Small Data to value their assets, improve their audience satisfaction and better monetize their inventories
Connecting Media, Audience and Advertising with Data
24 June 2014 – UAM Madrid
150 professionals / 30 speakers
keynotes / round tables / user cases / networking
Digitalization has positioned data in the center of the audiovisual media ecosystem.
A better control of content combined with a better understanding of the audience already enables to back up editorial decisions. It opens opportunities to offer more relevant content and more personalized experiences adapted to multi-device and contextual environment.
For publishers, how to efficiently leverage data has turned into a strategic question and became a new key to power its editorial, marketing and sales activities.
Good use of data enables to structure catalogues, reinforce the value of its assets, better know its audience and monetize its inventories thanks to new tools. When it is well controlled it can lead to find new audiences and open ways to diversification.
How Big, Small, Open and Linked Data can transform media contents into structured, rich and actionable knowledge?
How data help to connect audience with relevant content and drive the entertainment experience?
How business intelligence and real time analytic open doors for advanced advertizing solutions and new business opportunities for content owners and mediabrands?
On the 24th of June in Madrid, 150 professionals with 30 experts will explore those questions via practical cases, inspiring keynotes and debates.
In parallel, workshops will allow to better understand the practical operations of professional solutions critical for the success of Big Media.
Few would have predicted the match results following thirty-two games into the 2014 World Cup, but many had predicted the remarkable second screen activity during broadcasts of the competition.
The opening rounds have been noteworthy for on-pitch activity, but also for the on-screen activity of millions of smartphone, tablet and connected app viewers around the world.
During some qualification matches, more than 400,000 dual-screeners were sync’ed to global telecasts, and connected with supplementary content provided by app developers and app users.
Fans are able to make live predictions, rate player actions, share top moments and personal passions, post photos and videos, react to opinions on match topics ('That yellow card was not deserved!' 'Those aren't teeth, they're special implants!') and tweeting from the inside the applications.
Visiware is another leading provider of second-screen solutions, and its recently launched Sync2Ad platform uses a synchronized mobile ad-unit that plays in real-time with TV spots, as developers continue to explore ways that the second screen brings together audiences, advertisers and content in a unique interactive media experience.
Dedicated apps for the event, such as Televisa’s Deportes, ESPNSync, and L’Equipe Connect, are powered by Visiware's Sync2TV platform.
Sync2TV powers interactive engagement for its clients World Cup 2014 matches on iOS devices, Android devices, and on the Web, providing live game stats, gamification, social buzz, and video commentary.
“The World Cup 2014 has shown even greater uptake for second screen activity than we thought. The second screen has become a key component of sports and entertainment consumption,” said Laurant Weill, founder and executive chairman of Visiware. “Users are now telling us that they will no longer see a game without a second-screen application.”
When Breaking Bad scored three Emmy wins last fall, its showrunner, Vince Gilligan, credited Netflix for his show’s longevity and for heightening its popularity. Similarly, program execs in general have been thrilled with how streaming video services have made up for lost DVD revenue. But a little bloom is off the rose.
“The places we have to go to find [usage data for Netflix] are either private research companies that field panels, which you pay a lot of money to maintain, or we can do surveys. But surveys are notoriously unreliable,” said one network executive. That person notes that some Netflix rivals, like Hulu (which is owned by programmers and therefore more transparent) and iTunes (which is a pay-to-play model) aren’t as problematic.
Yahoo has released a massive dataset for researchers to experiment on. The dataset includes URLs for nearly 100 million images and 700,000 videos, as well as their metadata. Soon, a larger supercomputer-processed dataset that includes audio and visual features will be available.
Tandis que le temps fort de la Coupe du Monde de Football produit ses effets sur la marque L’Equipe, Marianne Siproudhis, Présidente de Amaury Médias depuis 5 ans, revient sur la stratégie de la régie qui entame une phase «techno». Elle sera illustrée à la rentrée par l’offre data de la régie. En 2016, 40% du CA de la régie sera réalisé par le digital et la TV. Durée : 4min
It s been said for years that the page view is dead as a way to measure media on the web. Now, finally, there may be a replacement. Advertisers and publishers are increasingly asking if time or “attention” — proven time spent engaging with media — can work instead.
Microsoft has been on quite a cloud roll lately and today it announced a new cloud-based machine learning platform called Azure ML, which enables companies to use the power of the cloud to build applications and APIs based on big data and predict future events instead of looking backwards at what happened.
The product is built on the machine learning capabilities already available in several Microsoft products including Xbox and Bing and using predefined templates and workflows has been built to help companies launch predictive applications much more quickly than traditional development methods, even allowing customers to publish APIs and web services on top of the Azure ML platform.
With each of these projects, our data team and our colleagues across the newsroom learned so much. We learned new ways to visualize data and new code libraries for user interaction. We learned about code management and how to (and how not to) open source our work. We learned how to report, design and release smarter, more engaging interactives. As we wrapped things up in our closing days, I realized that some of the biggest lessons from our time at Thunderdome were more cultural than technical, and they were ones we never anticipated in our previous jobs in print-first newsrooms.
If you do good work, other journalists will pay attention. Every project was a learning opportunity for our team, for colleagues at Thunderdome and at our local papers.
A big part of our data team’s mission was to evangelize for smart watchdog reporting, strong data analysis and good online presentation. We wanted to share our ideas and to learn from our colleagues. It was gratifying to see others pick up on this, whether it was a producer in Thunderdome embracing DocumentCloud or an editor in Michigan asking for advice on tools for searchable tables and interactive maps.
When we began, we inherited a small email list of a couple dozen local DFM journalists who were data-curious. By the time Thunderdome shut down, the listserv had grown to about 120 people, from San Jose to New Haven. Newspaper journalists want to learn how to do data analysis and develop for the web. A newsroom that supports them in practicing and mastering those skills will reap the benefits. You have to start building that culture.
It's hard to believe that just 25 years ago the idea of linking together databases so people could access information easily was limited to research universities and prescient sci-fi writers.With the advent of the worldwide web and HTML, databases came to the masses in the form of pages. Pages could be made up of images, text, video and even links with web pages linking to other web pages. We no longer needed a librarian to retrieve facts nor the skillset of a researcher to tap into bodies of knowledge.But with the advent of the broader Internet, information was liberated - and for many businesses and governments that leverage this data, the volume and complexity has spiralled out of control. Almost every business is a publisher and quickly good information was littered with bad information.Authoritative websites might have excellent data quality – but struggled to organise information in an adequate way. All the progress of unleashing useful information morphed into a calamitous expanse of information of the world being dumped together. The reader continues to work to make sense and find their own way through the chaos.
The online streaming music service's web API gives developers the ability to create robust streaming music applications.ProgrammableWeb Mashup & API Directory News, Streaming, Media, MusicSPOTIFY LAUNCHES NEW VERSION OF ITS WEB APIPatricio RoblesJun. 17 2014, 02:12PM EDTSpotify, one of the most popular online streaming music services in the world with more than 20 million subscribers, today unveiled a new version of its web API that gives developers the ability to create robust streaming music applications.The latest version of the company's REST API offers developers access to richer metadata such as album cover art and 30-second song previews. In addition, using OAuth 2.0, developers now have the ability to build applications that incorporate profile data from users. This includes subscriber status and playlists. As Spotify's José Manuel Pérez explained in a blog post, "Among other things, this means web apps can now build real Spotify playlists that users can listen to later using a wide variety of methods."One of the biggest additions to the Spotify web API is the integration of Echo Nest functionality. Echo Nest, a music discovery platform with an API used by hundreds of music and nonmusic apps, was acquired by Spotify in March. "The Echo Nest now has the most up-to-date view of Spotify’s catalog, and you can use Spotify Artist and Track IDs in Echo Nest API calls to build playlists, streaming radio stations, and more," Pérez said
Dropbox has been acquiring companies to help it expand the services that it can offer to consumers and enterprises beyond cloud storage. But the company — which has raised $1.1 billion and is among the larger tech startups tipped for an IPO – is also making strategic acquisitions to help keep its own house in order.
A la demande d'Arnaud Montebourg, l'opérateur prépare une offre concurrente à celle de l'américain Netflix, prévue pour septembre en France. Basé sur le catalogue d'Orange Cinema Séries, ce service sera inclus dans une clef connectable aux téléviseurs de salon.
What happens when a tech company makes its own television shows? We get the inside details this week from a children's television veteran whose latest show
Santomero explains how Amazon’s approach meshes with her own focus on research-driven television programming, and how the emergence of apps and interactive television are changing the world of content creation. She also provides guidance to parents on managing screen time for kids, and shares her memories of Fred Rogers, one of the biggest influences on her life and career.
Across a wide range of industries from health care and financial services to manufacturing and retail, companies are realizing the value of analyzing data with Hadoop. With access to a Hadoop cluster, organizations are able to collect, analyze, and act on data at a scale and price point that earlier data-analysis solutions typically cannot match.
While some have the skill, the will, and the need to build, operate, and maintain large Hadoop clusters of their own, a growing number of Hadoop’s prospective users are choosing not to make sustained investments in developing an in-house capability. An almost bewildering range of hosted solutions is now available to them, all described in some quarters as Hadoop as a Service (HaaS). These range from relatively simple cloud-based Hadoop offerings by Infrastructure-as-a-Service (IaaS) cloud providers including Amazon, Microsoft, and Rackspace through to highly customized solutions managed on an ongoing basis by service providers like CSC and CenturyLink. Startups such as Altiscale are completely focused on running Hadoop for their customers. As they do not need to worry about the impact on other applications, they are able to optimize hardware, software, and processes in order to get the best performance from Hadoop.
In this report we explore a number of the ways in which Hadoop can be deployed, and we discuss the choices to be made in selecting the best approach for meeting different sets of requirements.
Key findings include:
Hadoop is designed to perform at scale, and large Hadoop clusters behave differently from the small groups of machines developers typically use to learn.
There are a range of models for running a Hadoop cluster, from building in-house talent and infrastructure to adopting one of several Hadoop-as-a-Service solutions
Competing HaaS products bring different costs and benefits, making it important to understand your requirements and their strengths and weaknesses. Some offer an environment in which a customer can run — and manage — Hadoop while others take responsibility for ensuring that Hadoop is available, maintained, patched, scaled, and actively monitored.
There's no easy fix for comments, which is why Knight's spending $4 million on software they hope can fit any newsroom's needs: "It should be a bunch of parts that you can assemble and reassemble."
Both excitement and skepticism surrounded Thursday’s announcement that Knight has invested $3.89 million to help The New York Times, The Washington Post, and Mozilla collaborate on an open-source community engagement platform. Lots of people were simply confused — why does anyone need millions of dollars to build a commenting system?
A small team at NPR was given six weeks of development resources to build an analytics dashboard.
NPR already uses Google Analytics and Chartbeat to monitor its analytics. (Another part of the social media team’s work is to build small tools to make sharing easier; one of those is a bookmarklet to add Google Analytics tracking codes to the end of URLs.) While those off-the-shelf products are useful, Kramer and Bryan said they can be complicated to understand and don’t always provide the best information. Instead, they wanted to create a dashboard that allows NPR to use the data to inform their content decisions.
NPR doesn’t introduce any new form of measuring analytics with its dashboard; rather, it takes existing information and presents it in a way that’s more easily digestible. It clearly shows where a story’s traffic is coming from, how it’s being shared on social media and elsewhere, and whether readers are interacting with embedded audio or slideshows. The dashboard was designed to answer simple questions: How much attention is this story getting? How are people getting to this story? Who posted this story on social media?
The dashboard will be housed online and information from it will be sent out in a daily internal email, Kramer said.
'Content' and 'storytelling' were the buzzwords at Cannes Lions, but Havas' content-creation engine is turning talk into action
Havas' Social Newsroom is a content-creation engine both monitoring and responding to what's been happening at Cannes Lions 2014 this week. With a team of data analysts, content strategists, community managers, creative technologists, production experts, designers, photographers and editors based there from early until late, it's proved a busy week for the team.
The idea behind the Social Newsroom reflects Havas Group's new "together" mantra and "Havas villages" framework, while the initiative itself started at agency Cake in London. "We called it Social Newsroom to show a spirit of collaboration. It's a neutral name and we want to engage the rest of the network," explains Mike Mathieson, Cake's CEO. "There should be a Social Newsroom in every major market. Seeing it live here this week makes a big difference in terms of people understanding what it's all about. But it's not just about reporting, it's about using your own tone of voice too."
A story on the rise of women serving time in prison in the U.S. is the sort of thing you might expect to see from the national desk at The New York Times. If you followed a link to it off Twitter or Facebook, seeing the story's multimedia features would only support that idea.
But the careful reader (or journalist who pays attention to these things) will notice the “Paid Post” small print sitting atop the story. It’s an impressive piece of native advertising, and the credit belongs to the Times’ recently created T Brand Studio, which plans to use many of the techniques and tools of the paper in crafting sponsored content. The story on women in prison, sponsored by Netflix in support of Orange Is The New Black, has all the multimedia touches we’ve come to expect on deeply-reported features from the Times. The 1,500-word piece by Melanie Deziel includes illustrations, graphics, and high-quality video and audio interviews with current and former inmates.
You know the importance of technology to the future of journalism has become a widely accepted fact when a prominent editor decides to join a new company because of its content management system. That’s what Ezra Klein told The New York Times about his decisionto leave The Washington Post for Vox Media, a digital publisher with a fancy, custom-built CMS. Klein couldn’t quite describe what made the Vox system so special, but the fact that a journalist said he loved, let alone even tolerated, his CMS was all you needed to know that the world has changed.
Suddenly, the CMS, an often derided but necessary tool of modern journalism, is cool. Vox uses its CMS as a recruiting tool. Google is not-so-secretly building a CMS for the news industry. Times media columnist David Carr recently devoted an entire column to the up-and-coming blogging platform/CMS called Medium, and proclaimed that “the content management system is destiny.”
We couldn’t agree more. Here at The Times, our own CMS, Scoop, is central to our ambitions to innovate on all platforms. It’s also the repository for all the aspirations for what the merging of print and digital journalism may one day become — and many of the frustrations for what it is today.
With that in mind, we thought it was a good time to take a closer look at Scoop’s past, present and future — what it does well, what can be improved and how it will help The Times remain the finest journalistic organization in the world.
Time Inc., the mother of newsmagazines, was born in 1922. She survived wars and recessions, grew up to be fabulously rich by mid-century, married the media giant Warner Communications, entered the golden years as one of the largest media companies in the world, suffered a mid-life crisis at the hands of AOL, and watched Warner Music Group and Time Warner Cable (both adopted offspring) graduate into independence. But today the circle of life closes its long arc, as Time Inc. is starting over again as a pure publishing company, anxiously asking the same question it successfully answered 90 years ago: Are magazines the future?
Finally, the shift to digital isn't benefiting large media corporations so much as it's enriching search and social media companies that can scale audiences and their data to create targeted advertising that a media company could only dream of replicating.
My bet is that small, niche, and premium digital journalism survives with high CPMs and light costs, while big, broad, and everything-for-everyone journalism struggles with low CPMs and heavy ambitions.
The Britishi Braodcasting Corporation (BBC) has launced a new page detailing their internal data models. The page provides access to the ontologies the BBC is using to support its audience facing applications such as BBC Sport, BBC Education, BBC Music, News projects and more. These ontologies form the basis of their Linked Data Platform
Twitter et Starcom MediaVest Group (SMG) viennent de publier les résultats d'une étude sur 15 campagnes publicitaires américaines. Et sans surprise, les résultats confirment que la social TV représente une opportunité pour améliorer et amplifier les expériences proposées par les marques.Les auteurs résument cette étude en 4 points majeurs :Le couple Twitter + TV : augmentation moyenne de 6,9% de la notoriété et de la favorabilité envers la marque par rapport des campagnes publicitaires TV classiques ménées individuellement.L’amplification de Twitter : augmentation des ventes de 4% en moyenne sur les ménages exposés à des publicités sur Twitter + TV par rapport aux campagnes TV seulement.Les twittos multitâches sont plus attentifs aux publicités sur tous les écrans.Les événements ou spectacles en direct sont plus engageants.