7wData
21.8K views | +3 today
 
Rescooped by Yves Mulkers from digitalassetman
onto 7wData
Scoop.it!

Putting Big Data in Context

Putting Big Data in Context | 7wData | Scoop.it

While futurist Ray Kurweizel and Moore’s Law gets all the headlines, over the years there has been a lot of interesting research and creative thought given to the idea of technological innovation and its implications on the need for human involvement in complex decision-making. In the era of Big Data, when social networks capture our conversations, likes and ideas like never before and sensor networks, or the Internet of Things, indexes more of the world around us, the fastest systems have access to more of the raw fuel, in zettabytes of new data, needed to make increasingly more complex decisions. But, does that mean smart systems will soon replace human decision-making?


Via digitalassetman
Yves Mulkers's insight:

a fine approach to show that big data doesn't solve all issues and some pitfalls are most definitely included as well. Take the best out of it and put it to play.

more...
No comment yet.
7wData
Compete on Analytics, a Visual crisp on Data as Competitive Edge in all its forms
Curated by Yves Mulkers
Your new post is loading...
Your new post is loading...
Scooped by Yves Mulkers
Scoop.it!

The Depreciating Value of Data

CFOs are increasingly being called upon to inform, support and contribute to upper-level strategic decision-making, and in many businesses, financial analysis has replaced financial reporting as the main priority of the CFO. This change must be reflected in the methods and tools CFOs use. While financial reporting looks back, stating facts about what has happened, financial analysis for strategic decision-making must always look forward, using current data to consider what might happen in the future. This shift brings with it a change in attitude. Financial reporting was about meeting key deadlines, whether annual, quarterly or monthly. But using financial analysis to aid real-time decision-making requires data that is up to date all the time, not just at certain key moments. It is not enough for data to be just-in-time, it now needs to be real-time. A piece of data that is truly "real-time" should accurately reflect the world that it models. To make an effective decision in real-time, you need accurate, real-time data. A piece of data that is correct for today may no longer be relevant, useful or correct the next day. In some industries, the effect may be even more extreme: A piece of data that is correct at 9 a.m. might be considered out of date by 10 a.m. It does not matter how good your decision-making is if that decision relies on out-of-date data. Businesses that do this are relying on the assumption that what was correct when that data was measured is still correct today - this can be expensive. Millions of people rely on the real-time navigational information that sat navs provide to get from A to B. But what if, instead of providing real-time information, the directions the sat nav gave were five minutes behind? Instead of showing a driver where he is now, the sat nav would show where he was five minutes ago. Every time that driver reached a junction, he'd have to make an educated guess about the direction to take based on out-of-date information, and then five minutes later he'd discover if he was correct or not. Sometimes he'd get the decision right, but often he'd get it wrong, and he would suffer consequences for this poor decision -- in this case, a delay to his journey. Decision-makers aren't truck drivers, but they can be misled by poor information in the same way. The difference is, instead of a wrong turn costing 10 minutes of time, it costs a business millions of dollars. Collecting and using data requires three steps: In the past, each of these steps involved manual work, significantly slowing the process.
more...
No comment yet.
Scooped by Yves Mulkers
Scoop.it!

Three Reasons Why You Should Invest In Smart Cities Now

Imagine you’re heading to work on a cold winter day. You step outside your perfectly climate-controlled home and your self-driving car pulls up to the curb as scheduled. You enter the car from the extra-wide, pedestrian-friendly sidewalk. The car warns you there is ice on your standard route that has slowed traffic considerably. Using data culled by sensors embedded in roadways, shared by other commuters and analyzed by your local transportation department, it offersseveral other route recommendations. Smart cities are the urban landscapes of the future. Powered by the ubiquitous connectivity of the Internet of Things (IoT), smart citiescollect data on a variety of factors – from totraffic – and employ that data to make citiessafer and moresustainable. By 2050, themajority of the world will be living in cities – now is the time to lay the groundwork for smart building and infrastructure. According to Consumer Technology Association’s The Evolution of Smart Cities and Connect Communities Study, the smart city market, valued at $14.85 billion in 2015, is expected to hit $34.35 billion by . Many cities have chief technology officers and innovation offices that are implementing smart city technologies. Business leaders should engage with these leaders who have a vision and understanding of their local needs. Here are three reasons why business leaders should take advantage of this key moment in smart city development: Having a say in rules and implementation. City rules shape how energy is used andhow buildings are designed. As digital infrastructure evolves, the rules that govern it will become only more complex. Ultimately, the needs of a city’s workers, employers and residents ought to shape how a city develops. But too frequently,poor communication between the public and private sectors prevents this. Companies that will eventually be affected must start weighing in on policy and planning now. Getting ahead of the curve on innovative business strategies.Disruptive companies such as Uber, Lyft , Expedia and Airbnb have already started to do this by employing technology to meet local transportation and lodging needs. With the rise of smart cities, more companies will have the opportunity to develop these kinds of game-changing strategies.
more...
No comment yet.
Scooped by Yves Mulkers
Scoop.it!

Artificial Intelligence: Will It Kill Your Job or Let You Live The Dream?

Artificial Intelligence, or AI,  is hot topic these days. Along with robotics and automation, depending on who you listen to AI is either the most wonderful or most disastrous development in human history. Will AI take your job away? Will it free you from boring or dangerous tasks so you can enjoy life? Will it lead to World War III or a Star Trek like Utopia?  Will you be struggling to put food on the table or living and playing on the beach all day? The stories in media paint a picture of one extreme or the other.  We are in the early stages of a technology-driven industrial revolution, The Fourth Industrial Revolution,  that is both eliminating and creating jobs. What makes this time the start of an industrial revolution is more than a few technological advances. What we are seeing are not only new technologies but advances across a wide swath of industries from materials science to medicine, finance, and business. With time, all of these advances will begin to benefit and be used by each other to create new advances. Just like the need for blacksmiths and weavers were replaced in the First Industrial Revolution many of the jobs we see today will be replaced. Just like back then we don't know which jobs will be replaced and which new ones will be created. Some people will often half-seriously joke that at least the robots will need to be maintained and repaired by people. Don't count on it. And it's not an answer for everyone anyhow. Don't get caught up in the high-level hype.  You need to understand personally the changes that are happening now and that are coming in the future that impact not only your job directly, but your industry and the company you work for. These last points are particularly important to pay attention to, as industries, business models and companies are disrupted. Being an outstanding performer on the RMS Titanic does you no good.  Before you can prepare yourself for the changes, start by getting a high-level understanding of the changes coming from the Fourth Industrial Revolution, beyond AI, robots and automation. This will help you not only understand where risks to your career may come from but also identify opportunities for your future. What makes this the Fourth Industrial Revolution is a combination of broad technological advances, the convergence of these technologies, a transformation of energy sources and power, new business models and, of course, disruption.
more...
No comment yet.
Scooped by Yves Mulkers
Scoop.it!

The Impact of Big Data on Banking and Financial Systems

Today, most banking, financial services, and insurance (BFSI) organizations are working hard to adopt a fully data-driven approach to grow their businesses and enhance the services they provide to customers. Like most other industries, analytics will be a critical game changer for those in the financial sector. Though many BFSI organizations are beginning to disrupt their analytics landscapes by gathering immense volumes of data assets, these companies are at varying levels ofBig Data maturity. In many cases, these initial data projects lead business stakeholders to a very simple question:“How can this data help us solve our business problems?” As customer volume increases, it dramatically affects the level of services offered by the organization. Existing data analytics practices have simplified the process of monitoring and evaluation of banks and other financial services organizations, including vast amounts of client data such as personal and security information. But with the help of Big Data, banks can now use this information to continually track client behavior in real time, providing the exact type of resources needed at any given moment. This real-time evaluation will in turn boost overall performance and profitability, thus thrusting the organization further into the growth cycle. Identifying more areas where Big Data resources can be utilized most efficiently involves the alignment of business cases and technological capabilities, which reveals opportunities for improved business processes. There are three primary areas where banks and other financial organizations can attain benefits from advanced analytics: the customer experience, operation optimization, and employee engagement. The pace of almost any data initiatives in the BFSI industry is directly related to the size of the company, as it often requires additional infrastructure investments forenterprise organizations. But despite the size of organization, customer-centric objectives play the primary role among most of data–related activities. It is very important to focus on the customers’ needs as today’s customers have high expectations on the ways of how they interact with their banks or credit unions. Their buying journey is complex and non-linear so financial players must be able to carefully understand customer preferences and motivation. To achieve a 360-degree view of the customer, a series of customer snapshots are no longer enough. Companies need a central data hub that combines ALL of the customer’s interaction with the brand, including basic personal data, transaction history, browsing history, service, and so on. According toMcKinsey, using data to make better marketing decisions can increase marketing productivity by 15-20% – that’s as much as $200 billion given the average annual global marketing spend of $1 trillion per year. Data-fueled analytics can empower those in the BFSI sector with customer insights and help create customer segmentation.
more...
No comment yet.
Scooped by Yves Mulkers
Scoop.it!

Oracle Wants Its Cloud to Grow Inside Your Data Centers

Taking a different approach to hybrid cloud than some of its biggest competitors, Oracle has substantially beefed up the capabilities of its on-premises cloud product called Oracle Cloud at Customer. It gives companies the ability to use its cloud services but have them run inside their own data centers. The company launched the Oracle Cloud at Customer service last year. It installs and manages all the converged hardware, software, and networking equipment on premises, essentially provides organizations a private version of its public cloud through a subscription-based model. Oracle last week expanded services it offers in this way to include Software-as-a-Service applications, including ERP and CRM software, and its full suite of Platform-as-a Service offerings. The company previously only offered Infrastructure-as-a-Service, and some PaaS services on Oracle Cloud at Customer. Analysts say the announcement allows Oracle to better compete against cloud rivals, such as Microsoft, IBM, Amazon, and Alphabet’s Google. All leading enterprise cloud players’ approaches to hybrid cloud are now starting to take shape, all quite different from each other. Microsoft’s strategy, Azure Stack, is the closest to Oracle’s. Microsoft’s partners Dell EMC, Hewlett Packard Enterprise, and Lenovo started taking orders for the on-premises version of the public Azure cloud, earlier this month. “The update to [Oracle’s] Cloud at Customer is competitive with Azure Stack,” Dave Bartoletti, VP and principal analyst at the Forrester Group, told Data Center Knowledge. “The appeal is for customers who want a bit of the Oracle Cloud running on-premises for whatever reason: security concerns, data residency, or simply a desire to control data and apps more directly.” IBM’s BlueMix Private Cloud Local is another on-premises play that’s similar to Oracle’s and Microsoft’s, while Amazon Web Services and Google Cloud Platform at this point only offer migration and integration services. “Azure, IBM, and Oracle are all trying to place their clouds in customer data centers, while AWS and Google are trying to make it as easy as possible to connect customer data centers to their clouds. It’s a slightly different approach,” Bartoletti said. For example, Google this month struck a deal with Nutanix, a hyperconverged infrastructure vendor, to help organizations build hybrid clouds that integrate their Nutanix environments with GCP. Meanwhile, AWS partnered with VMware to help enterprises easily integrate their existing VMware environments with its public cloud. Last week, however, an anonymously sourced report appeared, saying the two may have a joint on-premises data center software product in the works. IDC research VP Carl Olofson said Oracle’s latest upgrades of Cloud at Customer are important for the company because it wants to move as many existing customers to the Oracle Cloud as possible.
more...
No comment yet.
Scooped by Yves Mulkers
Scoop.it!

How firms are using artificial intelligence to up their game

After decades of false starts, artificial intelligence (AI) is already pervasive in our lives. Although invisible to most people, features such as custom search engine results, social media alerts and notifications, e-commerce recommendations and listings are powered by AI-based algorithms and models. AI is fast turning out to be the key utility of the technology world, much as electricity evolved a century ago. Everything that we formerly electrified, we will now cognitize. AI’s latest breakthrough is being propelled by machine learning—a subset of AI which includes abstruse techniques that enable machines to improve at tasks through learning and experience. Although in its infancy, the rapid development and impending AI-led technology revolution are expected to impact all the industries and companies (both big and small) in the respective ecosystem/value chains. We are already witnessing examples of how AI-powered new entrants are able to take on incumbents and win—as Uber and Lyft have done to the cab-hailing industry. Predictive analytics, diagnostics and recommendations: Predictive analytics has been in the mainstream for a while, but deep learning changes and improves the whole game. Predictive analytics can be described as the ‘everywhere electricity’—it is not so much a product as it is a new capability that can be added to all the processes in a company. Be it a national bank, a key supplier of raw material and equipment for leading footwear brands, or a real estate company, companies across every industry vertical are highly motivated to adopt AI-based predictive analytics because of proven returns on investment. Japanese insurance firm Fukoku Mutual Life Insurance is replacing its 34-strong workforce with IBM’s Watson Explorer AI. The AI system calculates insurance policy payouts, which according to the firm’s estimates is expected to increase productivity by 30% and save close to £1 million a year. Be it user-based collaborative filtering used by Spotify and Amazon to content-based collaborative filtering used by Pandora or Frequency Itemset Mining used by Netflix, digital media firms have been using various machine learning algorithms and predictive analytics models for their recommendation engines. In e-commerce, with thousands of products and multiple factors that impact their sales, an estimate of the price to sales ratio or price elasticity is difficult. Dynamic price optimization using machine learning—correlating pricing trends with sales trends using an algorithm, then aligning with other factors such as category management and inventory levels—is used by almost every leading e-commerce player from Amazon.com to Blibli.com. Chatbots and voice assistants: Chatbots have evolved mainly on the back of internet messenger platforms, and have hit an inflection point in 2016. As of mid-2016, more than 11,000 Facebook Messenger bots and 20,000 Kik bots had been launched. As of April 2017, 100,000 bots were created for Facebook Messenger alone in the first year of the platform. Currently, chatbots are rapidly proliferating across both the consumer and enterprise domains, with capabilities to handle multiple tasks including shopping, travel search and booking, payments, office management, customer support, and task management. Royal Bank of Scotland (RBS) launched Luvo, a natural language processing AI bot which answers RBS, Natwest and Ulster bank customer queries and perform simple banking tasks like money transfers. If Luvo is unable to find the answer it will pass the customer over to a member of staff. While RBS is the first retail bank in the UK to launch such a service, others such as Sweden’s SwedBank and Spain’s BBVA have created similar virtual assistants. Technology companies and digital natives are investing in and deploying the technology at scale, but widespread adoption among less digitally mature sectors and companies is lagging.
more...
No comment yet.
Scooped by Yves Mulkers
Scoop.it!

How AI will make corporations more humane and super-linearly innovative (Part II)

Fast-innovating startups are too small and slow-innovating corporates too big. When startups grow they become like big corporates; their innovation slows down and their dynamic culture dilutes. Processes and compliance take over. Breaking things is replaced by ticking boxes. That’s why I think the problem with innovation is a problem of scale. And scale can only emerge by rethinking business organization. How can we scale organizations so they retain their startup agility and avoid the mass corporate extinction event? If you are a business leader transforming your organization should be your top agenda item. But transform how? Let me suggest that in a digital economy, where people and things become increasingly interconnected, the innovative organization is made up of six fundamental pillars: right strategy, right culture, right work organization, right technology stacks, right data and right cybersecurity. Each of these pillars needs special, in-depth treatment. But for the purpose of this article I want to focus on two of the pillars – right technology stacks and right work organization – for they are the two that distinguish the innovative organization from its predecessors, and also instrumental to scaling innovation. The first industrial revolution made humans into robots. Yes, you have read this sentence correctly. For the past two hundred years we humans adapted our work behaviour around machines. The very idea of a process comes from traditional assembly line and process manufacturing, where things need to happen in a sequence. Allowing for things to happen spontaneously and chaotically goes against the concept of “management”, which is at the heart of current business organization. Processes, managers, loops, repetition, rules, compliance, are the building blocks of traditional organizations. Workers going to work every day, staying at work for a specified period of time, vacationing only a few days per year, their output measured as if they were all the same, identical copies, robots. All these behaviours and methods have served us well for nearly two centuries, but now they are in dire need of reinvention. In their forthcoming book “Surge: Navigating the digital tsunami” digital transformation experts Brad Murphy and Carol Mase propose a phased approach to making businesses fit for the digital era. Each phase liberates people from productivity-depressing processes and allows them to collectively engage in value-producing networks that innovate, not only around products and services but in everything that is required to make the business successful. The nodes of these networks have access to technology stacks that reinvent the idea of corporate information technology. And here’s an important point: IT must cease to be a cost centre and instead become the means of innovation. To become so it requires the courageous embrace of the cloud. “Cloud-native organizations” accessing software micro-services in a cloud-based ecosystem environment means that teams can quickly build composable systems, experiment and fail fast, and thus significantly improve the delivery of the innovation cycle.
more...
No comment yet.
Scooped by Yves Mulkers
Scoop.it!

Toyota Wants Cars to Predict Heart Attacks

A heart attack or diabetic blackout can have especially deadly consequences for drivers when they cause car crashes. Toyota researchers hope to change that grim equation by studying how wearable devices could help smart cars possibly save lives by predicting medical emergencies ahead of time. The day when smart cars—either manually driven or self-driven—will watch out for the health of their drivers remains some ways off into the future. But Toyota’s Collaborative Safety Research Center think it’s worth investing in the technology and scientific research needed to make that future happen today through a $35-million, five-year effort that will last until 2021. Toyota researchers have already begun working with universities on seeing if wearables such as smartwatches could someday prove as accurate as clinical-grade medical equipment in monitoring signs of impending heart attacks or blackouts due to low blood sugar. “We looked at what conditions might have contributed to crashes from an emergency medical standpoint, and also looked at signals that may be measured through wearables,” says Chuck Gulash, director of Toyota’s Collaborative Safety Research Center. This may not sound like a huge deal when just 1.3 percent of all passenger car crashes in the National Motor Vehicle Crash Causation Survey were reportedly caused by medical emergencies, according to a 2009 study by the National Highway Traffic Safety Administration. But that still translates into medical conditions being at least partially responsible for more than 26,000 crashes, because the overall survey included more than 2 million car crashes. Heart attacks accounted for about 11 percent of the crashes caused by medical conditions, or approximately 2,680 crashes. Diabetic blackouts accounted for a significantly larger proportion of such crashes at 20 percent, or approximately 5,200 crashes. Toyota researchers chose to focus on those two conditions as opposed to medical emergencies that caused even more crashes—such as seizures—because they believe that wearable technologies can accurately detect and predict such conditions within the near future. Medical emergencies involving heart attacks or diabetes are also likely to become more common among a growing population of older drivers. For example, heart attacks often take place in adults 65 years of age or older. The population of U.S. adults older than 65 years old is set to double in size from the 2012 level of 43 million to almost 84 million by 2050, according to a 2013 report by the U.S. Census Bureau.
more...
No comment yet.
Scooped by Yves Mulkers
Scoop.it!

Data Governance: The Unsung Hero of Network Operations

Data and data systems are the lifeblood of modern business. Innovations like cloud computing, business intelligence, analytics, and automation hold the promise of operational efficiency and strategic insight; but without proper visibility and control over data, all that digitization can get in the way of productivity by creating a management nightmare. Time spent wrestling with difficult systems is time not spent extracting useful insights from the information contained therein and can discourage innovation.  Not so long ago, even large organizations handled data governance as a mostly discrete function. Some data and business applications were available company-wide, but specific business functions—and the data associated with them—were largely isolated and inaccessible to teams that might benefit from the knowledge they contained. Such isolation, often referred to as data silos, rewards routine and stanches innovation, preventing an organization from improving processes and efficiencies.  To change that dynamic, it’s vital that the silos be torn down and the data be interconnected. But without robust tools that are easy to use, it can be difficult to know what’s going on with the data flowing into, out of, and within your enterprise. And without that level of control and visibility you run the risk of failing to meet service level agreements (SLAs), losing track of vital files, and you may even fall afoul of the many security and privacy regulations that dictate the care of sensitive personal information and high-value intellectual property today.  Solutions designed for today’s demanding data management landscape operate as highly evolved gateways, bridging the gap between an enterprise’s internal constituencies and the many external entities that constitute a network of partners, contractors and customers. As these interconnected relationships grow, so does the need for more efficient and easy-to-use means of analyzing the underlying data flows that give employees and managers the insights necessary to make good decisions and for IT managers to govern their systems.  This need is a driving factor behind the development of management and analytics platforms that are tightly integrated to the systems that manage data flow in an enterprise. By tying into business and operational intelligence, IT managers have the near real-time ability to ensure data is moving in and out of an organization correctly and efficiently. That means alerting network administrators to conditions that could put security, compliance or operations in jeopardy.  IT teams are being tasked with greater responsibility over mission-critical systems and business functions.
more...
No comment yet.
Scooped by Yves Mulkers
Scoop.it!

Data is your biggest digital hurdle. DataOps will help you clear it

I spend a lot of time talking to senior executives at large companies, and if there’s one phrase I hear them all talk about consistently — perhaps a little too often —  it’s “Digital Transformation.” Ask 10 people what it means and you’ll get 10 different answers, but in broad brushstrokes it means using the power of software to transform and improve business processes large and small, and to study the data that software generates for new ways to boost efficiency, reduce costs or make money. And we can all probably recite the list of things that have made that easier: Cloud computing; agile development and collaboration tools. You can lump them into shorthand phrase “DevOps,” and in general, they have made building software easier and faster. Yet it’s true that companies can get everything right about DevOps and still fail at digital transformation. And most do fail — about 84 percent by one reckoning. One big reason they do fail is that they ignore the complexities around handling data. I got to talking about this recently with Eric Schrock, CTO at Delphix, a company founded in 2008 and based in Redwood City, Calif. that is focused on the fundamental problem of making data easier to work with. Companies have in the last decade or more developed an “unquenchable thirst for data,” he says. And that fact bumps up against a fundamental problem: While every aspect of the world of software has become light, agile, and streamlined, data is nothing like that. “Data is the new competitive advantage. It doesn’t matter what business you’re in,” Schrock told me in a recent interview. “It doesn’t matter if you’re an analyst or a data scientist or the head of marketing. Everyone needs data.” Sure everyone needs it, and there’s more of it than ever before. But data is complex, heavy, expensive to maintain, and difficult to move. At a large corporation the scale of managing it can be staggering, and this fact can cause tremendous friction throughout a company's digital operations. One reason is size. A typical corporate database may be one to 100 terabytes, and the number of those databases can range from dozens to tens of thousands amounting to hundreds of petabytes worth of raw data. And while developers, marketers, analysts, and even data scientists are all clamoring to get their hands on live data, they often end up having to settle for one of many bad workarounds: They work with a limited subset of the data they need; They substitute old, stale data in place of the live production data that's services as the lifeblood of modern applications.  What companies need, Schrock argues, is a new approach to working with data that like DevOps, breaks down the barriers to progress. He calls it DataOps, which strictly speaking isn’t a new phrase, but one whose meaning he’d like to extend: “The alignment of people, process, and technology to enable the rapid, automated, and secure management of data.
more...
No comment yet.
Scooped by Yves Mulkers
Scoop.it!

The Next Generation of SaaS Won't Optimize for User Engagement

A few weeks ago Hiten Shah explained in a new interesting post why the most successful SaaS companies of the future will focus on usage, just like Facebook. In the write-up, he goes very deep into his explanation bringing examples of world-class SaaS companies like Trello, Slack, and Dropbox that are all building their strategies around this consumer-oriented product approach. He predicts that this is how the next generation of SaaS will look like. While I was reading Hiten’s post, I immediately recalled a frugal email conversation I had last month with Patrick Campbell, CEO at Price Intelligently. Patrick briefly introduced me to the definition of what he calls “anti-active usage” products. While at first, this might sound very counterintuitive, it’s actually the natural evolution of most of the SaaS products that we know today. Harnessing the world of software is a single statement is very hard. Tom Tunguz explained in a post on the blog his vision about software in a simple way. Software world divides into systems of record and workflow applications. Systems of record unify data from different sources under a single view. Common applications of Systems of record are CRM and ERP. Here’s how the value chain for systems of record looks like: Workflow applications enable workers to do work. These products represent a huge portion of the products that we use in our daily work life. Here’s the value chain for workflow applications: Systems of records and workflow applications have one thing in common, at their core level they need some human interactions. The paradigm under which you have to actively use something to do a given task or to reach a certain goal is the bedrock of most of the SaaS products out there. Anti-active usage products flip this model— you don’t necessarily need to use the product to get something done because the product (1) understands the problem, (2) works out a solution and (3) outputs a result. Anti-active usage products don’t need human interactions at any level of their value-chain. We can expect in 10 years from now, a good part of today’s SaaS product flocking to this new category: There are many reasons I see why anti-active SaaS products might come in the next 10 years: 1. The scarcity of time — We can build a solid business strategy around things that are stable in time. This is why we create businesses on things that don’t change. Time is the scarcest resource and your employees’ time is one of the most important assets of your business. Products that don’t impact by any means on your team but yet they are able to generate relevant outcomes, can change the rules of the game. 2. The tragedy of the commons — The tragedy of the commons is an economic theory of a situation within a shared resource system where individual users acting independently according to their own self-interest behave contrary to the common good of all users by depleting or spoiling that resource through their collective action. The “individual users” in this instance are the SaaS vendors who are all trying to optimize for engagement/product usage. The reliance on this model is not only unsustainable but is demonstrably damaging the environment. 3. SaaS switching cost — The tool explosion I’ve been talking about for a while made companies more flexible, but it also made people waste time jumping back and forth between apps just to accomplish a given task. Not to mention the lack of context they need to do good work. This SaaS tools explosion broke your work into tiny pieces and scattered it across a dozen apps — making it almost impossible to feel on top of things. Anti-active usage products slash the switching costs between products and tools in our workday and centralize the metrics of their output in a single dashboard. These products allow you and your team to concentrate more on a strategic and on a tactic level.
more...
No comment yet.
Scooped by Yves Mulkers
Scoop.it!

Why the explosion of AI can no longer be ignored by businesses

Discussions around Artificial Intelligence (AI) have proliferated and will continue to over the next year. Businesses are beginning to see its value in a wide range of sectors, while a different report is released every day that instils a fear that AI will replace workers’ jobs. Simply put there is no hiding the explosion of AI and its applications; in the UK, Durham police have begun trialling a system that will classify suspects as low, medium or high risk and help officers review their sentence. As AI, automation and machine learning push the boundaries in almost every single industry, businesses will have to embrace the technology and understand that it is a force for good. This article will explore the perception around AI, the practical uses of the technology and the overall objective that businesses should be preparing for. While it’s true that AI will change how businesses operate, business leaders should consider it with optimism and positivity. PwC conducted a study and found that AI would help global GDP gain a huge $15.7 trillion, a 14% rise from 2016 to 2030. The detailed analysis found that the business impact of AI was so vast that productivity would improve dramatically. 2030 might seem like a long time away but in the business world preparation is key to everything. As artificial intelligence becomes more and more integral to daily work processes, the benefits will become much clearer. Automating simple tasks frees up the workforce to focus on more intricate and exciting aspects of their role. Take for example, innovative law firm Berwin Leighton Paisner. The UK based company developed an entirely different business model that relies on an AI system to work on certain property disputes. The system extracts the raw data and checks the basic details needed to serve legal notices to the corresponding property. The cost savings are clear, but the sheer difference in time is quite substantial – a task that would usually take workers weeks now can be done in minutes! And this using a simple process automation approach to AI; future systems will do much more comprehensive tasks. As businesses of all kinds look to invest in AI, the field itself will rapidly evolve, making it almost impossible to keep up with ongoing developments. Leaders should instead focus on their business and look to analyse how artificial intelligence could benefit their organisation. The application of Artificial Intelligence is quite broad and its technology is positively developing, possibly quicker than businesses would have predicted. From the sorting of information, to automated machines in the manufacturing process, the capabilities are clearly broad enough that businesses are wasting no time to invest. Our own recent AI engine, Zinrai, has been used by Japanese company Kawasaki Geological Engineering to accurately detect and prevent underground sinkholes before they manifest. The technology uses deep learning technology to analyse and process the volume of radar images collected with underground radar probe equipment. Businesses and governments could benefit from increased operating efficiency, spot cavities in the road before they manifest and be able to ultimately prevent sinkholes. Another fantastic example is a conversational tool that can be used to analyse the sentiment of customer calls, therefore enabling the host to tailor each conversation to the customer. What’s clear is that AI is developing at a significant rate.
more...
No comment yet.
Scooped by Yves Mulkers
Scoop.it!

“Hide the Tech” to Take Big Data Mainstream

Big data is a big deal – and a big opportunity. The challenge is that most big data progress has been limited to big companies with big engineering and data science teams. The systems can be complex, immature and hard to manage. That might be OK if you’re a well-trained developer in Silicon Valley, but it doesn’t play well if you’re a jack-of-all-trades IT leader at a bank in Atlanta or an on-the-move business leader in Amsterdam. How can you tap into big data if you don’t have an army of engineers who stay steeped in the latest tech? Fortunately, help is on the way thanks to several trends that move tech from the foreground to the background. First and foremost, big data is moving into the cloud. Amazon, Microsoft and Google now deliver Hadoop and Spark “as a service,” eliminating the need to spin up a cluster, research new software products or worry about version management. Companies are increasingly moving their big data workloads into the cloud, hiding complexity from their users while relying on the world’s best data center professionals to manage the tech. They want access to infrastructure when and how they want it, with just as much as they need but more. Next up is the emergence of “serverless” computing. This builds on the cloud trend, while removing even more tech dependencies. Just load your data and start processing. Tell your cloud provider what you want to do, how much data you want to crunch and where you want to run it – and they will spin up the infrastructure when you need it, and spin it down when you don’t. That’s incredible for retailers and CPG companies, for example, who may have seasonal businesses and therefore fluctuating data needs throughout the year. They can do critical data analytics as needed without having to pay for robust infrastructure during the off-peak periods. The third big trend is the move to self-service, fueled by new tools and platforms that democratize both data integration and data consumption. “Self-service integration” makes it fast and easy to connect systems, create data pipelines and automate processes without the need for intensive coding.
more...
No comment yet.
Scooped by Yves Mulkers
Scoop.it!

Calls grow for Canada to modernize privacy laws amid EU changes

New privacy regulations coming into force in Europe next year are calling into question whether Canada’s approach to privacy is keeping up with its global peers. Industry observers are suggesting that if Canada does not continue to modernize its approach to privacy, it could face roadblocks in maintaining its status as an adequately protected jurisdiction – a status that allows for more fluid trade with the European market. In May, 2018, Europe’s new General Data Protection Regulation (GDPR) will come into force, and will impose sweeping changes on how privacy is protected in the European Union. Businesses with operations there are – or should be – working to prepare for that deadline, but it could impact privacy controls beyond EU borders as well. Right now, Canada has “adequacy” status from the European Commission, which determined in 2001 that Canada’s law under PIPEDA (the Personal Information Protection and Electronic Documents Act) was strong enough to satisfy that any data transferred from the EU to Canada would be adequately protected. But things are changing. “We cannot take for granted that Canada would be recognized as adequate under the GDPR, because it is very different from our current legislation, and very different from the previous European legislation under which we were deemed adequate,” said Chantal Bernier, former interim privacy commissioner of Canada, and an adviser in the privacy and cybersecurity practice at law firm Dentons Canada LLP. The new regulations are far stricter than their predecessors in Europe and the rules in many countries. They will have an impact on marketers, since gathering and storing customers’ data is becoming a valuable part of targeted advertising. Any ad agencies doing business with clients in the EU, or companies targeting ads to potential customers there will have new rules to contend with – including the law’s broadened definition of personal information to include computers’ IP addresses. The law also allows individuals in many cases to withdraw their consent for companies to keep their data, particularly if the use of that information is not related to the reason that it was collected in the first place. And they have the right to ask to see the data companies have about them. But the law goes way beyond marketing: It also changes the way companies must handle their own employee data and how they protect against the kind of data breaches that have made headlines in recent years – and how such breaches are reported. Penalties for non-compliance could be up to €20-million (almost $30-million Canadian) or 4 per cent of a company’s total global revenue, whichever is greater. Adequacy status is important, because it allows for fluid exchange of personal information between the EU and Canada for commercial purposes. It paves the way for Canadian companies to do business with firms and consumers in Europe. “They know that they are transferring information to a company that is in compliance with the obligations that they are under,” Ms. Bernier said. For trade purposes, losing that status would make doing business much more difficult. In any circumstance where data is moving digitally across those borders, more onerous measures would be needed to ensure European firms could trust that the Canadian firms are compliant under their new stricter laws.
more...
No comment yet.
Scooped by Yves Mulkers
Scoop.it!

What sort of silicon brain do you need for artificial intelligence?

The Raspberry Pi is one of the most exciting developments in hobbyist computing today. Across the world, people are using it to automate beer making, open up the world of robotics and revolutionise STEM education in a world overrun by film students. These are all laudable pursuits. Meanwhile, what is Microsoft doing with it? Creating squirrel-hunting water robots. Over at the firm’s Machine Learning and Optimization group, a researcher saw squirrels stealing flower bulbs and seeds from his bird feeder. The research team trained a computer vision model to detect squirrels, and then put it onto a Raspberry Pi 3 board. Whenever an adventurous rodent happened by, it would turn on the sprinkler system. Microsoft’s sciurine aversions aren’t the point of that story – its shoehorning of a convolutional neural network onto an ARM CPU is. It shows how organizations are pushing hardware further to support AI algorithms. As AI continues to make the headlines, researchers are pushing its capabilities to make it increasingly competent at basic tasks such as recognizing vision and speech. As people expect more of the technology, cramming it into self-flying drones and self-driving cars, the hardware challenges are increasing. Companies are producing custom silicon and computing nodes capable of handling them. Jeff Orr, research director at analyst firm ABI Research, divides advances in AI hardware into three broad areas: cloud services, on‑device, and hybrid. The first focuses on AI processing done online in hyperscale data centre environments like Microsoft’s, Amazon’s and Google’s. At the other end of the spectrum, he sees more processing happening on devices in the field, where connectivity or latency prohibit sending data back to the cloud. “It’s using maybe a voice input to allow for hands-free operation of a smartphone or a wearable product like smart glasses,” he says. “That will continue to grow. There’s just not a large number of real-world examples on‑device today.” He views augmented reality as a key driver here. Or there’s always this app, we suppose. Finally, hybrid efforts marry both platforms to complete AI computations. This is where your phone recognizes what you’re asking it but asks cloud-based AI to answer it, for example. The cloud’s importance stems from the way that AI learns. AI models are increasingly moving to deep learning, which uses complex neural networks with many layers to create more accurate AI routines. There are two aspects to using neural networks. The first is training, where the network analyses lots of data to produce a statistical model. This is effectively the “learning” phase. The second is inference, where the neural network then interprets new data to generate accurate results. Training these networks chews up vast amounts of computing power, but the training load can be split into many tasks that run concurrently. This is why GPUs, with their double floating point precision and huge core counts, are so good at it. Nevertheless, neural networks are getting bigger and the challenges are getting greater. Ian Buck, vice president of the Accelerate Computing Group at dominant GPU vendor Nvidia, says that they’re doubling in size each year. The company is creating more computationally intense GPU architectures to cope, but it is also changing the way it handles its maths. “It can be done with some reduced precision,” he says. Originally, neural network training all happened in 32‑bit floating point, but it has optimized its newer Volta architecture, announced in May, for 16‑bit inputs with 32‑bit internal mathematics. Reducing the precision of the calculation to 16 bits has two benefits, according to Buck. “One is that you can take advantage of faster compute, because processors tend to have more throughput at lower resolution,” he says. Cutting the precision also increases the amount of available bandwidth, because you’re fetching smaller amounts of data for each computation.
more...
No comment yet.
Scooped by Yves Mulkers
Scoop.it!

How AI will become omnipresent

The resurgence of artificial intelligence in recent years has been fueled by both the advent of cheap, available mass processing capacity and breakthroughs in AI algorithms that allow them to scale and tackle more complex problems. Interestingly, this recent trend is reminiscent of the personal computing revolution of the ’80s, when cheaper and more available computing became a catalyst for mass “computerization” of numerous industries. Much like AI today, computers and computerization felt cutting edge and new, so companies were setting up computing departments and computerization task forces. By the standards of those days, we are all computer specialists today. Adoption of computers didn’t come about overnight. Decades ago, there was high demand for computerization, but its implications for each industry were not clear. People sensed computers were important but weren’t 100 percent sure in what way. We had to go through a whole process of development and discovery, and, as a result of computer experts working hand in hand with domain experts over the course of 15 to 20 years, computers and specialized software were developed to suit different needs. We’re following a similar path with AI. We’re now at the point where AI is often siloed in specialized departments and where C-suite players intuit how important AI will be but might not be sure how to approach it. Common questions today include “What is AI?” and “How can it help my business?” Let’s look at online content first, specifically website optimization. Most people now are familiar with conversion rate optimization (CRO), where site operators try to maximize conversions by testing new ideas for design, messaging, user experience, and more. AI can make this process more effective by orders of magnitude. We need to figure out how we’re judging the AI’s solutions and define the world in which it operates. For this example, we judge success by increased conversions (and we can choose whether that means leads or sales or whatever) and define the world as a particular website and the changes the AI can make to it (fonts, designs, colors, etc.). We can give the AI information like changes to try (dozens of messages and design ideas), as well as the ability to determine browser type or logged-in status so the AI can also start segmenting users. What happens with this approach can be staggering. The AI can find compelling combinations of designs and the audiences those designs resonate with. It can do this by leveraging genetic algorithms, effectively breeding fitter and fitter generations of designs that create children that convert more effectively and repeating the process as the AI bends toward more optimal configuration. It’s important to note here an important aspect of this approach that fits a general definition of AI: It’s autonomous. The operator sets parameters and goals, but the AI decides the combination of ideas, always trying to find a better answer and better results against that goal.
more...
No comment yet.
Scooped by Yves Mulkers
Scoop.it!

Old Database Problems In a New Guise: As a Service

Putting all your eggs in one cloud basket is risky, because clouds are not immune to denials of se ScaleArc: If cloud services are the "white knight" sent to solve IT problems, then database as a service is going to fall short. Interest in database as a service -- such as Amazon's MySQL re-architected for the cloud, Aurora database service -- is increasing, and Microsoft has jumped into the act with its own relational service, known as Azure SQL. But cloud operations in themselves will impose new problems on database response times and scalability. When it comes to being "the white knight of enterprise IT needs… cloud services don't free IT from all concerns about availability and performance, despite marketing to the contrary," wrote the database experts at ScaleArc, a supplier of database load balancing software. The P in PaaS "does not stand for panacea… In fact, the cloud introduces shortcomings and inefficiencies that can undermine performance and jeopardize uptime," when it comes to database applications, the authors said. "Millions of users have experienced application lag, data loss, and outages arising from database service limitations baked into platform infrastructures," warns the white paper, A Hazy Horizon: Why the Cloud Doesn’t Solve All Your Uptime and Performance Challenges. In effect, ScaleArc argues that you need load balancing middleware between you and the cloud database service for it to perform as expected. In the process of doing so, it highlights the chief obstacles to achieving performance and availability with cloud database services. They include: network latency, I/O limitations, scalability, and hypervisor challenges, as well as availability issues. Want to learn more about Amazon's Aurora and other database services? See AWS Expands Database Migration Services, Expands Replication. In that order, here's what ScaleArc's experts had to say about each: Network latency: A cloud database server to some extent is only as good as its proximity. How far away is the cloud data center with the database server? The server and its storage could be across town on a high-speed fiber optic loop or they could be hundreds of miles away. "Enterprises have no control over the number of or distance between their network hops," the authors warned. Cloud services with a data center in a region near you are more likely to offer the lowest latencies due to network delays imposed by distance. By 2025, there will be 485 cloud centers in the world, according to Cisco's Global Cloud Index, so chances are one is coming to a location near you, if there isn't one available already. In the meantime, network hop latencies impose delays that can cause a database system in the process of updating synchronized data to malfunction. If network latency passes a tolerable threshold, "multiple reconnect attempts may ensue. For some applications, this step might require re-authenticating to the server with each connection attempt." When that happens, kiss effective user response time goodbye. If the database system is synchronizing with a copy in a different geographical region, that synchronization may be speedy or slow, depending on the subsystem's operation and network connection. If the lag exceeds the customer's "threshold for freshness of data," the primary system needs some way of deciding whether there is another replication point available in a better timeframe, the authors said. I/O limitations: Not everyone realizes when they sign up for a cloud database service that there will be limitations on their number of I/Os. "The more highly shared the resources, the worse the issues become," said the ScaleArc experts, especially if the service provider makes no effort to police noisy neighbors that create a lot of I/O traffic. If the cloud provider is trying to spread use of existing resources across more customers, yielding more profit, it will contribute to the problem.
more...
No comment yet.
Scooped by Yves Mulkers
Scoop.it!

Is your culture interfering with data security?

Business is global. This isn’t new, nor is it surprising that cultural differences, international laws, and workplace practices differ around the world. Businesses have long sought to harness the strengths of particular cultures and, in other situations, to transplant the culture and values of the company’s mother country onto a global labor force. For example, a company with sites in Japan or Italy may have trouble being notified of security issues due to Italy’s “bella figura” or Japan’s “mentsu” concept of keeping face. Employees in those countries may not share the information out of concern for potentially shaming their global counterparts. In such cases, the parent organization may try to impress the value of open communication upon employees from those countries. On the other hand, a company might open research and development offices in Switzerland, Finland, or Singapore due to their high degree of intellectual property rights protection. Enterprise-wide security programs should consider how security will be effective in different cultures, the differences in legal and regulatory requirements, how company property is viewed, encryption limitations, and language barriers in order to manage security effectively around the world. Security programs can be more or less effective in different cultures so it is important to not only gather support and feedback from top management but also from leaders in regional centers with differing cultures. For example, separating the office into different security zones, each requiring authentication, may be well received in Western countries such as the United States but Eastern countries like Japan may think this rude and untrustworthy. Similarly, perceptions and priorities of security may differ between countries as shown in this global security survey. Another important global difference is legal and regulatory requirements. The European Union differs greatly from the United States in their privacy laws, so a security program will need to ensure that the requirements of each country’s laws are met while still maintaining at least the organizational defined minimum standard of security. Employees from multiple regions working on a single project or the same data will need to follow appropriate procedures to ensure they are complying. An organization’s response and transparency in handling incidents is related to the legal and regulatory requirements, but also impacts a company’s brand image.
more...
No comment yet.
Scooped by Yves Mulkers
Scoop.it!

Why chatbots need a big push from deep learning

Most tech giants are investing heavily both in applications and research, hoping to stay ahead of the curve of what many believe to be an inevitable AI led paradigm shift. At the forefront of this resurgence are the fields of conversational interactions (personal assistants or chatbots), computer vision and autonomous navigation, which thanks to advances in hardware, data availability and revolutionary machine learning techniques, have enjoyed tremendous progress within the span of just a few years. AI advances are turning problems previously thought to lie beyond the realm of what machines could tackle into commodities that are percolating our everyday life. Tailing the remarkable growth in popularity enjoyed by AI, a new generation of chatbots has recently flooded the market, and with them the promise of a world where many of our online interactions won’t happen on a website or in an app, but in a conversation. Helping turn this promise into reality is a combination of better user interfaces, the omnipresence of smart-phones, and new, state of the art, machine learning techniques. Perhaps one of the main drivers behind this wave of novel AI applications is deep learning, an area of machine learning that, despite existing for roughly 50 years, has recently revolutionized fields such as computer vision and natural language processing (NLP). Nonetheless, despite its incredible performance, deep learning alone is not sufficient to solve the challenges faced by chatbots. The ability to understand context, disambiguate between subtle differences in language that can lead to wildly different meanings, logical reasoning, and most crucially, understanding the preferences and intent of the consumer, are just a few of the many challenging tasks a system must be able to perform in order to sustain a conversation with a human. The ability to answer complex questions using not only context, but also information beyond the confinements of the dialog, is indispensable for building truly powerful chatbots. To answer questions effectively, the bot needs to rely on information that was either shared previously in the conversation, or even within other conversations between the bot and the consumer. Moreover, business goals and the intent of the consumer can influence the kind of response the bot will give. If a modern conversation engine hopes to go beyond answering simple, one-level questions, it must blend together the most prominent techniques emerging from the field of deep learning, with solid statistics, linguistics, other machine learning techniques, and more structured classical techniques such as semantic parsing and program induction. The first stop in building an intelligent conversational system is data. While we live in an era where endless streams of data are constantly being generated, most of it is too raw to be of immediate use for machine learning algorithms. In particular, deep learning is notorious for its need for vast amounts of high quality data before it can unleash its true potential. Unsupervised Learning, the subfield of machine learning devoted to extracting information from raw data, unassisted by humans, is likely a promising alternative. Among its many uses, it can be utilized to build an embedding model. In plain English, these techniques allow one to represent their data in a less complex form, allowing for easier discovery of patterns.
more...
No comment yet.
Scooped by Yves Mulkers
Scoop.it!

Internet of Things can protect the environment? Get ready to be amazed

Internet of Things (IoT) has a large role to play in the future of smart cities. IoT can be used in practically all scenarios for public services by governments to make cities environment friendly. Sensor-enabled devices can help monitor the environmental impact on cities, collect details about sewers, air quality, and garbage. Such devices can also help monitor woods, rivers, lakes and oceans. Many environmental trends are so complex that they are difficult to conceptualise. IoT is a recent communication paradigm that envisions a near future, in which the objects of everyday life will be equipped with micro-controllers, transceivers for digital communication, and suitable protocol stacks that will enable them to communicate not only with one another but also the users, becoming an integral part of the internet and the environment. IoT environmental monitoring applications usually use sensors to lend a hand in environmental protection by monitoring air or water quality, atmospheric or soil conditions, and can even include areas like monitoring the movements of wildlife and their habitats. An urban IoT platform can provide means to monitor the quality of the air in crowded areas, parks, or fitness trails. From real time monitoring of water quality in the ocean through sensors connected to a buoy that sends information via the GPRS network, to the monitoring of goods being shipped around the world, and smart power grids that create conditions for more rational production, planning and consumption can all be achieved via microchips implanted in objects that communicate with each other. Some applications related to the IoT aren’t new: toll collection tags, security access key cards, devices to track stolen cars and various types of identity tags for retail goods and livestock. Other monitoring and tracking systems have more business uses such as solving or averting problems like sending a cellphone alert to drivers that traffic is backed up at a particular exit ramp, and increasing efficiencies such as enabling a utility to remotely switch off an electric meter in a just-vacated apartment. ICT empowered atmosphere relief procedures could diminish worldwide environmental change 16.5% by 2020 contrasted with current endeavours.
more...
No comment yet.
Scooped by Yves Mulkers
Scoop.it!

Artificial intelligence suggests recipes based on food photos

There are few things social media users love more than flooding their feeds with photos of food. Yet we seldom use these images for much more than a quick scroll on our cellphones. Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) believe that analyzing photos like these could help us learn recipes and better understand people's eating habits. In a new paper with the Qatar Computing Research Institute (QCRI), the team trained an artificial intelligence system called Pic2Recipe to look at a photo of food and be able to predict the ingredients and suggest similar recipes. “In computer vision, food is mostly neglected because we don’t have the large-scale datasets needed to make predictions,” says Yusuf Aytar, an MIT postdoc who co-wrote a paper about the system with MIT Professor Antonio Torralba. “But seemingly useless photos on social media can actually provide valuable insight into health habits and dietary preferences.” The paper will be presented later this month at the Computer Vision and Pattern Recognition conference in Honolulu. CSAIL graduate student Nick Hynes was lead author alongside Amaia Salvador of the Polytechnic University of Catalonia in Spain. Co-authors include CSAIL postdoc Javier Marin, as well as scientist Ferda Ofli and research director Ingmar Weber of QCRI. The web has spurred a huge growth of research in the area of classifying food data, but the majority of it has used much smaller datasets, which often leads to major gaps in labeling foods. In 2014 Swiss researchers created the “Food-101” dataset and used it to develop an algorithm that could recognize images of food with 50 percent accuracy. Future iterations only improved accuracy to about 80 percent, suggesting that the size of the dataset may be a limiting factor. Even the larger datasets have often been somewhat limited in how well they generalize across populations. A database from the City University in Hong Kong has over 110,000 images and 65,000 recipes, each with ingredient lists and instructions, but only contains Chinese cuisine. The CSAIL team’s project aims to build off of this work but dramatically expand in scope. Researchers combed websites like All Recipes and Food.com to develop “Recipe1M,” a database of over 1 million recipes that were annotated with information about the ingredients in a wide range of dishes.
more...
No comment yet.
Scooped by Yves Mulkers
Scoop.it!

That Data Science Colleague is a Business Asset. 5 Tips.

Think about that data scientist colleague you work with. Does she drive you crazy or is she your greatest business asset? In today’s digitally-connected business and manufacturing ecosystem, chances are you work with a data scientist or two. However, legacy communication gaps persist between technical and non-technical professionals. Here are five tips for fine-tuning how well you work with data scientists. In my Playbook, these professionals are valuable sales and business assets. Let’s explore. I asked Carla Gentry, data scientist extraordinaire, about her role as a thought-leading data scientist. By the way, her article on the subject is recommended reading. Gentry is an immensely experienced data scientist. She works with an organization’s leadership, communicating conclusions and recommendations based on existing data and analyses. “The great thing about this field is that it crosses over to so many industries, from banking to finance to marketing, science, social/psychological to consumer package good, sales, etc.,” Gentry says. First, “data science is logic- and math-based.” Also, data scientists are holistic thinkers, remaining continuously inquisitive about trends suggested by data. As a result, data scientists are both scientists as well as artists, scrutinizing data, asking questions and applying “what if” analysis. Not only that, they question the status quo of existing assumptions and processes. Data science project outcomes are contingent upon specificity in project scope and definition. If you are a business or sales professional working with a data scientist, determine their level of training and experience. Also, communicate your own level of sales and business experience. That conversation, alone, develops a more collaborative relationship. As a result, says Gentry, “you end up with [a lot of] ‘academics,’ but not experience. Those [data science professionals] who have math and engineering backgrounds, or skills that are natural, can hone these skills by practicing using R [open source statistical programming language] to access free or ‘open source’ data. Then [they] hope for an internship or a Jr. Data Science position to actually work / gain experience in an analytical field.
more...
No comment yet.
Scooped by Yves Mulkers
Scoop.it!

Robust Algorithms for Machine Learning

Machine learning is often held out as a magical solution to hard problems that will absolve us mere humans from ever having to actually learn anything. But in reality, for data scientists and machine learning engineers, there are a lot of problems that are much more difficult to deal with than simple object recognition in images or board games with finite rule sets. For the majority of problems, it pays to have a variety of approaches to help you reduce the noise and anomalies so you can focus on something more tractable. One approach is to design more robust algorithms where the testing error is consistent with the training error, or the performance is stable after adding noise to the dataset. The idea of any traditional (non-Bayesian) statistical test is the same: We compute a number (called a "statistic") from the data and use the known distribution of that number to answer the question, "What are the odds of this happening by chance?" That number is the p-value. The problem with this approach is that the "known distribution" of that number depends on the distribution of the data. This dependency can be mild — as in the case of Student's t-test or the F-test — or it can be so severe and make the value essentially meaningless for statistical purposes. Pearson's "r" (which appears as r-squared in linear regression problems) falls into the latter category, as it is so sensitive to the underlying distributions of data that it cannot, in most practical cases, be turned into a meaningful p-value, and is therefore almost useless even by the fairly relaxed standards of traditional statistical analysis. For example, using "r" as a measure of similarity in the registration of low-contrast images can produce cases where "close to unity" means 0.998 and "far from unity" means 0.98, and there's no way to compute a p-value due to the extremely non-Gaussian distributions of pixel values involved. Statistics of this kind are sometimes called "parametric" statistics due to their dependency on the parameters of the underlying distributions. Student's t-test, for example, depends on the distributions being compared having the same variance. Robust statistics are also called nonparametric precisely because the underlying data can have almost any distribution and they will still produce a number that can be associated with a p-value. The trick is to find a property of the data that does not depend on the details of the underlying distribution. In particular, converting cardinal data value to ordinals (ranks) allows us to ask some very robust questions. Take, for example, the Mann-Whitney U test.
more...
No comment yet.
Scooped by Yves Mulkers
Scoop.it!

Innovation Is as Much About Finding Partners as Building Products

This is an incredible moment for innovation. Previously unthinkable opportunities to reinvent complex, established industries are now being made possible by the convergence of cloud computing, new analytical tools, and the data flowing from a host of new sensors in the physical world. To revolutionize old industries, small and big companies alike must get past competitive angst and embrace their strengths and weaknesses. Such collaboration can take many shapes. These are new, unique partnership models in which corporations bring assets, the ability to rapidly test and scale, and a deep understanding of the regulatory landscape. Startups inject new technical expertise, and venture capitalists offer funding and access to new talent. To seize the opportunity before us, collaboration across the economy must become universal. Yesterday’s model of innovation is no longer adequate – instead, the entire ecosystem must work together. Smart companies, founders, and investors that recognize this have a far better shot at making history, rather than running the risk of becoming a footnote to it. Have you ever stopped to ponder the true complexities involved with trying to create a viable, safe, autonomous vehicle? The innovation alone is a herculean task, but imagine being that upstart pioneer trying to develop the technology, while at the same time going up against entrenched, powerful competitors with deep industry knowledge, assets, and channels who’ve been around for a hundred years or more. This is the challenge for all kinds of disruptors, whether in the auto industry, pharmaceuticals, service industries, or healthcare. The fact is, going it alone, we believe, is simply not the way to go at all. Collaboration is the essential new secret sauce for startups and industry leaders alike. For true disruption to take hold, old and new must work together, playing to each other’s strengths. This is an incredible moment for innovation. Previously unthinkable opportunities to reinvent complex, established industries are now being made possible by the convergence of cloud computing, new analytical tools, and the data flowing from a host of new sensors in the physical world. Improbable advances are now real possibilities. Yet the requirements for innovation today are entirely different from those of the last 30 years. The technology-driven disruption model that brought us computing, the internet, and mobile apps is no longer sufficient. Transforming our oldest industries calls for more than new technology; sophisticated knowledge of regulations, testing protocols, and traditional physical assets are now essential. The inadequacies of the old approach are evident in recent startup stumbles, even when the technology is sound. DNA genetic testing company 23andMe fell behind on its communications with the FDA, resulting in a temporary ban on marketing the company’s personal genetic screening services to the public. It took more than two years for 23andMe to secure a green light from the FDA to screen for 10 diseases, the first such approval for a direct-to-consumer test. A123 Systems promised to be a clean tech success with a soaring IPO in 2009. The company developed lithium ion batteries that helped convince automakers of their value for hybrid vehicles. But the startup could not keep pace with the development and scale of established battery makers, and production defects led to a $55 million recall in 2012 and contributed to the company’s bankruptcy later that year. Acquired out of bankruptcy by a Chinese company to focus on the burgeoning domestic market, A123 recently announced it will close its Michigan plant as it winds down production of lithium-ion batteries in the U.S. and shifts its local focus to engineering and testing. Even an innovation pioneer like Tesla took a harder path — the company’s reinvention of the automobile meant a massive investment in non-core parts.
more...
No comment yet.
Scooped by Yves Mulkers
Scoop.it!

Machine intelligence: Build your own vs. as-a-service

Fans of HBO’s “Silicon Valley” may recall the plotline earlier this season in which Erlich Bachman secures $200,000 in VC funding for See Food, a camera app that recognizes various kinds of food and instantly surfaces useful information, such as nutritional data. Bachman is 5% technologist and 95% charlatan, give or take, so naturally there’s a hitch: See Food doesn’t exist. The funding is the result of a misunderstanding that Bachman quickly compounded into a lie. Antics ensue as Bachman, determined to keep the money, attempts to transmute his vaporware into a working prototype. Here’s what struck me: Many of these antics, such as Bachman’s attempt to con a class of Stanford undergrads into training a machine learning model, are predictably hilarious—but from a technical standpoint, virtually none of them is implausible.   Indeed, whereas just a few years ago, an app like See Food would have been literally impossible to build, today it’s not a stretch that a few Bachman-like misfits could actually cobble together the pieces. “Silicon Valley” is TV fantasy but Bachman and company rely on the same resources developers would use in the real world: cloud infrastructure, neural nets, etc. That these resources have become so accessible—accessible enough to be casually mentioned in a mainstream TV show—is a testament to how much and how quickly things have changed. Crucially, this shift is about more than camera apps; it’s about machine intelligence (MI) moving from niche applications to ubiquity. As we’ve written previously, thanks to APIs and cloud services, we’re entering the first years in which virtually any enterprise that wants to harness machine intelligence will be able to—which means that enterprises that don’t harness intelligence will risk being left behind. The prospect of digital ecosystems that widely incorporate MI raises a critical question for CIOs: When it comes to machine intelligence, how does one assess when to build a system from the ground up versus when to invest in third-party solutions? Machine intelligence requires three ingredients: computing muscle, algorithms, and data. The degree to which a company is strong in any one area informs when that company should go proprietary and when it should fill gaps with third-party as-a-service offerings. Strength should be assessed not only in terms of technologies and budgets but also human talent and ability to execute.  Prior to the cloud, harnessing the computing power necessary for machine learning (ML) typically required building one’s own supercomputer—a spectacularly forbidding prospect in terms of cost, time, and requisite expertise. Cloud infrastructure has changed that, rapidly diminishing the marginal cost for additional computing power, and enabling companies to rent when they previously would have been forced to build or to enter spectacularly expensive partnerships. Consequently, fewer scenarios exist in which building one’s own ML compute infrastructure confers a competitive advantage. The expense and effort might be justified if you’re building a unique service or require specialized hardware—but even then, the benefit is debatable. If a custom system performs 5% better than as-a-service offerings but involves 10 times the cost and takes 10 times longer to develop and deploy, the system can still easily lose money in the end, despite its superior performance. That’s not to say all cloud infrastructure is equal, of course. Top providers such as Google (our employer) and Microsoft don’t just scale up spare computing cores, for example; they build custom chips specifically for MI.
more...
No comment yet.