Internet of Thing...
Follow
Find
17.2K views | +35 today
 
Scooped by Richard Platt
onto Internet of Things - Technology focus
Scoop.it!

Nvidia Refocuses Its Plan to Target Servers, Instead Promoting Its GPUs In ARM ... - Trefis

Nvidia Refocuses Its Plan to Target Servers, Instead Promoting Its GPUs In ARM ... - Trefis | Internet of Things - Technology focus | Scoop.it
Nvidia Refocuses Its Plan to Target Servers, Instead Promoting Its GPUs In ARM ...
Trefis
... to develop its own 64-bit ARM-based CPU processor for servers.
Richard Platt's insight:
The server market is currently dominated by the x86 processors, which account for well over 80% of total server shipments. When combined with a GPU, the x86 processors can do more work per watt than a standalone x86 processor. ARM processors can help cut down on power consumption as they are known for their low power processors, an important criteria given the size and energy demands of most HPC (High Performance Computing) systems. AMD believes that ARM CPUs have the potential to account for 20% of the server market by 2016 or 2017
more...
No comment yet.
Internet of Things - Technology focus
Your new post is loading...
Your new post is loading...
Scooped by Richard Platt
Scoop.it!

Harvard's Michael Porter: Service Leaders Will Be Hard Hit by IoT Revolution

Harvard's Michael Porter: Service Leaders Will Be Hard Hit by IoT Revolution | Internet of Things - Technology focus | Scoop.it
The Internet of Things transformation will have a big impact on the service and maintenance industries. Here are 4 ways service leaders must adapt.
Richard Platt's insight:

Porter, along with Jim Heppelmann, president and CEO of PTC, summarized their findings from a November 2014 HBR article about how the Internet of Things (IoT) is disrupting various industries, field service included. The pair also spoke about ongoing research on the implications for company strategy and organization, which will be published in HBR later this year.

“Not only is the product changing, the product change is feeding back and changing how companies operate today,” Porter said. “How you run a company is going to change much more dramatically than in previous generations of IT.” 

1. Service businesses will shift from reactive to proactive:  There will be a transformation in the way service businesses are run and organized, as connected products allow technicians to diagnose the problem, or even perform service, remotely. Companies will be able to push updates to products in the field, and analyze product usage data to improve service efficiency and warranty management. As new IoT-enabled technologies take hold, service companies will move beyond the repair model to data-enabled advanced services that add value to customers. Ultimately, Porter said, this model will evolve to “product-as-a-service” as companies design new functionality and extend product life cycles. 

2. Big data will create an entirely new section of the value chain:    Companies will find ways to create value from the constant data stream from both internal and external factors. (Internal data could be product usage and equipment performance information, while external data could include weather conditions.)  Porter and Heppelmann said that, rather than having each division deal with its own data separately, companies need to create a “unified data group,” led by a chief data officer, that can store, aggregate and analyze the data — and work closely with other divisions to uncover insights that create customer value.

3. Product design will require a long-term, integrative approach:  Product design will become “evergreen,” said Porter, meaning products will be continuously re-designed and serviced via remote connections and services once they’re in the field. As a result, companies must find a new approach to product design that accounts for everything that happens after the sale is closed.  In addition, increased connectivity will require manufacturers to look at products within a larger, networked system. A “smart” tractor, for example, will have its own data analytics connections, but it must also interface with other smart machines on the farm.

4. Expect more consolidation and a war for talent: Porter frames the changes led by the IoT as an opportunity for companies to broaden their offerings and lead with innovative product functionality. There are two choices: cling to business as usual, or adapt. Companies that don’t react will have their products subsumed by companies that do. Porter predicts this will lead to further consolidation across industries, allowing companies to expand their market and products through data and IoT functionality. - But the biggest hurdle, Porter said, is likely to be the war for talent. There are currently too few people with the necessary mix of skills to tackle the new challenges presented by the IoT era.



more...
Caroline Roy's curator insight, August 7, 5:20 AM

Porter, along with Jim Heppelmann, president and CEO of PTC, summarized their findings from a November 2014 HBR article about how the Internet of Things (IoT) is disrupting various industries, field service included. The pair also spoke about ongoing research on the implications for company strategy and organization, which will be published in HBR later this year.

“Not only is the product changing, the product change is feeding back and changing how companies operate today,” Porter said. “How you run a company is going to change much more dramatically than in previous generations of IT.” 

1. Service businesses will shift from reactive to proactive:  There will be a transformation in the way service businesses are run and organized, as connected products allow technicians to diagnose the problem, or even perform service, remotely. Companies will be able to push updates to products in the field, and analyze product usage data to improve service efficiency and warranty management. As new IoT-enabled technologies take hold, service companies will move beyond the repair model to data-enabled advanced services that add value to customers. Ultimately, Porter said, this model will evolve to “product-as-a-service” as companies design new functionality and extend product life cycles. 

2. Big data will create an entirely new section of the value chain:    Companies will find ways to create value from the constant data stream from both internal and external factors. (Internal data could be product usage and equipment performance information, while external data could include weather conditions.)  Porter and Heppelmann said that, rather than having each division deal with its own data separately, companies need to create a “unified data group,” led by a chief data officer, that can store, aggregate and analyze the data — and work closely with other divisions to uncover insights that create customer value.

3. Product design will require a long-term, integrative approach:  Product design will become “evergreen,” said Porter, meaning products will be continuously re-designed and serviced via remote connections and services once they’re in the field. As a result, companies must find a new approach to product design that accounts for everything that happens after the sale is closed.  In addition, increased connectivity will require manufacturers to look at products within a larger, networked system. A “smart” tractor, for example, will have its own data analytics connections, but it must also interface with other smart machines on the farm.

4. Expect more consolidation and a war for talent: Porter frames the changes led by the IoT as an opportunity for companies to broaden their offerings and lead with innovative product functionality. There are two choices: cling to business as usual, or adapt. Companies that don’t react will have their products subsumed by companies that do. Porter predicts this will lead to further consolidation across industries, allowing companies to expand their market and products through data and IoT functionality. - But the biggest hurdle, Porter said, is likely to be the war for talent. There are currently too few people with the necessary mix of skills to tackle the new challenges presented by the IoT era.



clara noble's curator insight, August 9, 5:54 AM

Porter, along with Jim Heppelmann, president and CEO of PTC, summarized their findings from a November 2014 HBR article about how the Internet of Things (IoT) is disrupting various industries, field service included. The pair also spoke about ongoing research on the implications for company strategy and organization, which will be published in HBR later this year.

“Not only is the product changing, the product change is feeding back and changing how companies operate today,” Porter said. “How you run a company is going to change much more dramatically than in previous generations of IT.” 

1. Service businesses will shift from reactive to proactive:  There will be a transformation in the way service businesses are run and organized, as connected products allow technicians to diagnose the problem, or even perform service, remotely. Companies will be able to push updates to products in the field, and analyze product usage data to improve service efficiency and warranty management. As new IoT-enabled technologies take hold, service companies will move beyond the repair model to data-enabled advanced services that add value to customers. Ultimately, Porter said, this model will evolve to “product-as-a-service” as companies design new functionality and extend product life cycles. 

2. Big data will create an entirely new section of the value chain:    Companies will find ways to create value from the constant data stream from both internal and external factors. (Internal data could be product usage and equipment performance information, while external data could include weather conditions.)  Porter and Heppelmann said that, rather than having each division deal with its own data separately, companies need to create a “unified data group,” led by a chief data officer, that can store, aggregate and analyze the data — and work closely with other divisions to uncover insights that create customer value.

3. Product design will require a long-term, integrative approach:  Product design will become “evergreen,” said Porter, meaning products will be continuously re-designed and serviced via remote connections and services once they’re in the field. As a result, companies must find a new approach to product design that accounts for everything that happens after the sale is closed.  In addition, increased connectivity will require manufacturers to look at products within a larger, networked system. A “smart” tractor, for example, will have its own data analytics connections, but it must also interface with other smart machines on the farm.

4. Expect more consolidation and a war for talent: Porter frames the changes led by the IoT as an opportunity for companies to broaden their offerings and lead with innovative product functionality. There are two choices: cling to business as usual, or adapt. Companies that don’t react will have their products subsumed by companies that do. Porter predicts this will lead to further consolidation across industries, allowing companies to expand their market and products through data and IoT functionality. - But the biggest hurdle, Porter said, is likely to be the war for talent. There are currently too few people with the necessary mix of skills to tackle the new challenges presented by the IoT era.

 

 


TDI Group's curator insight, August 10, 6:20 AM

Porter, insieme a Jim Heppelmann, Presidente e CEO di PTC, ha riassunto le loro scoperte da un HBR articolo novembre 2014 su come Internet of Things (IoT) sta distruggendo vari settori, compreso il servizio di campo. La coppia ha parlato anche in corso ricerche sulle implicazioni per la strategia e l'organizzazione aziendale, che sarà pubblicato in HBR entro la fine dell'anno.

"Non solo è il prodotto cambia, il cambiamento del prodotto è l'alimentazione indietro e cambiare il modo in aziende operano oggi", ha detto Porter. «Come si esegue una società sta per cambiare molto più drammatico rispetto a precedenti generazioni di IT". 

1. le imprese di servizi saranno spostano da reattivo a proattivo:   Ci sarà una trasformazione nel modo in cui le imprese di servizi sono gestiti e organizzati, come i prodotti collegati consentono ai tecnici di diagnosticare il problema, o addirittura eseguire il servizio, in modalità remota. Le aziende saranno in grado di inviare gli aggiornamenti ai prodotti nel campo, e analizzare i dati di utilizzo del prodotto per migliorare l'efficienza del servizio e  la gestione delle garanzie . Come le nuove tecnologie degli oggetti abilitati prendere piede, società di servizi si muoveranno al di là del modello di riparazione di data-enabled servizi avanzati che  aggiungono valore ai clienti . In ultima analisi, ha detto Porter, questo modello si evolverà per "prodotto-as-a-service", come le società di progettare nuove funzionalità e di estendere i cicli di vita dei prodotti. 

2. I dati grandi creeranno una nuova sezione della catena del valore:    Le imprese dovranno trovare il modo di creare valore dal flusso di dati costante sia da fattori interni ed esterni. (Dati interni potrebbero essere l'utilizzo del prodotto e informazioni sulle prestazioni attrezzature, mentre i dati esterni potrebbero includere condizioni atmosferiche.)   Porter e Heppelmann ha detto che, piuttosto che dover ogni affare divisione con un proprio dati separatamente, le imprese hanno bisogno di creare un "gruppo di dati unificata," guidati da un ufficiale di dati principale, che può memorizzare, aggregare e analizzare i dati - e lavorare a stretto contatto con altre divisioni per scoprire intuizioni che creano valore per il cliente.

3. Progettazione del prodotto richiederà un lungo periodo, integrativo approccio:  Product design diventerà "evergreen", ha detto Porter, il che significa prodotti saranno  continuamente ri-progettati e serviti  tramite connessioni e servizi a distanza una volta che sono in campo. Come risultato, le aziende devono trovare un nuovo approccio al design di prodotto che rappresenta tutto ciò che accade dopo la vendita è chiusa.   Inoltre, una maggiore connettività richiede ai produttori di esaminare i prodotti all'interno di un sistema più ampio, in rete. Un  trattore "intelligente" , per esempio, avrà le proprie connessioni analisi dei dati, ma deve anche interfacciarsi con altre macchine intelligenti della fattoria.

4. Aspettatevi di più il consolidamento e una guerra per il talento: Porter incornicia i cambiamenti guidati dal IoT come un'opportunità per le aziende di ampliare le loro offerte e portare con funzionalità innovative di prodotto. Ci sono due scelte: si aggrappano alla vita di sempre, o adattare. Le aziende che non reagiscono avranno i loro prodotti sussunto da aziende che fanno. Porter prevede questo porterà a un ulteriore consolidamento attraverso le industrie, consentendo alle aziende di espandere il loro mercato e dei prodotti attraverso i dati e le funzionalità degli oggetti. -  Ma il più grande ostacolo, ha detto Porter, è probabile che sia la  guerra per i talenti . Al momento non ci sono troppo poche le persone con il mix di competenze necessarie per affrontare le nuove sfide poste dalla dell'era degli oggetti.

 

 


Scooped by Richard Platt
Scoop.it!

Intel unveils Maker Lab in India to focus on mobile devices, IoT - Tech2

Intel unveils Maker Lab in India to focus on mobile devices, IoT - Tech2 | Internet of Things - Technology focus | Scoop.it
Intel India on Thursday unveiled Intel India Maker Lab and Intel India Maker Showcase to power product innovation in India in Internet of Things (IoT), mobile devices and other computer focused domains.
more...
No comment yet.
Rescooped by Richard Platt from Advanced Threats,Intelligence Technology,CyberSecurity
Scoop.it!

Google's Data Architecture and What it Takes to Work at Scale.

Google's Data Architecture and What it Takes to Work at Scale. | Internet of Things - Technology focus | Scoop.it
Malte Schwarzkopf — currently finishing his PhD on “operating system support for warehouse-scale computing” at the University of Cambridge — has released a series of slides describing some of his research into large-scale, distributed data architectures. Schwarzkopf and his team at Cambridge Systems at Scale are aiming to build the next generation of software systems …

Via Constantin Ionel Milos / Milos Constantin
Richard Platt's insight:

Schwarzkopf and his team at Cambridge Systems at Scale are aiming to build the next generation of software systems for large-scale data centers. So it has been essential for him to understand how some of the current data giants are configuring their full stack at present, in order to build software for the next wave of businesses that grow with a need to work at a similar scale. Along the way, he has contributed to a number of open source projects including DIOS (a distributed operating system for warehouse-scale data centers that uses an API based on distributed objects); Firmament (a configurable cluster scheduler that looks to apply optimization analysis over a flow network); Musketeer (a workflow manager for big data analytics); and QJump (a network architecture that reduces network interference and provides latency messaging).   Schwarzkopf’s slide deck builds on his extensive bibliography into the Google stack.  His research finds that warehouse-scale computing (defined at 10,000-plus machines) requires a different software stack, all aiming to help increase the utilization of many-core machines, and allow fast, incremental stream processing and approximate analytics (like that offered by BlinkDB) on large datasets. (Many-core is a term meant to indicate a level of magnitude greater than multi-core.)   -   Schwarzkopf’s research spells out the three main characteristics that many of the largest data-driven companies like Microsoft, Twitter and Yahoo have in common with Google and Facebook:

*  “Frontend serving systems and fast backends.
*  Batch data processing systems.
*  Multi-tier structured/unstructured storage hierarchy.
*  Coordination system and cluster scheduler.”
In his presentation, “What does it take to make Google work at scale?” Schwarzkopf discusses the architecture behind those 139 microseconds between submitting a search request in the Google input bar, and the pages of ads-and-search results that are returned.

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

TRIZ: A Powerful Methodology for Creative Problem-Solving

TRIZ: A Powerful Methodology for Creative Problem-Solving | Internet of Things - Technology focus | Scoop.it
TRIZ is a problem solving methodology based on logic, data and research, not intuition.
Richard Platt's insight:

Projects of all kinds frequently reach a point where as much analysis as possible has been carried out, but the way forward is still unclear. Progress seems blocked, and if the project team is to move forward, it must develop creative solutions to the problems it faces.  You'll already know about techniques such as brainstorming , which can help with this sort of situation. However, this type of approach, which depends on intuition and the knowledge of the members of the team, tends to have unpredictable and unrepeatable results. What's more, a huge range of possible solutions can be missed, simply because they're outside the experience of the project team.   TRIZ is a problem solving methodology based on logic, data and research, not intuition. It draws on the past knowledge and ingenuity of many thousands of engineers to accelerate the project team's ability to solve problems creatively. As such, TRIZ brings repeatability, predictability, and reliability to the problem-solving process with its structured and algorithmic approach.  "TRIZ" is the (Russian) acronym for the "Theory of Inventive Problem Solving." G.S. Altshuller and his colleagues in the former USSR developed the method between 1946 and 1985. TRIZ is an international science of creativity that relies on the study of the patterns of problems and solutions, not on the spontaneous and intuitive creativity of individuals or groups. More than three million patents have been analyzed to discover the patterns that predict breakthrough solutions to problems, and these have been codified within the TRIZ. body of knowledge. And it is spreading into corporate use across several parallel paths – it is increasingly common in Six Sigma processes, in project management and risk management systems, and in organizational innovation initiatives.  TRIZ research began with the hypothesis that there are universal principles of creativity that are the basis for creative innovations, and that advance technology. The idea was that if these principles could be identified and codified, they could be taught to people to make the process of creativity more predictable. The three primary findings of the last 65 years of research are as follows:

  1. Problems and solutions are repeated across industries and sciences. By classifying the "contradictions" (see later) in each problem, you can predict good creative solutions to that problem.
  2. Patterns of technical evolution tend to be repeated across industries and sciences.
  3. Creative innovations often use scientific effects outside the field where they were developed.

Much of the practice of TRIZ consists of learning these repeating patterns of problems-solutions, patterns of technical evolution and methods of using scientific effects, and then applying the general TRIZ patterns to the specific situation that confronts the developer / problem solver.

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

Build Data Quality Into the Internet of Things

Build Data Quality Into the Internet of Things | Internet of Things - Technology focus | Scoop.it
With traditional data, we’ve become used to the idea that people are the main sources of issues relating to data, CIO Journal Columnists Thomas H. Davenport and Thomas C. Redman write. But the emergence of connected devices driving ever more complex industrial processes demands a new response from companies on how best to guarantee data quality.
Richard Platt's insight:

Standards are the obvious answer, but they take a devilishly long time and much effort. For example, the development of the EPCGlobal (electronic product code for radio frequency identification) standard took about 15 years. The development of the ANSI X12 standard for electronic data interchange took about 14 years. We don’t want to wait that long for anything these days, and standards development could really slow down the process of the IoT movement.

Beyond understanding the issues and trying to help establish standards, what must a technologist actually do?  We take as a general rule of thumb that bad data is like a virus. There is no telling where it will end up or the damage it will cause. With viruses the basic idea is to try to prevent the virus in the first place and do all you can to contain it.  -  For data quality and the Internet of Things, preventing the virus means excellent design, manufacturing, and installation of the IoT device. Since such devices are typically made by someone (a semiconductor manufacturer, for example) other than the user, buyers must insist the device actually measures what it purports to measure. This implies both specification of the intended measurement and rigorous testing, under both laboratory and real-world conditions, to ensure that is what actually occurs. What is a “step,” for example, and does the device actually count them properly?

The specification should spell out operating conditions. Recall we noted that the example of a health tracking device not working so well on the treadmill. The specification should spell out everything you’ll need to use the device successfully in practice: what you need to do to install and test it, its expected lifetime, how you’ll know when it is time to maintain or replace the device, and so forth.  Insist on two levels of calibration from your device supplier. First, there should be rigorous calibration before the device leaves the factory, and an “on-installation” calibration routine to ensure that the device works as expected. Second, ongoing calibration is required to make sure the device continues to work properly. Ideally, the on-installation and ongoing calibration routines should built-in and automated.   To contain bad data, devices should come equipped with what we call “I’m not working right now” and “I’m broken and must be replaced” features, which do exactly what their names suggest.  Finally, you should not expect perfection, particularly with new devices. But you must insist on rapid improvement. So it is critical that the manufacturer aggregate and analyze the results of all these steps, looking for patterns that suggest improvements. Seek answers to basic questions such as: can the devices really be trusted? Are they lasting as long as expected? What is causing them to fail?

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

Intel reveals Big Data's Dirty Little Secret

Intel reveals Big Data's Dirty Little Secret | Internet of Things - Technology focus | Scoop.it
Companies are spending billions on tools and engineering for big data-styled analysis, though many are hampered by one little problem: They still don't know what to do with all the data they collect.
Richard Platt's insight:

"This is the dirty little secret about big data: No one actually knows what to do with it," Jason Waxman, an Intel vice president and general manager of the company's cloud platforms group, said Thursday in a webcast for investors.  "They think they know what to do with it, and they know they have to collect it, because you have to have a big data strategy. But deriving the insights from big data is a little harder to do,".   Big data is all about collecting large amounts of sensor or process data, the analysis of which can lead to insights into customer behavior and point the way to improvements in operational efficiency.    -   Intel is interested in the big data market because big data systems will require lots of processor-driven hardware, preferably Intel's.   Today, big data sparks about $13 billion a year in IT spending, a figure Intel estimates will balloon to $41 billion by 2018, with at least $2 billion or so of that money earmarked for hardware.

To get value from big data, enterprises must get past a number of hurdles, Waxman said.  The company talked to a number of organizations to find out more about their use, and anticipated use, of big data. It found that the number one challenge is figuring out how to extract value from the data.  It's a demanding task. Organizations need the right talent to assemble and run big data systems, which requires skills in statistics and analytical reasoning in addition to the more usual programming and system administration.

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

8 IoT Products That Kind Of Made Us Laugh

8 IoT Products That Kind Of Made Us Laugh | Internet of Things - Technology focus | Scoop.it
The hype surrounding the Internet of Things has led to some unique products for consumers. CRN looks at some of the sillier ones.
Richard Platt's insight:

My favorite the Lixil's Satis Bluetooth toilet allows customers to flush and clean the toilet through an app on their smartphones. The smart toilet also includes a built-in night light and automated cleansing features.  The Japanese product made headlines when Forbes  highlighted the fact that it was vulnerable to hacks, which could cause the toilet to scream and spray customers with the bidet while opening and closing its mouth, according to a Trustwave security advisor.

more...
No comment yet.
Rescooped by Richard Platt from Future of Cloud Computing and IoT
Scoop.it!

Gartner on Medical 3D Printing - 3D Printing Industry

Gartner on Medical 3D Printing - 3D Printing Industry | Internet of Things - Technology focus | Scoop.it
Market research firm Gartner has laid out its thoughts on the future of 3D printed medical devices and bioprinting with the 2015 Hype Cycle on 3D Printing.

Via massimo facchinetti
Richard Platt's insight:

Gartner is widely regarded for its ability to analyze technological trends, relying heavily on their now famous Hype Cycle.  And the market research firm doesn’t just get into the technology, but the society surrounding it, even predicting that there would be widespread social upheaval in one report.  More recently, the firm created a specialized model for outlining specific trends in 3D printing, with the 2015 Hype Cycle for 3D Printing outlining various applications within the industry and their point of maturity versus popular expectations.  And, with this tool, they have determined that medical 3D printing has just hit the Peak of Inflated Expectations, meaning that, soon, we’ll no longer be wowed by 3D printing in the medical space, as the tech is legitimately incorporated regularly in specialist medical applications.    Bioprinting for transplantable organs is listed as a separate category, that is also listed as 5-10 years from maturity, but is further from pure hype inflation.  Michael Shanler, research director at Gartner, says of this specific application, “Some of these R&D systems are already capable of printing cells, proteins, DNA and drugs, however there are significant barriers to mainstream adoption.” He adds, “The sheer complexity of the items to be printed and the high maintenance requirements of these systems mean that initial deployments will be mostly limited to specialist service providers. We see mainstream adoption increasing as the systems become more diverse in their functions.”  3D Bioprinting Solutions may believe they can 3D print an organ in 1-2 years, but, even if they do, there won’t be a top 10 list of organ printers for another 5-10.   Outside of medicine, however, Gartner believes that there are a number of spaces in which the technology is only two to five years from becoming mature that will, in turn, see 3D printing move from specialty uses to broader usage. These include 3D scanning, service bureaus, and 3D printing software, says Basiliere.  With CAD software being made simpler for consumer use, coupled with repositories of 3D printables, consumers have greater access to 3D models.  3D scanners, too, give consumers a wider range of printable options, as they drop dramatically in price.  And, because they can use service bureaus to have this objects printed, rather than on a home machine, they can experiment with the technology more and more.  Basiliere explains, “Advancements outside of the actual printers themselves may prove to be the catalyst that brings about widespread adoption. Technologies such as 3D scanning, 3D print creation software and 3D printing service bureaus are all maturing quickly, and all — in their own way — have the potential to make high quality 3DP more accessible and affordable.”

more...
Jake D'Imperio gis's curator insight, August 30, 2:28 AM

This article explains the predicted prevalence of 3-D printing in the medical field. This will make surgery with bones easier and less expensive and will save and improve the lives of many.  This is exciting news as the normal cost of hip replacement is thousands of dollars and the cost will be significantly reduced.

Scooped by Richard Platt
Scoop.it!

Power Monitoring and Management Solutions for Facilities and IT Managers

Power Monitoring and Management Solutions for Facilities and IT Managers | Internet of Things - Technology focus | Scoop.it
The data center landscape is constantly evolving. Over the past ten years, with the emergence of the Internet of Things (IoT), facility managers are now equipped with the tools necessary to keep critical machines up and running 24/7.
Richard Platt's insight:

Enterprise-grade power management solutions are designed for use in larger data centers with more complex requirements, enterprise-grade power management solutions come with more extensive and advanced capabilities than entry-level systems. Specifically, the best enterprise-grade power management solutions give IT and facilities managers the ability to: generate detailed energy-efficient reports and calculate key energy-efficient metrics such as Power Usage Effectiveness (PUE) and Data Center infrastructure Efficiency (DCiE); balance electrical loads to minimize peak demand and maximize energy cost savings; identify inefficient data center hot spots that can put wasteful strains on a facility’s cooling systems; measure power consumption on a per-workload basis to support energy chargeback initiatives; provide capacity planning data.   Power monitoring and management are complex tasks, but they are critical to maintaining nonstop uptime. To perform those tasks effectively, most facilities need one or more infrastructure management solutions, a virtualization management solution and a specialized power monitoring and management solution. Data center operators who wish to stop utility outages and power system malfunctions from disabling mission-critical IT solutions should ensure they have the right set of onsite solutions and remote services for their specific environment.

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

IoT: Redefining Industries

IoT: Redefining Industries | Internet of Things - Technology focus | Scoop.it
IOT is the evolution of Mobile, Home and Embedded applications that are being connected to the internet integrated with greater computing capabilities on data and analytics to extract meaningful information.This convergence of things, data and processes will transform our life, business and everything in between.The internet is the heart of this technology and hardware will be the spinal code. Analytics performed on acquired data from sensors are will result in an intelligent system.Of course, the primary value in an IoT system is in the ability to perform analytics on the acquired data and e
Richard Platt's insight:

Example, this smart traffic camera identified congestion due to an accident.That insight can be sent to city-wide transportation system which can analyse the impact on other city systems.Recognising the accident is near the airport and other offices it can inform those systems so that they can adjust flight and office schedule.You can also analyse and derive optimum route around the accident inform people.  This is just one example of the potential benefit that can happen when intelligent systems share insights with other systems creating the ever-expanding system.

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

Chinese Government’s Driving Cloud Computing, Looking to Surpass Developed Countries

Chinese Government’s Driving Cloud Computing, Looking to Surpass Developed Countries | Internet of Things - Technology focus | Scoop.it
China has the potential to surpass developed countries in terms of boosting cloud computing industry. The government's efforts, particularly the "Internet Plus" plan, is significant in the industry's growth domestically.
Richard Platt's insight:

Please understand that when you read this article, that you are reading the promotional materials of the Chinese government, trying to make a play for the larger IoT space that Western capitalistic MNC's already own.  China wants to own a large piece of that IoT pie, up to you whether or not you believe them, in any event they are not letting the IoT, cloud computing or anything related to this 3rd wave of IT pass them by:  "The "Internet Plus" scheme, as noted by Premier Li Keqiang, entails the integration of modern manufacturing with the Internet of Things, big data, mobile Internet and cloud computing.   According to Zhao Yi, an associate researcher at the Institute of Computing Technology, Chinese Academy of Sciences, China's fast growth and breakthroughs in the cloud computing industry is attributed to breaking the monopolistic positions of the U.S. and Europe.   Zhao added that the industry will still go far, as many other industries--banking, game and telecom, among others--need cloud platforms for further growth.   It was in 2008 when China began exploring cloud computing. And for Huawei's cloud computing marketing director Zhang Jianhua, the country's basic education on this industry, coupled with the quick spread of the Internet, has laid a solid foundation for cloud computing development.  Zhang also cited that China now has the same level as the world in terms of global fundamental technologies for cloud computing.  Government statistics show that for the first half of this year, the country posted the fastest growth of information technology, big data and cloud computing industries: 22.1%.  For several analysts, cloud computing will become one of the main driving forces for China to speed up its economic development.

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

Internet of Things: the ticking bomb

Internet of Things: the ticking bomb | Internet of Things - Technology focus | Scoop.it
The spectrum of security risks and conceivable attack vectors affecting loT devices is diverse, with varying degrees of probability and concerns.
Richard Platt's insight:

Reality Check:  Despite acute awareness of potential risks, corporations are unable to completely mitigate the situation for a number of reasons.   Many home users are ill-equipped to ensure security across all layers of the Internet Protocol Suite for devices on their home networks.  Employing security best practices and configurations across the Physical, Link, Internet, Transport and Application layers are simply beyond most users.  And similar to what occurs with many Bring Your Own Devices (BYOD) situations, users are often highly resistant to allowing corporate governance and policies to fully apply to their personal devices.  The problem is compounded by the sporadic industry focus towards ensuring strong security measures are in-built into IoT solutions.   Firstly, many IoT devices are produced by hardware manufacturers who may have limited experience and knowledge in ensuring security of the upper layers of the Internet Protocol Suite. Secondly, incorporating defense-in-depth solutions adds additional costs to the manufacture of any device – i.e. through increased costs of development to better design and test secure devices; or building more powerful devices to better support security functions such as encryption; as well as providing ongoing patching and maintenance services to respond to new security threats.     The Shared Challenge:   Yes, the IoT future is exciting, with the potential to greatly enrich personal lives and daily interactions.   The question then becomes: what strategies can we implement for a balance between fast-paced innovation and security in the IoT ecosystem?   The promising news is there is significant attention given to this topic.  For instance, the US Federal Trade Commission released a detailed research report earlier this year, highlighting key requirements needed across the Industry and Regulatory landscape to ensure the risks do not outweigh the social benefits.    Another improvement towards addressing the environment would be for greater collaboration within the IoT community.  In a capitalistic economy, competition is both anticipated and healthy.  While efforts are underway around device interoperability between industry vendors, efforts are not as pronounced around security.  Opportunities exist to better address the problem as a collective industry rather than as competitors.

more...
No comment yet.
Rescooped by Richard Platt from Future of Cloud Computing and IoT
Scoop.it!

The History of Cloud-based File Sharing

The History of Cloud-based File Sharing | Internet of Things - Technology focus | Scoop.it
Here's a look at the past, present and future of cloud computing to help MSPs understand where we’ve been and prepare for the changes the industry will undergo in the future.

Via massimo facchinetti
more...
Shalyn Jooste's curator insight, August 27, 7:01 AM

There is no such thing as an original idea... Who would have guessed you could trace this back so far?

This most certainly made for an interesting read.

Scooped by Richard Platt
Scoop.it!

Europe doesn't properly recycle most of its electronic waste

Europe doesn't properly recycle most of its electronic waste | Internet of Things - Technology focus | Scoop.it
You probably know that you should recycle your old tech when you're done with it, but getting other people to do the same? That's quite hard, apparently. The Un...
Richard Platt's insight:

The United Nations and INTERPOL have found that only 35 percent of the European Union's electronic waste in 2012, about 3.2 million imperial tons, was recycled properly. The rest (6.1 million tons) was either exported, recycled improperly or trashed. And that's a problem beyond just the expected environmental issues, such as toxins making their way into landfills. Many crooks take advantage of this lapse by scavenging and smuggling e-waste -- that old laptop you chucked out might be a gold mine for a bootlegger hoping to sell its parts or raw metals.   So what is the EU to do? Researchers suggest tighter coordination between police and other relevant outfits, for one thing. They're likewise proposing a much tougher legal stance, including mandatory disposal standards and a ban on cash transactions in scrap metal sales. Whether or not those efforts would be enough is another matter.

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

​Formula 1 racing: Sensors, data, speed, and the IoT

​Formula 1 racing: Sensors, data, speed, and the IoT | Internet of Things - Technology focus | Scoop.it
The CIO of Williams Martini Racing explains how the famous team instruments both cars and drivers in the quest of win.
Richard Platt's insight:

It's a highly technical sport and the technology constantly evolves.

The world of Formula 1 racing is fascinating and so I invited the Chief Information Officer of Williams Martini Racing to be a guest on CXOTalk. Williams is one of the foremost names in Grand Prix racing and its CIO, Graeme Hackland, has worked in the sport for many years.   During the CXOTalk conversation, Graeme discusses the technology and infrastructure Williams uses to conduct it's racing activities. Imagine a complete technology infrastructure that travels from track-to-track around the world, supporting the race cars and team.  Here is the entire video of that discussion (click through to view):

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

Israeli IoT Company Augury raises $7M

Israeli IoT Company Augury raises $7M | Internet of Things - Technology focus | Scoop.it
The predictive machine diagnostics company is attempting to redefine the predictive maintenance market.
Richard Platt's insight:

Predictive machine diagnostics company Augury today announced a $7 million Series A funding round led by Formation 8 Hardware Fund and joined by Pritzker Group Venture Capital. Existing investor First Round Capital and Lerer Hippeau Ventures also participated in the funding. The capital will be used to accelerate product development, expand Augury’s sales and marketing and support the company’s ongoing growth.  Augury has offices in New York and Haifa in Israel.   Augury is attempting to redefine the predictive maintenance market, one that has remained inaccessible and expensive for decades. Augury is bringing its proprietary algorithms, smart sensing device and mobile diagnostics tool to new markets, starting with diagnosing HVAC (heating, ventilation and air conditioning) systems within commercial buildings. The company’s technology has the potential to save billions of dollars in maintenance and energy costs. In doing so, Augury is helping to create a much larger market for predictive maintenance and will eventually expand its reach to diagnosing the IoT (Internet of Things).  

Augury CEO Saar Yoskovitz said, “Augury is ‘Shazam’ for machines. Using smartphone technology, it has made IoT accessible and affordable for legacy systems and large industries. Formation 8 recognizes Augury as a future market leader in the Industrial IoT space, with the potential to capture the multi-billion dollar market opportunity in front of it. Additionally, Formation 8 Hardware Fund is a recognized leader when it comes to investing at the intersection of hardware and software and we are thrilled to work with the team and our other new partners to take advantage of the real strategic value they bring to the table."

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

Research: Sensors enabling a $70B wearable technology market by 2025

Research: Sensors enabling a $70B wearable technology market by 2025 | Internet of Things - Technology focus | Scoop.it
CAMBRIDGE, England--(BUSINESSWIRE)-- The IDTechEx Research report “Wearable Sensors 2015-2025: Market Forecasts, Technologies, Players” forecasts that wearable sensors will drive the wearable technology ...
Richard Platt's insight:

There will be over 3 billion sensors in wearable technology devices by 2025, with more than 30% being emerging sensor types. Wearable Sensors 2015-2025: Market Forecasts, Technologies, Players is a brand new IDTechEx Research report providing the only up-to-date and expert analysis of every prominent sensor type for wearable technology.   -  Sensors are the most diverse component type in wearable devices, and they also enable the key functions that will go into wearable devices and make them be worn. Advances with wearable sensors are a vital driver for the future of wearable technology. Their incorporation alongside new energy harvesting and energy storage techniques, efficient power management systems and low power computing, in form factors that will be increasingly flexible, fashionable and invisible will enable the wearable technology market to reach $70bn by 2025.  The IDTechEx Research report gives detailed coverage of the 15 most prominent sensor types in wearable technology today, including inertial measurement units (accelerometers, gyroscopes, magnetometer and barometers), optical sensors (including optical heart rate monitoring, PPG and cameras), wearable electrodes, chemical sensors, flexible stretch, pressure and impact sensors, temperature sensors, microphones and other emerging wearable sensors.  For each sensor, the technologies and major players are described, backed up by detailed interviews and company profiles of key bodies in each sector. The report also views the big picture, discussing the implications of sensor fusion and the relative merits of each sensor type for various applications. Extensive primary research is used to produce detailed market forecasts for each sensor type over the next decade, illustrating the trends that will be prominent in the industry. Case studies of the key trends include regulatory implications for healthcare systems, ease of commoditization in infotainment devices and the possibilities presented by sensor fusion.

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

NSA Mass Surveillance: Biggest Big Data Story

NSA Mass Surveillance: Biggest Big Data Story | Internet of Things - Technology focus | Scoop.it
When people talk about the US National Security Agency/Central Security Service (NSA), the talk usually centers on privacy, with good reason. Still, it’s not the only subject worth discussing. The volume of data collected by the NSA and the associated costs make it the ultimate in Big Data case studies. [...]
Richard Platt's insight:

Very interesting read.  "When people talk about the US National Security Agency/Central Security Service (NSA), the talk usually centers on privacy, with good reason. Still, it’s not the only subject worth discussing. The volume of data collected by the NSA and the associated costs make it the ultimate in Big Data case studies. What can it tell us about data and business? What can it tell us about business risk and the potential benefits and consequences of Big Data investments?   -  The agency’s exact budget is a government secret, but estimates put it around $10 billion per year. Although not all of that is devoted to surveillance, it’s reasonable to conclude that something in the ballpark of $5 billion goes to fund NSA data gathering each year. This may not be the clear-cut biggest Big Data application (Google’s revenue was $66 billion last year, for example), but it’s substantial, focused and paid for by the public. We ought to discuss what we’re getting for the money.   The budget is not the only cost of any Big Data program. Data gathering and analysis has an impact on public perception and everyday business practices. Do it wrong, and you could run into a lot of costs you never expected. NSA programs have led to costs that the government and public may not have anticipated: correcting functional problems, lost business to US companies, additional security costs to US individuals and businesses seeking to protect private data and the lost influence due to damaged credibility of US government and businesses.   Spies have always depended on communication surveillance to obtain information. Stealing documents, listening in on conversations and cracking the codes of secret messages are basics of the profession. Electronics have been part of the mix for decades: the British used an elaborate electronic surveillance system to listen in on captured German officers during the 1940s. What’s new is the volume and breadth of information gathered.    Communication surveillance is a major part of the NSA’s mission (paired with protecting sensitive US communications). Years before Edward Snowden leaked details of the NSA’s mass surveillance of US citizens, Evan Coldewey of TechCrunch reported “NSA to store yottabytes of surveillance data in Utah mega-repository”, though that figure was quickly challenged and a later update revised the figure to “not so much”. While Coldewey, writing in 2009, may have been a little off-base on the quantity, he was right on target when he said the purpose was to store data from extensive surveillance programs. In 2012, James Bamford of Wired placed the cost of building that data repository at $2 billion, and quoted an unnamed NSA official stating, “Everybody’s a target; everybody with communication is a target.”

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

Consumer Reports says new Tesla is best car ever

Consumer Reports says new Tesla is best car ever | Internet of Things - Technology focus | Scoop.it
The new Tesla P85D scored a 103 out of 100 on a new Consumer Reports test, so the company had to totally reconfigure its rating system. The official score is now a mere perfect 100.
Richard Platt's insight:

The new Tesla P85D scored a 103 out of 100 on a new Consumer Reports test, so the company had to totally reconfigure its rating system. The official score is now a mere perfect 100.  The new model goes from 0 to 60 miles per hour in 3.5 seconds, more than two seconds faster than the original Tesla Model S – which scored 99 out of 100--the fastest acceleration consumer reports had ever measured. Morrison, Oliver Courtesy of Consumer Reports.


more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

The Deeper Reason Why Amazon Is Laying Off Dozens of Its Engineers

The Deeper Reason Why Amazon Is Laying Off Dozens of Its Engineers | Internet of Things - Technology focus | Scoop.it
It’s the first time Amazon has cut employees at its Lab126, report says.
Richard Platt's insight:

Make sure to listen to the under 2 minute video.  Basically Lab126 is a failing effort on the whole for Amazon, low morale for it's engineers, loss of senior talented engineer to Yahoo,, and a less than a good engineering focus for those who work there.  This news and the recent NYT article about working at Amazon that it makes me so much happier that I took myself out of the running for an innovation management position earlier this year with a recruiter who came a calling via LinkedIn.  Amazon needs to stick to it's knitting, it doesn't do engineering work very well without the full participation of engineers in a way that they want to be engineers, treating them like cogs in your machine isn't an adequate work environment.  Nuff said about that - Peace Out.

more...
No comment yet.
Rescooped by Richard Platt from Future of Cloud Computing and IoT
Scoop.it!

How to Jump From Cloud to Cloud

How to Jump From Cloud to Cloud | Internet of Things - Technology focus | Scoop.it
Today there’s no easy way to escape your cloud provider. Some big Web names are trying to fix that

Via massimo facchinetti
Richard Platt's insight:

In 2010, when Netflix was still early into its shift from DVD rentals to online movies and shows, it started using Amazon.com’s data centers. Video streaming’s popularity was growing fast, and Amazon Web Services, the retailer’s cloud computing division, had the capacity to handle the load. Now that Netflix streams 100 million-plus hours of video every day, it’s sticking with Amazon partly because of Amazon’s scale and features, and partly because switching vendors “would be a significant multiyear effort,” says Yury Izrailevsky, Netflix’s vice president for cloud and platform engineering.   -   All the major cloud providers—including Amazon, Salesforce.com, Microsoft, and Google—use technology different enough so that switching from one to another would require customers to rewrite much of their software. (Jeremy King, chief technology officer of Wal-Mart Stores’ e-commerce division, compares picking a cloud provider to staying at the Hotel California—“You can check out anytime you like, but you can never leave.”)Still, in the next five years about one-third of companies using the cloud may either switch providers to get a lower price or more features, or add another provider to get servers closer to customers or have a backup should one company suffer a meltdown, says David Linthicum, a consultant who creates cloud applications for companies.  Containers break up apps into smaller packages of code, each bundled with all the basic system software the apps need to operate independently of whichever server plays host. This means programmers won’t have to rewrite the code for each new operating system and platform as an app evolves from a project on a laptop to a global hit with millions of users reached via enormous servers, says Jim Zemlin, executive director of the Linux Foundation, a nonprofit that oversees the open source Linux OS. “A developer will be able to write that software and deploy it without having to spend six months” rewriting it for broader and bigger systems, he says. Moving containers from one cloud provider to another is as simple as downloading them onto the new servers.

While market share data are tough to pin down, Docker set the early standard in container software, and the leading options among its dozen or so rivals include Warden, LXD, and CoreOS, according to researcher IDC. Many of the container makers, plus Google, are also refining competing versions of container orchestration software, the layer of programming that helps containers knit themselves together in the proper order to make an app run. Kubernetes, an open source program led by Google, is the early front-runner, says Larry Carvalho, an analyst at IDC.

more...
Robert McKenzie's curator insight, August 29, 2:18 PM

This is a topic that is often misunderstood. At present moving workloads between environments is difficult and has a cost measured in hundreds and thousands of dollars per instance. Containers hold a promise that this will be easier . The challenge is that the containers are themselves proprietary . 

Marc ENGEL's curator insight, Today, 5:37 AM

I like this comparison between leaving a cloud patform and lyrics of Hotel California: "“You can check out anytime you like, but you can never leave.”! Standards are again the key!

Scooped by Richard Platt
Scoop.it!

Industrial IoT Mkt growth at 26.56% CAGR

Industrial IoT Mkt growth at 26.56% CAGR | Internet of Things - Technology focus | Scoop.it
increased usage of smart devices, intelligent systems, and robots will likely completely transform the skills and jobs required in the future...
Richard Platt's insight:

The Global Industrial IoT Market Research report 2015-2019 segments the IIoT industry into four end-user segments including manufacturing, energy and utilities, automotive and transportation as well as healthcare while forecasting a 26.56% CAGR to 2019.
The growing adoption of industrial IoT (IIoT) in the market may lead to a shift in employment structures. A blended workforce is expected where both humans and machines work together to attain outcomes that neither humans nor machines could produce alone.  Technology will likely be designed and applied to empower people rather than replace them. The increased usage of smart devices, intelligent systems, and robots will likely completely transform the skills and jobs required in the future. Companies will use intelligent machines and network systems to automate tasks at lower costs and achieve high quality outcomes.    This automation will help people to focus more on human-related job elements such as creative problem-solving and collaboration. So, there will be higher productivity with a dynamic and more engaging human work experience through the combination of humans and machines.  According to this latest industrial IoT (IIoT) market research, a rising number of connected devices are now being used in industries to boost the quantum of generated data. Businesses have realized that they can use these data to optimize costs, deliver better services, and boost revenues. They are also seeing opportunities for changes in business models. For example, aviation engine manufacturers are offering inclusive rental programs on their equipment and servicing contracts. The expectation is that feedback from engine users would help manufacturers enhance engine design and reduce manufacturing and maintenance costs, which may confer significant competitive advantages.   Countries have realized that sensor data could be used to avoid catastrophic failures in key infrastructure networks like water, power, and transport. All these factors drive revenues in the market.

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

Uber Hires Two Engineers Who Showed Cars Could Be Hacked

Uber Hires Two Engineers Who Showed Cars Could Be Hacked | Internet of Things - Technology focus | Scoop.it
The company’s latest talent grab snared Charlie Miller and Chris Valasek, who recently startled the auto industry by hacking remotely into Fiat and Chrysler vehicles.
Richard Platt's insight:

Uber is continuing its hiring spree of top technical talent by recruiting two respected computer security engineers, Charlie Miller and Chris Valasek.  Mr. Miller and Mr. Valasek will work in Uber’s offices in Pittsburgh, where the company has based its self-driving car and robotics research. In a statement, Uber said the two men would work closely with Joe Sullivan, Uber’s chief security officer, and John Flynn, the chief information security officer, to “continue building out a world-class safety and security program at Uber.”  The potential for breaches is escalating as cars transform into Internet-connected computers. A report from Verizon last November found that 14 car manufacturers accounted for 80 percent of the worldwide auto market, and each one has a connected-car strategy. Security experts say one remote hacking of an Uber vehicle could spell disaster for the ride-hailing company.   Mr. Miller and Mr. Valasek have made car hacking a focus. In August, the two demonstrated at the Black Hat and Def Con hacking conferences a way to control hundreds of thousands of vehicles remotely. Over the Internet, they were able to track down cars by their location, see how fast they were traveling and manipulate their blinkers, lights, windshield wipers, radios and navigation and, in some cases, control their brakes and steering.   -  Mr. Miller, a former “global network exploitation specialist” for the National Security Agency, most recently worked at Twitter. He was hired there after making a name for himself by exploiting Apple- and Android-powered devices.  2 years ago, he and Mr. Valasek turned their attention to cars, because cars were a more tangible target, they said, and because of the increasing momentum behind Internet-connected vehicles.  “I’ve been in security for more than 10 years, and I’ve worked on computers and phones. This time, I wanted to do something that my grandmother would understand. If I tell her, ‘I can hack into your car,’ she understands what that means,”.   “Also, I drive cars,” Mr. Miller added. “I would like them to be safe.”

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

Logic Supply Announces Fanless, Intel Atom-Based IoT Gateway

Logic Supply Announces Fanless, Intel Atom-Based IoT Gateway | Internet of Things - Technology focus | Scoop.it
The CC150 computer is powered by Candi Controls' IoT Server
Richard Platt's insight:

The Logic Supply hardware company, known for selling all sorts of industrial and embedded computers powered by GNU/Linux operating systems like Ubuntu, announced the general availability of a fanless Internet of Things (IoT) gateway.    Designed from the ground up to ease the control and management of energy and operational data in industrial sites and commercial buildings, the new CC150 fanless computer is dubbed "Internet of Things Gateway" and it is powered by Candi’s embedded IoT Server software stack.  The launch of the Logic Supply's Internet of Things Gateway comes as a terrific news for businesses who have previously used Candi Controls' IoT Server to control, manage, and monitor data from communicating devices, as the platform is now pre-configured into the CC150 computer and they won't have to pay extra for it.   "Commercial class gateways are at the heart of the industrial Internet of Things," says JP Ishaq, product manager at Logic Supply. "By creating an off-the-shelf solution for our customers who use Candi’s IoT platform, we are reducing the time-to-revenue every business desires."    "We searched for a gateway partner with a track record of product availability, expertise, and customer service to supply enterprise customers with industrial PCs," says Steve Raschke, Candi CEO. "Logic Supply was the nimble, highly capable, and quality-focused company Candi needed to support the requirements of our customers who deploy and resell IoT Services.

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

How the IoT Is Shaping the Sensor Market

How the IoT Is Shaping the Sensor Market | Internet of Things - Technology focus | Scoop.it
And vice versa. Particularly in the industrial space, increasing connectivity is driving sensor growth, but sensor innovation is also creating big...
Richard Platt's insight:

The $85 billion sensor market has grown at a compound annual growth rate (CAGR) of more than 7.5 percent over the past three years, points out Denver-based investment banking firm Headwaters. That’s expected to grow to more than $115 billion by 2019, with 7.3 percent CAGR.   More sensors are showing up in the automotive industry (with connected and self-driving cars creating a huge demand) and healthcare applications (health monitoring and medical diagnostics being key), but nowhere is demand more significant than in the industrial space, accounting for more than a third of the sensor market.   IIoT is expected to stimulate huge demand for sensors, the Headwaters report says, pointing to industrial data that’s expected to double within the next 10 years. And given the domain expertise of industrial players, IIoT technology should see higher-margin growth opportunities.  As a more mature end market, industrial applications focus more on high-end rather than high-volume production, Headwaters notes, with demand for sensors that can be used in harsh environments with high reliability, precision and miniaturization. And as machine-to-machine (M2M) communications becomes more sophisticated, sensors are helping to enable predictive maintenance, asset monitoring and data analytics for production efficiency gains.   In process automation, systems for process control, process safety, operations management and asset optimization call for the increased use of sensors for measurement and analytical instrumentation, plus control for industrial settings, Headwaters reports. “The recurring theme of integrating multiple sensor technologies with software analytics will enhance the speed and precision of the information flow driving production performance, reliability and safety,” the report says. “The result is an improved cost structure and work environment leading to superior products or processes.”

more...
No comment yet.
Rescooped by Richard Platt from Future of Cloud Computing and IoT
Scoop.it!

ClearSky Data Exits Stealth With Novel Approach To Cloud Storage

ClearSky Data Exits Stealth With Novel Approach To Cloud Storage | Internet of Things - Technology focus | Scoop.it
ClearSky Data emerged from stealth today, offering an unusual approach to storage and data lifecycle management. It wants to combine the speed of on-premises..

Via massimo facchinetti
Richard Platt's insight:

ClearSky Data emerged from stealth today, offering an unusual approach to storage and data lifecycle management. It wants to combine the speed of on-premises storage with the cost and elasticity of a cloud service.    While it’s a fully managed service, it involves several layers of storage with an on-premises piece, a local cloud data facility, and longer term public cloud storage. The location of the data in this hierarchy depends on how quickly you are likely to need it.   Businesses have been looking for a way to take advantage of the economies of scale the cloud offers, but have often been stymied by latency issues getting the data where it’s needed most when using cloud services, Ellen Rubin, CEO at ClearSky Data told TechCrunch.   “When we talk to customers, they are telling us that they don’t want to be in the infrastructure business, and are trying to consolidate data centers. We are trying to match storage to the way it is consumed today,”  To achieve this, it’s offering a 3-tiered approach. First it installs a fully managed appliance on-premises, where it stores a cache of the most important data in the organization. This is the information that people need to access the most.   Next, it has a co-location facility with Digital Realty within a 120-mile radius of any customer’s location to deal with warm data — that is, data that’s important, but not used quite as often as the on-premises data. Finally it uses inexpensive Amazon S3 storage for archived data that a company doesn’t need very often.

The solution uses software algorithms to determine where the data should reside, and it also deals with standard kinds of storage services like de-duplication to reduce data size along with backup and disaster recovery. Rubin claims her company’s approach is approximately one third the cost of a traditional storage solution.

more...
No comment yet.