Internet of Things - Technology focus
37.8K views | +28 today
Follow
 
Scooped by Richard Platt
onto Internet of Things - Technology focus
Scoop.it!

This Cheap Sensor Will Tell Your Phone When Food Goes Bad

This Cheap Sensor Will Tell Your Phone When Food Goes Bad | Internet of Things - Technology focus | Scoop.it
Some MIT eggheads invented a very impressive and very inexpensive sensor that stands to protect you against anything from a bomb to a bad pack of beef. And it's so simple.
Richard Platt's insight:

"The beauty of these sensors is that they are really cheap. You put them up, they sit there, and then you come around and read them. There's no wiring involved. There's no power," MIT chemistry professor Timothy Swager said in a release. "You can get quite imaginative as to what you might want to do with a technology like this."  The MIT team that invented the sensors has applied for a patent on the technology. But they're also trying to improve it so that the sensors work with Bluetooth, which would extend their range. They even think that they could be integrated into badges so that employees in hazardous areas could get an alert if a dangerous gas is present. We'd probably be happy getting a push alert when our chicken is starting to get nasty, though. 

more...
No comment yet.
Internet of Things - Technology focus
Your new post is loading...
Your new post is loading...
Scooped by Richard Platt
Scoop.it!

Next Gen NPD + / - Agile NPD for Hardware Development Teams

This presentation is an early draft of one of the chapters of the book that I am working on for Minimum Winning Game Plus. It is titled Next Generation New Pro…
Richard Platt's insight:
This presentation is an early draft of one of the chapters of the book that I am working on for Minimum Winning Game Plus. It is titled Next Generation New Product Development + => Agile NPD for Hardware Development Teams.
 
I did this research and project over a 2 year period drilling down into the details of how profitability for NPD teams and MRO (Maintenance, Repair and Overhaul) for Aviation, Aerospace and Defense firms is actually achieved and is based off of maximizing the processing all of the WIP of the MRO shop or the work packages of the NPD Hardware team. It contains the key metrics for managing NPD and MRO teams of Time to $, Time to Task Completion and Time to Solution Implementation as the necessary metrics.
 
Please keep in mind I am a Practitioner in my line of work as the Senior Managing Partner for a small Professional Services Firm where our emphasis and skill set is in the field of Systematic Innovation for the Fortune 500.  
 
Have a look and let me know what you think.
more...
No comment yet.
Rescooped by Richard Platt from GAFAMS, STARTUPS & INNOVATION IN HEALTHCARE by PHARMAGEEK
Scoop.it!

AI Model Learns How to make Cancer Treatment Less Toxic

AI Model Learns How to make Cancer Treatment Less Toxic | Internet of Things - Technology focus | Scoop.it

The MIT researchers say that their AI model may improve cancer patients’ quality of life. The researchers are investigating how it may reduce toxic radiotherapy and chemotherapy dosing for glioblastoma.
Prognosis for adults with glioblastoma is up to five years. In other words, patients rarely live longer than five years after diagnosis. They have to endure a combination of multiple medications and radiation therapy.

Doctors generally administer maximum safe drug doses to shrink tumors as much as possible. However, they are powerful drugs which cause debilitating side effects in patients.
The AI model ‘learns’ from patient data, and subsequently makes cancer treatment considerably less toxic.
 MIT Media Lab researchers are presenting their research at the 2018 Machine Learning for Healthcare Conference at Stanford University.

The AI model is powered by a ‘self-learning‘ machine-learning technique. It looks at treatment regimes that are currently in use and iteratively adjusts their doses. 
It eventually finds an optimal treatment plan. The plan has the lowest possible potency and dose frequency without losing efficacy. In this context, efficacy refers to the treatment’s ability to shrink tumors.


Via Annick Valentin Smith, Lionel Reichardt / le Pharmageek
Richard Platt's insight:

An AI model learns from patient data to make treatment for cancer less toxic but still effective. AI stands for artificial intelligence. The AI model, a machine-learning system, determines the smallest and fewest doses. However, the doses can still shrink brain tumors.  AI refers to software technologies that make robots or computers think and behave like human beings. Artificial intelligence contrasts with natural intelligence. Animals, including humans, have natural intelligence.

AI Model – glioblastoma treatment

The MIT researchers say that their AI model may improve cancer patients’ quality of life. The researchers are investigating how it may reduce toxic radiotherapy and chemotherapy dosing for glioblastoma.  Glioblastoma or glioblastoma multiforme (GBM) is the most aggressive brain cancer that starts within the brain. Glioblastoma, which can also occur in the spinal cord, can affect people of any age. However, it is more common among older adults.  Prognosis for adults with glioblastoma is up to five years. In other words, patients rarely live longer than five years after diagnosis. They have to endure a combination of multiple medications and radiation therapy.  Doctors generally administer maximum safe drug doses to shrink tumors as much as possible. However, they are powerful drugs which cause debilitating side effects in patients.

more...
No comment yet.
Rescooped by Richard Platt from Technology in Business Today
Scoop.it!

How New Entrants Are Shaping The Future Of Retail

How New Entrants Are Shaping The Future Of Retail | Internet of Things - Technology focus | Scoop.it
Retailers Brandless, Warby Parker and Boxed show that retail innovation and disruption may come from the middle – or rather, by eliminating it.

Via oconnorandkelly, massimo facchinetti, TechinBiz
Richard Platt's insight:

New business models focused on an evolved customer experience are shaping the future of consumer markets – sending new entrants and legacy players the same message: innovate to survive.  The heavyweight competitors that thrive in today’s market are the leaders who introduced us to products at our fingertips, free next-day delivery and subscription-based shopping. But the market is changing and what works today will need to adapt for tomorrow.  I hear a lot of speculation about the next big innovator – who is going to be that next brand that disrupts the market? I look at disruption a little differently. Innovators will not be able to just appear shinier, use new tech and simply deliver on the things that consumers have come to expect – even if those things, like vast inventories at low prices and next-day delivery, are incredibly challenging to get right.  Innovation that disrupts the market will come from a business that is able to harness the power of advanced analytics, new supply chain models and artificial intelligence. Businesses will break through by creating a customer experience that does not exist today . I see these emerging powers in both big and small upstarts and startups.

One such startup is Boxed, an e-commerce business that offers bulk items at wholesale prices without a membership fee, and delivers goods in less than three days. Boxed started in a garage, and today its valued at more than half a billion dollars. The business is successfully taking market share away from brick-and-mortar retailers partly because of its easy-to-navigate customer experience and also its use of advanced technology and machine learning to improve internal operations.  Warby Parker, the eyeglass company that evolved a dated shopping experience, has also disrupted a category that has seen little innovation over the past few decades. The company offers quick delivery, lower costs and on-trend products. Warby Parker is now a household name and worth $1.75 billion— proving the quickly won value of disrupting the retail experience.  Then there’s Brandless, a smaller retailer that secured $50 million in funding last year. Brandless sells more than 300 food, clean beauty, non-toxic cleaning, household and office goods. Like Boxed and Warby Parker, these three innovators show us a lot about what matters in the future of retail.

  • Creating a retail democracy – Amazon did something powerful for retailers and distributors – in a brand and label-obsessed market, it democratized the selling of goods for distributors. No one knows who they last purchased from through Amazon. The retailer has given distributors a user-friendly platform to sell their products, leaving distributors the opportunity to do what they do best: compete on price and availability.
  • Brandless is democratizing retail in a different way – stripping packaged goods down to their core, and focusing on attributes of the products over branding (see terms known as Functionality, Jobs-To-Be-Done, and Main Parameter's of Value). By eliminating traditional branding, the packaging illuminates only what the core customer finds important – organic, non-GMO, vegan, etc. (and you better believe those attributes are rooted in customer data). Brandless is then able to sell most things at a lower price, eliminating what they call the “brand tax” and giving every product a $3 or less price tag.
  • Building relationships directly with consumersWarby Parker gives every customer a boutique shopping experience. By cutting out the middleman, the company is able to lower its prices without royalties and offer customers an intimate shopping experience from their living room. It’s likely a unique enough experience that people share it with their friends, or at least ask their opinion on frames. Similarly, Brandless’ model also allows shoppers to skip the big box retailer or e-commerce grocer. With a condensed value chain, the company can offer lower prices and better market directly to consumers—which they both do via social media. Using analytics and digital marketing, Brandless, for example, is able to target consumers directly using social channels. In fact my team’s research found that 23 percent of Brandless shoppers reach its site via social media. By comparison, social media only accounts for 2-4 percent of other sites’ traffic in the category.
  • Shaping every aspect of your business around your customer – From supply chain to delivery and loyalty, retail businesses have to gain deeper insights on their customers and integrate that thinking into every corner of its operations. Boxed delivered an experience that younger, digitally savvy consumers enjoy. Warby Parker put the customer at the center of every operational and market decision it made.
more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

How 5G will change Home Internet and TV

How 5G will change Home Internet and TV | Internet of Things - Technology focus | Scoop.it
Your home internet and TV experience is about to change drastically with the rollout of 5G.
Richard Platt's insight:
  • Verizon and other internet providers are going to start rolling out 5G internet this year.
  • It will change how we get TV and internet in our homes.
  • Instead of drilling holes for cable everywhere, you'll get a modem and a subscription to a streaming TV service.

The way you get and use both TV and internet in your home is about to change drastically.  Verizon detailed a bit about how the next stage of home TV and internet will work when it discussed its 5G rollout plans on Tuesday evening. Lots of buzzwords get tossed around with 5G, so I'll try to explain how it's going to change how you get TV and internet at home as easily as I can.

How it Works Now:  Right now, you probably have a cable wire running from the telephone poles on your street to your house. It might come in the attic and then, thanks to some drilling done by the cable guy, snakes its way from room to room connecting to cable boxes. Those cables also need to connect to a modem and/or router to provide wireless internet to your house. That means even if you "cut the cord" and ditch cable, you still need the same coaxial cable line for internet at home. The current wireless standard offered by Verizon, AT&T, T-Mobile and Sprint — 4G LTE — is fast but not quite fast enough for an entire house of people to play games and stream 4K movies at the same time. It makes a poor replacement for wired broadband. The technology for 5G is fast enough for that, and you can forget the cords. It's just as reliable as the wired broadband internet you're used to, and it could save you a lot of headaches.

How its Supposed to Work:  Verizon said Tuesday that instead of giving you a bunch of cable boxes and other gadgets, it's going to simply give you an Apple TV 4K and a wireless modem. Since Verizon isn't going to run a standard cable line to your house, it's also going to include a subscription to YouTube TV, YouTube's streaming service that will provide access to TV channels. YouTube TV normally costs $40 per month, but Verizon's deal is likely only a limited-time offer.  You'll still have a modem at home, but it'll connect to Verizon's wireless 5G signal and then serve as a home Wi-Fi router, complete with standard Ethernet ports. This is how devices like the Apple TV 4K, your smartphone, computer and other internet gadgets will connect to Verizon's 5G wireless network. PCMag had a look at one of the routers Verizon said it was initially considering in 2017. This means you won't need to run a cable throughout your home to each TV, since you won't be using cable boxes to get your TV content. No more drilling through walls. No more waiting for the cable guy. Just plug in your Verizon modem and get online. 

CONSTRAINTS: 

This won't be for everyone, at least not at first. To start, Verizon (and other 5G providers) are only going to roll out slowly in select cities around the United States. Verizon says it will be in four cities — Houston, Indianapolis, Los Angeles and Sacramento, California — by the end of 2018. More cities will get Verizon 5G next year. AT&T is also planning to expand its consumer 5G network. AT&T has its own streaming TV service, DirecTV Now, which it could theoretically offer through a 5G connection.  Meanwhile, T-Mobile and Sprint are attempting to merge in order to build a 5G network that can compete with AT&T and Verizon

 

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

Best Business Book of the Month – Skin In The Game

Best Business Book of the Month – Skin In The Game | Internet of Things - Technology focus | Scoop.it
Darrell Mann When Nassim Nicholas Taleb releases a new book, its unlikely there will be anything else arriving on the market that will be able to compete
Richard Platt's insight:

What we get in Skin In The Game is Taleb’s anger at various sectors of society nicely turned up to eleven. If you don’t like economists, politicians, fat-cat CEOs, journalists and academics, and want to read someone elegantly demolishing each of them, this is the book for you. Their problem, Taleb convincingly demonstrates is that these are the professions in which no-one has any real ‘skin in the game’. That is, they are people who make pronouncements about the future safe in the knowledge that if those pronouncements don’t pan out there’s little or no negative consequence to them personally, but rather a lot to the people that had the bad fortune of listening to them and acting upon their pronouncements. About 40% of the book is basically a rant against these people.  Another 40% of the book describes a series of strategies and heuristics for dealing with risk and making sure you deal with people who do possess some actual skin in the games you’re thinking of participating in. (These are the gold nuggets worth reading the book, if just for those). The final 20% then seems to be about Taleb answering various of his critics. Some of this content is quite funny, but mostly it comes across as Taleb showing-off his ‘let me show you how much smarter than you I am’ intellect. Granted his ability to step back, see big pictures does put him at a major advantage over 99% of other commentators. When people attack him without understanding complexity they’re always going to find themselves on shaky ground. Usually finding themselves embarrassed at the wrong end of some serious mathematical argument. This too can be quite funny, even if I have to resort to humming the mathematical formulae half the time. 

 

First time through at least, the ‘best bits’ are the always-pithy italicized pull-out sentences. If you’re an aphorism fan, this is your book:

“You do not want to win an argument. You want to win.”

 

“Things designed by people without skin in the game tend to grow in complication (before their final collapse).”

 

“Under the right market structure, a collection of idiots produces a well-functioning market’.”

 

“Anything that smacks of competition destroys knowledge.” (Which I do not believe or agree with as actually being true)

 

“If your private life conflicts with your intellectual opinion, it cancels your intellectual ideas, not your private life.”

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

Designing 7-nm IP, Bring It On Moore!

Richard Platt's insight:

In keeping with Moore’s Law, discover how Synopsys is developing 10nm/7nm IP for SoC designs. Learn how tradeoffs are made in electrostatics, leakage, pattern, manufacturability and transistor performance to meet PPA requirements. (If you are TRIZ trained engineer then these trade-offs are your contradictions to go solve).  In this video see how quantum effects impact FinFET designs in terms of fin width, fin height and anything that impacts bandgap. Technology can be scaled to 7nm, bringing performance and power improvements

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

How a CPU is made

How a CPU is made how to make CPU make cpu how cpu made CPU How a CPU working from sand to CPU making CPU Central processing unit CPU factory how CPUs are made how cpu is manufactured how cpu is made hd how cpu works what is CPU clean room intel cpu manufacturing process how intel make cpu make cpu cpu factory what is a cpu? AMD intel electronics hardware cpu hardware computers Nvidia Notebook Ram Laptop cpu industry
Richard Platt's insight:

CPU - Central Processing Unit - the brains of all computers.  Video courtesy of GlobalFoundaries

more...
No comment yet.
Rescooped by Richard Platt from Advanced Threats,Intelligence Technology,CyberSecurity
Scoop.it!

How Hackers Are Leveraging Machine Learning

How Hackers Are Leveraging Machine Learning | Internet of Things - Technology focus | Scoop.it
Unfortunately, like many advanced and innovative technological processes, machine learning can be leveraged for both beneficial enterprise purposes as well as malicious activity.

Via Constantin Ionel Milos / Milos Constantin
Richard Platt's insight:

For business executives and internal information security specialists, it seems that every day brings a new potential risk to the company – and in the current threat environment, it isn't hard to understand this viewpoint.  Sophisticated cybercriminals are continually on the lookout for the next big hacking strategy, and aren't shy about trying out new approaches to breach targets and infiltrate enterprises' IT assets and sensitive data. One of the best ways to stem the rising tide of threats in this type of landscape is to boost awareness and increase knowledge about the latest risks and how to guard against them.  Currently, an emerging strategy among hackers is the use of machine learning. Unfortunately, like many advanced and innovative technological processes, machine learning can be leveraged for both beneficial enterprise purposes as well as malicious activity.

Machine learning: A primer

Many internal IT and development teams as well as technological agencies are experimenting with machine learning – but white hats aren't alone in their use of this method.  As SAS explained, machine learning is an offshoot of artificial intelligence, and is based on the ability to build automated analytical models. In other words, machine learning enables systems to increase their own knowledge and adapt their processes and activities according to their ongoing use and experience.  "The iterative aspect of machine learning is important because as models are exposed to new data, they are able to independently adapt," SAS stated. "They learn from previous computations to produce reliable, repeatable decisions and results. It's a science that's not new – but one that has gained fresh momentum."  Individuals have likely encountered some form of machine learning algorithm in their daily life already – things like online recommendations from streaming services and retailers, as well as automated fraud detection represent machine learning use cases already in place in the real world. Artificial intelligence and machine learning can be used to bolster malicious attacks.

Machine learning on both sides of the coin

However, as legitimate agencies and white hat security professionals continue to dig deeper into advantageous machine learning capabilities, hackers are increasingly looking toward AI-based processes to boost the effects of cyberattacks.  "We must recognize that although technologies such as machine learning, deep learning, and AI will be cornerstones of tomorrow's cyber defenses, our adversaries are working just as furiously to implement and innovate around them," Steve Grobman, security expert and McAfee chief technology officer told CSO. "As is so often the case in cybersecurity, human intelligence amplified by technology will be the winning factor in the arms race between attackers and defenders."  But how, exactly, are hackers putting machine learning algorithms to work, and how will these impact today's enterprises? Let's take a look:

ML vs. ML: Evasive malware

When hackers create malware, they don't just look to breach a business – they also often want to remain within victims' systems for as long as possible. One of the first, and likely most dangerous, ways machine learning will be leveraged by hackers is to fly under the radar of security systems aimed at identifying and blocking cybercriminal activity.  A research paper from Cornell University authors described how this type of instance could be brought to life by hackers. Researchers were able to create a generative adversarial network (GAN) algorithm which, in and of itself, was able to generate malware samples. Thanks to machine learning capabilities, the resulting infection samples were able to effectively sidestep machine learning-based security solutions designed specifically to detect dangerous samples.  Security experts also predicted that machine learning could be utilized by cybercriminals to modify the code of new malware samples based on the ways in which security systems detect older infections. In this way, hackers will leverage machine learning to create smarter malware that could potentially fly under the radar within infected systems for longer periods of time.  This will require enterprises to be increasingly proactive with their security posture – monitoring of critical IT systems and assets must take place continually, and security officers must ensure that users are observing best protection practices in their daily access and network activities.

more...
No comment yet.
Rescooped by Richard Platt from Advanced Threats,Intelligence Technology,CyberSecurity
Scoop.it!

Tropic Trooper’s New Hacking Strategy - Targeting Asia

Tropic Trooper’s New Hacking Strategy - Targeting Asia | Internet of Things - Technology focus | Scoop.it
Tropic Trooper is believed to be very organized and develop their own cyberespionage tools that they fine-tuned in their recent campaigns. Many of them now feature new behaviors, including a change in the way they maintain a foothold in the targeted network.

Via Constantin Ionel Milos / Milos Constantin
Richard Platt's insight:

We also observed malicious documents that don’t need to download anything from the internet as the backdoor’s dropper is already embedded in the document. This, however, doesn’t influence the overall result for the victim.

The backdoor will load the encrypted configuration file and decrypt it, then use Secure Sockets Layer (SSL) protocol to connect to command-and-control (C&C) servers.

Tropic Trooper uses exploit-laden Microsoft Office documents to deliver malware to targets. These documents use job vacancies in organizations that may be deemed socio-politically sensitive to recipients. Below is a screenshot of the document used in their latest campaigns:

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

ABI Research Ranks Smart Manufacturing Platforms

ABI Research Ranks Smart Manufacturing Platforms | Internet of Things - Technology focus | Scoop.it
The “Smart Manufacturing Platform Ranking” competitive assessment ranked 11 major vendors in the sector: ABB Ability, Bosch IoT Suite, Emerson Plantweb, Fujitsu Colmina, GE Predix, Hitachi Lumada, PTC ThingWorx, SAP Leonardo, Schneider Electric EcoStruxure, Siemens Mindsphere and Telit deviceWISE, based on ABI Research's proven innovation/implementation criteria framework.
Richard Platt's insight:

“PTC emerged as the leader, excelling with its innovative initiatives across transformative technologies, and GE Predix came in second,” says Pierce Owen, principal analyst of Smart Manufacturing at ABI Research. “GE has ridden a roller-coaster of expectations and disappointment over the last few years, especially with GE Digital, its software subsidiary. GE Digital led the way out of the gate in this fourth industrial revolution, launching Predix back in 2013 and making grandiose promises about improving asset utilization and operations optimization. It largely failed to deliver on those promises with a quite slow rollout, but since then, it has built up its technological capabilities and partnerships and now offers a legitimate platform with many solutions.”

ABB’s third-place ranking came about largely due to its business model. ABB differentiates ability by not charging for the platform as a separate item. Despite its relatively high score in innovation, ABB provides the platform for free to its customers, instead charging for apps and solutions, which still often come for free during the warranty period for ABB equipment. The high score for this business model has helped contribute to a top ranking in implementation, tied with GE.

Siemens Mindsphere ranked fourth. “Siemens had developed quite advanced capabilities around the Mindsphere platform, but up until now, it has struggled to connect devices and equipment from other manufacturers without OPC UA. To their credit though, they have taken steps that could lead to a jump to the top spot within a year,” Owens points out.

Siemens has partnered with Telit deviceWISE, which ranked sixth in ABI Research’s assessment, for data extraction and edge intelligence and has announced general availability on Microsoft Azure coming in Q4 of 2018. At this point, the platform providers appear to have found a balance between competing for market share and implementing these open best practices to drive innovation. This does not mean that all these platforms will survive, but it will ultimately benefit the customers by making it easier to deploy the best solutions,” concluded Owen.

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

How to Write a Patent Application

How to Write a Patent Application | Internet of Things - Technology focus | Scoop.it
How to write a patent application. Writing a patent application is not as easy as many think. Indeed, the concept of usefully describing the invention is not as straight forward as it might seem, and why you cannot simply file an abbreviate description of an invention and think that suffices to protect anything really.
Richard Platt's insight:

One of the most basic things any new patent attorney or patent agent needs to learn is how to draft a patent application. This skill is also one that can be extremely useful for inventors, particularly serious inventors who are likely to have more than one invention, as well as professional, corporate inventors who work for companies that pay them to invent. While the latter category (i.e., inventors) may approach the task of writing a patent application for the purpose of creating a solid first draft to file as a provisional patent application, or to pass forward to a patent practitioner, thereby cutting costs, the goals are the same: Write a quality patent application that usefully describes the invention.

Writing a patent application is not as easy as many think. Indeed, the concept of usefully describing the invention, which on its face seems easy enough to understand, is not as straight forward as it might seem, and why you cannot simply file an abbreviate description of an invention and think that suffices to protect anything really.  When a patent application is filed a filing date is awarded, and priority is established with respect to whatever is appropriately described in the patent application at the time of filing. “Appropriately described” is defined in the United States by 35 U.S.C. 112(a), which requires the patent application to description that thoroughly explains the metes and bounds of the invention, inventions that it can be practiced by those of skills in the relevant technological art or scientific field without undue experimentation, and which also discloses any preferences the inventor has with respect to the invention. For more see: Variations and Tricks & Tips to Describe an Invention in a Patent Application.

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

Global GDP Impact on Worldwide IC Market Growth Forecast to Rise

Global GDP Impact on Worldwide IC Market Growth Forecast to Rise | Internet of Things - Technology focus | Scoop.it
Richard Platt's insight:

As shown, over the 2010-2017 time-frame, the correlation coefficient between worldwide GDP growth and IC market growth was 0.88, a strong figure given that a perfect correlation is 1.0.  In the three decades previous to this time period, the correlation coefficient ranged from a relatively weak 0.63 in the early 2000s to a negative correlation (i.e., essentially no correlation) of -0.10 in the 1990s.  IC Insights believes that the increasing number of mergers and acquisitions, leading to fewer major IC manufacturers and suppliers, is one of major changes in the supply base that illustrate the maturing of the industry that is helping foster a closer correlation between worldwide GDP growth and IC market growth.  Other factors include the strong movement to the fab-lite business model and a declining capex as a percent of sales ratio, all trends that are indicative of dramatic changes to the semiconductor industry that are likely to lead to less volatile market cycles over the long term.  In 2017, IC industry growth was greatly influenced by the “Capacity/Capital Spending Cycle Model” as the DRAM and NAND flash markets surged and served to drive total IC industry growth of 25%.  It would initially appear that the strong correlation coefficient between worldwide GDP growth and total IC market growth that had been evident from 2010 through 2016 had disappeared in 2017.  However, IC Insights does not believe that is the case.  When excluding the DRAM and NAND flash segments from the IC market in 2017, the remainder of the IC market displayed an 11% increase, which closely correlates to what would be expected given a worldwide GDP increase from 2.4% in 2016 to 3.1% in 2017.  Moreover, the three-point decline in the total IC market growth rate forecast for 2018, when excluding DRAM and NAND flash (from 11% in 2017 to 8% in 2018), is expected to mirror the slight decline expected for worldwide GDP growth this year as compared to last year.  Thus, excluding the amazing surge for the DRAM and NAND flash markets in 2017 and 2018, IC Insights believes that the trend toward an increasingly close correlation between total IC market growth and worldwide GDP growth is still largely intact.

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

Silicon Wars - NAND Flash Memory Prices to Drop through 2018 to 2019

Silicon Wars - NAND Flash Memory Prices to Drop through 2018 to 2019 | Internet of Things - Technology focus | Scoop.it
Richard Platt's insight:

The Mid-Year Update revises IC Insights’ worldwide economic and IC industry forecasts through 2022 that were originally published in The 2018 McClean Report issued in January of this year.  The Figure here compares the estimated required capex needed to increase NAND flash bit volume shipments 40% per year, sourced from a chart from Micron’s 2018 Analyst and Investor Event in May of this year, versus the annual capex targeting the NAND flash market segment using IC Insights’ data.  As shown, Micron believes that the industry capex needed to increase NAND flash bit volume production by 40% more than doubled from $9 billion in 2015 to $22 billion only two years later in 2017!  This tremendous surge in required capital is driven by the move to 3D NAND from planar NAND since 3D NAND requires much more fab equipment and additional cleanroom space to process the additional layers of the device as compared to planar NAND.

Most of the five major NAND flash suppliers have stated that they believe that NAND bit volume demand growth will average about 40% per year over the next few years.  Figure 1 shows that the capex needed to support a 40% increase in NAND bit volume shipments was exceeded by 27% last year and is forecast to exceed the amount needed by another 41% this year (NAND bit volume shipments increased 41% in 2017 but 1H18/1H17 bit volume shipments were up only 30%).  As a result, it is no surprise that NAND flash prices have already softened in early 2018. Moreover, the pace of the softening is expected to pick up in the second half of this year and continue into 2019.  Historical precedent in the memory market shows that too much spending usually leads to overcapacity and subsequent pricing weakness.  With Samsung, SK Hynix, Micron, Intel, Toshiba/Western Digital/SanDisk, and XMC/Yangtze River Storage Technology all planning to significantly ramp up 3D NAND flash capacity over the next couple of years (with additional new Chinese producers possibly entering the market), IC Insights believes that the risk for significantly overshooting 3D NAND flash market demand is very high and growing. 

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

NASA announces the first winners of the crowdsourced contest to design its next robot

NASA announces the first winners of the crowdsourced contest to design its next robot | Internet of Things - Technology focus | Scoop.it
NASA’s crowdsourced robotics program is coming to a head. The agency has announced the first few winners who’ll see their contributions launch into orbit in 2019.
Richard Platt's insight:

NASA, if you haven’t heard, is partially crowdsourcing the design of its next robot. And today, it announced which teams’ submissions are destined for orbit.  Mechanical engineering student Nino Wunderlin, conceptual engineer Myrdal Manzano, and software developer Amit Biswas will see their attachment and deployment mechanisms adapted to fit Astrobee, a free-flying robot intended to replace three MIT-designed robots that have operated on the International Space Station (ISS) since 2006. It’s part of NASA’s research into SPHERES, short for “synchronized position hold engage and reorient experimental satellites.”

When the cube-shaped, one-by-one-foot Astrobee is formally deployed sometime in 2019, it’ll have cameras, sensors, a touchscreen, and a robotic arm, plus lasers and a fan-based system that will help it navigate the ISS’s cramped quarters and corridors. Astrobee will perform tasks like equipment monitoring, sensor testing, sound level monitoring, air quality analysis, and other chores. And NASA’s Mission Control Center at the Johnson Space Center in Houston will use it to perform tasks autonomously.  Astrobee is designed to remain active for the life of the ISS, which is scheduled to be decommissioned by 2024.

NASA kicked off the Freelancer.com program — dubbed the NASA System Architecture Task — in 2016. It’s open to the public, but applicants have to complete an academic knowledge test and commit a certain amount of time to project development. So far, it’s been highly competitive — in the first phase, of the thousands of people who entered, only 30 went onto receive $10 and a finalized breakdown of the elements they needed to win.

“NASA has grown in the multiple ways we engage the crowd to provide solutions to challenges we face when advancing complex space systems,” Jason Crusan, NASA’s director of Advanced Exploration Systems and head of NASA’s Center of Excellence for Collaborative Innovation, said in a statement. “This challenge continues that expansion and will help to create novel designs but also allow us to learn about sophisticated system design through the use of open innovation.”  Cash prizes ranging from $250 to $5,000, drawn from a prize pool of about $25,000. The deadline for entries is September 30, 2018.

more...
No comment yet.
Rescooped by Richard Platt from GAFAMS, STARTUPS & INNOVATION IN HEALTHCARE by PHARMAGEEK
Scoop.it!

Artificial Intelligence 'Did not miss a single Urgent Case'

Artificial Intelligence 'Did not miss a single Urgent Case' | Internet of Things - Technology focus | Scoop.it
A machine has learned how to read complex eye scans and detect many types of disease, research has found.
Via Philippe Marchal, Lionel Reichardt / le Pharmageek
Richard Platt's insight:

Artificial intelligence can diagnose eye disease as accurately as some leading experts, research suggests.  A study by Moorfields Eye Hospital in London and the Google company DeepMind found that a machine could learn to read complex eye scans and detect more than 50 eye conditions.

Doctors hope artificial intelligence could soon play a major role in helping to identify patients who need urgent treatment.

They hope it will also reduce delays.

A team at DeepMind, based in London, created an algorithm, or mathematical set of rules, to enable a computer to analyse optical coherence tomography (OCT), a high resolution 3D scan of the back of the eye.

Thousands of scans were used to train the machine how to read the scans.

Then, artificial intelligence was pitted against humans.

The computer was asked to give a diagnosis in the cases of 1,000 patients whose clinical outcomes were already known.

The same scans were shown to eight clinicians - four leading ophthalmologists and four optometrists.

Each was asked to make one of four referrals: urgent, semi-urgent, routine and observation only.

'Jaw-dropping'

Artificial intelligence performed as well as two of the world's leading retina specialists, with an error rate of only 5.5%.

Crucially, the algorithm did not miss a single urgent case.

The results, published in the journal Nature Medicine, were described as "jaw-dropping" by Dr Pearse Keane, consultant ophthalmologist, who is leading the research at Moorfields Eye Hospital.

more...
No comment yet.
Rescooped by Richard Platt from Technology in Business Today
Scoop.it!

World's first commercial 3D-printed concrete homes planned

World's first commercial 3D-printed concrete homes planned | Internet of Things - Technology focus | Scoop.it
World's first commercial 3D-printed concrete homes planned

A consortium between the municipality of Eindhoven, Eindhoven University of Technology (TU/e), and three private firms is building five unique 3D-printed concrete homes in the city. Designer Rob Wolfs told Jim Drury they will be the world

Via TechinBiz
Richard Platt's insight:

A consortium between the municipality of Eindhoven, Eindhoven University of Technology (TU/e), and three private firms is building five unique 3D-printed concrete homes in the city. Designer Rob Wolfs told Jim Drury they will be the world’s first commercially available 3D-printed concrete homes.

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

Rethinking The ‘Robust Design’ Trend

Rethinking The ‘Robust Design’ Trend | Internet of Things - Technology focus | Scoop.it
Darrell Mann It’s not often these days that we get to add a new stage to one of the Trends Of Evolution. It’s even rarer that we find ourselves in the
Richard Platt's insight:

In the world of business, it’s probably fairest to say that the large majority of enterprises on the planet are not AntiFragile. Most aren’t even very Robust, thanks to years of ‘continuous improvement’. It’s difficult to find examples of human-built entities that have been designed to be AntiFragile. Finding them tends to require zooming-out and looking at bigger pictures. Like Switzerland.  Switzerland is often described as ‘the most boring country on Earth’. Boring is usually a signal of unappealing, but one suspects that there are many on the planet who would very definitely prefer Switzerland’s boring-ness rather than the turmoil and crisis we see everywhere else. Switzerland is a model of stability. This stability comes in no small part because, unlike most other countries, it doesn’t have a big central bank or national government. What it has instead are dozens of sovereign mini-states that squabble and fight with one another constantly. The country’s AntiFragility, in other words, comes because all of this micro-scale turmoil helps make the country as a whole stronger because it enables all of the small problems to be revealed and resolved before they’re able to metastasize into something bigger. Like, say, the sort of kicking-the-can-down-the-road, quantitative-easing fiscal cliff that the US, EU and UK have all inadvertently climbed......Finally, switching to the world of technology, earlier this year saw the publication of a manifesto for ‘AntiFragile’ software:

The disruptive nature of the antifragile approach for open and complex systems is of greatest importance and needs to be systematized, especially for software systems. In fact, antifragile software design is becoming a research issue in the software engineering community… We propose a similar approach to Antifragility, namely we would like to define the principles ruling the building up of software systems which exploit faults and errors to become better and stronger. This Manifesto does not want to be a fixed and complete set of principles. It is an open contribution to the discussion which needs to be improved and re-elaborated. All rights related to the Manifesto are free, open and belong to the community. This work represents our suggestions urging the community to start elaborating antifragile principles to lead their implementation in real organizations.

Some software companies, it has to be said, are already ahead of the game. They’ve seen the coming contradictions and have decided to do it anyway. When software writes, updates and evolves ‘itself’, why do we need humans to code it? The problem with software is that it only needs to be programmed once. After that, it is able to self-replicate an unlimited number of times. Maybe to the point where the control systems used to control all the things in and around our lives, learn how to make the physical stuff AntiFragile too.  Okay, so that last part is still a way off, but that shouldn’t stop us from saying that the ‘AntiFragile’ Trend stage well and truly exists today in many forms. Here’s what we think our Robust Design Trend now looks like as a result:

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

2018 and Beyond Strategic Predictions From Gartner

2018 and Beyond Strategic Predictions From Gartner | Internet of Things - Technology focus | Scoop.it
A Gartner report provides predictions, research findings and recommendations for what to focus on and how to navigate the accelerating rate of digital change.
Richard Platt's insight:

The rapid rate of technology change has created steep challenges for organizations of all shapes and sizes. It's no secret that innovation now arrives faster than most CIOs and IT departments can cope with. All too often, before an enterprise completes one project, two more initiatives appear on the horizon. A new research report from Gartner, "Top Strategic Predictions for 2018 and Beyond: Pace Yourself, for Sanity's Sake," assembles some key predictions, research findings and recommendations for what to focus on and how to navigate today's accelerating rate of digital change. Gartner reports that meeting goals for revenue generation and value creation will increasingly rely on engaging and reengaging people and businesses on ways to adopt and use technology for maximum gain. Gartner also examines how specific technologies—ranging from artificial intelligence (AI) and the internet of things (IoT) to blockchain and bots—are altering the business and IT landscape, along with the key areas that business and IT leaders are focusing on. Here are some of the key findings and recommendations.

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

Inside a Google Data Center

Joe Kava, VP of Google's Data Center Operations, gives a tour inside a Google data center, and shares details about the security, sustainability and the core architecture of Google's infrastructure.
Richard Platt's insight:

An older video, but it explains well enough.  Joe Kava, VP of Google's Data Center Operations, gives a tour inside a Google data center, and shares details about the security, sustainability and the core architecture of Google's infrastructure.

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

‘Game Change’ - Recent book looks at ‘Gamification’ as a Future Business Model

‘Game Change’ - Recent book looks at ‘Gamification’ as a Future Business Model | Internet of Things - Technology focus | Scoop.it
An estimated one billion people spend at least one hour every day playing computer and videogames. Why do people put more effort into these games than their day jobs and personal relationships?
Richard Platt's insight:

An estimated one billion people spend at least one hour every day playing computer and videogames. Why do people put more effort into these games than their day jobs and personal relationships?  ‘‘Game Change’’, the latest book from global media and communications agency PHD, discusses how “gamification” can be harnessed and applied as a business model for the future.  Although gaming has an unfortunate capacity for attracting negative headlines, Game Change explores the mounting evidence and opinions from a number of commentators who argue that gaming can actually be a force for good, especially when applied to real life problems.

“Gamification” uses game thinking and mechanics to engage users into solving problems. In a business and marketing sense, it creates new ways to engage both employees and consumers. Businesses need to consider that 7 billion hours by 1 billion gamers is what’s up for grabs in the “Engagement Economy”.

Co-author and worldwide strategy and planning director at PHD, Mark Holden, says if businesses and marketers can apply even a fraction of the engagement seen in gaming, the payback will be incredibly significant. Gaming has the ability to create meaning and motivation relevant to the target market, whether internal or external, which results in engaged audiences and ultimately, increased revenue.  “It is not enough to expect people within any organisation to give the best part of their waking lives for money alone. People, especially the younger generation, are seeking more meaning from their working lives. And it is important that organisations start to take this on board. There’s a lot that can be learnt from the immersive and empowering experience that games provide. It is now becoming increasingly clear that, if employed correctly and not superficially, game mechanics can help to generate the levels of engagement required for this new working world.” says Holden.  Game Change looks at “gamification” in practice through successful examples, including Nike+, Guinness and Heineken and from sectors including finance, such as Mint.com.  PHD itself has implemented a “gamified” global operating system, Source – the largest enterprise gaming system with more than 2,500 staff in over 76 countries collaborating and playing on a leaderboard at work every day. It has been adopted quicker than ever anticipated, thanks to the game element.  Jane McGonigcal, Game Change foreword author, game designer and author of New York Times bestseller Reality is Broken, points out that innovative systems like Source empowers employees to gamefully tackle daily challenges that excite and interest them most, helping them to realise their strengths, which creates significant improvements in employee engagement and output.

more...
No comment yet.
Rescooped by Richard Platt from Advanced Threats,Intelligence Technology,CyberSecurity
Scoop.it!

Kaspersky's 'Slingshot' report burned an ISIS-focused intelligence operation

Kaspersky's 'Slingshot' report burned an ISIS-focused intelligence operation | Internet of Things - Technology focus | Scoop.it
CyberScoop has learned that Kaspersky's 'Slingshot' is an active, U.S.-led counterterrorism cyber-espionage operation used to target ISIS and Al-Qaeda.

Via Constantin Ionel Milos / Milos Constantin
Richard Platt's insight:

The U.S. government and Russian cybersecurity giant Kaspersky Lab are currently in the throes of a nasty legal fight that comes on top of a long-running feud over how the company has conducted itself with regard to U.S. intelligence-gathering operations.

A recent Kaspersky discovery may keep the feud alive for years to come.

CyberScoop has learned that Kaspersky research recently exposed an active, U.S.-led counterterrorism cyber-espionage operation. According to current and former U.S. intelligence officials, the operation was used to target ISIS and al-Qaeda members.

On March 9, Kaspersky publicly announced a malware campaign dubbed “Slingshot.” According to the company’s researchers, the campaign compromised thousands of devices through breached routers in various African and Middle Eastern countries, including Afghanistan, Iraq, Kenya, Sudan, Somalia, Turkey and Yemen.

Kaspersky did not attribute Slingshot to any single country or government in its public report, describing it only as an advanced persistent threat (APT). But current and former U.S. intelligence officials tell CyberScoop that Slingshot represents a U.S. military program run out of Joint Special Operations Command (JSOC), a component of Special Operations Command (SOCOM).

The complex campaign, which researchers say was active for at least six years, allowed for the spread of highly intrusive malware that could siphon large amounts of data from infected devices.

Slingshot helped the military and intelligence community collect information about terrorists by infecting computers they commonly used, sources told CyberScoop. Often times, these targeted computers would be located within internet cafés in developing countries. ISIS and al-Qaeda targets would use internet cafés to send and receive messages, the sources said.

These officials, all of whom spoke on condition of anonymity to discuss a classified program, fear the exposure may cause the U.S. to lose access to a valuable, long-running surveillance program and put soldiers’ lives at risk.

The disclosure comes at a difficult time for Kaspersky. The company is currently fighting the U.S. government in court after the government claimed that the Moscow-based company’s software poses a national security risk due to the company’s Russian government ties. Kaspersky has consistently denied any wrongdoing.

CyberScoop’s reporting of JSOC’s role in Slingshot provides the first known case of a SOCOM-led cyber-espionage operation. The command is better known for leading physical missions that place elite soldiers on the ground in hostile territories. Over the last decade, SOCOM has been instrumental in the Global War on Terror, having conducted many sensitive missions, including the one that killed former al-Qaeda leader Osama bin Laden.

more...
No comment yet.
Rescooped by Richard Platt from WEARABLES - INSIDABLES - IOT - CONNECTED DEVICES - QUANTIFIEDSELF
Scoop.it!

Sensors to Smartphones Bring Patent Wars to Diabetes Monitoring

Sensors to Smartphones Bring Patent Wars to Diabetes Monitoring | Internet of Things - Technology focus | Scoop.it

Diabetes treatment has evolved since Mary Fortune was diagnosed in 1967 and hospitalized because there was no reliable way monitor her blood sugar. These days, a glucose skin patch transmits her levels day and night to her iPhone and shares the data with others.

Fortune and other diabetics are benefiting from an explosion in technology and innovation, from under-the-skin sensors that eliminate the need for painful finger pricks, to smartphone alerts when glucose levels rise too high. But the technology, and its integration with mobile devices, has brought the types of lawsuits typically seen by Silicon Valley companies.

For glucose monitors alone, the number of published patent applications has grown steadily for a decade and has accelerated significantly since 2015, according to an analysis by the research firm Patinformatics. More than 880 patent applications related to glucose monitoring have been published so far this year, said Tony Trippe, managing director of the Dublin, Ohio-based company.

“Everybody in the market is realizing there’s an enormous opportunity there,” said Paul Desormeaux, a senior analyst with Toronto-based Decision Resources Group. “Other players are starting to come in, and there’s a lot of competition to make advanced products.”

The boom is driven by a variety of factors, Desormeaux said. The number of people with diabetes in the U.S. is rising -- the Centers for Disease Control estimates more than 100 million Americans are now living with diabetes or prediabetes. Insurance coverage for new devices has increased, and there’s a growing number of partnerships between health companies and traditional technology firms such as Alphabet Inc.’s Google, International Business Machines Corp., and Fitbit Inc.


Via Dominique Godefroy, Lionel Reichardt / le Pharmageek
Richard Platt's insight:

Diabetes treatment has evolved since Mary Fortune was diagnosed in 1967 and hospitalized because there was no reliable way monitor her blood sugar. These days, a glucose skin patch transmits her levels day and night to her iPhone and shares the data with others.

Fortune and other diabetics are benefiting from an explosion in technology and innovation, from under-the-skin sensors that eliminate the need for painful finger pricks, to smartphone alerts when glucose levels rise too high. But the technology, and its integration with mobile devices, has brought the types of lawsuits typically seen by Silicon Valley companies.

For glucose monitors alone, the number of published patent applications has grown steadily for a decade and has accelerated significantly since 2015, according to an analysis by the research firm Patinformatics. More than 880 patent applications related to glucose monitoring have been published so far this year, said Tony Trippe, managing director of the Dublin, Ohio-based company.

“Everybody in the market is realizing there’s an enormous opportunity there,” said Paul Desormeaux, a senior analyst with Toronto-based Decision Resources Group. “Other players are starting to come in, and there’s a lot of competition to make advanced products.”

The boom is driven by a variety of factors, Desormeaux said. The number of people with diabetes in the U.S. is rising -- the Centers for Disease Control estimates more than 100 million Americans are now living with diabetes or prediabetes. Insurance coverage for new devices has increased, and there’s a growing number of partnerships between health companies and traditional technology firms such as Alphabet Inc.’s Google, International Business Machines Corp., and Fitbit Inc.

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

Tricks & Tips to Describe an Invention in a Patent Application

Tricks & Tips to Describe an Invention in a Patent Application | Internet of Things - Technology focus | Scoop.it
Failure to disclose alternatives when you describe an invention in a patent application will foreclose your ability to own those variations in any patent.
Richard Platt's insight:

One of the biggest problems that inventors face when setting out to describe an invention is with defining what the law refers to as “alternative embodiments of the invention,” or simply “alternative embodiments.” Whenever you read the word “embodiment” in a patent application or issued patent the drafter is merely talking about a particular version of the invention.  The trouble many inventors have is that they don’t understand why they would ever have more than a single version of their invention. They will sometime say: “Everyone would do it this way and include all the features, you’d be crazy not to!” The problem created by this mentality can be enormous. If you do not describe it then it is not a part of your invention. So, for example, if you describe an invention as always having elements A + B + C + D and then someone makes virtually the same thing but leaves D (or any of the other elements out) they couldn’t possibly be infringing. Why? Because the invention was too narrowly described.  Most inventors are quite good at describing exactly what they have invented. The invention is your work and you know it best, so it is not surprising that most inventors can (with enough effort) explain what they view as the best version of the invention; what the law refers to as the “preferred embodiment.” Nevertheless, it is absolutely essential to think outside the box when you describe an invention in any patent application. You don’t want to just describe the best version of your invention, but rather you want to describe every version of the invention that can work at all, no matter how crudely. If it works at all it should be described in a patent application.

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

DRAM Sales Forecast to Top $100 Billion This Year with 39% Market Growth

DRAM Sales Forecast to Top $100 Billion This Year with 39% Market Growth | Internet of Things - Technology focus | Scoop.it
Richard Platt's insight:

 Sales and unit growth rates are shown for each of the 33 IC product categories defined by the World Semiconductor Trade Statistics (WSTS) organization in the Mid-Year UpdateThe five largest IC product categories in terms of sales revenue and unit shipments are shown in Figure 1.  With forecast sales of $101.6 billion, (39% growth) the DRAM market is expected to be the largest of all IC product categories in 2018, repeating the ranking it held last year.  If the sales level is achieved, it would mark the first time an individual IC product category has surpassed $100.0 billion in annual sales.  The DRAM market is forecast to account for 24% of IC sales in 2018.  The NAND flash market is expected to achieve the second-largest revenue level with total sales of $62.6 billion this year.  Taken together, the two memory categories are forecast to account for 38% of the total $428.0 billion IC market in 2018.   For many years, the standard PC/server MPU category topped the list of largest IC product segments, but with ongoing increases in memory average selling prices, the MPU category is expected to slip to the third position in 2018. In the Mid-Year Update, IC Insights slightly raises its forecast for 2018 sales in the MPU category to show revenues increasing 5% to an all-time high of $50.8 billion, after a 6% increase in 2017 to the current record high of $48.5 billion.  Helping drive sales this year are AI-controlled systems and data-sharing applications over the Internet of Things.  Cloud computing, machine learning, and the expected tidal wave of data traffic coming from connected systems and sensors is also fueling MPU sales growth this year.  Two special purpose logic categories—computer and peripherals, and wireless communications—are forecast to round out the top five largest product categories for 2018.  Four of the five largest categories in terms of unit shipments are forecast to be some type of analog device.  Total analog units are expected to account for 54% of the total 318.1 billion IC shipments forecast to ship this year.  Power management analog devices are projected to account for 22% of total IC units and are forecast to exceed the combined unit shipment total of the next three categories on the list.  As the name implies, power management analog ICs help regulate power usage and to keep ICs and systems running cooler, to manage power usage, and ultimately to help extend battery life—essential qualities for an increasingly mobile and battery-powered world of devices.

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

Semi Content in Electronic Systems Forecast to Reach 31.4% in 2018

Semi Content in Electronic Systems Forecast to Reach 31.4% in 2018 | Internet of Things - Technology focus | Scoop.it
Richard Platt's insight:

Historically, the driving force behind the higher average annual growth rate of the semiconductor industry as compared to the electronic systems market is the increasing value or content of semiconductors used in electronic systems.  With global unit shipments of cellphones (-1%), automobiles (3%), and PCs (-1%) forecast to be weak in 2018, the disparity between the moderate growth in the electronic systems market and high growth of the semiconductor market is directly due to the increasing content of semiconductors in electronic systems.  While the trend of increasing semiconductor content has been evident for the past 30 years, the big jump in the average semiconductor content in electronic systems in 2018 is expected to be primarily due to the huge surge in DRAM and NAND flash ASPs and average electronic system sales growth this year. After slipping to 30.2% in 2020, the semiconductor content percentage is expected to climb to a new high of 31.5% in 2022.  IC Insights does not anticipate the percentage will fall below 30% any year through the forecast period.  The trend of increasingly higher semiconductor value in electronic systems has a limit.  Extrapolating an annual increase in the percent semiconductor figure indefinitely would, at some point in the future, result in the semiconductor content of an electronic system reaching 100%.  Whatever the ultimate ceiling is, once it is reached, the average annual growth for the semiconductor industry will closely track that of the electronic systems market (i.e., about 4%-5% per year).

more...
No comment yet.
Scooped by Richard Platt
Scoop.it!

Generational Cycles – Distribution Curves & Contradictions

Generational Cycles – Distribution Curves & Contradictions | Internet of Things - Technology focus | Scoop.it
Darrell Mann There are two kinds of people in the world Part #335. There are the people that see everything as black or white. And there are people who
Richard Platt's insight:

There are two kinds of people in the world Part #335. There are the people that see everything as black or white. And there are people who understand distribution curves. For some reason the generation-cycles work of Strauss and Howe seems to attract a lot of the former. They reveal themselves usually by taking great offence at a model that describes them (or their offspring) according to a caricature word or phrase. When a caricature word describes a generation cohort as ‘narcissistic’ or ‘suffocated’, their immediate assumption is that it is trying to insult everyone in the cohort. It is not. And, as far as I can tell following multiple re-readings of the Strauss & Howe work, it was not their view either. Strauss & Howe understood distribution curves. Such that, when we characterize Boomers and Nomads as ‘live to work’ generations, we know that there is a range of live-to-work-like behaviours that can be spotted when looking at different members of the cohort. Likewise, not every Hero generation Millennial has a 180-degree opposite ‘work to live’ perspective on life. The reality when comparing the different cohorts might look something like this:  What this pair of distribution curves is trying to tell us is that ‘on average’ a GenX Nomad is more likely to have ‘live to work’ attitudes than the ‘average’ GenY Hero.  This doesn’t necessarily mean the distribution curve appreciating people like the Strauss & Howe model any more than the black-or-white crowd. Their argument usually tracks along the lines, ‘well, if the difference between one cohort and the next is only a few percentage points, does the model actually tell us anything useful at all?’  And that’s why the world needs a third kind of person. People that understand the true value of distribution curves has nothing at all to do with the mean or deviation, but rather the two extremes. The extremes tell us what the contradictions are. And the contradictions define the innovation opportunities.  It’s not about what percentage of Nomads live-to-work and what percentage of Heroes don’t, it’s about recognizing there is a contradiction between live-to-work and work-to-live and if we can somehow solve that contradiction we make everyone in between the two extremes happy.  I wish there were more of this third kind of person. If there were, the world wouldn’t find itself having so many fatuous arguments about whether there’s a two-point difference between two distributions, or whether normal distribution curves are more relevant than power law curves, but would rather get on with the more important job of innovating to make everyone happier…

more...
No comment yet.