5G, IoT, Big Data, Analytics, AI & Cloud
176 views | +0 today
Follow
 
Scooped by Al Sedghi
onto 5G, IoT, Big Data, Analytics, AI & Cloud
Scoop.it!

Ask The Thought Leaders: What’s the Future of the IoT? | Future of Everything

Ask The Thought Leaders: What’s the Future of the IoT? | Future of Everything | 5G, IoT, Big Data, Analytics, AI & Cloud | Scoop.it
The concept of the “internet of things” was initially proposed in 1999 by Kevin Ashton. While it initially sounded like it was impossible, today, the concept is becoming very real. Predictions estimate that the growth of the IoT will reach 50 billion objects by 2020. So, what will a world with that many internet-connected devices …
Al Sedghi's insight:
Without a centralized IoT platform, businesses lack full visibility into the data that sensor-enabled assets generate. 

Enterprises need a central IoT platform to have full visibility into the data that sensor enabled devices produce. There are there development stages that IoT initiatives typically go through:

Stage 1: Process efficiency 

The first focus area of IoT in this stage involves gathering information from connected devices to improve operations. In this phase, IoT use is dedicated to a single business function, not a formal business-wide program. For instance, fleet operators can sensor-enable trucks to identify and repair mechanical failures before a malfunction. 

Stage 2: Create new revenue streams

During this stage, the strategy and focus is to leverage IoT data to create new revenue streams. Here is when you will need to have a central IoT platform. For example, a printing company can have a platform to remotely monitor customers connected printers for faults and cartridge replacement. The IoT platform can monitor and collect data from the connected assets and deliver the information to revenue tracking and generating systems (i.e. product lifecycle management). 

Stage 3: Business transformation 

During this stage, the enterprise is now able to change business model from selling products to selling services. In the case of a connected car, automobile manufacturers will be able to offer valuable services. There will be various opportunities to use real-time data from the vehicle. Complex analytical models running in the cloud or even on board the vehicle can forecast service events and notify the driver. The driver could be notified of a threatening issue, in a safe and non-distracting way – and be guided to the nearest dealership with a vacant service bay and parts in stock – offering convenience to the customer.
more...
No comment yet.
Your new post is loading...
Your new post is loading...
Scooped by Al Sedghi
Scoop.it!

Because Digital Smart City Healthcare and Medical Systems

Because Digital Smart City Healthcare and Medical Systems | 5G, IoT, Big Data, Analytics, AI & Cloud | Scoop.it
IoT-led innovations will drive a shift towards mature, value-based indicators for Smart City healthcare systems, improving efficiency and quality of life.
Al Sedghi's insight:
Smart Cities, by their very nature, produce significant amounts of data in their daily operations. IoT, Open Data are driving cities to collect and make available additional amounts of data – some static but increasingly large parts of it are real-time data. This data exhibits the classic characteristics of Big Data – high volume, often real-time (velocity) and extremely heterogeneous in its sources, formats and characteristics (variability). 

 This big data, if managed and analyzed well, can offer insights and economic value that cities and city stakeholders can use to improve efficiency and lead to innovate new services that improve the lives of citizens.

The growing technology that captures, manages and analyzes this Big Data, leverages technology trends such as cloud computing. Cities are now capable of accessing and using massive compute resources that were too expensive to own and manage a few years ago. Coupled with technologies like Hadoop/HDFS, Spark, Hive and a plethora of proprietary tools it is now possible for cities to use big data and analytical tools to improve the city. 

For example, Boston, USA is using big data to better track city performance against a range of indicators, and additionally identify potholes in city streets and to improve the efficiency of garbage collection by switching to a demand driven approach. New York has also developed a system (FireCast) that analyzes data from 6 city departments to identify buildings with a high fire risk. In Europe, London uses a wide variety of city data and advanced analytics to map individual neighborhoods to better understand resource allocation and planning which is made available through the Whereabouts service. In Asia, Singapore tracks real time transportation and runs a demand driven road pricing scheme to optimize road usage across the island.
more...
No comment yet.
Scooped by Al Sedghi
Scoop.it!

The Modern Data Platform

The Modern Data Platform | 5G, IoT, Big Data, Analytics, AI & Cloud | Scoop.it

“The Modern Data Platform” starts off with a look back that builds the ground for the nowadays challenges to follow.

Al Sedghi's insight:
Here are some of the main factors for getting the Big Data Analytics platform right: 

1. Raging speed 

Opportunities around data are higher than ever. Business users and customers need results almost instantly, but meeting those expectations can be challenging, especially with legacy systems. Speed is not the only factor in executing a Big Data analytics strategy, but it is on top of the list. I was working with a customer running queries on a 10-terabyte data set. With this solution, that query would take 48 hours to come back with an answer and after 48 hours that question is almost moot. There's no benefit to having the answer that is received too late since the time to action has ended.

2. Massive—and growing—capacity

Your Big Data analytics solution must be able to adapt to huge quantities of data, but it must also be able to organically grow as that data increases. You must be able to grow your database in line with your data growth, and do it in a way that is clear to the data consumer or analyst. A modern analytics solution presents very little downtime, if any at all. Capacity and computer expansion happens in the background 

3. Easy integration with legacy tools 

A significant part of an analytics strategy is to ensure that it works with what you have—but also to identify which tools must be replaced, and at what point of time.. A lot of people have made investments in older tools (e.g. extract, transform, load (ETL) tools)  Of course, it is vital to support those legacy tools, but at scale, and as the need for data and analysis grows, you may realize that scaling those ETL solutions becomes a costly problem. It might make more sense to re-tool your ETL with a more modern and more parallel solution.

4. Working well with Hadoop

For many organizations, this open-source Big Data framework has become synonymous with Big Data analytics. But Hadoop alone is not enough. At the end of the day, Hadoop is a batch processing system, which means that when I start a job to analyze data, I go into a queue, and it finishes when it finishes. When you're dealing with high-concurrency analytics, Hadoop is going to show its weaknesses. You may have heard this: What's needed is a way to harness the advantages of Hadoop without incurring the performance penalties and potential disruptions of Hadoop.

5. Support for data scientists

Enterprises should help their experts—and most in-demand—data workers, by investing in tools that allow them to conduct more robust analysis on larger sets of data. What's important is that you want to move towards a solution where the data scientists can work on the data in place in the database. For instance, if they have SQL Server, they're pulling a subset or sample of data out of the database, transferring it on their local machine, and running their analysis there. If they are able to run statistical models in-database, they're no longer sampling, and thus will be able to get their answer much quicker. It's a significantly more efficient process.

6. Advanced analytics features

As organizations move toward predictive analytics, they have more needs and demand from their data technology. It's beyond just reporting. It's beyond the aggregates of the data in the data warehouse. You may be looking for a complicated query of the data in your database—predictive, geospatial, and sentiment focused.
more...
No comment yet.
Scooped by Al Sedghi
Scoop.it!

Worldwide Sports Analytics Market to Reach $3.97 Billion by 2022

Worldwide Sports Analytics Market to Reach $3.97 Billion by 2022 | 5G, IoT, Big Data, Analytics, AI & Cloud | Scoop.it
CBS 58 is your local source for the Milwaukee news, Milwaukee weather, and Milwaukee sports.
Al Sedghi's insight:
Today you can gather information about distance run, speed, burst speeds, impact, strength, recovery times, etc. there’s even more ability to study not just current stats but a massive amount of historical stats which then generates a whole new series of benchmarking opportunities for performance and coaching.

To leverage available analytics, teams and leagues typically turn to data companies like Genius Sports and Sportradar – which can offer customized programs and services to meet individual needs. 

There are opportunities for a team to do everything from evaluate player biometrics to improve scouting reports and assist players better practice and plan for opponents with data-informed virtual reality systems to tailor their marketing to a fan’s interests and guarantee that supporters are involved with the most relevant, exciting achievable experiences.
more...
No comment yet.
Scooped by Al Sedghi
Scoop.it!

Big Data and IoT - How The Future Of Analytics is Evolving | Analytics Training Blog

Big Data and IoT - How The Future Of Analytics is Evolving | Analytics Training Blog | 5G, IoT, Big Data, Analytics, AI & Cloud | Scoop.it
With each passing day, more objects and machines are getting connected to the internet transmitting the information for analysis. The objective is to harness this data to discover trends and results that can help any business with a positive impact. And why not, after all the future of technology lies in the hands of data … Continue reading "Big Data and IoT – How The Future Of Analytics is Evolving"
Al Sedghi's insight:
The biggest barrier facing enterprises considering IoT deployments will be knowing what to do with the massive amounts of information that will be gathered.

You will face different sources of data, anywhere from social media, sensors, embedded devices, etc. So, the challenge is to design for analytics – develop a strategy that you can see data more as a supply chain than a warehouse. There are going to be numerous unstructured data sources, so it is key to focus on collecting and organizing the data that you really need. 

For data analysis, the following are challenges to overcome. We need to worry about 5 Vs , Variety, Volume, Velocity & Veracity, and  Value 

Variety: defines the different types of data, which is growing every day. Structured data that has been regularly stored in databases is now being linked with unstructured data, including social media data, wearable data, and video streaming.

Volume: relates to the scale of data, how it’s obtained and warehoused. According to IBM, an approximate 2.5 quintillion bytes of data are created every single day. By the year 2020, there will be 300x more information in the world than there was in 2005, an approximate 43 trillion gigabytes. 

Velocity:  refers to the pace of data processing and

Veracity: is the ambiguity vs. the reliability of data. According to IBM, poor data quality costs the U.S. economy over $3 trillion dollars a year. 

Value:  relates to the ability of making data profitable by utilizing analyzed data to expand revenue and reduce cost.

It is evident that variety and volume of data that is being delivered through the networks is overwhelming; consequently, this impacts the velocity i.e. how quickly technology and enterprises can analyze the complete excess of information that’s collected. We should employ well-organized network connections as well as monitors and sensors to work out behavioral patterns and applications to structure these patterns. The key V to focus on is Value: monetizing on Big Data. 

Business Challenges 

We need to concentrate on analytics, accurate forecasting, evaluating/ implementing new tools / technologies, and getting real-time insights. We must determine how we can be profitable from something that impacts time, money and resources to keep up with. 

Business analytics: We must use analytics to grow our operational acumen and measure whether it’s working and who it’s working for. 

Accurate forecasting: Leveraging analytics we can make reliable implications about the future of the market. We understand what we did, so the next thing is how can we do it better? What’s next for our demographic? What can we conclude that our demographic will respond to? 

Real-time insights: are significant to ensuring that the analytics and forecasting are worth it. Time is crucial. Enterprises want to know what’s going on immediately so they can determine how to respond accordingly. Big Data can do this. 

Technology Challenges 

Workforce We must train the workforce that’s already in place and attract/hire top talent to fill the gaps. It’s a flourishing market for IoT / Big Data specialists. Specialists with expertise in such skills as VMWare, application development, open source technology, data warehousing and solid programming skills will be the ones to employ.

Infrastructure Agile IT is key in the business. From an architectural point of view, the fewer systems you have, the more agile you will be. Here is an ecosystem to focus on buildout:

 • To reduce cost, add modern systems to legacy systems initially. Use the existing architecture and practice an evolutionary approach to build the smart ecosystem of the future. No need to start from scratch; it will cost valuable time and resources. 

• Focus on build out of the infrastructure for business-critical applications. 

>>> Note that customer experience depends on high availability. 
>>> Ensure to plan for hardware, network and data management applications. 
>>> Design data storage to have the capacity to hold and update data at a low cost. 
>>>The network must be capable to cost-effectively transfer data to/from frameworks and architectures, while offering future growth.
 >>>Data management applications must be adaptable for processing, classifying and consuming vast amounts of data in real time.

• Analyze data for predictive analytics and directed marketing campaigns. 

Tools

Invest in established BI tools and applications. This will enable you to take advantage of the vast amounts of data that’s collected as well as gain valuable insight. Focus on applications that are easy-to-use and able to provide interactive interfaces to help you gain control and make informed decisions. 

Regulatory Challenges 

Be aware of current and new regulations that limit the use, storage and collection of certain types of data. Obtaining and analyzing data to improve business practices will help to increase revenue and decrease. How do you succeed Notwithstanding all the challenges, there are many prospects to monetize on IoT / Big Data and be ahead of the competition.  These include opportunities to: 

1) Grow revenue by tailoring advertising, products and services, and by leveraging predictive analytics to get real-time insights into customer behavior. 

2) Reduce costs through security, fraud prevention and network optimization.

3) Improve the Customer Experience by getting customer feedback, targeted and predictive marketing, and tailored products and services. 

To overcome challenges there must be collaboration across the three silos. For instance, the business and technology silos need to work together to discover synergies and enhance/automate IT and business processes for the most effective operation across the enterprise. Work together as a team and be smart.
more...
No comment yet.
Scooped by Al Sedghi
Scoop.it!

Can artificial intelligence cataloging be the Google for enterprise big data?

Can artificial intelligence cataloging be the Google for enterprise big data? | 5G, IoT, Big Data, Analytics, AI & Cloud | Scoop.it

Can artificial intelligence cataloging be the Google for enterprise big data?

Al Sedghi's insight:
Unlike a Relational Database Management System (RDMS), when data is available in Hadoop (on HDFS), there is no assurance that the data, at a record level is “good,” or consumable. What’s more alarming is when retrieving this data (whether it is with Hive, Spark, Impala, etc.) the user will not be to know whether the data is “bad” or not. This is one of many challenges in what’s referred to as “on-boarding data”. Other key areas to consider include: organization as well as history management.

Having an intelligent data management platform offering self-service data analytics will help marketers understand their customers and find similar customers across various marketing channels, devices, ad campaigns, etc. 

This platform must have the ability to create models that progress from descriptive to predictive to prescriptive analytics. Users evaluate possible alternatives and predict outcomes through simulation analysis.

This leads me to the next point. Why is AI a revived trend? It is all about the three Vs: velocity, variety, and volume. Platforms that could process the three Vs with current and conventional processing models which scale horizontally provide ten to twenty times cost advantage over traditional platforms. 

We will be able to realize the highest value from applying AI to high-volume repetitive tasks when uniformity and stability is more effective than getting individual intuitive oversight at the expense of human error and cost.
more...
No comment yet.
Scooped by Al Sedghi
Scoop.it!

Taming Big Data with Spark Streaming for Real-time Data Processing

Taming Big Data with Spark Streaming for Real-time Data Processing | 5G, IoT, Big Data, Analytics, AI & Cloud | Scoop.it
Learn how Spark Streaming capabilities help handle big and fast data challenges through stream processing by letting developers write streaming jobs.
Al Sedghi's insight:
Spark is a technology well worth considering and learning about. It has a growing open-source community and is the most active Apache project at the moment. 

Spark offers a faster and more general data processing platform and  lets you run programs up to 100x faster in memory, or 10x faster on disk, than Hadoop.  Other reasons to consider using Spark is that It is highly scalable, simpler and modular, This could be why it is being adopted by key players like Amazon, eBay, Netflix, Uber, Pinterest, Yahoo, etc..

Last year, Spark took over Hadoop by completing the 100 TB Daytona GraySort contest 3x faster on one tenth the number of machines and it also became the fastest open source engine for sorting a petabyte. Spark also enables you to write code faster as you have over 80 high-level operators at your disposal. 

Additional major features of Spark include: 

- Provides APIs in Scala, Java, and Python, with support for other languages (such as R) on the way 

- Integrates well with the Hadoop ecosystem and data sources (HDFS, Amazon S3, Hive, HBase, Cassandra, etc.)

- Can run on clusters managed by Hadoop YARN or Apache Mesos, and can also run standalone

Overall, Spark simplifies the challenging and compute-intensive task of handling high volumes of real-time or archived data, both structured and unstructured, effortlessly integrating relevant complex capabilities such as machine learning and graph algorithms.
more...
No comment yet.
Scooped by Al Sedghi
Scoop.it!

The Future Is Intelligent Apps – InFocus Blog | Dell EMC Services

The Future Is Intelligent Apps – InFocus Blog | Dell EMC Services | 5G, IoT, Big Data, Analytics, AI & Cloud | Scoop.it

Enterprises must pay attention to key application technology and architecture capabilities

Al Sedghi's insight:
Data gathering has advanced drastically. From automobiles, to smartgrids, to patient bodies, there are currently dashboards and platforms for collecting, organizing, and displaying detailed information that can offer a more improved set of data from gas consumption, to vital signs with sophisticated alert configuration. 

APIs and Open Source are essentially democratizing data in all these scenarios. Amid all this lies Machine Learning and Predictive Analysis. Two things can really cause a threat to putting machine learning to work: a) poor data quality and b) lack of data integration. The improvement of APIs and the trend of open sourcing can put an end to all these threats. There are a lot of open source projects on Github that software developers can leverage to integrate machine learning into their applications. Last year Google started releasing lower-level libraries like Tensorflow, which can be used in conjunction with others to entirely match the level of refinement a developer or data scientist is considering. For novice, there is a service like Amazon Machine Learning, which provides a simple UI for non-developers. 

Enterprises must pay attention to key application technology and architecture capabilities, such as data management services, user self-service, better sharable analytics, agile development leveraging latest PaaS and DevOps techniques.
more...
No comment yet.
Scooped by Al Sedghi
Scoop.it!

Actility Announces Breakthrough IoT Geolocation and Tracking Solution Platform for Logistics

Actility Announces Breakthrough IoT Geolocation and Tracking Solution Platform for Logistics | 5G, IoT, Big Data, Analytics, AI & Cloud | Scoop.it
Actility, the industry leader in Low Power Wide Area Networks (LPWAN), today announces the availability of a comprehensive geolocation and tracking solution platform offering breakthrough network-based location capability enabled by LoRaWAN network gateways and infrastructure.
more...
No comment yet.
Scooped by Al Sedghi
Scoop.it!

Fog Computing Improves Healthcare Cloud for IoT Adoption

Fog Computing Improves Healthcare Cloud for IoT Adoption | 5G, IoT, Big Data, Analytics, AI & Cloud | Scoop.it
The OpenFog Consortium aims to solve the bandwidth, latency, and communication challenges associated with the, artificial intelligence, and other advanced digital IT infrastructure concepts.
Al Sedghi's insight:
The purpose of the healthcare IoT is to make it simpler for patients to stay connected to their providers, and for their providers to provide liable, value optimized care to their populations.  Fog computing may be the primary infrastructure for transforming the healthcare IoT from innovation to reality. 

To achieve this, the healthcare industry must master three of its main big data hurdles: a) The challenge turning big data into smart data b) the geographical distribution of healthcare providers and their lack of interoperability, and the strict patient privacy and security rules that oversee the flow of sensitive health data. 

 Fog computing can surmount these hurdles by acting as small data processing centers that exchange data without the need for the cloud.  By means of predefined authorization and user protocols, a patient’s health data could be open to each device through a shared interface, but any computations will only happen where the data originates: at the hospital or physician office that holds the patient record.   

Fortunately, OpenFog Consortium was founded in November 2015 by ARM, Cisco, Dell, Intel, Microsoft and Princeton University, based on the shared vision that an open fog computing architecture is necessary in today’s increasingly connected world. The OpenFog is addressing the bandwidth, latency, and communication challenges related to the, AI , and other advanced digital IT infrastructure concepts.
more...
No comment yet.
Scooped by Al Sedghi
Scoop.it!

How geospatial analytics can give your business a competitive edge

How geospatial analytics can give your business a competitive edge | 5G, IoT, Big Data, Analytics, AI & Cloud | Scoop.it
What happens when you combine geographic information system (GIS) and internet of things/sensor data with data from operations, customers, finance and marketing? You get an unbeatable competitive edge for business.
Al Sedghi's insight:
GIS’s robust capability to integrate different kinds of data about a physical location can lead to better-informed decisions on public and private investments in infrastructure and services, together with a more efficient and timely reaction in emergency situations. 

Here are some use cases for use of Geospatial Analytics to remain competitive: 

 - Improve Sales & Marketing (Market Segmentation): 

Divide customers into groups with common characteristics, e.g. demographics data such as gender, income, life style/behavioral data such as buying pattern and then using location data from GPS push offers to those specific customers

 - Upgrade Asset Management: 

Monitor your physical assets and then upgrade when they fail, unavailable or have operational issues (e.g. networks assets during power outages). So this use case allows you to determine where the failure is and also which service crew is closest to be dispatched

 - Strengthen situational awareness & intelligence

Law enforcement / public safety agencies can predict and prevent crime using: a) Mapping, b) Alerts c) predictions 

- Improve risk analysis: 

Determine how often certain events might happen and what their outcome could be to minimize exposure. Example: Insurance companies assess risk to set premiums based on analytics / location data gathered from floods and hurricanes -


- Boost transportation & logistics planning: 

Transportation charges are frequently a great part of logistics costs. Fuel, maintenance, and driver time add to the cost. In this case, transportation company might want to try to maximize fleet utilization for optimum efficiency by developing complex models using linear programming techniques. Things considered might include historical data of routes taken, trucks used, orders loaded on each truck, shift times, and delivery data. These companies frequently implement alternate scenario modeling to decide which routes make the most sense.

- Improve strategic location determination:

Market optimization exercise is an example where the planners must understand the relationship among various factors that affect the choice of a certain location. Geospatial analysis can help to enrich this application. One related use case is restaurant chains which can use geospatial analysis to determine where a new restaurant should be located, including  i) target market ii) competitors in a specific area iii) operational consideration iv) change: e.g. using GIS to look at new plans for housing and roads that might affect the number of potential customers coming to a location

 - Improve fraud detection and prevention: 

Use case 1 (Insurance): Auto Insurance Company: By providing location information about where fraud might occur; for example Allstate may be performing an analysis to identify repair shops that inflate repair estimates for automobile body work.

 User case 2 ( Healthcare): Examining claims forms to discover physicians who might be raising claims or fabricating reports • 

Use case 3 (Government agencies): Studying where recipients are deceiving social programs such as the supplemental nutrition assistance (food stamps) program

Use case 3 (Banking): Credit card companies:  Examining transaction data, including a combination of geographic location of a transaction, transaction amount, date/time, and merchandise category, might be used to detect fraud
more...
No comment yet.
Scooped by Al Sedghi
Scoop.it!

Expansion of cloud services fuels the IoT sensors market in emerging economies

more...
No comment yet.
Scooped by Al Sedghi
Scoop.it!

Daily Fintech

Daily Fintech | 5G, IoT, Big Data, Analytics, AI & Cloud | Scoop.it
Global Fintech Trends Analysis for Banks Insurance ScaleUps Investors
Al Sedghi's insight:
Fintech startups are unbundling banking. There’s no doubt these startups disrupt services that the big banks provide or facilitate. If startups in all these areas succeed, they could grab at banks’ revenue opportunities. Here are key imperatives banks should be focused on doing : 

 • Use analytics holistically across the bank 

 • Generate an elegant, segmented, and integrated customer experience, versus using one-size-fits-all distribution

 • Create digital-marketing competencies that will be equivalent to e-commerce giants 

 • Intensely reduce the potential cost advantage of Fintech  attackers through fundamental simplification, process digitization, and streamlining 

 • Quickly use and deploy the next generation of technologies, from mobile to agile to cloud 

 • Reorganize legacy organizational structures and decision rights to support a digital environment
more...
No comment yet.
Scooped by Al Sedghi
Scoop.it!

Advanced analytics, Big Data to Blockchain driving disruption in banking sector; here’s how

Advanced analytics, Big Data to Blockchain driving disruption in banking sector; here’s how | 5G, IoT, Big Data, Analytics, AI & Cloud | Scoop.it
Advanced analytics, Big Data and open APIs are driving disruption in the banking sector, followed by artificial intelligence and Blockchain
Al Sedghi's insight:

Analytics and Big Data are redefining the banking. Recently BBVA bank has rolled out the Open API framework, and a group of technology giants (Amazon, Apple, Google, Intuit and PayPal) has launched “Financial Innovation Now” to promote superior innovation in financial services.  Another bank that has been very active in this space is Capital One that has been transforming their technology to allow for building services which are API based. They launched their developer portal “Capital One DevExchange” offering developer tools and APIs, empowering developers to use the financial institution's software building resources to generate better customer experiences.

 

 So, what are the benefits?  New startups / FinTechs can create value-added services leveraging these APIs.  One example is to offer product comparison or credit scoring. Additionally, these APIs will make it possible to locate, access and extract large amount of a bank’s data efficiently and securely.

 

 Here is a scenario:  A customer having an account at ABC bank wants to apply for a loan or credit card at DEF bank. If the DEF bank has implemented capabilities to leverage the open APIs offered by ABC bank, it would connect to ABC bank and access important financial details for that customer. The Open API will enable the DEF bank not only to get personal details from the applicant but also assess credit-worthiness without having to go through more expensive credit check companies.

 

more...
No comment yet.
Scooped by Al Sedghi
Scoop.it!

Bitcoin Startups Challenging Big Banks Profits - Bitcoin News

Bitcoin Startups Challenging Big Banks Profits - Bitcoin News | 5G, IoT, Big Data, Analytics, AI & Cloud | Scoop.it

How much of big banks’ profits do you think will be lost to Bitcoin startups? 

Al Sedghi's insight:
Bank disrupters such as Bitcoin startup Abra or money transfer service TransferWise have certainly been scaring the banks. Bitcoin has promised a new way to hold and move money, totally outside the existing banking system.

In my view, Bitcoin is a threat down the road but I don’t believe it has yet taken off to hold and spend money on a day-to-day basis, as many of its fans had predicted. Some larger banks have been making big investments targeted at using the technology underlying Bitcoin — known as the Blockchain — for their own purposes. 

Americans are now able to send each other money immediately from their phones, thanks to Venmo, and can get loan approval in minutes, also from their phone.

According to the consulting firm McKinsey digital disruption could put $90 billion, or 25 percent of bank profits, at risk over the next three years as services become more automated and Chatbots are replacing the tellers. 

Banks learn new ideas from the technology world and shrink their own operations, without ineludibly losing substantial numbers of customers to start-ups. 

Venmo, for, example, has caught the wallets of many young Americans, but the major banks have all started their own versions of the service and some of them claim that they already processed, as single banks, more instant personal payments than Venmo. 

Several large banks are also collaborating to start a cross-industry mobile application, Zelle, that will take Venmo head-on in 2017. 

Financial start-ups are not confronting the same problems everywhere in the world. Ant Financial, a technology firm in China (spin-off from Alibaba) has over 450 million users and handles most online payments in China. 

Ant Financial has used its position as an electronic, phone-based wallet to attract other financial business from Chinese consumers, offering online loans, insurance and investment advice to its hundreds of millions of customers. 

According to CB Insights: China has four of the five most valuable financial technology start-ups in the world, with Ant Financial leading the way at $60 billion. There was a 64 percent rise in financial technology investments in China last year, while a drop of 29 percent in the US. In Africa, mobile phone-based payment systems like M-Pesa , started by local mobile operator, in Kenya, has grown to become the leading form of payment in the country. 

So why have the financial start-ups not achieved the same level of growth in the United States? Because most Americans already have access to a relatively functional set of financial products, unlike in Africa and China. 

Initially we found some start-ups looking to becoming banks, so they could offer deposit insurance — among other things — but they generally found that getting a bank charter needed more time and capital than is available to even the most successful start-ups. This left most start-ups dependent on banks to hold and move any money they collected from customers.

Online lending organization were also predominantly determined in trying to take on the banks from outside the system. For example, firms such as Lending Club, which makes personal loans, and OnDeck, which offers small business loans, originally expanded quickly and went public, but both companies have faced challenges and realized limitations on how fast a lending business can grow without being a bank. 

The online lenders have wrestled with the high cost of gaining new customers through marketing, which is not as crucial in other industries that have been disrupted by technology.

At a more basic level, online lenders learned how hard it was to fund the new loans they wanted to give out without having access to inexpensive deposits, as the banks do. Non-bank lenders will take share from banks, though I think they must pick their spots carefully. 

Start-ups offering enhancing payments have seen some success —one example is recent start-up Stripe, which helps new companies accept online payments. 

But you must understand that Stripe and other payment start-ups like Square are all developed on top of the current credit card and banking infrastructure and have not posed any type of fundamental risk or challenge to the prevailing giants. 

In my view, Bitcoin has not been able to achieve the marketing promises that were made for it back in 2012. I think it will take some time to see their success in this space. But I do believe that Bitcoin startups with access to plenty of capital in addition to developer base may see success and challenge banks, establishing their brands, infiltrate markets, and ink deals with customers soon.
more...
No comment yet.
Scooped by Al Sedghi
Scoop.it!

Can the Internet of things solve environmental crises?

Can the Internet of things solve environmental crises? | 5G, IoT, Big Data, Analytics, AI & Cloud | Scoop.it
If so, it must consider local business and cultural needs, build business processes and market structures around the world.
Al Sedghi's insight:
We have to think about the likely limitations on IoT due to power consumption, the use of rare earth elements – all from the beginning of any relevant IoT project. How energy hungry the IoT will mainly depend on the types of devices chosen for deployment and what they will be doing.

Low-power, low-data transmitting devices – such as sensors that are used to monitor when it is time to refill vending machines – are not likely to increase energy bills. Many of these devices don’t use main building power, but instead leverage long-lasting batteries or solar energy.

But some devices that are used for video surveillance are going to be energy-hungry. In fact, these devices will require main power to operate and will drive data consumption tremendously. According to Cisco, internet video surveillance traffic almost doubled between 2014 and 2015, and is forecasted to increase ten times by 2020.

IoT can drive energy harvesting wireless technology. IoT networks gain from self-powered technology because it eliminates the need for battery replacement, making devices maintenance-free, which is particularly positive for remote areas and the deployment of billions of connected devices. Moreover, energy harvesting sensors provide IoT with a green angle, bridging between the automation world and the mobile world with the added benefit of being eco-friendly.  

The total volume of data being transmitted and stored is also forecasted to blast. Data storage has become more energy efficient over recent years. Instead of being stored on company premise servers and relying on data centers, data is gradually stored and processed in the cloud. 

To drive energy efficiency and climate change initiatives, not only policies are needed, but also interests and actions coming from the user community and leading industry organizations.
more...
No comment yet.
Scooped by Al Sedghi
Scoop.it!

Ask The Thought Leaders: What’s the Future of the IoT? | Future of Everything

Ask The Thought Leaders: What’s the Future of the IoT? | Future of Everything | 5G, IoT, Big Data, Analytics, AI & Cloud | Scoop.it
The concept of the “internet of things” was initially proposed in 1999 by Kevin Ashton. While it initially sounded like it was impossible, today, the concept is becoming very real. Predictions estimate that the growth of the IoT will reach 50 billion objects by 2020. So, what will a world with that many internet-connected devices …
Al Sedghi's insight:
Without a centralized IoT platform, businesses lack full visibility into the data that sensor-enabled assets generate. 

Enterprises need a central IoT platform to have full visibility into the data that sensor enabled devices produce. There are there development stages that IoT initiatives typically go through:

Stage 1: Process efficiency 

The first focus area of IoT in this stage involves gathering information from connected devices to improve operations. In this phase, IoT use is dedicated to a single business function, not a formal business-wide program. For instance, fleet operators can sensor-enable trucks to identify and repair mechanical failures before a malfunction. 

Stage 2: Create new revenue streams

During this stage, the strategy and focus is to leverage IoT data to create new revenue streams. Here is when you will need to have a central IoT platform. For example, a printing company can have a platform to remotely monitor customers connected printers for faults and cartridge replacement. The IoT platform can monitor and collect data from the connected assets and deliver the information to revenue tracking and generating systems (i.e. product lifecycle management). 

Stage 3: Business transformation 

During this stage, the enterprise is now able to change business model from selling products to selling services. In the case of a connected car, automobile manufacturers will be able to offer valuable services. There will be various opportunities to use real-time data from the vehicle. Complex analytical models running in the cloud or even on board the vehicle can forecast service events and notify the driver. The driver could be notified of a threatening issue, in a safe and non-distracting way – and be guided to the nearest dealership with a vacant service bay and parts in stock – offering convenience to the customer.
more...
No comment yet.
Scooped by Al Sedghi
Scoop.it!

Connected to converged: Can fog computing cure industrial IoT's pain points? - SiliconANGLE

Connected to converged: Can fog computing cure industrial IoT's pain points? - SiliconANGLE | 5G, IoT, Big Data, Analytics, AI & Cloud | Scoop.it
Connected to converged: Can fog computing cure industrial IoT's pain points? - SiliconANGLE
Al Sedghi's insight:
Fog computing enables computing, policymaking and action-taking to occur via IoT devices and only pushes relevant data to the cloud. 

The fog extends the cloud to be closer to the things that generate and perform an action on IoT data. In this context, we refer to any IoT devices with computing, storage and network connectivity as fog nodes. Fog nodes can be deployed anywhere with a network connection such as on factory floor, on top of a power pole, near a railway track, in a car, or on an oil rig. Other examples include industrial controllers, switches, routers, embedded servers, and video surveillance cameras. 

Here is a summary on what occurs within fog computing: 

• Examines the most time-sensitive data at the network edge, close to where it is created instead of sending vast amounts of IoT data to the cloud. 

• Acts on IoT data in milliseconds, based on policy.

• Transmits selected data to the cloud for historical analysis and longer-term storage. 

Benefits of using Fog Computing 

• Reduce latency 

• Preserve network bandwidth

• Handle security concerns at various levels of the network 

• Operate consistently with rapid decisions • Gather and secure wide range of data 

• Move data to the most suitable place for processing 

• Reduce expenses of using high computing power only when needed and less bandwidth 

• Better analysis and insights of local data 

Note that that fog computing is not a substitute for cloud computing, as it works in combination with cloud computing, improving the use of available resources. But it addresses two challenges, real-time processing and action on incoming data, and optimizing the use of resources like bandwidth and computing power. Another positive element in fog computing is that it takes advantage of the distributed nature of today’s virtualized IT resources.  The enhancement to the Data-path hierarchy is empowered by the increased compute functionality that manufacturers are building into their edge routers and switches. 

Here is a real-life example: A traffic light system in a major city is equipped with smart sensors. An application developed by the city to regulate light patterns and timing is running on each edge device. The app automatically adjusts light patterns in real time, at the edge, dealing with traffic issues as they occur and improving traffic flow. Once the traffic slowdown is over, all the data collected from the traffic light system would be sent to the cloud and examined, supporting predictive analysis and letting the city regulate and improve its traffic application’s response to future anomalies. There is little value in sending a live stream of traffic sensor data to the cloud for storage and analysis. Instead, the information is processed and acted upon in the edge nodes, and only a summary is sent to the cloud for further analysis.
more...
No comment yet.
Scooped by Al Sedghi
Scoop.it!

Navigating a Successful Journey toward IoT and Big Data in the Cloud | IT News Africa – Africa's Technology News Leader

Navigating a Successful Journey toward IoT and Big Data in the Cloud | IT News Africa – Africa's Technology News Leader | 5G, IoT, Big Data, Analytics, AI & Cloud | Scoop.it

"It is predicted that by 2020, 25 billion devices will be connected to IoT and 600 zettabytes of information will be sitting in

Al Sedghi's insight:
IoT is more about devices, data and connectivity. The real significance of Internet of Things is about creating smarter products, delivering intelligent insights and providing new business results. 

As we go in future, millions of devices will get connected, internet of things will produce an enormous inflow of Big Data. The key challenge is envisioning and revealing insights from various categories of data (structured, unstructured, images, contextual, dark data, real-time) and in context of your applications.   

I believe gaining intelligence from Big Data using artificial Intelligence technologies is the key enabler for smarter devices and a connected world. 

The final objective is to connect the data coming from sensors and other contextual information to discover patterns and associations in real-time to positively impact businesses. Current Big Data technologies must be expanded with the goal to effectively store, manage and gain value from continuous streams of sensor data. 

In the case of connected cars, if 25 gigabytes of data is being sent to the cloud every hour, the main goal must be to make sense of this data, detecting data that can be consumed and rapidly acted upon to develop actionable events. The evolution of AI technologies will be key to grow insights rapidly from massive streams of data. 

With IoT, Big Data analytics would also need to move at the edge for real-time decision making; Here are some examples: i) detecting crop patterns in agriculture plants using drones, ii) detecting suspicious activities at ATMs or iii) predicting driver behavior for a connected car.
more...
No comment yet.
Scooped by Al Sedghi
Scoop.it!

7 Tips to Prepare Your Network for the Internet of Things

7 Tips to Prepare Your Network for the Internet of Things | 5G, IoT, Big Data, Analytics, AI & Cloud | Scoop.it
To prepare the network to support a plethora of connected devices, consider these seven tips.
Al Sedghi's insight:
To prepare your network for IoT, one key recommendation is to move data storage and processing to the network edge. This potentially decreases the distance that data must travel over a network. Nevertheless, it does mean an increased number of edge devices, and a corresponding increase in costs. 

One alternative strategy is to redesign current network routing, and cash in on the capacity that is already there. Latency can be reduced 60% by redesigning network routing protocols to prevent congestion and traffic bottlenecks. Furthermore, smart Internet routing can optimize bandwidth freeing up more space for IP traffic.
more...
No comment yet.
Scooped by Al Sedghi
Scoop.it!

Analytics and the cloud: The Internet of Things

Analytics and the cloud: The Internet of Things | 5G, IoT, Big Data, Analytics, AI & Cloud | Scoop.it
f low power sensors, and their ease of use coupled with having lifetimes of several years before replacement,
Al Sedghi's insight:
For critical and time sensitive remote sensing applications, it is key to develop sensors that can run on near-zero power and produce a wake-up signal when a specific signature or alert signal is discovered, such as a car or truck driving by, or a generator being switched on. For disaster situations such as earthquakes you can imagine having an ultimate geophone, where you're sensing for earthquakes, sensing vibrations in the earth.
more...
No comment yet.
Scooped by Al Sedghi
Scoop.it!

Strategies to Improve Health Results and Reduce Costs

Strategies to Improve Health Results and Reduce Costs | 5G, IoT, Big Data, Analytics, AI & Cloud | Scoop.it
Digital Health Solutions Cost Savings Driven By Medication Adherence, ER Diversion and Behavior Change
Al Sedghi's insight:
It’s not surprising that lack of medication adherence increases hospital and nursing home admissions. Technology that organizes multiple medications by time of day has proven to help patients stick with their prescribed therapy. 

Another major cost saving strategy is to reduce ER diversion by developing an interoperable health information exchange (HIE) system or other information sharing portal which updates the health center when its patient presents at the ER or is admitted. Also, the health center and hospital frequently keep interoperable electronic medical records systems that allow for the transmission of patient health information in a timely fashion. This technology can also be leveraged during  a disaster;  this web portal would be activated and health care professionals and possibly first responders assisting in the response will be able to access the portal to view critical lifesaving patient health information. 

HIE can be developed leveraging cloud computing technology. The idea is that applications can exist in anywhere in a group of servers instead of one specific server. Each healthcare provider needs only one connection to the cloud instead of one to every provider in the exchange. The cloud based solution will offer data storage, multimedia communication including alerts. As an add-on service , an overlay unified communication system can be developed/integration to support offering real time video or audio consult during an emergency. 
more...
No comment yet.
Scooped by Al Sedghi
Scoop.it!

3 areas banks need to embrace to improve customer experience

3 areas banks need to embrace to improve customer experience | 5G, IoT, Big Data, Analytics, AI & Cloud | Scoop.it
The banking sector is struggling to rapidly adapt to an omnichannel customer experience strategy. But, focusing on three areas can help while also accelerating products to market and deescalating costs.
Al Sedghi's insight:
Platforms in the core banking system, which you will find at the deepest level of the IT organization are usually batch-based, largely monolithic, and old by technology standards. 

The challenge that typically surfaces with these monolithic platforms is Customer facing applications must develop and advance faster, however they have data dependencies on core banking i.e. old legacy platforms which makes the technology upgrade challenging. 

Even if your mobile banking application utilizes an agile methodology to go to market, changes that need to be made in the core banking will delay the whole process from timely commercial delivery. This is a perfect example where a Microservices architecture layered on top of core banking could seperate the release cycles and increase agility for the organization overall
more...
No comment yet.
Scooped by Al Sedghi
Scoop.it!

Physician willingness to use wearables data in treatment grows

The lack of integration of devices with electronic health record systems
Al Sedghi's insight:
Big data and new high-tech innovations have the potential to address health disparities and progress health results for patients. These new tools and methods can provide a greater evidence base for more effective, strong, broad, and justifiable healthcare delivery. Their potential is in the additional facility of applicable and timely data to individually created patient and hospital records. Here are some examples: 

• Mobile data analysis: To address Ebola virus disease epidemic call data records (CDRs) from mobile network operators have been collected to map people’s mobility and project the path of the disease. CDRs can provide a powerful tool to identify risks, design information campaigns, and show result of actions.

 • Patient monitoring through self-tracking via sensors, gadgets, and apps: In 2015, there were more than 100,000 health apps available for smart phones. In the U.S., 34% of all Americans who tracked their health habits stated that self-tracking has affected a health decision they have taken.

At any rate, the capability to combine numerous sources of data is crucial to successfully leverage big data in health care. Additionally, the constant growing volume of data with complex patterns may outspread beyond the physician’s ability to use traditional data processing techniques for interpretation. A data revolution for better health outcomes will require establishing the right incentives to support coordination among different stakeholders within health care systems. The potential success of this will also require new partnerships to link data producers with data users and data analysts. Finally, acknowledging the value of big data and the will to act on its insights requires a fundamental shift in approach.
more...
No comment yet.
Scooped by Al Sedghi
Scoop.it!

Huawei pushing into NB-IoT through partnerships and research | ZDNet

Huawei pushing into NB-IoT through partnerships and research | ZDNet | 5G, IoT, Big Data, Analytics, AI & Cloud | Scoop.it
While Huawei is not aiming to deliver IoT products itself, it is pushing into the market by providing equipment and technologies through its partnerships with businesses and research organisations.
Al Sedghi's insight:
Huawei has been very active in partnering with various players around the world to offer its narrowband Internet of Things (NB-IoT) solution. 

Last year Huawei and Pessl Instruments, an Austrian Manufacturer of IoT Products for Agriculture, partnered to develop end to end solutions leveraging NB-IoT technology to reduce risks associated with natural disaster and extreme weather enhancing a smarter and more profitable farming. 

Pessl makes standalone environmental monitoring instruments which can be used to track parameters such as temperature, humidity, rainfall, leaf wetness, insect pressure and soil moisture. NB-IoT will enable connectivity of remote weather stations, soil moistures sensors and many other devices with the internet. 

NB-IoT uses an ultra-low and cost effective power chip with a long-range reach which can connect almost every spot of the world with the internet reusing the commercial mobile network. The farmers will get wide connectivity to their own fields for timely application of plant protection products, fertilizers and most the activities of the crop cycle.
more...
No comment yet.
Scooped by Al Sedghi
Scoop.it!

How artificial intelligence will transform Wall Street

How artificial intelligence will transform Wall Street | 5G, IoT, Big Data, Analytics, AI & Cloud | Scoop.it
High-earning traders will quickly become an endangered species as AI takes over the financial sector.
Al Sedghi's insight:
There is certainly the motivation to dump million-dollar-a-year ($500 an hour!) traders with AI Bots, but how likely that is in the near term. Imagine having a digital trading assistant stopping you with a popup: "This trade is unusual or rather lopsided for you. Typically, you invest in biotechs in the late stages of FDA approval. This firm just began the process. Are you certain this is a good trade?"

In sectors such as health care, where human interaction is vital, automation threatens fewer jobs than it does in the labor market. Several factors can influence labor supply and demand. For example, Taxi and truck drivers face a miserable future given recent advances in self-driving cars. 

In some markets the impact on human replacement will be different. Finance is particularly noticeable; because of the degree to which the industry is built on processing information — the stuff of digitization — the research suggests that it has more jobs at high risk of automation than any skilled industry, about 54 percent. (Oxford Academics) In the case of Wall Street, traders need to aggregate and filter through tremendous amounts of data quickly and will rely on technology to help if it is available.  

So-called robo-advisers generate personalized investment portfolios, removing the need for stockbrokers and financial advisers. Almost every Wall Street firm has published research reports on the tens of billions of dollars of revenue that might be lost to these upstarts in the coming years. Banks are trying to resist the newcomers by making their own investments in market data analytics software start-ups like Kensho, which has raised more than $25 million so far. Kensho’s Big Data / Analytics data-crunching software technology allows financial professionals to make better, faster and more informed decisions with statistics.  

In conclusion, the financial industry is closely paying attention to automation and taking it very seriously, both as an opportunity and as a threat. It is one thing to make a few analysts lose their jobs, but automation could put whole business models at risk. An AI Bot will certainly be playing a major role in Banking / Wall Street, augmenting the traders rather than replacing, at the least in the near term. Eventually when it gets complicated and you need assistance from a human being, you’ll go to a customer service representative anyway.
more...
No comment yet.