Looking ahead to what may become the golden age of data for the Internet of Things (IoT) and connected vehicles, there is reason to be hopeful, looking around at the new business units and innovations steering towards some strategic goals.
Matt Hatton, Vice President of Research at Gartner, made a presentation at the TechXLRate conference recently, part of London Tech Week, where he spoke about the emerging eco-system for the IoT.
Gartner’s research* showed that over the next few years, data and analytics programmes will become even more mission-critical throughout the automotive business and across industries.
According to Gartner analysts Douglas Laney and Ankush Jain: “Modern information infrastructure will include data virtualization, the separation of storage and computing, and cloud-based data persistence. Data and analytics leaders must evolve their technology capabilities for digital transformation….An increasing pressure to manage data in multiple deployment models, while also optimising its access and retrieval, is mounting.”*
Companies recruiting workers for data monitoring and guiding networks
In Gartner’s forecasts*, by 2020 20% of companies will dedicate workers to monitor and guide neural networks. By 2021, 30% of net new revenue growth from industry-specific solutions will include artificial intelligence (AI) technology. By 2020, 25% of large organizations will be either sellers or buyers of data via formal online data marketplaces.
IoT data is not just big, but broad, non-standardised and inter-connected
The transformational aspects of IoT, the so-called IoT 2.0, have taken longer to materialise than anyone expected. For now, a lot of the hype is around smart home services such as smart energy meters, in-vehicle connected services, and simple efficiency steps. But now we are really moving close to major transformation with IoT 2.0.
An estimated 80% of global corporations are investing in some aspects of the IoT, although this doesn’t amount to much yet, as 80% of these projects are at proof-of-concept stage. And another 80% of the 80% are unlikely to go into any kind of commercial service. But it is starting to come.
On our part at LexisNexis Risk Solutions, we launched the Global Telematics Exchange to connect vehicles to insurance and other services encompassing the whole gamut of hardware types and software sources.
We have embarked on a strategy to help the vehicle OEMs with all their data challenges, which will become crystallised into new ways of delivering personalised services, for improving the lifetime ownership experience, to vehicle sales, to recall, to service alerts, to FNOL (first notification of loss) and incident services in the context of insurance and risk.
In the automotive sector today there are many different data formats. Big data is not just big: it is being created and distributed broadly, worldwide, and on such a scale that in many cases it is too large to move around, in the state of raw data.
Test running of vehicles in hot and cold countries, data coming from the powertrain (durability, exhaust gas, diagnostics, oil data, metadata) or the differing infrastructure for cellular networks, and for insurance telematics, around the world are just examples of a situation that is creating a whole landscape of different databases and tools. Integration is key and there is an important conversation to be had around holistic analysis.
Car data testing can reach 20GB to 40GB per shift
Consider that a new vehicle today (a non-autonomous vehicle) has more than 4,000 or 5,000 data labels with higher data output than ever before, equivalent to 20GB to 40GB per shift of testing. The accumulation of such data is becoming too large to move away from any single test site by the time it reaches 2PB to 4PB per year, requiring a distributed data system.
But the real problem ahead comes with autonomous driving. With data output rising to 4GB per second of vehicle testing, it becomes a very challenging data scenario, requiring scalable, cost effective access and resilience, such as we at LexisNexis currently deliver to insurance and other business services through our HPCC Systems super-computing arm and data lake.
Speaking at the Smart Transportation and Mobility Summit at TechXLRate, Dr Tobias Abthoff, CTO of NorCom, said the OEMs are getting better and faster with data management.
“But today we have a problem,” said Dr Abthoff. “For advanced vehicle technology we can have distributed data storage with some search and analysis, with redundancy built into the vehicle systems with LIDAR, optical systems and radar. But what happens if there is damage to the parameters of the sensors? To what extent do the different sensors and radar know about each other? Which is the right data to use in the road environment, for example with buses, other vehicles and traffic lights in the mix of data? How do we deal with limitations such as noise from the data, for example if there are stickers obscuring traffic signs? For us, as humans designing the systems, it is counter-intuitive to detect these anomalies.”
In this context for autonomous driving and intelligent decision-making on the part of the vehicle, we can see there are still challenges.
Data economy discussions turning to inter-connectivity and shared platforms
Vehicles will move closer to a state of sensor fusion, similar to human thought where the ears, eyes and other senses can relate to each other. The sensors will then be able to maintain safety in case of an incident or damage, such as when a particular sensor is moved to the right or to the left. At this level, vehicles will be more aware of their surroundings and the physical state of any particular sensor input, moving from machine learning (algorithms designed by a human) into the realms of deep learning (algorithms designed by the machine) and neural networks. But at the same time we can see this will eat up tons of data on a scale never seen before.
The current testing pipeline of data for the autonomous vehicle, the challenges with data labelling, training the systems, is currently so large that if it doesn’t work when fully automated with AI, it doesn’t work at all.
We can think of the challenges rather like that icon of motoring history, Henry Ford and his approach to finding solutions. His greatest achievements were not only in the design of the motor car itself and the Ford Model T, but in the invention of the assembly pipeline.
The evolution has shifted to talking about broad uses of the data, not just the use a particular app was designed for. The conversation is now around data hubs and data exchanges, about taking data from the IoT and making it available to other uses across a platform and across platforms. The conversation is shifting.
*Gartner’s report ‘100 Data and Analytics Predictions Through 2021’, 20 June 2017
|If you have a sales-related enquiry please complete this short Contact Us form and a member of our Client Engagement team will be in touch.|