HUMANS ARE FABRICS OF ABSTRACTION
As human beings, we are equipped with an outstanding processing device called the brain that delivers us tangible representations of our surroundings in almost real-time (300 ms). We are handling the principle of reality in a complete pre-brain filtered world. Our raw sensor data for vision, hearing and taste are all post-processed to focus on basic tasks in particular contexts. In that sense, our skull hosts a high-speed virtualiser chip.
In a way, we raise this skill to another dimension. We combine it with our “consciousness” by drawing, writing, building and even defining our universe with a universal language of invention, such as mathematics.
The brain is only processing its own virtual creations of subjective reality, meaning that our perception is limited by our consciousness, which itself is on a dream level, taking only between 3% and 8% of our brain’s total activity.
That can be linked to the Freudian reality principle—the mind’s ability to assess the reality of the external world and act upon it. When acting to make the right decisions for our survival, we need to process a comprehensible representation of reality, an abstraction closely bound to our world. It sounds like an obvious tautology, but it is a great reminder for transposing that train of thought to the digital world.
The act of coding could be considered an action towards the reality from a virtual environment to our physical world—an attempt to control reality using algorithms and mathematics.
FROM BARE-METAL SERVER TO INFRASTRUCTURE BY CODE
If we look back at the last forty years of computer technology evolution, we observe a gradual creation of abstraction layers that highly complex programs can achieve.
The circle of abstraction and paradigm is closed since the conception of systems is able to imitate brain neuronal models. The industry is even planning to rely on machine learning for automated cars and trust those systems to make safety decisions for human lives based on multiple layers of virtualisation.
The first computers that appeared in the early 80s had no connection to any network for (very) personal computers (PC). Programming in low-level languages, close to bare-bones metal, was tedious and required technical knowledge. Programmers were required to have a deep understanding of the electronic architecture to be able to develop programs. Otherwise, it was limited to new modular compiled languages, limited in performance and options.
Then, structured object-oriented programming languages using libraries appeared, drastically accelerating development and enabling a layer of abstraction with the use of classes. This created the possibility to define tools and enabled reusability. For instance, you could design your Lego bricks and build spaceships and modules.
Even with hardware-assisted virtualisation introduced by IBM in 1972, it is only from 2005 that computer virtualisation took over. That revolutionised the ease for developers to use production’s infrastructure configuration, facilitated the lean ways of working inspired by the automotive industry and later initiated the first steps towards the DevOps approach.
Integrating the operational people (system engineers) into the development process loop and the use of containerisation tools like Docker and orchestration systems like Kubernetes tremendously accelerated the development process. This allowed developers, system administrators, testers and all the teams involved to dynamically adjust the production and testing infrastructures, reducing deployment time and making the system much more reliable.
Companies like Google and Amazon have initiated most of those technological and cultural innovations by solving growing complexities. The DevOps movement is now integrating more and more aspects related to the software development process.
Finally, all these IT approaches from Agile, Scrum and Kanban to DevOps are here to define common grammar and vocabulary by handling a mutual team reality to increase efficiency and reduce errors. The ultimate goal is to share the same reality level.
We can observe in the history of IT abstraction that we are better at solving complex tasks by handling symbols and abstractions and getting along with our brain design, then trying to deal with the raw functional information.
CARS WILL HAVE TO SHARE OUR PRINCIPLE(S) OF REALITY
Connecting sophisticated machines responsible for our health and safety, such as cars or medical devices, raises many questions for the industry on building services around machines executing a hundred million lines of code.
That includes responsibilities beyond business concerns. With human lives being on the line, compromises in terms of security, safety and ethics are not allowed.
Consequently, the technological choices around the new era of connected vehicles, including communication security, the integrity of software updates, data privacy and all new services attached to the car, will have to comply with vital requirements.
Integrating those crucial constraints adds complexity to a currently unprepared electronic car architecture in a fragmented supply chain, which traditionally wasn’t cooperative. Until now.
Machines will share our reality principles, becoming active business actors and autonomous moving objects. Signs for a paradigm shift are pretty straightforward.
Making this possible on a global scale will require the use of technologies providing indisputable trust, transparency, cooperation, and, beyond all, a common understanding of the upcoming challenges for the automotive industry. The understanding that is already endangered by companies able to embrace the changes like Google with Waymo and their self-driving car project, or Tesla and Baidu.
The connected vehicle’s complex and sensitive electronic topology will need a safe and secure rationalisation, especially when it comes to sensors and AI model integrations. Otherwise, the implementation investments and risks involved will be discouraging for every next development initiative.
The automotive industry has been using virtual representations for a decade, leveraging the concept of Digital Twin, invented in 2002 by Michael Grieves at the University of Michigan. From the 3D modelling to the production of a functional prototype, the entire team handles an evolutive abstraction, speeding up the processes, enabling remote cooperation and reducing costs. But until now, it was limited mostly to the design before manufacturing.
Tesla is already offering a Digital Twin access for their vehicles, reflecting the states and features of the car with increasing interaction possibilities. This abstracted representation is reflecting the vehicle state and features on a high level.
But that is only scratching the surface of the potential. Applying this virtualisation to the level of the electronic component called ECU will revolutionise the way the industry will handle the car life cycle. Here again, it is necessary to develop new tools using new technologies. Some technologies are already developed and mature enough to combine trust, security and collaboration to produce a reliable representation of physically complex machines on the network.
In the next blog post, we will cover how the combination of trustful connected vehicle abstraction will enable a new field of interactions and business models. That concerns the new mobility shift, autonomous vehicles, the upcoming vehicle data gold rush, security linked to safety and many other aspects.