In the not-so-distant future, customers will want to simply utter from the warm, frothy comfort of their hot tubs, “Hey Alexa, get the tires rotated for my Mach-E before noon tomorrow,” and expect an incredible, integrated intelligence to support this user experience. First, the connected car’s agent must shop for available reservations within service bays versus the bathing customer’s preferences such as price and customer reviews, marry that timing with weather predictions so the autonomous vehicle might circumvent any impending storms, assure the vehicle’s charge or fuel shall support the trip at the proposed time, and initiate the late-night journey for servicing. Then the vehicular technology must sense the nearly desolate ecosystem around his remote cottage, avoid any unanticipated obstacles and find its way to the exact service bay while coordinating arrival with the repair station’s agent. This requires systems of systems within the vehicle including Advanced Driver Assistance Systems (ADAS), programmable telematics and a myriad of sensing technologies to prevent mishaps such as accidental stowaways and prancing deer. But to accomplish such complexity, the automotive companies must employ a strategy as previously reported to Forbes by Valeo’s Vice President of Research and Development, Guillaume Devauchelle: “On both the hardware and software side there’s a significant investment, so any ‘platform’ – both on and off the vehicle — is built on stable, long-term bricks.”

However, the test environment shall be arguably just as complex. In that tire rotation example, not only must there be component, integration and system testing for the hundreds of critical, on-vehicle parts but it must subsequently test the tens of thousands of build combinations against several cellular networks (e.g. 3G, 4G, 5G), various grid generations, numerous international variables (e.g. Right-Hand Drive, roundabouts), and a multitude of other environmental conditions. “With more systems exchanging even more information under a much larger set of configurations and use cases, you could legitimately say there would be an infinite set of [test] scenarios,” states Chad Chesney, NI’s Vice President and General Manager of Transportation. “It is cost prohibitive to physically test all updates in all locations in the world. The trend is moving towards simulation testing as much as possible – although not exclusively — which minimizes the cost of the test suite while ensuring the vehicle’s performance and functional safety.” But to realize these efficiencies, it must be nearly flawless. “If you want to win in the new Mobility Era, you will need to bring products to market quicker at a lower cost, but it must perform correctly,” continues Chesney.  “The cost of failure is high, especially with the heightened attention.” It all comes down to two words: cost and trust.

So the automotive developer that wants to succeed must have the foresight to employ three key strategies for the test environment: an open platform, a learning intelligence and a switchover reckoning.

Open Platform

The ideal would be a purchasable, quasi-Google World with live loading such as distracted pedestrians and black ice, as well as constantly updating infrastructure elements like the Next Generation Electrical Grid, 6G cellular and The-Next-Great-Thing 2.0. However, nothing like that exists. 

Currently, the best available solutions are core, open platforms that address the heart of the vehicular performance and allow for tailored, plug-in “segments” for specific operating domains or additional in-vehicle systems. This requires, though, that the manufacturer begins down the yellow brick road with insight since the majority of tools are closed; designed for a particular, proprietary system. Not very flexible. Not particularly upgradable.

“Cars will become far more defined by the software than anything else, which makes them increasingly upgradable,” states Chesney. “The industry is going to rely upon evolving software platforms, which will manifest as a material quickening of the process. We have case studies where a new variant is realized in 6 to 9 months instead of the traditional 12 to 18 months.”

Learning Intelligence

Anyone who has used neophyte simulations has thought, “This thing is garbage and will never replace the ‘real thing.’” Early Alexas could not even answer the simplest of questions.

But learning intelligence is what transforms the worthless simulations into powerful tools. And lots of data. “Although simulation is the way to reduce costs and time, the auto industry cannot accomplish its Safety Goals without a portfolio of field testing and simulations that are highly correlated to the physical world. There will have to be a thread of information throughout with learning and evolution over time that will iteratively improve,” predicts Chesney.

The good news: per engineering.com even 2019 connected vehicles provide an estimated 12 to 4,399 petabytes per day. The bad news: storing, cleaning and assimilating data isn’t as easy as typing this sentence; upfront fixed costs and ongoing operational costs must be understood to reap the benefits.

Understanding the Switchover Costs

Once this intelligent, learning test environment has been built, the antibodies will attack. “This test thing has become expensive. Can’t we do that same stuff with a cheaper option,” will be the financial controller’s question while trying to address his annual-reduction target. And just like the off-shoring of all test personnel in the late 90’s, the holistic cost-value equation shall not be calculated in the moment; only the instantaneous, myopic cost reduction for the sake of this year’s budget crunch. Therein, few companies will avoid this slippery slope by constantly evaluating both the tangible and intangible value (e.g. speed to market), understand the switchover costs of either the entire testing platform or various plug-ins, and augment long-term strategies as opposed to near-term purchases. That sounds like motherhood and apple pie but, like many things, the devil is [thwarted] in the details. For instance, a hidden cost will be retrofitting any new test suite for all possibly affected, historical vehicle architectures prior to the latest over-the-air reflash since 8-10 years’ worth of vehicles might get the latest cybersecurity update. Or another hidden opportunity cost might be the historical data that’s no longer applicable for improving the correlations of the newest modeling algorithms and, therein, resets the accuracy for surrogate testing. And so it goes. Such switchover costs are true of any sticky platform (e.g. transferring from iOS to Android): the innovation’s value must be understood within a context of the variable costs AND the losses when going to the Next Best Alternative. It’s a massive exercise, but worth the trip.

And while you’re calculating that, I’ll hopefully be lounging in my hot tub and ordering my tire rotation.