This week’s Consumer Electronics Show (CES) in Las Vegas has been even more dominated by automotive news than last year, with scores of announcements of new in-vehicle development platforms, automotive 5G services, self-driving concept cars, automotive cockpit UIs, assisted driving systems, and a host of electric vehicles. We’ve also seen numerous systems that provide Google Assistant or Alexa-driven in-vehicle interfaces such as Anker’s Google Assistant based Roav Bolt.
Here we take a brief look at some of the major development-focused CES automotive announcements to date. The mostly Linux-focused developments range from Hyundai joining the Automotive Grade Linux project to major self-driving or assisted ADAS platforms from Baidu, Intel, and Nvidia.
Hyundai jumps on AGL bandwagon
Just prior to the launch of CES, the Linux Foundation’s Automotive Grade Linux (AGL) project announced that South Korean automotive giant Hyundai has joined the group as a Bronze member. The news follows last month’s addition of BearingPoint, BedRock Systems, Big Lake Software, Cognomotiv, and Dellfer to the AGL project. In October, AGL announced seven other new members, including its first Chinese car manufacturer — Sitech Electric Automotive.
Hyundai’s membership does not commit it to using the group’s Unified Code Base (UCB) reference distribution for automotive in-vehicle infotainment, but it’s another example of the growing support for the open source, Linux-based IVI stack. Several major carmakers are members, including Honda, Mazda, Mitsubishi Electric, and Suzuki, yet Toyota is the only AGL automotive manufacturer to ship IVI systems based on UCB in most of its major models, from the Camry to its Lexus luxury cars. In June, AGL announced that Mercedes-Benz Vans was using UCB for upcoming vans, and we can expect more AGL commitments in 2019.
At the Westgate Hotel Pavilion (booth 1614) in Las Vegas this week, AGL is showing off a 2019 Toyota RAV4 equipped with AGL systems, and AGL members are offering demonstrations of AGL-based connected car services, audio innovations, instrument cluster, and security solutions.
Baidu releases open source Apollo 3.5 self-driving software
AGL is not the only automotive project offering an open source solution. For the past year, Chinese search and cloud giant Baidu has been developing its Linux-driven Apollo stack for self-driving cars. At CES, it announced Apollo 3.5, with new support for “complex urban and suburban driving scenarios.” A hardware platform is available with an Intel Core based Neousys industrial computer equipped with an Nvidia graphics card, among other components including Baidu’s own sensor fusion unit.
Baidu also announced an Apollo Enterprise platform built on top of Apollo designed for autonomous fleet operations. In addition, it revealed an open source OpenEdge cloud-enabled edge computing platform with development boards based on NXP and Intel technologies. The latter is designed for in-car video analytics and incorporates Intel’s Mobileye technology. Details were sketchy, however.
The Intel AV system provides 60 percent greater performance at the same 30W consumption as Nvidia’s automotive focused Jetson Xavier processor, claims Intel, The Mobileye EyeQ5 processors are each claimed to generate 24 trillion deep learning operations per second (TOPS) at 10W each. Volkswagen and Nissan have announced plans to use the earlier EyeQ4 processors when it launches later this year. An EyeQ5 Linux SDK with support for OpenCL, deep learning deployment tools, and adaptive AUTOSAR will be available later this year, and production will begin in 2020.
The Atom 3xx4 chip, meanwhile, borrows high-end multi-threading and virtualization technologies from Intel’s Xeon processors for running different tasks simultaneously on different systems around the car.
Nvidia Drive Autopilot
Intel is playing catchup with Nvidia in the autonomous vehicle computer contest. In recent years, Nvidia has increasingly focused on the automotive business, launching one of the first independent self-driving car computers with its Drive PX Pegasus based on its newly shipping, octa-core Arm-based Jetson AGX Xavier module. At CES, it followed up with a Xavier-based Nvidia Drive Autopilot system.
Unlike the fully autonomous, “Level 5” Drive PX Pegasus, the Drive Autopilot is designed for Level 2 assisted ADAS systems. Due to ship in vehicles in 2020, the system features a claimed 30 TOPS AI performance and provides “complete surround camera sensor data from outside the vehicle and inside the cabin.”
Drive Autopilot integrates a new Drive IX software stack that can map and memorize typical routes to improve performance in the future. It also provides driverless highway merge, lane change, lane splits, and as well as driver monitoring and AI copilot capabilities. We saw no OS details, but presumably Drive Autopilot runs the Tegra4Linux stack used on other Xavier based systems.