VSI In the News
Elektrobit und NXP kooperieren beim autonomen Fahren
"Die Entwicklung automatisierter Fahrzeugfunktionen ist aus vielen Gründen sehr anspruchsvoll. Eine der anspruchsvollsten Aufgaben ist die Zusammenführung von Software-Komponenten mit der Zielplattform. Die Verfügbarkeit von robinos für die BlueBox bedeutet, dass Anwendungen über die offene Schnittstellenspezifikation auf der Hardware ausgeführt werden können. Das spart Zeit und ermöglicht eine zuverlässige Funktionalität ", sagte Phil Magney, Gründer von Vision Systems Intelligence (LLC).
Elektrobit and NXP Join Forces on Automated Driving
"One of the most challenging tasks for automated vehicle development is interfacing software components with your target platform. The availability of EB robinos for NXP’s BlueBox means that applications can be run on the hardware via the EB robinos open interface specification. This saves time and enables a proper buildup of functionality,” said Phil Magney, founder and principal advisor at Vision Systems Intelligence, LLC.
TI's Shrewd Robo-Car Strategy
Phil Magney, founder and principal advisor for Vision Systems Intelligence (VSI), noted, “TI does not subscribe to massive architectural overhauls.” He pointed out, “For TI, it is all about incremental ADAS features which become the enablers to automation. TI is not concerned with L4 and L5 at the moment. In time their architectures will support advanced levels of automation but for now they are targeting automotive safety and convenience features because that is where the money is.”
Toyota Going Open-Source in '18 Camry
Danny Kim, a director and partner at Vision Systems Intelligence (VSI), partly agreed. He noted, “AGL’s regional dominance would likely be limited to Asia (or Japan) at the beginning, just like Genivi was to Europe.” However, he described Toyota’s announcement as “a significant endorsement for the standard to be successful as any standardization needs to be led by a major OEM.”
Matthew Linder, VSI AV software engineer, noted, “We have not seen any references of AGL being used on a production ECU for ADAS or autonomous driving. Furthermore, we have not heard of any form of Linux being used for an ADAS or autonomous system.”
Robo-car Redraws Auto Landscape
Magney likes to analyze the autonomous vehicle platform at a more granular level, though, by breaking it into chunks, which he describes as “AV (autonomous vehicle) stacks.” AV stacks include pieces like perception, localization/planning, decision/behavior, control and connectivity & I/O, according to his definition. Pushing each of these AV stacks are often major chip companies, said Magney. “And they drive partnerships.”
Toyota Selects Nvidia, Intel Feels Heat
Phil Magney, founder and principal at VSI (Vision Systems Intelligence), told EE Times that among Nvidia's growing list of automotive partners, some are pilot programs while others are in production. “In the case of Toyota this is a production deal to use the Drive PX (or elements of it) to improve automation and safety in future vehicles.”
Baidu Battles Google in Robo-Car Derby
"Baidu could have millions of autonomous vehicles using their cloud based Automated Vehicle platform,” Magney said. “Their platform essentially becomes the operating system for managing seas of vehicles sort of like the ‘Internet of Vehicles,’ allowing autonomous vehicles to move seamlessly together.
Taiwan Eyes Automotive Market
Magney doesn’t think there will be a major disruption in the supply chain. He said, “The big tech companies have been turning to traditional auto suppliers lately for their integration of safety critical systems or other challenging elements of building a vehicle. Taiwan needs to build the integration of safety critical components and strict functional safety practices. Sure, Taiwanese companies may move up the value chain and have the capacity to handle safety elements out of context (SEooC), but a complete system is another matter.”
Unresolved Issues Facing Robo-Cars
VSI believes AI has the ability to train driverless cars to behave more like people who sometimes, counterintuitively, use “a little bit of aggression to enable an opening to merge into, for example. Furthermore, data will remain “a big gap.” For AI to be acceptable you need enormous amounts of data to train your behavior models. You can go about collecting as much as you can but you are still not going to be able to test for all edge cases, nor would it be feasible in a safe way. So you have to turn to simulation data in this case.
Renesas ‘Opens’ Autonomy for Cars
Renesas Electronics emerged from a prolonged silence with a bang Wednesday morning (Japan time), heralding the launch of Renesas Autonomy, a newly-designed advanced driving assistance systems (ADAS) and automated driving platform. VSI explained, “It is a collection of all the processing nodes that make up the ‘AV (automated vehicle) Stack.’ In the case of Renesas they already had some good assets going in to this with their vision IP, their safety controllers, and their diverse collection of eco-system partners.”
Daimler, Bosch have chosen Nvidia as their partner
The Daimler – Bosch deal announced earlier this week, however, has brought some clarity. Daimler and Bosch said Tuesday (April 4) that they’re partnering to accelerate the production of "robo-taxis.” VSI states, "the building blocks for automation are out there as the push from the tech community has shown. The real challenge is the integration into a total automotive ready platform. There are so many domains within the AV Stack (perception, behavior, control and safety) and nobody owns all those pieces. This is why we see a flurry of activities related to centralized domain control for highly automated driving."
Mentor in Robo-Car Race with Mobileye, Nvidia
Mentor is rolling out an automated driving platform called DRS360, designed to “directly transmit unfiltered information from all system sensors to a central processing unit, where raw sensor data is fused in real time at all levels,” the company said. Phil Magney, founder & principal advisor at Vision Systems Intelligence, told us, “Sensor fusion is a complex task and doing it with RAW data makes it even harder.” Raw data fusion, Magney explained, enables consolidation of computing resources into a centralized systems more efficiently, despite greater challenges in algorithm integration.
Uber Rollover in Arizona Points to More Testing
Uber Technologies grounded all of its driverless cars deployed in pilot programs in Tempe, Arizona, Pittsburgh and San Francisco, after a crash Friday in Arizona. “As a general rule, autonomous vehicle systems have to get exposed to as many scenarios as possible. Some of this is only practical in simulation as this situation points out.”
Read More >
Intel Rocks World with $15B Mobileye Buy
As Mobileye’s Dagan explained, autonomous driving needs two complementary technology solutions. One is something like the black box Mobileye has developed for computer vision. Mobileye added more logical layers while the solution itself has become a less configurable and less open computing platform. It’s dense and more power efficient.
Read More >
9 Startups with Self-Driven Future (Maybe)
Until 10 days ago when Ford announced a $1 billion investment in Argo AI (Pittsburgh), it was just another anonymous tech startup. And it was really new. Argo AI was founded only last November by two AI robotic experts (Bryan Salesky originally from Google, and Peter Rander from Uber). The company cleverly avoided notice entirely, until Ford came calling.
Read More >
Are we getting HMI issues right?
The U.S. National Highway Traffic and Safety Administration (NHTSA) has found no defects in Tesla’s Automatic Emergency Braking (AEB) and Autopilot systems. But was that ODI finding, that “braking for crossing path collisions… are outside the expected performance capabilities of the system,” well understood by Tesla drivers?
Read More >
What Driving Policy means for autonomous cars
At CES 2017, Mobileye’s co-founder, CTO and chairman Amnon Shashua, has discussed the “three pillars of autonomous driving”–namely, sensing, mapping and driving policy, and how the company is addressing all three. Shashu defined “Driving Policy” as based on “deep network-enabled reinforcement learning algorithms.”
Read More >
Predictive vs reactive: Robo-car trends at CES 2017
How best to apply AI for safety in automated driving is a major conundrum for researchers and design engineers the world over. Gil Pratt, CEO at Toyota Research Institute, said his team is working on two tracks of research called Guardian—which basically assists the driver in situations that require quick response—and Chauffeur, which is closer to autonomous driving.
Read More >
'Open Source' Robo-Car in '17?
The year 2016 opened the door to a new phase of highly automated driving, moving the discussion away from “wouldn’t it be nice-to-have-a-robo-car” to a more immediate “to-do list” with which regulators, car OEMs and technology companies must grapple if they hope to make self-driving cars commercially viable and safe.
Read More >
New Software Aims to Speed Development of Autonomous Cars
A global automotive supplier thinks carmakers are wasting too much time developing software that runs autonomous technology. In working with clients, engineers at Elektrobit, a subsidiary of Continental, say automakers have become adept at creating stand-alone features like adaptive cruise control and active lane-keeping assist.
Read More >