Home » Latest Technology » Driverless Car Creeping Closer to Your Driveway, with New Work Shown at Tokyo Auto Show

Driverless Car Creeping Closer to Your Driveway, with New Work Shown at Tokyo Auto Show

         

robocarpeek

Pedestrian and object detection. Automated braking. Self-parking. Nowadays’s vehicles function refined driver help techniques straight out of science fiction. However we’re nonetheless a few years far from vehicles that force themselves. Aren’t we?

Smartly, the longer term is also nearer than you suppose.

Imagine what’s being proven this week on the Tokyo Motor Convey by using Japan’s ZMP, which permits its work with NVIDIA portraits processors.

ZMP is a start-up that develops and sells R&D platforms for autonomous driving. By fusing insights from the field of robotics with automotive, the company aims to enhance safety, preserve the environment and provide power-saving solutions for the next-generation of mobility.

1/10 scale RoboCar 3
1/10 scale RoboCar 3

Starting out with robotic technology and moving to remote control cars at 1/10 scale, ZMP developed and launched a driverless car it calls “RoboCar,” a so-called plug-in hybrid that can be driven autonomously by computers.

ZMP’s research engineer, Dr. Daniel Watman, started building the brains for RoboCar with field-programmable gate arrays – chips that can be configured by customers after they’re manufactured – but soon realized GPUs are faster and easier to develop on, so he switched to NVIDIA’s GPUs running CUDA, our parallel programming architecture.

The RoboCar uses cameras, lasers and radar to sense what is happening around the vehicle. The camera data is processed using the complex algorithm known as HOG – Histogram of Oriented Gradient – to detect pedestrians crossing the street. Without the powerful Kepler GPU, the video couldn’t be processed in real time, and the RoboCar would not be able to stop in time.

Collaborating with Virgina Tech and the University of Technology Sydney, ZMP also uses SLAM (Simultaneous Localization and Mapping) technology – first developed for robots – to guide the car without GPS or road signs.

Vehicle data is also collected via the car’s CAN (controller area network) bus and monitored through the cloud on a laptop PC or tablet – aiding analysis of the car’s behavior.

No driver, only a passenger.
No driver, only passengers.

There’s more coming.  Watman is using the latest GPU feature, “dynamic parallelism,” to enhance his system. Look for his demonstration at the Tokyo Motor Show. Just don’t expect to spend any time behind the wheel.

The Tokyo Motor show runs from Nov. 22 to Dec. 1 at Tokyo Big Sight in Koto-ku, Tokyo.