loading...

Showdown

Text
Ann Harder and Sabrina Kolb

Illustration
Carola Plappert

Man vs. Machine, Eye vs. Sensor

The second comparison in the Man vs. Machine series sees the Audi A8 step up to the plate. Are the driver assistance systems it uses to identify its surroundings just as good as the human eye? Or even better?

“Assistance systems serve to increase comfort in a car,” explains Dr. Stefan Wender, who is responsible for the architecture of driver assistance systems at AUDI AG. “The systems must work meticulously together if they are to identify dangers reliably. Depending on the application, cameras, radar and ultrasound systems can be used to classify the surroundings in their own way and to issue alarm signals.” Audi’s driver assistance systems are pure hi-tech. Are they already equal to human beings or per- haps even superior to them? An attempt, in nine parts, to come up with an answer.

1


360-degree view – four dedicated cameras make parking easier.

Over-View 
Audi A8 – Human [1:0]

Up to five cameras are mounted in the Audi A8 when the customer orders perimeter cameras. The driver assistance camera is mounted on the rear-view mirror, while the parking cameras are beneath the exterior mirrors, close to the Audi rings at the front and on the trunk. This affords the Audi A8 the perfect view in all situations.

Based on the parking cameras, a control unit generates a virtual plan view of the car and its surroundings, which is shown in the MMI display. When maneuvering into a parking space, the driver has an overview of the entire situation. In addition, four ultrasound sensors in the front and rear bumpers measure the distance to obstacles in front of and behind the vehicle; their signals warn the driver with increasingly urgency as the sedan moves closer to one of them.

When it comes to overview, the Audi A8 is therefore ahead of the human eye in a few areas. The bird’s eye perspective, the simultaneous overview in all directions and the precise estimation of distances are things that a human being cannot achieve to the same degree on his own.

2

Hind-Sight
Audi A8 – Human [2:0]


View to the front – the Audi A8 with its driver assistance camera.

The Audi A8 is also well ahead when it comes to hindsight. In order to look rearward, Audi side assist uses two rear radar sensors in the bumper. At speeds upward of 30 km/h, they keep an eye on an area up to 70 meters behind the sedan. As well as the blind spot, this also covers the so-called approach zone, which is considerably larger.

By scanning the approach zone, Audi side assist can make the driver aware of potential dangers. The system measures the distance and speed of other vehicles off the Audi's rear corner. In the first stage – the information stage – the LEDs in the exterior mirror illuminate. Their brightness, however, is muted and only noticeable when viewed directly. If the system identifies an intention on the part of the driver to change lane when the distance is too short, the LEDs flash brightly as the second warning stage.

The rear-view benefits of the Audi A8 are also evident when maneuvering into a parking spot. The reversing camera captures the area behind the vehicle and shows it in the MMI display. The associated control unit automatically calculates the path that the Audi A8 will take with the given steering angle and presents this likewise in the display.

Human beings would have a tough time beating the Audi A8 rear-view systems. If a person is looking forward, he/she can see a little more than 90 degrees to either side, giving a total field of vision of around 180 to 200 degrees. Although he/she can and should also be checking the area behind the car with regular glances over the shoulder, potential danger situations in this area are far more difficult to see than for the Audi A8 with its direct rearward view.

3

Far-sighted – human beings can see up to five kilometers in front of them, taking in an angle of more than 180 degrees. The upward and downward angles are 60 and 70 degrees respectively.

Far Sight
Audi A8 – Human [3:1]

With long-distance vision of up to five kilometers, human beings are far better than cars. The special parking cameras in the car can see for just a few meters, albeit across a wide horizontal field of vision of 180 degrees. The driver-assistance camera pointing forward has a horizontal field of vision of just 46 degrees.

However, the comparison between car and human in this chapter ends in a draw, as the front radar of the adaptive cruise control gives the Audi A8 a major advantage. Thanks to its precise measurements, the sedan can adapt its speed to maintain a constant distance to the vehicle in front.

The human being is inferior when it comes to precision. In order to estimate the speed of an oncoming vehicle or one driving in front, the brain compares the size of the image of the car on the retina at intervals of around 30 milliseconds. The changes in such a short time are very small, making it difficult to estimate speed. Human perception is therefore based on estimated values and not on the precise measurement conducted by the Audi A8.

4–9

Audi A8 – Human [3:2]

When it comes to clarity of vision, the human being cuts the better figure. In the fovea area of the retina, a person can see up to six times more clearly than the driver assistance camera in the Audi A8. In an ideal scenario – i.e. with 20/20 vision and rested eyes – human vision in this central area is equal to a resolution of around 8 megapixels.

However, resolution is not the same all over the eye. It dissipates toward the edge, which is why people can only see things out of focus on the outer edges of their field of vision. This is where the brain comes into play – it completes the fuzzy images from its memory and experiences. These are based, for instance, on what we had previously noticed on looking around.

The outer edges are out of focus in camera systems, too. But here, the pixels are distributed more evenly so that the camera can see clearly in a larger section of it field of vision than a human being can.

Audi A8 – Human [3:3]

The human being scores in this test, too. One eye alone can see only in 2D. Two eyes working together, however, make that 3D. Thanks to the distance between them, each of our two eyes sees things from a slightly different angle. The slight difference in the perspective of the two images enables the human brain to determine distances and the spatial positioning of objects. This becomes easier as the distances involved become closer.

The greater the distance, the more this capability diminishes. From distances upward of one kilometer, the brain can no longer determine differences in distance from the images conveyed by the two eyes. The pinpointing of certain objects is instead determined from an individual’s experience of scale and proportion.

The driver assistance camera in the Audi A8 cannot provide 3D vision. However, Audi is already working to develop a new generation of cameras that will also enable a vehicle to observe its surroundings in three dimensions. So, in future, the car will be able to match this point.

Audi A8 – Human [4:3]

When it comes to alertness, the luxury car is once again out it front. With a reaction capacity of considerably less than 100 milliseconds, the Audi A8 is well ahead of the human being, whose reaction capacity is between 300 and 500 milliseconds.

In the car, the driver assistance camera prepares information as quickly as the computing power permits. If we assume that the camera can process 36 images per second, it therefore needs 1/36 seconds, i.e. 27.7 milliseconds, to recognize something. Although, in the interests of reliability, the car usually evaluates several images, its reaction time is still considerably less than that of a human being.

Recognition time varies among individuals and is largely dependent on attentiveness. The effects of inactivity and distraction have a negative impact on reaction capacity, while the Audi A8 is constantly at the top of its game.

Nevertheless, with maximum concentration, the human being can beat the camera. However, this means that, when driving, he/she must concentrate fully on absolutely nothing else but the road ahead and the traffic round about.

Audi A8 – Human [5:3]

The Audi is in the lead by night. In the dark, a human being can see only in black-and-white and with a considerably lower resolution than in bright daylight. With the help of low beam, he can identify obstacles on the near side of the road from a distance of up to 60 meters, and from up to around 40 meters on the off side. However, at speeds of more than 70 km/h, these distances are usually insufficient to come to a complete halt in time in a dangerous situation.

The Audi has the advantage in the dark due to its infrared camera, the core element of the night view assist system. The system generates a thermal image of the situation in front of the car. The thermal image highlights warm objects, while cold objects appear blue. People and animals, which the eye would normally perceive as dark outlines, can be seen brightly lit in the dashboard display thanks to their body heat. Furthermore, the infrared camera is also able to show the course of the road and building outlines. It can see up to 300 meters ahead, beyond the range of the full beam.

Drivers have two handicaps by night. One is so-called afterimages that form in the eye as soon as another road user approaches from the other direction with full beam activated. The human being has difficulty seeing directly afterward. The second disadvantage is so-called dysphotopsia, which makes it difficult to see around a bright light source.

Audi A8 – Human [6:3]

Both people and machines drive in an anticipatory manner. The camera, however, is more precise. In contrast to people, who see distances only as rough approximations, the two radar sensors in the adaptive cruise control calculate distance down to the last meter. The system can see up to 250 meters ahead, meaning it can estimate far more precisely than a human being when braking is necessary.

The Audi A8 also helps the human being to stay in the correct lane. If the driver is distracted and deviates from the lane, Audi active lane assist kicks in and pulls the sedan back into the correct lane with a computer-controlled steering maneuver. To do this, the camera monitors the road for 50 meters ahead and through an angle of around 40 degrees. If the indicator is activated or the steering maneuver is so definite that the lane change is clearly intentional, the system takes no action. In non-transparent situations such as a construction zone on a multi-lane highway, the assistant switches automatically to passive mode.

In order to activate the system, the speed of the car must be at least 65 km/h. The human being would win from this point of view, as he/she is ready for action at any speed.

Audi A8 – Human [6:4]

A human being’s perception of his surroundings is far more colorful than that of the in-car camera. In human eyes, three sets of so-called cones enable the recognition of at least 270,000 colors. The eye identifies around 300 spectral colors and can differentiate between around 30 gradients of light and 30 of shade, i.e. paler and darker tones of a particular color.

The in-car camera, on the other hand, can only differentiate variations of intensity in the colors red and white, concentrating on the colors most critical in traffic situations. For instance, it can differentiate red rear lights and brake lights from white front headlamps and recognize the red outlines of traffic signs. This technology guarantees good night vision.

In the R18 e-tron quattro Le Mans race car, which has no rear window, the rear view is provided by a camera/monitor system. The digital rear-view mirror, made up of organic lightemitting diodes (AMOLEDs), shows what is going on behind the car in a brilliant and detailed image. The data are prepared to ensure that the image remains colorful and bright even in low light conditions, and that the headlamps of other cars don’t dazzle the driver in the dark.

Final Score
Audi A8 – Human [6:4]

In this comparison, both man and machine demonstrate a compelling array of individual qualities. The driver assistance systems in the Audi A8 take first place primarily when it comes to reliability. Monitoring traffic consistently and without distraction, estimating distances with precision and maintaining a close eye on the area behind the car – these systems are perfect for such tasks.

Human beings, however, have a more colorful view of the world. They see it in three dimensions and have no problem classifying even long distances. Above all, human beings are still well ahead of the car when it comes to understanding a situation. They understand interrelationships, can draw definite conclusions from the information presented to them and adapt themselves flexibly to the respective situation.

In-car assistance systems can send signals triggered by a specific stimulus. However, this process is based only on algorithms – a car is not intelligent in the true sense of the word. Assistance systems may be able to identify and react to predefined dangers, but they are currently unable to interpret unknown, more complex situations. In case of doubt, they must adopt a conservative approach.

Audi is working hard to make its assistance systems more powerful. The models of the future will be able to park themselves in garages and car parks without the driver having to be in the car. They will also be able to take the pressure off the driver in slow-moving traffic by steering, braking and accelerating at speeds of up to 60 km/h. They will depend fully on their sensors – thinking, on the other hand, remains the preserve of the human being.