FREEDOM AND SAFETY

 

Autonomous vehicles can follow the general rules of American roads, recognizing traffic signals and lane markings, noticing crosswalks and other regular features of the streets. But they work only on well-marked roads that are carefully scanned and mapped in advance.

 

Many paved roads, though, have faded paint, signs obscured behind trees and unusual intersections. In addition, 1.4 million miles of U.S. roads - one-third of the country’s public roadways - are unpaved, with no on-road signals like lane markings or stop-here lines. That doesn’t include miles of private roads, unpaved driveways or off-road trails.

 

What’s a rule-following autonomous car to do when the rules are unclear or nonexistent? And what are its passengers to do when they discover their vehicle can’t get them where they’re going?

 

Accounting for the Obscure

 

Most challenges in developing advanced technologies involve handling infrequent or uncommon situations, or events that require performance beyond a system’s normal capabilities. That’s definitely true for autonomous vehicles. Some on-road examples might be navigating construction zones, encountering a horse and buggy, or seeing graffiti that looks like a stop sign. Off-road, the possibilities include the full variety of the natural world, such as trees down over the road, flooding and large puddles - or even animals blocking the way.

 

At Mississippi State University’s Center for Advanced Vehicular Systems, we have taken up the challenge of training algorithms to respond to circumstances that almost never happen, are difficult to predict and are complex to create. We seek to put autonomous cars in the hardest possible scenario: driving in an area the car has no prior knowledge of, with no reliable infrastructure like road paint and traffic signs, and in an unknown environment where it’s just as likely to see a cactus as a polar bear.

 

Our work combines virtual technology and the real world. We create advanced simulations of lifelike outdoor scenes, which we use to train artificial intelligence algorithms to take a camera feed and classify what it sees, labeling trees, sky, open paths and potential obstacles. Then we transfer those algorithms to a purpose-built all-wheel-drive test vehicle and send it out on our dedicated off-road test track, where we can see how our algorithms work and collect more data to feed into our simulations.

 

Starting Virtual

 

We have developed a simulator that can create a wide range of realistic outdoor scenes for vehicles to navigate through. The system generates a range of landscapes of different climates, like forests and deserts, and can show how plants, shrubs and trees grow over time. It can also simulate weather changes, sunlight and moonlight, and the accurate locations of 9,000 stars.

 

The system also simulates the readings of sensors commonly used in autonomous vehicles, such as lidar and cameras. Those virtual sensors collect data that feeds into neural networks as valuable training data.

 

simulated desert meadow forest by autonomous vehicle

Simulated desert, meadow and forest environments generated by the Mississippi State University Autonomous Vehicle Simulator. Chris Goodin, Mississippi State University, Author provided.


Building a Test Track

 

Simulations are only as good as their portrayals of the real world. Mississippi State University has purchased 50 acres of land on which we are developing a test track for off-road autonomous vehicles. The property is excellent for off-road testing, with unusually steep grades for our area of Mississippi - up to 60 percent inclines - and a very diverse population of plants.

 

We have selected certain natural features of this land that we expect will be particularly challenging for self-driving vehicles, and replicated them exactly in our simulator. That allows us to directly compare results from the simulation and real-life attempts to navigate the actual land. Eventually, we’ll create similar real and virtual pairings of other types of landscapes to improve our vehicle’s capabilities.

 

road washout real and simulation

A road washout, as seen in real life, left, and in simulation. Chris Goodin, Mississippi State University, Author provided.


Collecting More Data

 

We have also built a test vehicle, called the Halo Project, which has an electric motor and sensors and computers that can navigate various off-road environments. The Halo Project car has additional sensors to collect detailed data about its actual surroundings, which can help us build virtual environments to run new tests in.

 

Halo Project car

The Halo Project car can collect data about driving and navigating in rugged terrain. Beth Newman Wynn, Mississippi State University, Author provided.


Two of its lidar sensors, for example, are mounted at intersecting angles on the front of the car so their beams sweep across the approaching ground. Together, they can provide information on how rough or smooth the surface is, as well as capturing readings from grass and other plants and items on the ground.

 

Beam Pattern Lidar

Lidar beams intersect, scanning the ground in front of the vehicle. Chris Goodin, Mississippi State University, Author provided


We’ve seen some exciting early results from our research. For example, we have shown promising preliminary results that machine learning algorithms trained on simulated environments can be useful in the real world. As with most autonomous vehicle research, there is still a long way to go, but our hope is that the technologies we’re developing for extreme cases will also help make autonomous vehicles more functional on today’s roads.

 

Matthew is an automotive R&D manager with expertise in autonomous and electrified vehicle systems at the Center for Advanced Vehicular Systems within the Bagley College of Engineering. As CAVS Associate Director, he participates in strategic and investment planning, corporate outreach, and marketing. 
 
Chris Goodin received his Ph.D. in physics from Vanderbilt University in 2008. From 2008-2017, worked with the U.S. Army Engineer Research and Development Center (ERDC) in Vicksburg, MS, focusing on physics-based simulations of ground vehicles, sensors, and robotics. Dr. Goodin is currently an Assistant Research Professor with the Center for Advanced Vehicular Systems, Mississippi State University.
 
Daniel Carruth is Associate Director for the Human Factors and the Advanced Vehicle Systems groups at the Center for Advanced Vehicular Systems at Mississippi State University. He received his Ph.D. from the Department of Psychology at Mississippi State University. His research interests include modeling and simulation of human interaction with autonomous vehicles as well as the study of human task performance in law enforcement, military, and industrial work.