New and improved sensors are overcoming the challenges long posed by inclement weather and limited lighting.
HANK HOGAN, CONTRIBUTING EDITOR
On the roads at any given moment, there are traffic accidents, stopped vehicles, and drivers going the wrong way on one-way streets. All of these incidents adversely affect traffic flow and lead to congestion, delays, and, sometimes even secondary accidents. These realities have given rise to the intelligent
transportation system (ITS), which involves a host of communication, control, sensing, and imaging technologies working in tandem to improve efficiency and safety.
Vision systems have been central to ITS, and recent advancements have ushered in a new set of capabilities. Traffic management, however, presents several key challenges. Lighting varies, making reliable vehicle identification tough. Weather can reduce visibility. And upgrading to the latest vision systems often isn’t possible for the cash-strapped transportation departments that are tasked with making highways safer and more efficient; sometimes they must make due with upgrading their existing infrastructure.
Industry is responding to such needs
with advanced vision technology, combining it with radar, persistent surveillance, sophisticated software, and other techniques. On the horizon is deep learning, an artificial intelligence technology that should make any vision solution more powerful.
A testbed for vision traffic systems.
The cameras under test are mounted on
a pole above State Highway 6 in Texas. Courtesy of Dan Middleton/Texas A&M Transportation Institute.
Still, the basic tasks confronting vision systems are daunting. During the day, systems must identify a car in a lane without being confused by the shadow of an adjacent vehicle. At night, they operate by detecting headlights, necessitating a different algorithm.
In all situations, though, one parameter is essential.
“Contrast is important. Most of these systems work simply on the change in pixel intensity,” said Dan Middleton, a research engineer with the Texas A&M Transportation Institute of College Station, Texas. Middleton’s focus is on vehicle detection systems, which can be useful for counting cars, timing traffic signals, identifying incidents, improving highway safety and efficiency, and other aspects of traffic management.
Thermal imaging performed by a FLIR TrafiData camera, enabling traffic managers to create zones according to traffic lanes and to differentiate vehicles from motorcycles and pedestrians. Courtesy of FLIR.
In tackling these tasks during full daylight, vision systems — which are often mounted on poles 20 or so feet above the pavement — must account for glare and shadows. These factors vary significantly by time of day and season. Vision systems also must provide information at night, when lighting is minimal. At intersections, the key nugget of information is vehicle movement occurring 700 to 1000 ft away from a traffic light. This information determines what color the light should be and when it should change.
At intersections, the key nugget of information is often about vehicle movement occurring 700 to 1000 ft away from a traffic light. This information determines what color the light should be and when it should change.
This, in and of itself, is a challenge; looking ahead hundreds of feet alters the appearance of cars, which makes detecting them with certainty problematic. What’s more, at dawn or dusk, the bright light of day is gone, and cars may or may not have headlights turned on. One way to overcome such issues is to pair vision with radar, with the latter handling distance detection, and the former taking over for the last 100 or 150 ft. Such sensor fusion also mitigates the problem of weather, since radar is largely immune to weather-induced visibility changes.
Another approach to overcoming weather and also darkness is to use sensors outside the visible spectrum. FLIR Systems Inc. of Wilsonville, Ore., has leveraged its thermal imaging technology to make IR cameras for traffic management applications.
A screenshot from a demo application built around SwRI’s ActiveVision showing live data from Houston at around 2 p.m. on Sept. 18, when the city was dealing
with Tropical Storm Imelda. The red triangles are cameras where inclement weather was detected. The radar overlay shows that detections match live weather reports, proving that existing video/still camera infrastructure can aid traffic management by automatically spotting weather conditions and potentially other problems. Courtesy of SwRI.
“Thermal is largely indifferent to lighting conditions, and I do not believe there is a better sensor for detecting pedestrians and cyclists,” said Daniel Benhammou, senior director of ITS software and solutions at FLIR.
Measuring thermal transmission
Benhammou added that the company’s sensors measure various thermal transmission factors, including surface emissivity and temperature. This enables the technology to distinguish among vehicles, people, and the ground, leading to what he characterized as excellent definition, even on hot desert-like days.
TrafiData, the latest FLIR traffic
management product, offers a higher-
resolution sensor than previous versions, which improves capabilities and performance, according to Benhammou. The product also runs AI algorithms to further increase accuracy and the ability to differentiate among objects. Cloud- and web-based tools make it possible to ingest this information along with that from inductive loop, magnetometer, and visual sensors. When combined, these inputs can be used for enhanced traffic management.
Other organizations are using technology advancements to make the most of the existing infrastructure. For example, such advancements motivated the design of ActiveVision from the San Antonio-based Southwest Research Institute (SwRI), according to Dan Rossiter, a research analyst there. The company works with state transportation agencies across the U.S.
Intelligent transportation systems include sensing and imaging technologies that monitor vehicle congestion.
Transportation departments want to warn drivers to slow down in the event of a dust storm, fog, rain, or snow, and at other times when visibility is reduced. If weather conditions and visibility are bad enough, agencies will close highways. Some use dedicated sensors to detect various forms of inclement weather. However, many transportation agencies use only still or video cameras.
“If they’ve already got these cameras they’re maintaining and deploying, and we can add extra value by applying software on top of those video feeds to extract actionable data, then maybe they don’t need to buy those dedicated sensors,” Rossiter said. “Or maybe they need to buy [fewer] of them. And they can still get that coverage they need.”
SwRI’s software works with static or video cameras, applying algorithms developed by deep learning on training sets. In the first phase of the rollout of the software, it targets visibility, reporting back on changes along with a classification as to the cause. This information typically goes to a person in the loop, who when alerted looks at the camera feed and either agrees or disagrees with the software. The person may then activate electronic signage to warn drivers, or decide to take other appropriate measures.
Weather changes occur relatively slowly, so a still camera may work well enough. A measured pace in deciding what the issue is and then acting upon it may also be acceptable.
A much quicker response is necessary in wrong-way driver situations. These incidents tend to happen at night, and often there is little time to dispatch police, alert other drivers, or shut down a highway. A balancing act occurs with any response, Rossiter said. Getting the information wrong or not responding correctly can result in accidents or fatalities. So, the next phase of SwRI’s product will include wrong-way driver detection.
Looking to the future, Rossiter said the capabilities of SwRI’s software will continue to expand. Plans are in place to add the ability to determine how many cars sit at an intersection. This information can then be used to adjust traffic light signals or advise drivers to take alternate routes.
Some high-end traffic management cameras include features such as built-in vehicle counters and high-resolution sensors. Rossiter said the latter can be an issue, though, in rural areas where bandwidth is limited. In this case, the image may be down-sampled so that only a fraction of the pixels is transmitted. Software can then analyze the data.
The future of traffic management includes vision as well as other sensors and advanced processing, according to Wayne Sarasua, an associate professor of civil engineering at Clemson University in Clemson, S.C. Sarasua’s research focus is traffic management.
On the sensor side, high-definition radar is promising because of its ability to more accurately classify vehicles, Sarasua said. However, this technology is much more expensive than standard vision systems.
Deep learning and other AI techniques can overcome various sensor
drawbacks. Traffic management, though, is a complex problem with many varying conditions. Consequently, training a system on one data set may fail to yield good results on a different data set, according to Sarasua.
He said he would ideally like to get real-time data and use this to tell drivers to take alternate routes — which may be necessary if traffic density spikes because of a special event, an accident, or rush hour. For optimum traffic management, it is critical to spot the moment when traffic demand becomes greater than capacity.
Sensors monitor roadways in real time, allowing analysts to extract data such as average traffic speed and congestion.
“If we exceed it by just a little bit, that demand [limit], then all of a sudden the bottom falls out,” Sarasua said.
Rising above the problem
According to Sarasua, one vision-based approach to getting such information is by putting high-resolution cameras in planes or drones to capture details across a large area. Ross
McNutt, president of Dayton, Ohio-based Persistent Surveillance Systems (PSS), said that if the company’s current 192-MP 12-camera system images a region 5.8 miles on a side, then vehicles cover about 25 pixels. The resulting full-color imagery provides enough detail to study traffic flow, such as how many cars originate from an area and where they go.
Privacy safeguards have been put in place for these situations, according to McNutt. “We can provide appropriate anonymized flow data to help cities understand their traffic situation better,” he said.
This information from the airborne system is sent over a high-speed link to users on the ground. The imagery allows analysts to extract data such
as average speed or how long cars sit at a traffic light. The company has performed studies of traffic over bridges in New York City. It has also helped adjust traffic flow for major racing events, cutting hours off the amount of time spectators spend to find parking.
McNutt said future versions of the system may include up to 12 70-MP cameras, resulting in a total system resolution of 840 MP. And PSS will likely choose to fly the plane carrying the system lighter and higher, as this preserves privacy and lowers the cost of operation, he said.
According to Clemson’s Sarasua, adding some form of persistent surveillance may be the next step in visual technology for traffic management, and he said doing so could bring benefits. But he doesn’t think this approach — or any other single method — will be the only one deployed.
“Is there one best technology? And the answer is, there are really multiple technologies,” Sarasua said. “No system is perfect.”