It is a great achievement for researchers to be recognized at a conference for their work. For one researcher from Illinois, his work caught the eye not once, but twice at the same conference for two different papers.
Tarek Abdelzaher, professor of computer science at the University of Illinois at Urbana-Champaign and the Coordinated Science Laboratory, along with other researchers from UIUC, Massachusetts Institute of Technology and George Mason University, have received an Outstanding Paper Award at RTAS’22 for their work, “Self-Cueing Real-Time Attention Scheduling in Criticality-Aware Visual Machine Perception.” At the same conference, a paper Abdelzaher helped author 20 years ago, “RAP: A Real-Time Communication Architecture for Large-Scale Wireless Sensor Networks,” received the Test of Time award. Influential Paper.
“These two articles were a great team effort, and it would not have been possible without all the authors,” said Abdelzaher. “It’s lucky to have a great team, and I think that was a big factor for us to win both awards.”
Both projects focus on improving algorithms that automate sensor data processing, but in very different ways. In the most recent paper, the researchers demonstrated how to minimize the computational footprint of AI so that it can run on smaller hardware: specifically, perception-based AI. Algorithms for perception tasks focus on envisioning anything possible, such as drones and self-driving cars. But all that computing power means a bigger processor. If the researchers reduced the hardware and the cost of the algorithm, then they could build processors that are physically and computationally lighter. This is where the researchers decided to draw inspiration from the way humans perceive the world.
“If I’m sitting in an IMAX theater, I’m not looking at every pixel on that screen and not paying the same attention to it. I’m watching where the action is,” Abdelzaher said. The research was funded by the Army Research Lab as part of the REIGN Internet Battlefield of Things project. “We don’t need a machine powerful enough to track every pixel, because humans aren’t powerful enough to track every pixel. We just needed to know where to focus.”
Rather than targeting everything at once, the AI technology was able to determine the most significant aspect of any given moment by using an algorithm to self-list its attention. This decreased computing power without sacrificing a significant amount of image quality. The team hopes this will be implemented in everyday life in the future.
“I think it promises much more diverse and interesting applications in smart homes, smart hospitals, and just everyday life where more and more computing devices are equipped with smart capabilities,” Abdelzaher said. “I think some of this work will enable more applications in this space.”
Abdelzaher’s other paper, “RAP: A Real-Time Communication Architecture for Large-Scale Wireless Sensor Networks,” has gained considerable recognition among researchers since its publication 20 years ago. The paper has been cited several hundred times over the years and was honored at the conference for its significant contribution to the field.
In 2002, the authors of the paper wrote that RAP provides a new prioritization protocol for distributed micro-sensing applications. This has been beneficial for communications planning in sensor networks, in which many wireless devices are seamlessly integrated into a physical space to perform real-time monitoring or control, such as a surveillance system.
“Our paper was the first to propose a scheduling algorithm that addresses real-time constraints on end-to-end tasks that run in very large distributed sensing systems,” Abdelzaher said. “It’s really flattering to hear that an article we wrote 20 years ago won an influential article award. That’s a very good compliment.
Read the original story from the Coordinated Science Laboratory.