Rice Center Researches Adaptive Software

Krishna Palem, RUCCAM Director

Rice-led Project Aims for Computing that Can Live at the Margins

Rice University recently announced the opening of a new center that explores solutions to the physical and energy limitations currently restricting the continued expansion of computing capacity required to solve emerging workloads. Krishna Palem, the Kenneth and Audrey Kennedy Professor of Computing, has been designated the director for the Rice University Center for Computing at the Margins. RUCCAM unites a team of dedicated Rice University professors and specialists from universities around the world who intend to change the manner in which resources are utilized, even at the margins of stability and accuracy, in order to increase the efficiency at which answers can be calculated.

Work done at the center, in collaboration with partner institutions, has potential applications for –and is supported by— a Defense Advanced Research Projects Agency (DARPA) initiative under the Building Resource Adaptive Software Systems program. Projects under the BRASS program are tasked with the ambitious goal of spurring advances that will allow software to remain robust and functional for more than 100 years.

As one example, the Proteus project – a joint venture between researchers at Rice MIT, UT Austin and the University of Chicago — may seem far-fetched in an age when computer hardware and software typically becomes obsolete or incompatible within a few years, but Palem said improving software adaptability by dynamically trading cost for quality has the potential to enable new generations of quasi-autonomous computing systems.

“The Proteus platform will be developed and tested on a drone system,” Palem said. “One scenario that is directly relevant to Proteus’s mission is ensuring that such vehicles continue flying and meet mission objectives even if critical components like batteries or sensors fail or get destroyed in mid-flight.”

“Importantly, the same techniques that allow a system to adapt to this type of short-term challenge could also help software adapt when components are upgraded or changed over the long term,” said Robert “Corky” Cartwright, a computer science professor at Rice and one of the researchers in the new center. “Upgrading system components today typically requires re-written code, either in the form of a software patch or an entirely new version of an application. That level of change is both expensive and time-consuming. If we can develop new techniques to avoid that, the implications will be far-reaching.”

Palem said RUCCAM research began with a close focus on embedded computing. Embedded computers are special-purpose systems that are inside thousands of consumer and industrial products, including everything from modems, toys and toasters to automobiles, satellites and jet fighters.

“The Proteus project will require coordinated efforts by researchers from each member institution,” said Palem. “For example, the centerpiece of the project will be a piece of software that is enriched by machine learning and control theory to yield information derived from embedded sensors.”

He said the software discovery and analytics components in Proteus will be supported by FAST, a type-safe extension of a mainstream programming language that captures user intent and defines the adaptive behaviors the drone might use.

“Our primary strategy is to use feedback control and estimation techniques that exploit control through inexactness, a new concept we’ll develop as part of this effort,” Palem said. “These techniques will leverage sophisticated methodologies like ‘constraint satisfaction,’ which are rooted in the foundations of computer science.”

Krishna Palem Explores the Art of Inexactness

Krishn PalemKrishna Palem, the Kenneth and Audrey Kennedy Professor of Computer Science and Director of RUCCAM, said solving big problems remains high on his radar, but he has realized the need for a new approach to high-performance computing.

To explain, he uses the example of collecting virtual creatures in one of the hottest games of the summer, Pokémon GO. “A player might spend a lot of time calculating an exact, best possible route to amass the most Pokémon, but most people prefer playing the game to planning it. They choose a ‘good enough’ route that might be less effective, but provides immediate satisfaction.” This concept, also known as satisficing, was first introduced in 1956 by Nobel prize laureate Herbert A. Simon.

Read more in csprofiles.rice.edu.

Corky Cartwright’s Gift for Language

Corky CartwrightRice University computer science professor and RUCCAM member Robert “Corky” Cartwright  was recently featured in a Computer Science Profile article on his gift for programming languages.

“The gap that programming languages fill,” he said, “is between a human’s understanding of what they want to compute – usually in some kind of mathematical model – and how to translate that description into code the device can execute.”

Cartwright’s current focus is energy efficiency. Like spoken languages, both syntax (structure) and semantics (meaning of the symbols, characters, and words) are critical. Cartwright is exploring ways to economize on the energy required to execute a command by changing the language used to write the command.

Read more in csprofiles.rice.edu.

Workshop on Computing : May 7-8

Workshop on Computing at the Margins

Sustaining and Accelerating Scaling Through Adaptation, Resilience and Tolerance

Rice University, May 7-8 Arlington, VA

 

In this invitation-only workshop, we intend to explore and develop a vision for how future systems can function on the margin of available resources, constrained either by static limits (e.g., the power budget for a supercomputer or data center) or by dynamic deviations (e.g., battery drain or failing sensors in a UAV). The focus is on system designs that generate acceptable or “good enough” solutions in the presence of severe resource limits. Examples range from supercomputing applications performing numerical and computational algebra to embedded and mobile applications such as a UAV. On the one hand, the goal is to explore computing technologies that enable applications that are not possible within the confines of stable computing environments in which massive margins prevent loss of functionality with high probability. Interesting novel examples include 2-D phosphorous and graphene. On the other hand, we seek computational solutions that vary output quality as system resources fluctuate. Several research challenges must be overcome to accomplish this two-pronged goal, ranging from novel programming and compilation mechanisms to tools for performing system adaptation and validating system correctness. Application domains ranging from atmospheric and ocean modeling to signal processing including video, radar, and sonar, as well as vision and hearing are all likely candidates for new resource-efficient system designs; we invite contributions from specialists in these and other domains.

What is Computing at the Margins?

Until recently, Moore’s law and Dennard circuit scaling continually reduced the cost and energy usage of computations, enabling the construction of progressively larger and more sophisticated computing platforms. However, after sixty years of rapid advancement, the computing revolution is confronting hurdles to continued scaling including fundamental physical limits due to the nanoscales at which computing substrates are fabricated and the increasing complexity of ambitious application platforms such as atmospheric modeling, media-oriented operating systems, deep space satellite probes, UAVs, and cyber-physical systems. Continued scaling of size, speed, and sophistication of computing systems hinges on achieving significant and novel breakthroughs in system design and implementation.

A promising strategy for overcoming looming resource limits is to rethink current technology design methodologies to emphasize techniques for efficient resource utilization. This may require renegotiating the relationship between the application and the computing platform. For example, we can gain orders of magnitude in power, speed, and potentially area efficiency using systems that operate at the margins of circuit stability. The principles underlying advances in scaling the size, speed, and complexity through more efficient resource utilization are also applicable to the development of more resilient applications that cope with unforeseen dynamic alterations in available resources including energy and input data quality.

 

 

NY Times: A Climate-Modeling Strategy That Won’t Hurt the Climate

Krishna Palem of Rice Unviersity and Tim Palmer of Oxford University, were quoted about their strategy to improve the efficiency of energy used to solve big problems like climate science in the New York Times article A Climate-Modeling Strategy That Won’t Hurt the Climate by John Markoff.

Excerpt from the article:

Dr. Palem believes his inexact approach is more appropriate for weather and climate modeling because the vast grids of cells that separately calculate local effects like cloud formations, wind, pressure and other variables can be calculated without great accuracy.

“I see it as a necessary tool we need now to move the science forward,” said Tim Palmer, a University of Oxford climate physicist. “We can’t do a lab experiment with the climate. We have to rely on these models which try to encode the complexity of the climate, and today we are constrained by the size of computers.”

Read the full May 2015 article on the New York Times website: http://www.nytimes.com/2015/05/12/science/inexact-computing-global-warming-supercomputers.html