There’s an old saying: When the only tool you have is a hammer, every problem looks like a nail.
Sometimes referred to as “the law of the instrument,” that hammer-and-nail idea is a common pitfall in research; when you’re not open to questioning your own methods, you might miss an opportunity for learning and impact.
For a multidisciplinary group of researchers at the Microsoft Research Lab in Cambridge, U.K., the mission was to build a new kind of computer that would transcend the limitations of the binary systems in rapidly solving complex problems. But the willingness to entertain big questions like “What is the nature of this tool we are designing?” and “What is the ‘nail’ we can hammer with it?” was key to success in building a computer that can solve practical problems at the speed of light.
To start, they built the first 8-variable optical computer of its kind. The computer uses different intensities of light to compute at the same location where the information is stored. The researchers called the device they built AIM, for Analog Iterative Machine.
“It is always the case that if you make some technological advancement, typically in the beginning it will not be clear how to use it in practice,” says Christos Gkantsidis, one of the three principal researchers on the project. He was recalling how they originally hoped to use AIM as a tool to accelerate machine learning. “There is a bit of research figuring out which practical problems are more of a natural fit for them.”
About three years ago, they tried using AIM to solve a particularly vexing but important type of math problem – optimization. They quickly realized this new device had the potential to greatly surpass the speed and capacity of the binary systems used in typical computers in solving these optimization problems.
“Basically, optimization runs the world as we know it,” says Gkantsidis. Optimization problems underlie many of society’s most important structures – among them: banks and finance, healthcare, logistics and manufacturing.
The promise of this new computer has led to a one-year research agreement with Barclays Bank PLC to investigate the potential of using it to solve a real-world problem – how batches of transactions are settled at the clearing houses used by most banks. The number of transactions goes into the hundreds of thousands daily. Like most optimization problems, it’s the sheer scale that foils the capacity of binary computers to solve it.
“Effectively, it would take the lifetime of the universe to evaluate all the possible options,” says Lee Braine, managing director and distinguished engineer in the chief technology office at Barclays. Currently, he says, a variety of computing and mathematical shortcuts are used to make a sophisticated estimation of the most effective way to settle batches of tens of thousands of transactions.
The AIM team had already run what they call a “toy version” of the transaction settlement problem posed by Braine, and the optical computer solved it with 100% accuracy every time. A previous research effort at solving the same problem using a different technology was only hitting the mark about 50% of the time.
It’s very exciting to be involved in something that has the potential to create innovative change.
Braine is himself a computer scientist who has done extensive research on optimization. Now he and the Microsoft team have begun designing a larger-scale version of the problem using more data and variables. They hope to test it on an upgraded version of AIM later this summer. Braine says working with Microsoft’s AIM team is a unique opportunity. “It’s very exciting to be involved in something that has the potential to create innovative change,” he says. “To be on the leading edge of what’s possible.”
The end of Moore’s Law
In 1965, the engineer (and a founder of Intel) Gordon Moore predicted that the number of transistors in an integrated circuit would double every year. He later changed his prediction to every two years, and for decades, the capacity of computers has increased at roughly that rate, getting progressively faster and smaller without getting more expensive. But in the past decade, the trend has plateaued. At the same time, demand for computing capacity and speed has only grown.
Reproduced courtesy of Microsoft