Many of the technologically advanced products we all use were developed with the help of a computer simulation. Prototyping in silico saves the manufacturers of marvels such as airplanes and computer chips huge amounts of time and money and can improve quality and safety. That is, so long as the engineers understand a simulation’s inherent margins of error. The mathematics that explain and manage those error boundaries are the specialty of Mark Ainsworth, professor of applied mathematics.
“Computers are very good at doing long, tedious calculations, but they are only going to give results that are as smart as the technology you put into them, so they never give you quite the right answer,” said Ainsworth, who came to Brown from Strathclyde University in Glasgow, Scotland. “The discrepancy between what’s coming out of your computer realization of a model and what you should get out if you’ve done the computation exactly can be quantified and bounded rigorously using mathematics.”
Ainsworth is not merely rehashing the old refrain “Garbage in, garbage out.” Sure, human errors result in computer errors, but he’s talking about something more ingrained. As binary machines, computers necessarily require calculations to be broken into distinct parts — “discretized,” in the language of math. Here’s a simple example: In the real world, a macroscopic surface can be smooth, at least down to the level of individual atoms or molecules. In a computer it must be approximated much more coarsely ” — in graphics surfaces are typically composed of little triangles. Using smaller triangles (and more of them) to approximate the surface will reduce the model’s inherent error, but will also slow it down.
Ainsworth not only studies the amount of inherent error that individual choices yield, but also that trade-off between error and economy. He seeks ways make the trade-off as favorable as possible. That’s scientific computing’s quest for “higher order methods.”
“One of the questions mathematicians ask is what would happen if I did a discretization that was twice as fine? How much would the error go down?” he said. “With a lower-order method, if you use twice as much computer resources the error may go down by half. With a higher-order method you are trying to beat that.”
Sometimes one can employ higher-order methods and sometimes one had better not.
“When they work they’re fantastic, but often when they don’t work they can fail spectacularly,” he said. “So it’s quite interesting mathematically to figure out when they are going to work and when they don’t work, why they don’t work.”
As a professor of applied mathematics, Ainsworth has indeed made an impact in the real systems. His insights into computer modeling of optoelectronic devices have given him the knowledge to help Hewlett-Packard make better LCD displays. His modeling of thin-walled structures has informed the code that governs pipelines in the North Sea oil industry. He’s even applied his methods to masonry.
At Strathclyde, where he held the 1825 Chair in Mathematics, he recently won a major grant for a project called “Numerical Algorithms and Intelligent Software,” a collaboration with several universities and the United Kingdom’s most important supercomputer center.
Especially with a major project like that underway, it might seem surprising that Ainsworth, who has two children, would leave his native Great Britain to come to Providence. But his wife Lynn, whom he met while he was a visiting professor at the University of Texas–Austin in the early 1990s, is a native of Buffalo. And Brown has been on his mind for years. He’d visited Brown for a month before coming back in 2004 for a major higher-order methods conference.
“It’s really too good a place not to come to,” he said. “There’s just so many good people here and the research environment is fantastic.”
For Ainsworth the key attraction to Brown is the potential for collaborations with people in mathematics and scientific computing who have complementary expertise.
“The most exciting discoveries come when you bring two apparently different areas together,” he said. “Life is really too short to educate yourself in both areas and do it all yourself. Find a good collaborator. That’s the way to do things and get a new perspective.”
After all, just because computer models demand that calculations be broken into artificially discrete parts doesn’t mean the mathematicians, scientists, and engineers who study them have to be.