Pulling Back the Curtain on the 20th Century’s Most Important Algorithms

Around 1600 B.C., Babylonians developed the earliest known algorithms for finding square roots. But most of the algorithms that impact us today were developed over the last century. Those formulas power the world’s biggest companies, regulate traffic, help computers run more efficiently and even save lives.

John MacCormick, who wrote Nine Algorithms That Changed the Future, defined an algorithm as “a sequence of steps that you would follow to achieve a goal.” For example, he said, “you follow a recipe to make a loaf of bread.”

An algorithm can be as simple as following a step-by-step culinary recipe, but many are made up of thousands of lines of code. And that code has become part of nearly every move we make.

“We are in a very interesting moment in human history as we increasingly rely on algorithms,” said Tom Griffiths, co-author of Algorithms to Live By. “There’s a lot more potential for these things, inserting computation into our everyday lives.”

So, what are those codes, and who are those coders, that have had such an impact on how we interact with one another, get from one place to the next and navigate the vast digital world?

Sort It Out

In 1945, mathematician John von Neumann developed the mergesort algorithm, as legend has it, while playing cards. It was the first algorithm to use a divide-and-conquer approach, breaking down a problem into subproblems, then solving each subproblem in a recurring loop to attain a final solution. With mergesort, a list is divided into two parts. The algorithm repeats the dividing process until there is only one element left in each part, and then merges the elements.

Other sorting algorithms followed. Quicksort (1962) and heapsort (1964) apply a similar divide-and-conquer method, but use different sorting approaches. Apart from sorting elements, mergesort and its algorithmic offspring are widely used in a variety of computer functions, internet link analysis, data mining software, and artificial intelligence.

Love and War

In 1947, George Dantzig created the simplex method for linear programming, considered one of the most successful algorithms of all time due to its widespread use.

Dantzig served as a program planner for the U.S. Air Force during World War II and saw the need for a tool to solve large-scale planning problems to minimize costs. The simplex algorithm, which handles multi-variable problems more efficiently, helped the Allies win the war, and later transformed other industries and spawned new ones.

The transportation sector, for instance, uses it to minimize the shipping costs between countless warehouses and final destinations. Dating apps also use the algorithm to optimize the number of matches between users.

Message Delivered

GPS route finders have to solve a difficult problem during millions of interactions around the world each day. With thousands of highways and back roads, how do you find the shortest route between two points?

“Suppose you’re trying to drive from Syracuse to Miami,” Thomas Cormen, author of Algorithms Unlocked said. “If you’re trying to enumerate the best route, there would be trillions.”

In 1959, computer scientist Edsger Dijkstra developed an algorithm to find the shortest path between two nodes. A professor for most of his life, Dijkstra lived through the early days of computer programming, working as a programmer on a computer called ARMAC.

Dijkstra wanted to test the device by identifying a problem a non-computing audience could understand, in this case, mapping 64 cities in the Netherlands and finding the shortest path between them. He designed the algorithm in about 20 minutes, he said later, while having a cup of coffee with his fiancé and thinking about the shortest route between Rotterdam to Groningen.

Today, logistics teams at companies like Amazon and UPS rely on variants of Dijkstra’s shortest path algorithm to ensure packages are shipped and delivered on time to thousands of locations. It is also the basis of much of the software that routes calls and texts through phone networks, and emails through the internet.

Filter Out the Noise

When taking measurements, noise, such as random errors and inaccuracies, is an unwanted side effect to many computerized processes. Measurements from devices like accelerometers and gyroscopes produce a ton of raw data full of noise. In 1960, engineer, Rudolf Kálmán published an article with a solution that would essentially take noisy data measurements in and filter them out.

The Kalman filter is an algorithm that accurately predicts variables such as speed, direction and location by taking observed measurements over time, containing noise and inaccuracies, and then producing an estimate based on the observed variables. For example, our smartphones’ GPS sensors track our location, but the Kalman filter helps predict relative location in a tunnel or while underground.

Kalman filters were used during the Apollo space program, in NASA space shuttles, submarines and cruise missiles. GPS is a giant Kalman filter, and wind turbine farms are harnessing the power of the algorithm to detect wind anomalies in order to prolong the lives of individual windmills. An even more recent example of the algorithm’s use is in the budding field of virtual reality. Kalman filters are used for predictive tracking to forecast the position of an object.

Nuclear Test Detection

In 1963, James W. Cooley, a mathematician who had been a programmer under John von Neumann, was working at the IBM Watson Research Center in New York when a physicist named Richard Garwin visited the lab. Garwin had designed the first hydrogen bomb in 1952, and he’d come to Cooley’s lab from a meeting of President John F. Kennedy’s Scientific Advisory Committee.

The two men spoke about a math problem Garwin had encountered “determining the periodicities of the spin orientations in a 3-D crystal” of Helium, Cooley wrote later. But Cooley soon figured out that Garwin “was far more interested in improving the ability to do remote seismic monitoring of nuclear explosions.”

By 1965, Cooley and another mathematician on Kennedy’s science committee, John Tukey, had developed an algorithm, called the fast Fourier transform (FFT) that would help Garwin detect Russian nuclear tests. The FFT algorithm is key ingredient of digital signal processors which take analog signals such as audio, video or temperature and translate them into digital mathematical functions. Today, the internet, smartphones, computers and satellites all use FFT algorithms in order to function.

Feeling Lucky

In 1997, Stanford PhD student Larry Page decided to research the mathematical properties of the World Wide Web by mapping its link structure as a graph, which each web page linking to a given page. After Page and fellow PhD candidate Sergey Brin discovered a common fascination for retrieving relevant information from large data sets, the pair wrote “The Anatomy of a Large-Scale Hypertextual Web Search Engine.”

That research became the foundation for Page and Brin’s PageRank algorithm, a quality metric that values a web page’s importance, reliability and credibility on a scale of 0-10. With the launch of the Google search engine in 1998, Page and Brin used the PageRank of a web page to determine its position in the search results. Today, PageRank is one of 200 ranking factors Google uses to determine a page’s popularity. The Google search engine, supplemented by the PageRank algorithm, changed the way the world navigates the web, and made way for new industries such as search engine optimization and digital media analytics.

The Algorithm Road Ahead

Like great literature, algorithms can be beautiful.

But while the world has been fundamentally transformed by the speed, efficiency and—yes—beauty of algorithms, there could be consequences for continuing to rely on them.

“As the saying goes, history is written by the victors,” said Susan Etlinger, an industry analyst at Altimeter Group. “But we are now at a point when a great deal of human behavior will be expressed in algorithms.”

Google Assistant and Amazon’s Alexa have, in recent years, led to huge improvements in speech recognition, for instance. Tesla and Uber have promised algorithms that drive self-driving cars in the not-distant future. Neural networks will foster deep learning and may spark an automation revolution.

Algorithms, Etlinger said, “will influence how we search, learn, make decisions and even the extent to which we can understand the consequence of our actions.”