Ask a physicist for the value of pi, and you’ll likely get an answer like 3.14 – maybe 3.142 if they’re feeling particularly scholarly that day. Ask an engineer, and it’s even worse: the standard answer is “about three”. Ask Google Cloud, though, and you’re in for a long and incredibly accurate ride.
This week, the company reclaimed the world record for the highest number of digits of pi calculated, bringing the known value of pi, the ratio between a circle’s circumference and diameter, up to a staggering 100 trillion digits.
And we say “reclaimed”, because this is actually the second time Google Cloud has held this title. “Records are made to be broken,” wrote Google developer advocate Emma Haruka Iwao in a press release announcing the achievement.
“This is the second time we’ve used Google Cloud to calculate a record number of digits for the mathematical constant,” she said, “tripling the number of digits in just three years.”
If anything, that’s underselling it: Google previously made the record books for calculating 31.4 trillion digits of pi, an achievement they appropriately announced on pi day 2019. But last year, their thunder was well and truly stolen when a team of Swiss scientists doubled the number of significant figures, bringing the length of the most accurate known value of pi to 62.8 trillion digits.
Now, after nearly six months of solid computation, Google has upped the ante once again. And just as you might expect for an achievement of this magnitude, it didn’t come easy: “we estimated the size of the temporary storage required for the calculation to be around 554 TB,” explained Haruka Iwao.
“The maximum persistent disk capacity that you can attach to a single virtual machine is 257 TB, which is often enough for traditional single node applications, but not in this case.”
So instead, the developers designed a cluster of nodes: one computational node and 32 storage nodes, giving access to a total of 663 TB of disk space. As a rough guide for how big that is, you can think of it as enough to install the massively multiplayer online role-playing game World of Warcraft about 7,000 times at once.
But what is all that power being used for?
The Cloud computed the digits using a method designed by a pair of mathematicians named David and Gregory Chudnovsky. By no means an obscure name among pi-chasers, the Chudnovsky brothers developed their algorithm in the late 80s, and within just a few years, they had personally broken the then-record for most digits of pi calculated – two billion digits – using a supercomputer they had built from mail-order parts in their shared New York apartment.
Since its publication, the Chudnovsky algorithm has been the algorithm of choice for most record attempts in pi-calculation – in fact, for the last 12 years it’s been the only method to have found more digits than previous records. That’s because it’s a very efficient algorithm: every time you perform a fresh iteration, you get more than 14 extra digits on average.
It’s a massive difference from the earliest ways people tried to calculate pi. The first recorded method was devised by Archimedes – he of bathtime “eureka!” fame – and relied on polygons: A computer, human this time rather than mechanical, would have to construct two regular polygons on either side of a circle, and could then find an approximation of the circumference by using the perimeters of the polygons as limits. The higher the number of sides, the more accurate the estimate of pi was – Archimedes himself went all the way up to 96-sided shapes, thereby proving that pi was bigger than 3.1408 but less than 3.1429.
Between 250 BCE and about 1500 CE, if you wanted to calculate pi, polygons were all you had. That changed when Indian mathematicians, most notably Nilakantha Somayaji, discovered that infinite series could be used instead – basically, that pi could be found as a sum of an infinite number of terms, and the more terms you added up, the closer your approximation would become.
The infinite series found by Nilakantha to calculate pi. Dude trust me
Western mathematics caught up with their Indian peers within a century or so, although the first few methods were actually infinite products, rather than infinite sums. By 1706, though, John Machin had found what would become the best-known algorithm for more than two centuries for calculating pi – his method would be used by Royal Navy mathematician Daniel Ferguson to calculate what was then a record 620 digits of pi in 1946.
At 100 trillion digits, that record has now been increased by a factor of around, oh, 161,290,322,581. And while obviously most of that improvement comes down to the development of supercomputers, the idea behind the algorithm used by Google is actually the same as those 18th century methods – it’s just been refined thanks to mathematicians like Ramanujan and, of course, the Chudnovsky brothers.
The infinite series used for the Chudnovsky algorithm. Image credit: Lorenz Milla, 2021
Now, human brains can’t really deal with numbers like 100 trillion, so it might be difficult to appreciate just how huge Google’s record-breaking figure is. So here are some comparisons:
3. Look at your pinky fingertip. That’s probably about one by 2.5 centimeters (1 inch) in size. Pretty small. But if you wrote one digit of the new approximation on individual pieces of paper the size of that pinky tip, you would have enough paper at the end to entirely cover the state of Vermont.
1. If you had a dollar bill for each digit, not only would you have more money than everybody in the world added together, but you would also be able to stack them so high that they would reach the moon … 25 times.
4. If each digit corresponded to one grain of rice, those grains would weigh about 21 million tonnes and could fill more than 650 shipping containers.
2. If you read the digits out at a rate of one per second, it would take approximately 317 million years. That’s about 80 million years longer than the distance between us and the evolution of the very first dinosaurs.
But if you’re thinking this new approximation will be used for cutting-edge physics research or never-before-seen mathematical exactitude, we have bad news: as exciting as every new pi record is, it’s never really been about pi itself.
“For [NASA’s Jet Propulsion Lab’s] highest accuracy calculations, which are for interplanetary navigation, we use 3.141592653589793,” explained director and chief engineer for NASA's Dawn mission, Marc Rayman. “[W]e don't use more decimal places … there are no physically realistic calculations scientists ever perform for which it is necessary to include nearly as many decimal points as [Google provides].”
Even for the most extreme levels of accuracy possible – measuring the circumference of the visible universe to an accuracy equal to the diameter of a hydrogen atom – you only need about 40 decimal places of pi, Rayman pointed out.
So why are so many people obsessed with finding ever longer approximations of pi? It’s basically a way to test – and boast about – how good your computer is.
“It’s a computational challenge – it is a really seriously difficult thing to do, and it involves lots of mathematics and these days computer science,” David Harvey, an associate professor at the University of New South Wales, told The Guardian.
“There’s plenty of other interesting constants in mathematics: if you’re into chaos theory there’s Feigenbaum constants, if you’re into analytic number theory there’s Euler’s gamma constant,” he explained.
“Why do you do pi? You do pi because everyone else has been doing pi …That’s the particular mountain everyone’s decided to climb.”