Google Cloud calculated 100 trillion pi

In 2019, Google Cloud calculated pi — the irrational number discovered thousands of years ago — out to 31.4 trillion digits. That was a world record. But then, in 2021, scientists from the University of Applied Sciences of the Grisons tacked another 31.4 trillion digits to the figure, raising the total to 62.8 trillion decimal places.

But now, Google Cloud has shattered the latest record, calculating pi out to an unprecedented 100 trillion digits, according to a press release.

Hold on tight.

How Google Cloud calculated Pi to 100 trillion digits

This marks the second time Google Cloud has set a record for the number of digits of the mathematical constant. And the number of digits of pi calculated has tripled in only three years.

“This achievement is a testament to how much faster Google Cloud infrastructure gets, year in, year out,” read the press release from Google Cloud. “The underlying technology that made this possible is Compute Engine, Google Cloud’s secure and customizable compute service, and its several recent additions and improvements: the Compute Engine N2 machine family, 100 Gbps egress bandwidth, Google Virtual NIC, and Persistent balanced Disks.”

The program that executed the calculation of 100 trillion digits of pi is called y-cruncher v0.7.8 by Alexander J. Yee. The algorithm employed is called the Chudnovsky algorithm. The computing node involved is an n2-highmem-128 with 128 vCPUs and an 864-GB RAM.

The calculation began on Thursday, October 14, at 12:45 AM EDT in 2021 and ended on Monday, March 21, at 12:16 AM EDT, in 2022. That’s 157 days, 23 hours, 31 minutes, and 7.651 seconds. It almost feels like something a sci-fi android would say and do.

The storage size of this unconscionably large number is 515 TB of 663 TB available, with a total I/O of 43.5 PB read, 38.5 PB written, and 82 PB total.

Showcasing Google Cloud’s superior technology

Naturally, it takes a lot of computing power, storage, and networking finesse to make a calculation of this magnitude feasible. Google Cloud estimated the size of temporary storage needed to complete an analysis of roughly 554 TB. “The maximum persistent disk capacity that you can attach to a single virtual machine is 257 TB, which is often enough for traditional single-node applications, but not in this case,” read the press release.

The firm created a cluster of one computational node and 32 storage nodes, comprising 64 iSCSI block storage targets. There’s much to admire about Google Cloud’s efforts in reaching a final solution for the first 100 trillion digits in the number pi. But this goes beyond the ancient Greek letter pi and the number behind it. In finding it at this total length, the firm has proven its infrastructure’s flexibility, enabling teams to push the envelope of scientific experimentation while showcasing how reliable Google’s products are. What other hardware can carry out a calculation continuously, for more than five months, and experience zero node failures? It’s a wondrous world, and we’re here for it.


Nord VPN
60% off Nord VPN
Coinbase - Getty Images - 1234552839
Coinbase – Crypto Currency – Sign up with this link and get $10 free?! Buy/sell/exchange crypto, and use their ATM card to access your cash easily!
Chase Sapphire Preferred - Travel Points
NordPass - Password Manager - CJ Banner
https://www.dpbolvw.net/click-100604079-15345170
Binance Cryptowallet - Buy/Sell
Binance Blockchain
Amazon - Daily Deals
Amazon’s Daily Deals!
Your favorite restaurants are delivered to your front door! Grubhub!
Game Fly
Game Fly Video Game Rentals!