And we have a new record!
StorageReview Lab has computed decimal pi to 202,112,290,000,000 digits. The final digit in that sequence was a 2.
It took 100 elapsed days, including continuous computation for 85 days, and used 1.5 picobytes of storage. CPU speed used to be the determinant for computation time, but this time the storage read/write speed was the limiting factor, so a faster CPU wouldn't have made any difference.
I usually celebrate achievements like this without reservation, but I just read an article about how much energy is being used for cryptocurrency mining, and now I'm wondering if the energy used to compute 202 trillion digits of pi could have been used for some other worthy purpose, like charging all of the EVs on the road.