I would dispute one claim in that article.
They say:
If pi were truly random, that would mean that the number sequence in pi would never repeat itself, and -- because pi is infinite -- it would contain all patterns in existence. Any word that you can think of, when encoded in numbers, would show up in pi, says Kryzwinski. So would the entire works of Shakespeare, all possible misprints and permutations of Shakespeare, and even, if you were patient enough, pi itself. As Cornell mathematician Steven Strogatz writes for The New Yorker, pi is so special in part because it "puts infinity within reach."
I added the bold.
If the digits of pi were contained somewhere within pi, starting at the N+1st digit, for some positive integer N, then consider the sequence consisting of the first N digits. That sequence would repeat itself starting at digit N+1, i.e., in the range {N+1,2N}, which means that for every digit in position P the same digit would occur at position P+N. For example, the digits in the range {1,N} would also occur at {N+1,2N}, {2N+1,3N}, {3N+1,4N}, and in fact the range {mN-N-1,mN) for every positive integer m.
Therefore, the first N digits of pi would repeat, making it a repeating decimal whose value would equal the first N digits of pi, without the decimal point, divided by 10^N-1. That would make pi rational. Since pi is irrational, we've reached a contradiction, so the original statement must be false.