Originally defined as the ratio between the circumference of a circle and its diameter, pi — written as the Greek letter π — appears throughout mathematics, including in areas that are completely unconnected to circles such as chemistry, physical sciences and medicine.
Pi belongs to a huge mathematical group called irrational numbers, which go on forever and cannot be written as fractions. Scientists have calculated pi to 105 trillion digits, although most of us are more familiar with the approximation 3.14. But how do we know that pi is an irrational number?
Rational numbers, which make up the majority of numbers we use in day-to-day life (although less than half of all possible numbers), can be written in the form of one whole number divided by another. Pi, with its complicated string of decimals, certainly doesn’t appear to be part of this group at first glance.
“Rationality is the practical property of having access to the number explicitly, i.e. without any approximation … so being able to write the number in a finite amount of symbols,” Wadim Zudilin, a mathematician at Radboud University in the Netherlands, told Live Science.
Related: What is the largest known prime number?
However, actually proving that you can’t write pi as a fraction is a surprisingly knotty issue. Mathematicians don’t have a universal method to show that a particular number is irrational, so they must develop a different proof for each case, explained Keith Conrad, a mathematician at the University of Connecticut. “How do you know a number is not a fraction?” he said. “You’re trying to verify a negative property.”
Despite this difficulty, over the past 300 years, mathematicians have established different proofs of pi’s irrationality, using techniques from across mathematics. Each of these arguments begins with the assumption that pi is rational, written in the form of an equation. Through a series of manipulations and deductions about the properties of the unknown values in this equation, it subsequently becomes clear that the math contradicts this original assertion, leading to the conclusion that pi must be irrational.
The specific math involved is often incredibly complex, typically requiring a university-level understanding of calculus, trigonometry and infinite series. However, each approach relies on this central idea of proof by contradiction.
“There are proofs using calculus and trigonometric functions,” Conrad said. “In some of them, π is singled out as the first positive solution to sin(x) = 0. The first proof by Lambert in the 1760s used a piece of mathematics called infinite continued fractions — it’s a kind of infinitely nested fraction.”
However, rather than proving pi is irrational directly, it’s also possible to confirm irrationality using a different property of the number. Pi belongs to another numerical group called transcendental numbers, which are not algebraic and, importantly, cannot be written as the root of a polynomial equation. Because every transcendental number is irrational, any proof showing that pi is transcendental also proves that pi is irrational.
“Using calculus with complex numbers, you can prove π is transcendental,” Conrad said. “The proof uses the very famous equation called Euler’s identity: eiπ +1 = 0.”
Although pi’s universal importance may arise from this intangible irrationality, seven or eight decimal places is usually more than sufficient for any real-world applications. Even NASA uses only 16 digits of pi for its calculations.
“We approximate the value for practical purposes, 3.1415926 — that’s already a lot of information!” Zudilin said. “But of course in mathematics, it’s not satisfactory. We care about the nature of the numbers.”