A quick search through 1/1 - 100,000/100,000 shows no other interesting values. My algorithm is extremely naive though (brute force), a faster search could potentially be used to find other high quality values much more quickly.
Update - A slightly optimized algorithm, and nothing special found with denominators up to the hundreds of millions. I know this means nothing wrt actual mathematics though...
By your output you're performing some redundant checks. 355/113 == 710/226. Not sure how you're performing the search, but removing those may speed things up.
I think the cost of removing them (checking every value to see if it's a multiple of another) is greater than the benefit of removing them. My search terminated though (with no interesting finds) - my fractional values were getting too small and I got a ZeroDivisionError.
Reading about Farey sequences led me to http://en.wikipedia.org/wiki/Stern%E2%80%93Brocot_tree. This could be used to perform a search of the space of rational numbers with denominator < threshold with each number examined only once.
Neat, that was in a tab I hadn't reached yet. Though for the purposes of this sort of search, the Stern-Brocot tree has the nice property of being a binary search tree. Allowing you to constrain the breadth as well as the depth of the search.
local max = 100000000
local c = {}
local min , j = math.huge
for i=1,max do
local a = math.pi-math.abs(math.cos(i))
if a < min then
min = a
j = i
table.insert ( c , j )
end
end
local min , j = math.huge
for k=1,#c do
for i=1,max do
local a = math.abs(math.pi-c[k]/i)
if a < min then
min = a
j = i
print(c[k],j,min)
end
end
end
The numbers that you're producing can be better produced by the following Python code:
from fractions import Fraction as frac
def continued_fraction(x, n=10):
"""Represent x as a continued fraction out to n places."""
last = int(x)
out = []
for i in range(n):
x = 1/(x - last)
last = int(x)
out.append(last)
return out
def bras(x, n=10):
"""Compute the best rational approximations to x."""
base = int(x)
nums = continued_fraction(x, n)
S = lambda depth, i = 0: \
frac(1, nums[i]) if depth == 0 else 1 / (nums[i] + S(depth - 1, i + 1))
return ', '.join(str(x) for x in (base,) + tuple(base + S(k) for k in range(n)))
from decimal import Decimal
print(bras(Decimal('3.1415926535897932384626433832795028841971693993')))
# 3, 22/7, 333/106, 355/113, 103993/33102, 104348/33215, 208341/66317,
# 312689/99532, 833719/265381, 1146408/364913, 4272943/1360120
These are the best rational approximations to pi, which are determined by the continued fraction of pi:
Large numbers are opportunities to "grab a lot of extra digits" and thus the large qualities come when we truncate after 7, 15, 292, and 14.
You may wonder what the "most irrational" number is, in the sense that all of its best-rational-approximations are of low quality. That distinction belongs to:
It's true, my code is longer because it computes an explicit continued fraction representation (and also because it's not written as two generator scripts).
It's not actually the difference to pi that matters, it's the "quality" of the approximation, as described in the article. You can always get better approximations by using larger denominators, it's just that some approximations are, given the size of the denominator, unreasonably good.
Update - A slightly optimized algorithm, and nothing special found with denominators up to the hundreds of millions. I know this means nothing wrt actual mathematics though...