Is there somebody who can find an error in this: (at the end of the book)
"The universe has a finite age, T, about 13.7 billion years. Because information cannot travel faster than the speed of light, c, our observable universe is limited to an apparent 13.7 billion light years, although the furthest objects we can see have since moved further away. Its mass is limited by the gravitational constant, G, to a value that prevents the universe from collapsing on itself.
A complete description of the universe could therefore consist of a description of the exact positions and velocities of a finite number (about 10^80) of particles. But quantum mechanics limits any combination of these two quantities to discrete multiples of Planck's constant, h. Therefore the universe, and everything in it, must have a finite description length. The entropy in nats (1 nat = 1/ln(2) bits = 1.4427 bits) is given by the Bekenstein bound as 1/4 of the area of the event horizon in Planck units of area hG/2πc3, a square of 1.616 x 10^-35 meters on a side. For a sphere of radius Tc = 13.7 billion light years, the bound is 2.91 x 10^122 bits.
We now make two observations. First, if the universe were divided into regions the size of bits, then each volume would be about the size of a proton or neutron. This is rather remarkable because the number is derived only from T and the physical constants c, h, and G, which are unrelated to the properties of any particles. Second, if the universe were squashed flat, it would form a sheet about one neutron thick. Occam's Razor, which the computability of physics makes true, suggests that these two observations are not coincidences."
I'm not qualified enough to find the possible error but I consider that speculation intriguing!
"According to calculations, the comoving distance (current proper distance) to particles from the CMBR, which represent the radius of the visible universe, is about 14.0 billion parsecs (about 45.7 billion light years), while the comoving distance to the edge of the observable universe is about 14.3 billion parsecs (about 46.6 billion light years),[1] about 2% larger."
I suppose he means that if, simplifying, you consider the universe to be one big 3-dimensional ball, it would be squished "vertically" into a 2-dimensional circle with the same diameter as the ball.
>But quantum mechanics limits any combination of these two quantities to discrete multiples of Planck's constant, h.
This sounds wrong to me. My understanding of QM is that the underlying physics is actually continuous (as apposed to quantized), and it is only when we attempt to make measurements that we run into planks constant.
For those who've never heard of the author, Matt Mahoney is the author of a number of compression algorithms (and implementations), including the famous PAQ family of arithmetic coding / context modeling compressors. PAQ and its (many) derivatives are still among the strongest available compressors to date (meaning that they compress very, very well — in general much more than popular compressors like 7-Zip or UHArc), while not being practical due to their incredible slowness.
Matt Mahoney has a lot of great information about compression!
The compression curiosities page has some nice gimmicks. A file that compresses well in with one software, but hardly at all with another; a tiny compressed file (115 bytes) that expands to 5 MB; compressed files decompressing to themselves.
Compression on a intuitive level can be applied to many concepts. Being a geneticist (more bioinformatician now), I sometimes view DNA as a compressed dataset.
Let's say we were to somehow define all the information that makes up us as individuals (map all the atoms in our body or something). All that information is algorithmically compressed into DNA. If the universe is a giant computer, DNA is the code and we are the output. Geneticists, in a sense, are just studying how this compression works.
"The universe has a finite age, T, about 13.7 billion years. Because information cannot travel faster than the speed of light, c, our observable universe is limited to an apparent 13.7 billion light years, although the furthest objects we can see have since moved further away. Its mass is limited by the gravitational constant, G, to a value that prevents the universe from collapsing on itself.
A complete description of the universe could therefore consist of a description of the exact positions and velocities of a finite number (about 10^80) of particles. But quantum mechanics limits any combination of these two quantities to discrete multiples of Planck's constant, h. Therefore the universe, and everything in it, must have a finite description length. The entropy in nats (1 nat = 1/ln(2) bits = 1.4427 bits) is given by the Bekenstein bound as 1/4 of the area of the event horizon in Planck units of area hG/2πc3, a square of 1.616 x 10^-35 meters on a side. For a sphere of radius Tc = 13.7 billion light years, the bound is 2.91 x 10^122 bits.
We now make two observations. First, if the universe were divided into regions the size of bits, then each volume would be about the size of a proton or neutron. This is rather remarkable because the number is derived only from T and the physical constants c, h, and G, which are unrelated to the properties of any particles. Second, if the universe were squashed flat, it would form a sheet about one neutron thick. Occam's Razor, which the computability of physics makes true, suggests that these two observations are not coincidences."
I'm not qualified enough to find the possible error but I consider that speculation intriguing!