Ninja Nichols

The discipline of programming

Why 256-bit Symmetric Keys are Enough

Recently Google announced that they are upgrading their SSL certificates from 1024-bit keys to 2048-bit keys. SSL uses asymmetric cryptography which requires much larger keys than symmetric cryptography to get equivalent security, and so increasing sophisticated cracking hardware is forcing the move from 1024 to 2048-bit keys.

But what about our symmetric ciphers? Before AES we had 56-bit DES, which today is brute-forcible by students for undergraduate cryptography class assignments. Do we have to be concerned that AES only supports up to 256-bit keys? Is Moore’s Law going to necessitate a jump from 256-bit to 512-bit ciphers?

The answer is simply, no.

In Applied Cryptography (pp. 157–8), Bruce Schneier argues there’s no reason to use anything larger than a 256-bit key for symmetric encryption. It’s also one of those rare arguments from the second law of thermodynamics that’s actually decent:

One of the consequences of the second law of thermodynamics is that a certain amount of energy is necessary to represent information. To record a single bit by changing the state of a system requires an amount of energy no less than kT, where T is the absolute temperature of the system and k is the Boltzman constant. (Stick with me; the physics lesson is almost over.)

Given that k = 1.38×10-16 erg/Kelvin, and that the ambient temperature of the universe is 3.2 Kelvin, an ideal computer running at 3.2K would consume 4.4×10-16 ergs every time it set or cleared a bit. To run a computer any colder than the cosmic background radiation would require extra energy to run a heat pump.

Now, the annual energy output of our sun is about 1.21×1041 ergs. This is enough to power about 2.7×1056 single bit changes on our ideal computer; enough state changes to put a 187-bit counter through all its values. If we built a Dyson sphere around the sun and captured all its energy for 32 years, without any loss, we could power a computer to count up to 2192. Of course, it wouldn’t have the energy left over to perform any useful calculations with this counter.

But that’s just one star, and a measly one at that. A typical supernova releases something like 1051 ergs. (About a hundred times as much energy would be released in the form of neutrinos, but let them go for now.) If all of this energy could be channeled into a single orgy of computation, a 219-bit counter could be cycled through all of its states.

These numbers have nothing to do with the technology of the devices; they are the maximums that thermodynamics will allow. And they strongly imply that brute-force attacks against 256-bit keys will be infeasible until computers are built from something other than matter and occupy something other than space.

Hardware AES: Windows vs. Linux

Intel’s newest chips include a new AES hardware acceleration feature. My favorite cross-platform encryption utility, TrueCrypt, recently added support for the new instructions (Turn it on in Setting->Performance in Windows or Settings->Preferences->Performance in Linux).

However, I noticed something interesting when comparing the benchmark performance in Windows 7 to that in Linux. It seems that Linux is significantly faster than Windows, even though the acceleration is hardware-based.

Each data point represents the average of three (3) runs. The operating systems used were Windows 7 Professional 64-bit, Arch Linux 64-bit (latest) and Ubuntu 10.10 64-bit. Tests were performed on a ThinkPad T510 with an Intel Core i5 M 560 processor and 4 GB of memory.

Block Size Windows 7 Arch Linux Ubuntu
1 MB 377 MB/s 1.5 GB/s 1.5 GB/s
5 MB 798 MB/s 1.6 GB/s 1.6 GB/s
50 MB 1.1 GB/s 1.7 GB/s 1.6 GB/s
200 MB 1.3 GB/s 1.6 GB/s 1.7 GB/s

Bottom line: If you plan to do a lot of encryption, Linux will give you noticeably better performance, assuming all your data is already in memory.