Information About Algorithms
In this tutorial, we frequently refer to specific cryptographic algorithms. This section is intended to give you some information about the specific algorithms that are frequently used in firewalls and network protocols, allowing you to make some comparisons between them. It is by no means an exhaustive listing of cryptographic algorithms that you may encounter, or of all the interesting information about the listed cryptographic algorithms.Encryption Algorithms
These algorithms are designed to be used for encryption (reversibly obscuring information). As we've mentioned, it is often possible to use encryption algorithms for other purposes, and many of these algorithms are also used for digital signatures and/or for cryptographic hashing.- Rivest, Shamir, and Adleman (RSA)
- RSA is a public key cipher that can use varying key sizes (which are theoretically unlimited). Typical key sizes are 512, 768, 1024, and 2048 bits. Because the algorithm is expensive to compute, the smaller key sizes tend to be seen in smart cards and smaller devices. A 1024-bit key is considered suitable both for generating digital signatures and for key exchange when used with bulk encryption. A 2048-bit key is sometimes used when a digital signature must be kept secure for an extended period of time. One such use is for a certificate authority key.
RSA was developed in 1978 by Ronald Rivest, Adi Shamir, and Leonard Adleman. Mathematically, it is arguably one of the simplest public key algorithms to understand and implement. The RSA algorithm obtains its strength from the belief that it is difficult to factor large numbers. The RSA algorithm is patented, although the last patent that is believed to cover the algorithm runs out on 20 September 2000. When implemented in software, RSA with 1024-bit keys is about 100 times slower than DES and gets much slower as the keys get larger.
- The Data Encryption Standard (DES) and Triple DES
- DES is a symmetric 64-bit block cipher that uses a 56-bit key. It has been demonstrated[191] that it is possible, with a modest investment, to build a specialized piece of hardware that can break a 56-bit DES key in about 24 hours. A group of individuals using a large number of general-purpose computers was able to break a 56-bit DES key in about three months. This size key is probably too short for protecting anything of significant value.
[191]See the tutorial Cracking DES: Secrets of Encryption Research, Wiretap Politics and Chip Design, by the Electronic Frontier Foundation (Anonymous, 1998).
Triple DES (3DES) is a way of combining three uses of single DES with two keys, making the key size 112 bits. Because of the increase in key size, 3DES is believed to be much more secure than ordinary DES. 3DES runs about three times slower than single DES.The DES algorithm was developed by IBM in the 1970s and was standardized by the U.S. National Bureau of Standards and Technology (the organization is now called the National Institute of Standards and Technology or NIST). It was intended for encrypting unclassified data. Some of the development process for this algorithm is shrouded in mystery, and the NSA had input into the design process that resulted in a number of modifications. The fact that nobody has been able to show that DES has significant weaknesses suggests that the influence of the NSA actually increased the strength of the cipher. It is not clear that the designers intended for the algorithm to be released to the public.
- RC2 and RC4
- RC2 is a variable key length symmetric 64-bit block cipher; RC4 is a variable key length stream cipher. The key size for either algorithm can be anywhere from 1 to 2048 bits long. The algorithms are typically used with 40-bit and 128-bit keys. A 40-bit key is too small to protect anything of any value but is widely used due to previous U.S. export rules, which prevented the export of products using longer keys.
These algorithms were developed by Ronald Rivest and are trade secrets of RSA Data Security. The RC4 algorithm was disclosed by a 1994 anonymous Usenet posting; RC2 met the same fate in 1996. Both algorithms seem to be reasonably strong. When implemented in software they are about ten times faster than DES.
- Skipjack
- Skipjack is a symmetric 64-bit block cipher. The key size is 80 bits long. Skipjack was originally developed as part of a U.S. government encryption standard that was designed to make it easy for law enforcement agencies to decrypt data (which requires using Skipjack in conjunction with a protocol called the Key Exchange Algorithm, or KEA).
Skipjack was initially available only in a hardware implementation called the Clipper chip. The algorithm was declassified in 1998 and can now be implemented in software (in this form, it does not necessarily include the provisions for law enforcement access). Research into the strength of Skipjack itself is inconclusive, but it has been shown that a slightly modified version of Skipjack can be broken without using an exhaustive key search. Skipjack does not seem to be a very popular algorithm to implement, possibly because of its history, and as such there is little comparative timing data.
- International Data Encryption Algorithm (IDEA)
- IDEA is a symmetric 64-bit block cipher that uses a 128-bit key.
IDEA was invented by Xuejia Lai and James Massey and released in 1992. It has undergone some extensive cryptanalysis and seems to be a strong cipher. The IDEA algorithm is patented in Europe and the United States and must be licensed for commercial use. IDEA is the symmetric block cipher used in Phil Zimmerman's original PGP, one of the most widespread programs for exchanging arbitrary encrypted data across the Internet. There are no problems with the key size. When implemented in software, IDEA runs at about half the speed of DES, but one and a half times the speed of 3DES.
- Blowfish
- Blowfish is a symmetric 64-bit block cipher with a variable-length key. The key can be from 32 to 448 bits in size.
Blowfish was invented by Bruce Schneier and released in 1994. The algorithm appears to be strong. It is designed to be used on 32-bit microprocessors using simple mathematical operations. It does have larger memory requirements than other algorithms, which makes it less attractive for use in smart cards and other small devices. Blowfish is not patented, and implementations in C are in the public domain. When implemented in software, Blowfish runs at about five times the speed of 3DES.
- Advanced Encryption Standard (AES)
- The Advanced Encryption Standard is being set by the U.S. NIST organization, and the aim is to choose "a Crypto Algorithm for the Twenty-First Century". AES is intended to replace DES as a U.S. government standard. Having learned from the problems with DES, NIST has chosen to use a public process to develop the standard and to include algorithms designed outside the United States. At the time of writing, a standard has not yet been set, but the following five algorithms are being considered: Mars, RC6, Rijndael, Serpent, and Twofish. Comparison data for all of these algorithms is available from NIST; they all appear to be strong algorithms. In order to meet the requirements of the standard, they are all 128-bit block ciphers, and they all support 128 and 256 bit keys.
Digital Signature Algorithms
Digital signature algorithms were discussed earlier; they provide a way to combine public key encryption and cryptographic checksums so that a piece of information is attached to a specific identity:- Rivest, Shamir, and Adleman (RSA)
- See the discussion earlier in the , "Encryption Algorithms" section.
- Digital Signature Algorithm (DSA) and the Digital Signature Standard (DSS)
- The DSA is a public key algorithm that can be used to generate digital signatures. The key size is between 512 and 1024 bits (in 64-bit increments). A key size of 512 bits is too small for long-term security, but 1024 is acceptable.
The DSS is a U.S. NIST standard, issued in 1994, that standardizes the use of DSA; in an official sense, the DSA and the DSS are separate objects (one of them is an algorithm, and the other is an official U.S. government document), but in practice, the terms are often used interchangeably. The DSA is between 10 and 40 times slower to verify signatures than RSA. Some elements of the design of the DSA make it difficult to use for encryption, but it is possible with a full implementation of the algorithm.
The patent situation in regard to implementations using the DSA is unclear; there is some possibility that parts of it are patented and thus cannot be used without paying license fees. The U.S. government has indicated that they would indemnify companies that were sued by possible patent holders, but only if they were implementing the DSS as part of a government contract.
- Elliptic Curve
- Elliptic curve algorithms are discussed later, in the section on key exchange. They can also be used for digital signatures.
Cryptographic Hashes and Message Digests
Cryptographic hashes and message digests were discussed earlier; they are designed to take a long piece of data and generate a shorter value, in a way that makes it easy to detect changes to the long piece of data:- MD4
- MD4 is a cryptographic hash algorithm that calculates a 128-bit number from an input of any length.
This algorithm was designed by Ronald Rivest and released in 1990 as RFC 1186. In 1996 some significant flaws in MD4 were discovered. As a result, MD4 should not be used.
- MD5
- MD5 is a cryptographic hash algorithm that calculates a 128-bit number from an input of any length.
This algorithm was designed by Ronald Rivest as an improvement to MD4 and was released in 1992 as RFC 1321. Research on MD5 has indicated that one of the design goals of a cryptographic hash (collision resistance) has been violated. A general way to generate collisions has not been found, but the research is troubling. Current cryptographic wisdom also suggests that the size of a cryptographic hash function should be at least 160 bits in size in order to be resistant to birthday[192] attacks. Given these issues, MD5 should probably be avoided in situations where long-term digital signatures are required.
[192]A birthday attack is based on probability and is related to the question: How many people have to be in a room for there to be two people with the same birthday? The surprising answer to this is that there is more than a 50 percent chance of having two people with the same birthday if there are at least 23 people in the room. This footnote is too small to contain an explanation, but we encourage you to look this up in any good tutorial on probability.
- SHA and SHA-1
- SHA and SHA-1 are cryptographic hash algorithms that calculate a 160-bit number from an input of any length.
The SHA algorithm is a U.S. NIST standard and was first issued in 1992 for use with the DSA. In 1995 NIST released a technical update to SHA called SHA-1. This update supersedes the previous version and is thought to address a collision problem similar to the one discussed previously in MD5. SHA-1 should now be used instead of SHA. As this algorithm calculates a 160-bit value, it is more resistant to birthday attacks.
- HMAC
- HMAC is a method for combining a cryptographic hash algorithm with a key. It will work with any cryptographic hash that produces at least 128 bits of output. HMAC is described in RFC 2104.
Key Exchange
Key exchange algorithms are used to allow two parties to agree on a shared secret across an unsecured network. They are occasionally more correctly called key agreement algorithms:- Diffie-Hellman
- Diffie-Hellman is a key exchange algorithm that can use varying key sizes (theoretically unlimited).
This algorithm was invented by Whitfield Diffie and Martin Hellman in 1976. It uses exponentiation and modular arithmetic as the basis of its calculations; this is known to be secure, but it involves very large numbers and relatively slow calculations. One of the most important features of Diffie-Hellman is that it can be used to generate a secret that has perfect forward secrecy. Diffie-Hellman was patented, but the patent expired in 1997 so the algorithm can now be freely used.
Diffie-Hellman is a pure key exchange algorithm, which cannot be used to encrypt data. People who claim to be using Diffie-Hellman to encrypt data are extremely confused (unfortunately, because Diffie-Hellman is commonly used for key exchange with bulk encryption schemes, such people are not as rare as one would hope).
- Elliptic Curve
- A new class of algorithms, called elliptic curve algorithms, is based on the mathematics of polynomial equations. (Ellipses are related to elliptic curve algorithms extremely indirectly; elliptic curves use the kinds of polynomials that are also used to calculate facts about ellipses.) Elliptic curve cryptography uses very simple calculations and very complex mathematics (unlike Diffie-Hellman, which uses complicated calculations and elegant mathematics), and the resulting keys are faster to generate and smaller than Diffie-Hellman keys at the same apparent level of security.
Elliptic key algorithms are much newer than Diffie-Hellman, which gives them two significant drawbacks. First, the mathematical foundations, including the cryptographic strengths and weaknesses, are not as well understood. Second, elliptic key algorithms are still subject to patent protection (and there are significant arguments about who owns what patents). As of this writing, elliptic curve algorithms are still considered a relatively risky choice; this may change as more mathematical results are found (it is beginning to look as if they may become a widely accepted choice, but it is also possible that they will become unusable, if there turns out to be a general solution to the problem that makes them difficult to solve).
- RSA
- RSA, the all-purpose cryptographic algorithm, can also be used for key exchange. Like Diffie-Hellman, it is secure but slow and memory-intensive for this purpose.
Key Sizes and Strength
Table C-1 gives our recommendations for acceptable algorithm types and key lengths. This sort of information is volatile; weaknesses are continually being discovered in algorithms; new algorithms are being developed; and both the speed and memory capacity of computers is increasing all the time. However, these are what we were willing to use at the time this tutorial was published. We don't think it will ever be a good idea to use these algorithms with shorter keys than those shown.Table C-1. Acceptable Cryptographic Algorithim and Key Lengths
Purpose | Size (in bits) | Acceptable Algorithms |
---|---|---|
Symmetric encryption | 128 |
IDEA |
Symmetric encryption | 112 | 3DES |
Cryptographic hashes | 160 | SHA-1 |
Cryptographic hashes | 128 | MD5 |
Key exchange | 1400 | Diffie-Hellman |
Key exchange | 1024 | RSA |
Digital signatures | 1024 |
RSA |
Evaluating Other Algorithms
Evaluating the strength of a cryptographic algorithm can be extremely difficult. It's not unusual for people to find problems with algorithms that have been examined before by multiple professional cryptographers. However, this sort of analysis is needed only for new cryptographic algorithms. In general, a reasonably educated and suspicious person can do an adequate job of figuring out whether a cryptographic product is appropriately secure without delving into any of the details of the algorithms involved. A good resource is the "Snake Oil FAQ", published regularly on the sci.crypt newsgroup.In fact, in most cases, all you need is the suspicion. Cryptography is a difficult business: it's hard to come up with good cryptographic algorithms; there are trade-offs between the speed of an algorithm, the memory requirements of an algorithm, and the strength of an algorithm; and no algorithm is perfectly unbreakable. Therefore, any product that advertises a magic new algorithm that runs really fast on small devices and can never be broken is at best over-optimistic and at worst fraudulent.
If you need to evaluate an algorithm, here are some questions you should ask:
- Has the algorithm been published? If not, has it been independently evaluated by multiple professional cryptographers? Independent evaluation is absolutely required to be sure that an algorithm is strong (algorithms are like arguments; everybody finds their own unassailable). Furthermore, a strong algorithm is not weakened by being made public. You should be suspicious of unpublished algorithms, and you should not accept algorithms unless there are independent evaluations of them.
- Is the algorithm being used for its intended purpose? It is possible to use hashing algorithms for encryption, and vice versa, but most algorithms work best when used for what they were designed.
- Is the algorithm being used exactly as designed? Apparently small changes in algorithms (optimizations in how values are calculated, changes in constants, differences in the values used to pad out odd-sized values to the block size needed for block-mode algorithms) can make big changes in the security of the algorithm. All such changes need independent evaluation.
- Are the key sizes being used acceptable? As we've discussed, key sizes mean different things to different algorithms. This makes it hard to decide what key length is long enough. On the other hand, you can often identify cases where a key is too short. It is extremely improbable that a 40-bit key, or even a 56-bit key, will ever again be long enough to protect data with a useful lifetime of more than a day. Present-day symmetric algorithms make maximum use of the available key length, and there is no such algorithm that will hold out against a determined attacker for more than a day with a 56-bit or shorter key.
- How new is the technology? It takes several years to develop enough experience with new techniques to give cryptographers confidence in them.