The level of expense required for strong cryptography originally restricted its use to the government and military agencies,11 until the middle of the 20th century the process of encryption required a lot of human labor and errors (preventing the decryption) were very common, so only a small share of written information could have been encrypted.12 US government, in particular, was able to keep a monopoly on the development and use of cryptography in the US into the 1960s.13 In the 1970, the increased availability of powerful computers and unclassified research breakthroughs (Data Encryption Standard, the Diffie-Hellman and RSA algorithms) made strong cryptography available for civilian use.14 Mid-1990s saw the worldwide proliferation of knowledge and tools for strong cryptography.15 By the 21st century the technical limitations were gone, although the majority of the communication were still unencrypted.16 At the same the cost of building and running systems with strong cryptography became roughly the same as the one for the weak cryptography.17
The use of computers changed the process of cryptanalysis, famously with Bletchley Park's Colossus. But just as the development of digital computers and electronics helped in cryptanalysis, it also made possible much more complex ciphers. It is typically the case that use of a quality cipher is very efficient, while breaking it requires an effort many orders of magnitude larger - making cryptanalysis so inefficient and impractical as to be effectively impossible.
This term "cryptographically strong" is often used to describe an encryption algorithm, and implies, in comparison to some other algorithm (which is thus cryptographically weak), greater resistance to attack. But it can also be used to describe hashing and unique identifier and filename creation algorithms. See for example the description of the Microsoft .NET runtime library function Path.GetRandomFileName.18 In this usage, the term means "difficult to guess".
An encryption algorithm is intended to be unbreakable (in which case it is as strong as it can ever be), but might be breakable (in which case it is as weak as it can ever be) so there is not, in principle, a continuum of strength as the idiom would seem to imply: Algorithm A is stronger than Algorithm B which is stronger than Algorithm C, and so on. The situation is made more complex, and less subsumable into a single strength metric, by the fact that there are many types of cryptanalytic attack and that any given algorithm is likely to force the attacker to do more work to break it when using one attack than another.
There is only one known unbreakable cryptographic system, the one-time pad, which is not generally possible to use because of the difficulties involved in exchanging one-time pads without them being compromised. So any encryption algorithm can be compared to the perfect algorithm, the one-time pad.
The usual sense in which this term is (loosely) used, is in reference to a particular attack, brute force key search — especially in explanations for newcomers to the field. Indeed, with this attack (always assuming keys to have been randomly chosen), there is a continuum of resistance depending on the length of the key used. But even so there are two major problems: many algorithms allow use of different length keys at different times, and any algorithm can forgo use of the full key length possible. Thus, Blowfish and RC5 are block cipher algorithms whose design specifically allowed for several key lengths, and who cannot therefore be said to have any particular strength with respect to brute force key search. Furthermore, US export regulations restrict key length for exportable cryptographic products and in several cases in the 1980s and 1990s (e.g., famously in the case of Lotus Notes' export approval) only partial keys were used, decreasing 'strength' against brute force attack for those (export) versions. More or less the same thing happened outside the US as well, as for example in the case of more than one of the cryptographic algorithms in the GSM cellular telephone standard.
The term is commonly used to convey that some algorithm is suitable for some task in cryptography or information security, but also resists cryptanalysis and has no, or fewer, security weaknesses. Tasks are varied, and might include:
Cryptographically strong would seem to mean that the described method has some kind of maturity, perhaps even approved for use against different kinds of systematic attacks in theory and/or practice. Indeed, that the method may resist those attacks long enough to protect the information carried (and what stands behind the information) for a useful length of time. But due to the complexity and subtlety of the field, neither is almost ever the case. Since such assurances are not actually available in real practice, sleight of hand in language which implies that they are will generally be misleading.
There will always be uncertainty as advances (e.g., in cryptanalytic theory or merely affordable computer capacity) may reduce the effort needed to successfully use some attack method against an algorithm.
In addition, actual use of cryptographic algorithms requires their encapsulation in a cryptosystem, and doing so often introduces vulnerabilities which are not due to faults in an algorithm. For example, essentially all algorithms require random choice of keys, and any cryptosystem which does not provide such keys will be subject to attack regardless of any attack resistant qualities of the encryption algorithm(s) used.
See also: Cryptography § Forced disclosure of encryption keys
Widespread use of encryption increases the costs of surveillance, so the government policies aim to regulate the use of the strong cryptography.19 In the 2000s, the effect of encryption on the surveillance capabilities was limited by the ever-increasing share of communications going through the global social media platforms, that did not use the strong encryption and provided governments with the requested data.20 Murphy talks about a legislative balance that needs to be struck between the power of the government that are broad enough to be able to follow the quickly-evolving technology, yet sufficiently narrow for the public and overseeing agencies to understand the future use of the legislation.21
The initial response of the US government to the expanded availability of cryptography was to treat the cryptographic research in the same way the atomic energy research is, i.e., "born classified", with the government exercising the legal control of dissemination of research results. This had quickly found to be impossible, and the efforts were switched to the control over deployment (export, as prohibition on the deployment of cryptography within the US was not seriously considered).22
Main article: Export of cryptography from the United States
The export control in the US historically uses two tracks:23
Since the original applications of cryptography were almost exclusively military, it was placed on the munitions list. With the growth of the civilian uses, the dual-use cryptography was defined by cryptographic strength, with the strong encryption remaining a munition in a similar way to the guns (small arms are dual-use while artillery is of purely military value).24 This classification had its obvious drawbacks: a major bank is arguably just as systemically important as a military installation,25 and restriction on publishing the strong cryptography code run against the First Amendment, so after experimenting in 1993 with the Clipper chip (where the US government kept special decryption keys in escrow), in 1996 almost all cryptographic items were transferred to the Department of Commerce.26
The position of the EU, in comparison to the US, had always been tilting more towards privacy. In particular, EU had rejected the key escrow idea as early as 1997. European Union Agency for Cybersecurity (ENISA) holds the opinion that the backdoors are not efficient for the legitimate surveillance, yet pose great danger to the general digital security.27
The Five Eyes (post-Brexit) represent a group of states with similar views one the issues of security and privacy. The group might have enough heft to drive the global agenda on the lawful interception. The efforts of this group are not entirely coordinated: for example, the 2019 demand for Facebook not to implement end-to-end encryption was not supported by either Canada or New Zealand, and did not result in a regulation.28
President and government of Russia in 90s has issued a few decrees formally banning uncertified cryptosystems from use by government agencies. Presidential decree of 1995 also attempted to ban individuals from producing and selling cryptography systems without having appropriate license, but it wasn't enforced in any way as it was suspected to be contradictory the Russian Constitution of 1993 and wasn't a law per se.29303132 The decree of No.313 issued in 2012 further amended previous ones allowing to produce and distribute products with embedded cryptosystems and requiring no license as such, even though it declares some restrictions.3334 France had quite strict regulations in this field, but has relaxed them in recent years.
Examples that are not considered cryptographically strong include:
Vagle 2015, p. 121. - Vagle, Jeffrey L. (2015). "Furtive Encryption: Power, Trusts, and the Constitutional Cost of Collective Surveillance". Indiana Law Journal. 90 (1). https://www.repository.law.indiana.edu/cgi/viewcontent.cgi?article=11134&context=ilj ↩
Vagle 2015, p. 113. - Vagle, Jeffrey L. (2015). "Furtive Encryption: Power, Trusts, and the Constitutional Cost of Collective Surveillance". Indiana Law Journal. 90 (1). https://www.repository.law.indiana.edu/cgi/viewcontent.cgi?article=11134&context=ilj ↩
Levy, Steven (12 July 1994). "Battle of the Clipper Chip". New York Times Magazine. pp. 44–51. /wiki/New_York_Times_Magazine ↩
"Encryption and Export Administration Regulations (EAR)". bis.doc.gov. Bureau of Industry and Security. Retrieved 24 June 2023. https://www.bis.doc.gov/index.php/policy-guidance/encryption ↩
Reinhold 1999, p. 3. - Reinhold, Arnold G. (September 17, 1999). Strong Cryptography The Global Tide of Change. Cato Institute Briefing Papers No. 51. Cato Institute. https://www.cato.org/briefing-paper/strong-cryptography-global-tide-change ↩
Schneier 1998, p. 2. - Schneier, Bruce (1998). "Security pitfalls in cryptography" (PDF). Retrieved 27 March 2024. http://www.madchat.fr/crypto/papers/pitfalls.pdf ↩
Schneier 1998, p. 3. - Schneier, Bruce (1998). "Security pitfalls in cryptography" (PDF). Retrieved 27 March 2024. http://www.madchat.fr/crypto/papers/pitfalls.pdf ↩
Schneier 1998, p. 4. - Schneier, Bruce (1998). "Security pitfalls in cryptography" (PDF). Retrieved 27 March 2024. http://www.madchat.fr/crypto/papers/pitfalls.pdf ↩
Vagle 2015, p. 110. - Vagle, Jeffrey L. (2015). "Furtive Encryption: Power, Trusts, and the Constitutional Cost of Collective Surveillance". Indiana Law Journal. 90 (1). https://www.repository.law.indiana.edu/cgi/viewcontent.cgi?article=11134&context=ilj ↩
Diffie & Landau 2007, p. 725. - Diffie, Whitfield; Landau, Susan (2007). "The export of cryptography in the 20th and the 21st centuries". The History of Information Security. Elsevier. pp. 725–736. doi:10.1016/b978-044451608-4/50027-4. ISBN 978-0-444-51608-4. https://doi.org/10.1016%2Fb978-044451608-4%2F50027-4 ↩
Vagle 2015, p. 109. - Vagle, Jeffrey L. (2015). "Furtive Encryption: Power, Trusts, and the Constitutional Cost of Collective Surveillance". Indiana Law Journal. 90 (1). https://www.repository.law.indiana.edu/cgi/viewcontent.cgi?article=11134&context=ilj ↩
Vagle 2015, p. 119. - Vagle, Jeffrey L. (2015). "Furtive Encryption: Power, Trusts, and the Constitutional Cost of Collective Surveillance". Indiana Law Journal. 90 (1). https://www.repository.law.indiana.edu/cgi/viewcontent.cgi?article=11134&context=ilj ↩
Diffie & Landau 2007, p. 731. - Diffie, Whitfield; Landau, Susan (2007). "The export of cryptography in the 20th and the 21st centuries". The History of Information Security. Elsevier. pp. 725–736. doi:10.1016/b978-044451608-4/50027-4. ISBN 978-0-444-51608-4. https://doi.org/10.1016%2Fb978-044451608-4%2F50027-4 ↩
Path.GetRandomFileName Method (System.IO), Microsoft http://msdn.microsoft.com/en-us/library/system.io.path.getrandomfilename.aspx ↩
Riebe et al. 2022, p. 42. - Riebe, Thea; Kühn, Philipp; Imperatori, Philipp; Reuter, Christian (2022-02-26). "U.S. Security Policy: The Dual-Use Regulation of Cryptography and its Effects on Surveillance" (PDF). European Journal for Security Research. 7 (1). Springer Science and Business Media LLC: 39–65. doi:10.1007/s41125-022-00080-0. ISSN 2365-0931. https://link.springer.com/content/pdf/10.1007/s41125-022-00080-0.pdf?pdf=button ↩
Riebe et al. 2022, p. 58. - Riebe, Thea; Kühn, Philipp; Imperatori, Philipp; Reuter, Christian (2022-02-26). "U.S. Security Policy: The Dual-Use Regulation of Cryptography and its Effects on Surveillance" (PDF). European Journal for Security Research. 7 (1). Springer Science and Business Media LLC: 39–65. doi:10.1007/s41125-022-00080-0. ISSN 2365-0931. https://link.springer.com/content/pdf/10.1007/s41125-022-00080-0.pdf?pdf=button ↩
Murphy 2020. - Murphy, Cian C (2020). "The Crypto-Wars myth: The reality of state access to encrypted communications". Common Law World Review. 49 (3–4). SAGE Publications: 245–261. doi:10.1177/1473779520980556. hdl:1983/3c40a9b4-4a96-4073-b204-2030170b2e63. ISSN 1473-7795. https://journals.sagepub.com/doi/10.1177/1473779520980556 ↩
Diffie & Landau 2007, p. 726. - Diffie, Whitfield; Landau, Susan (2007). "The export of cryptography in the 20th and the 21st centuries". The History of Information Security. Elsevier. pp. 725–736. doi:10.1016/b978-044451608-4/50027-4. ISBN 978-0-444-51608-4. https://doi.org/10.1016%2Fb978-044451608-4%2F50027-4 ↩
Diffie & Landau 2007, p. 727. - Diffie, Whitfield; Landau, Susan (2007). "The export of cryptography in the 20th and the 21st centuries". The History of Information Security. Elsevier. pp. 725–736. doi:10.1016/b978-044451608-4/50027-4. ISBN 978-0-444-51608-4. https://doi.org/10.1016%2Fb978-044451608-4%2F50027-4 ↩
Diffie & Landau 2007, p. 728. - Diffie, Whitfield; Landau, Susan (2007). "The export of cryptography in the 20th and the 21st centuries". The History of Information Security. Elsevier. pp. 725–736. doi:10.1016/b978-044451608-4/50027-4. ISBN 978-0-444-51608-4. https://doi.org/10.1016%2Fb978-044451608-4%2F50027-4 ↩
Diffie & Landau 2007, p. 730. - Diffie, Whitfield; Landau, Susan (2007). "The export of cryptography in the 20th and the 21st centuries". The History of Information Security. Elsevier. pp. 725–736. doi:10.1016/b978-044451608-4/50027-4. ISBN 978-0-444-51608-4. https://doi.org/10.1016%2Fb978-044451608-4%2F50027-4 ↩
Farber, Dave (1995-04-06). "A ban on cryptography in Russia (fwd) [Next .. djf]". Retrieved 2011-02-14. http://www.interesting-people.org/archives/interesting-people/199504/msg00018.html ↩
Antipov, Alexander (1970-01-01). "Пресловутый указ №334 о запрете криптографии". www.securitylab.ru (in Russian). Retrieved 2020-09-21. https://www.securitylab.ru/informer/240707.php ↩
"Указ Президента Российской Федерации от 03.04.1995 г. № 334". Президент России (in Russian). Retrieved 2020-09-21. http://kremlin.ru/acts/bank/7701 ↩
The sources provided here are in Russian. To alleviate the problem of lack of English-written ones the sources are cited by using official government documents. ↩
"Положение о лицензировании деятельности по разработке, производству, распространению шифровальных средств и систем". Российская газета (in Russian). Retrieved 2020-09-21. https://rg.ru/2012/04/24/shifry-site-dok.html ↩
"Миф №49 "В России запрещено использовать несертифицированные средства шифрования"". bankir.ru (in Russian). Retrieved 2020-09-21. http://bankir.ru/publikacii/20090714/mif-49-v-rossii-zaprescheno-ispolzovat-nesertificirovannie-sredstva-shifrovaniya-2228626/ ↩
Security Bulletin: Sweet32 vulnerability that impacts Triple DES cipher. IBM Security Bulletin, 2016. https://www.ibm.com/support/pages/security-bulletin-sweet32-vulnerability-impacts-triple-des-cipher-affects-communications-server-data-center-deployment-communications-server-aix-linux-linux-system-z-and-windows-cve-2016-2183 ↩