Bad Crypto 101: Spotting Red Flags

Estimated difficulty: ๐Ÿ’œ๐Ÿ’œ๐Ÿค๐Ÿค๐Ÿค

This post is heavily inspired by Simson Garfinkel’s A Field Guide to Spotting Bad Cryptography (2005) – link below – and aims to give you a quick, lightweight overview of things to look out for when you’re trying to assess the security of cryptography in the wild, without dissecting an algorithm.

If you’re new to cryptography, for specific definitions of some security services (and other cryptography terms), check out this post – which I’ll keep updated with terms as I publish more in this series.

Firstly…

Glad that’s out of the way.

Cryptography might seem intimidating, or like something you only see in movies, when the protagonist’s sidekick hacks the Pentagon in four minutes from a moving car during a high-speed chase, but cryptography is all around us. It’s used to secure communication channels, protect electronic payments or your home wireless network, and even to unlock your car.

So how can you tell good crypto from bad crypto? To be sure of good crypto, it takes a team of experts, but there are a few standout issues that can let you know quickly that crypto is probably bad. This list isn’t exhaustive, but includes some things to look out for. Leave a comment below if you can think of any other red flags! ๐Ÿ˜Š

Assuming encryption provides total security
There are several security services cryptography can provide, and different cryptographic primitives and functions can be used for these purposes. A common mistake is to assume that because something uses cryptography, it provides security services that it actually doesn’t. Examples of this are the incorrect ideas that cryptocurrency is anonymous, or that simply encrypting information prevents it from being altered (data integrity).

Key length problems
Encryption keys need to be long enough that it would take a computer an appropriate length of time to work them out or otherwise break the algorithm (this is referred to as cover time). If keys are too short, the cryptosystem is unlikely to withstand an exhaustive key search (a sort of brute force attack where an attacker basically tries as many keys as possible until they find the right one). Cryptographic algorithms have standard key lengths – like AES (Advanced Encryption Standard), which is commonly implemented with 128-bit, 192-bit, or 256-bit keys, to make sure the keys are long enough.

Conversely, keys mustn’t be too long – longer keys are associated with providing “more security”, but there’s a point where the computational expense vs security trade-off is no longer worth it. Because of the processing constraints of current technology, increasing key length beyond what’s used in these common implementations becomes too computationally expensive and doesn’t add further value – which is why you won’t sensibly see 1024-bit AES keys being widely used any time soon. For more in-depth information regarding key lengths according to specific standards, check out BlueKrypt.

Poor Baby Yoda.

Key management problems
One of Kerckhoff’s Principles is that you need to be able to change cryptographic keys at will; another key principle of cryptography is the concept of key separation. To explain this point, we’ll use WEP (Wired Equivalent Privacy) as an example. There are numerous problems with WEP, but for now we’ll just focus on the key management issues:

Firstly, WEP uses a single master key for the whole WLAN, and the same key is used to generate the encryption key. Because of how the key generation happens, there’s a dependency on the master key, which is also partially exposed – and this defies the principle of key separation. Next, WEP is symmetric, so the master key is shared by every node on the network, and keys are fixed.

WEP also uses RC4 (a kind of stream cipher, very insecure), which can accept keys as short as 40 bits – far too short to be considered secure against modern exhaustive key search attacks. When generating/deriving keys, a lot of WEP implementations also allow keys to be derived from user input – if the user input isn’t long or complex enough, it reduces the potential keyspace (different to key length – think of this as the number of possible keys).

Proprietary or secret algorithms
Another one of Kerckhoff’s Principles is that when designing a cryptosystem, the security of the system shouldn’t rely on the secrecy of the algorithm – in fact, given all the information about a cryptosystem besides the key, the system should still be secure. If an algorithm is kept secret, it might not be evaluated or tested as fully as possible – so there could be hidden flaws in the system. (Note: This is different to asymmetric/public-key cryptography, which uses a public and a private key.)

Similar principles apply to proprietary cryptography – two heads are better than one, and so on – even professional cryptographers struggle to design secure, new algorithms when working alone. When a new algorithm is developed, usually an agency such as NIST will evaluate the security of the algorithm for a long while before it’s adopted publicly. Although some government bodies use proprietary algorithms, individuals should follow Boromir’s advice.

Even Boromir knows.

Algorithms based on new problems
The same as with proprietary or secret algorithms, algorithms based on new problems should be widely examined, scrutinised and tested before they’re adopted. Asymmetric cryptographic algorithms currently most often rely on the factorisation of primes problem, or the modular exponentiation problem. The mathematical problem at the heart of an algorithm needs to be difficult enough to solve to provide the cover time needed – and these problems are quite rare and difficult to prove. There are exceptions to this – there’s currently a NIST Post-Quantum Cryptographic Algorithm CFP which is evaluating algorithms based on novel problems (like lattice vector cryptography). Submissions to this CFP will be rigorously tested for years before they’re standardised and adopted publicly. In most cases though, if someone claims to have invented a sparkly new cryptographic algorithm based on a totally new problem – treat them with skepticism.

Here’s the link to Garfinkel’s article, which I’d recommend reading if you have the time: https://www.csoonline.com/article/2119351/a-field-guide-to-spotting-bad-cryptography.html

As always – if you found this helpful or have any constructive feedback, leave a comment and let me know!

Morgan x

One Comment

  1. What are some key aspects to consider when assessing the security of cryptography in real-world scenarios, as outlined in Simson Garfinkel’s “A Field Guide to Spotting Bad Cryptography”?

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.