As Ed Felten dissects and analysis this mess, he concludes that:
Unmasking of the algorithm should have been no problem, had the system been engineered well. Kerckhoffs’s Principle, one of the bedrock maxims of cryptography, says that security should never rely on keeping an algorithm secret. It’s okay to have a secret key , if the key is randomly chosen and can be changed when needed, but you should never bank on an algorithm remaining secret.Perhaps the organization that designs and implements this system has been warned internally by people who is aware of this kind of principle, which can be found in any cryptography text, but chooses to ignore it. This is not an unusual reaction in many software organization.
Unfortunately the designers of Mifare Classic did not follow this principle. Instead, they chose to combine a secret algorithm with a relatively short 48-bit key.
[...]
This kind of disaster would have been less likely had the design process been more open. Secrecy was not only an engineering mistake (violating Kerckhoffs’s Principle) but also a policy mistake, as it allowed the project to get so far along before independent analysts had a chance to critique it. A more open process, like the one the U.S. government used in choosing the Advanced Encryption Standard (AES) would have been safer. Governments seem to have a hard time understanding that openness can make you more secure.
Many manager also have the view that if you can program in one area of expertise you can program in any area.
I have encountered so many muttering like this: We can't crack this key or reverse engineer it, so it must be secure!
Ed Felten correctly identifies the other failure is the lack of checks and inspections in a system of this magnitude and importance. I am wondering how they can now argue that Inspection would cost their project more. This is a classic example that using Inspection (using subject experts of course) would have save $2 billion!