Why Cryptographic Agility Is Now a Software Quality Requirement
- Giuliana Bruni
- Jun 5
- 3 min read

Security is often discussed in terms of compliance, risk management, or response planning. These, however, are outcomes - not the origins - of secure design. The presence of robust security mechanisms is more accurately a reflection of engineering quality. In software development, quality is not just about how well something performs when it works, but how well it holds up under scrutiny, pressure, and evolving threats.
As the software industry moves towards higher standards of assurance, particularly around supply chain transparency and cryptographic resilience, it becomes harder to separate quality from security. The two are inherently linked.
As quantum computing moves out of research and into policy discussions, a new layer of this security–quality relationship is being exposed: cryptographic resilience. The concern is straightforward. Most cryptographic algorithms in use today were not designed to withstand quantum decryption techniques, they pose an imminent and increasingly recognized risk to digital infrastructure.
Governments have already started to respond. NIST has published a shortlist of post-quantum cryptography (PQC) algorithms, and standards bodies are pushing organisations to identify and eventually replace quantum-vulnerable encryption. The transition to PQC is likely to be gradual, but that doesn’t make it optional. Software written today may still be in production long after quantum attacks become a reality.
This underscores how deeply security and quality are intertwined. Cryptographic agility, which we covered before, is part of building software that can adapt to a changing threat landscape. But to achieve it, developers and security teams first need to know what cryptography is being used across their codebase, where it comes from, and whether it is fit for purpose.
Many tools built for software composition analysis don’t answer those questions. They are focused on license compliance or known vulnerabilities. That’s useful, but incomplete. Without visibility into how encryption is implemented, organisations may be unaware of silent dependencies that pose future risks. This is particularly relevant given that the vast majority of modern software is built on open source components, where transitive dependencies are often deeply nested and difficult to track manually.
Bringing security under the quality umbrella requires better transparency. Not just at the surface level, but deep into the structure of the software—where cryptographic libraries are used, how they are configured, and whether they align with current and future standards.
At SCANOSS, we’re building tools to support this kind of analysis. Not to enforce rules, but to help software teams make informed, long-term decisions. We’ve already introduced datasets that map the use of cryptographic algorithms in open source code, enabling assessment of exposure to quantum-vulnerable encryption and preparedness for post-quantum standards. It’s a step towards what we see as a necessary shift: treating cryptographic visibility as a core part of software quality.
Security is not an external layer. It is a reflection of how software is built. And if software is to remain trustworthy in the years ahead, that quality standard must evolve to include quantum resilience—not later, but now, while there is still time to act on it.
Consider how your organisation defines quality. Does it account for emerging risks, such as quantum vulnerabilities, supply chain transparency, and cryptographic agility?
Security is not separate from quality. It is quality. And in the coming years, the ability to build and maintain quantum-safe software will define the leaders in software assurance and trust.
If you're interested in learning how SCANOSS supports quantum safety, open source risk management, and cryptographic transparency, contact us.