Quantum computing may still be emerging, but the risk it poses to today’s encryption has already entered the planning horizon for banks and market infrastructures. At the Singapore FinTech Festival 2025, Hart Montgomery — chief technology officer of Linux Foundation Decentralised Trust (LFDT) and executive director of the Post-Quantum Cryptography Alliance (PQCA) — highlighted three priorities for the financial sector: inventory, agility and open collaboration. “Whenever we send something across the Internet today it’s probably encrypted,” he said. “But someone malicious could be storing that data. If my data needs to be secure for 20 years, maybe it’s already problematic that I’m sending it with classical encryption today.” Montgomery’s perspective is shaped by his role at the heart of two major open-source communities. LFDT provides a neutral home for digital-trust components relied on by central banks and financial institutions — members include the Bank of Korea, Banque de France and Deutsche Bundesbank — while the PQCA brings together firms such as AWS, Cisco, IBM and Google to translate post-quantum standards into practical tools. Quantum risk on a board-level timeline Montgomery stressed that quantum-risk planning is less about predicting an exact breakthrough date and more about recognising migration lead times. The Bank for International Settlements has warned that upgrading cryptographic infrastructure across market participants is a complex, multi-year process that requires phased transition and sector-wide coordination. In practice, that means a credible quantum threat in the 2030s already sits within today’s engineering and governance horizon. Large financial institutions are starting to respond. Banks and payment networks including JPMorgan, HSBC, Citi, Visa and Mastercard are piloting quantum-safe algorithms, upgrading internal cryptographic libraries and running early transition exercises. Supervisory bodies such as the Monetary Authority of Singapore, National Institute of Standards and Technology (NIST) and the European Union Agency for Cybersecurity (ENISA) are increasingly explicit that firms should understand where cryptography is used and how it will be migrated. Yet Montgomery sees a widening gap between awareness and readiness. Many institutions, he said, still cannot answer basic questions about which algorithms, key lengths and protocols protect their critical systems — particularly where smart cards, embedded devices and legacy infrastructure are involved. The defining capability: cryptographic agility Montgomery drew a clear line between institutions that will cope with quantum migration and those that will struggle: cryptographic agility. By agility, he means the ability to change cryptographic algorithms, protocols or keys without rewriting business applications or disrupting core services. In agile architectures, cryptography is isolated behind dedicated services or libraries. Application teams call those services through stable interfaces, while security teams can upgrade underlying algorithms as needed. “When you can swap out cryptography with relatively small effort, quantum migration looks like a large but manageable programme,” he said. “When it’s hard-coded into business logic across decades of systems, it becomes a much riskier exercise.” Large technology firms have treated this as core engineering discipline for years, designing their internal cryptographic toolkits to be modular and easily updated. Montgomery’s concern is that many financial institutions have not yet made this shift. Inventory as the starting discipline For institutions that have progressed furthest, Montgomery sees one common starting point: a cryptographic inventory. “The first thing we see financial institutions do is an inventory,” he said, emphasising the need for a cryptographic bill of materials — a catalogue of algorithms, key types, protocols and libraries deployed across the organisation. This extends the idea of a software bill of materials into the cryptographic layer. Such inventories rarely only surface quantum-vulnerable components. They also uncover weaknesses that matter today: outdated hash functions, short keys, legacy certificates and inconsistent implementations across systems. In that sense, quantum preparation becomes an opportunity to clean up broader cryptographic debt. “An inventory gives you a factual baseline,” Montgomery said. “You can’t prioritise or sequence a migration plan if you don’t know what you’re running.” Convergence on standards, uncertainty on hybrids On the choice of post-quantum algorithms themselves, Montgomery sees encouraging alignment. Regulators and industry are broadly converging around the new standards selected by NIST — the next-generation replacements for today’s RSA and elliptic-curve cryptography — as the baseline set to replace current public-key schemes. He does not expect major divergence between jurisdictions on those core algorithms. Instead, he views the main risk as lying in hybrid deployments — intermediate models that combine classical and post-quantum techniques. Hybrids are likely to be required for a period, particularly where institutions need backward compatibility. However, without guidance, firms could adopt incompatible combinations that complicate supervision and interoperability. Montgomery argued that regulators who expect hybrids should indicate preferred patterns so that markets do not fragment around ad hoc designs. Open collaboration as a security requirement Montgomery underlined that cryptographic software is inherently a public good: even when products are commercial, the code underpinning critical algorithms must be open to inspection, verification and peer review. This is where Linux Foundation’s governance model becomes important. “Linux Foundation solves the problem of companies that may not trust each other but still need to collaborate on secure code,” he said. LFDT and PQCA provide neutral venues where banks, technology firms and vendors can coordinate on shared libraries, reference implementations and testing. With post-quantum cryptography, this collaboration becomes systemic. No single institution can validate every implementation detail alone, and fragmented efforts risk inconsistent security properties. Open, jointly governed projects reduce that risk. Beyond quantum: making cryptography safer to use Looking further ahead, Montgomery expects a shift in how developers interact with cryptography in general, not just post-quantum schemes. As systems rely more on advanced techniques for privacy and authentication, he argued, it is unrealistic to expect every developer to understand the underlying mathematics or protocol subtleties. “Cryptography is hard for developers to use,” he said. “We just can’t ask that of everyone.” The alternative is secure-by-design abstraction: building strong cryptographic services and patterns into platforms so that developers consume security as a simple, reliable capability. For financial institutions, that vision dovetails with quantum preparation. Architectures that isolate cryptography, centralise key management and provide safe defaults will be better positioned to rotate keys, patch vulnerabilities and adopt new standards when needed. A pragmatic call to action Montgomery’s message to banks and market infrastructures is ultimately practical rather than alarmist. Quantum risk is real, timelines are uncertain, but the path to resilience is clear. Institutions that start with a cryptographic inventory, invest in agility and participate in open standards efforts will be positioned to move early and calmly. Those that delay, or continue to embed cryptography deep inside legacy business logic, risk facing a rushed and complex migration under regulatory and market pressure. “We’re always going to rely on encryption,” he said. “The question is whether we build systems that can evolve when the world changes.”