29 januari 2026 · privacy · encryption

Why we opt for Post-Quantum Resilience

This article explains briefly the reality of the quantum threat, the reason why we decided to already integrate post-quantum cryptography (PQC) into our product today and how we achieved this.

When we began developing Databeamer, our core pillars were already safety and
security. We believe that software should be more than just functional; it must
be logical, rigorously tested, and validated. This philosophy led us to choose
Rust as our core programming language and to prioritize sovereign software
components. By minimizing external dependencies, we have built a product that
respects digital sovereignty.

However, security is a moving target. Beyond our zero-knowledge architecture, we
have made the strategic decision to implement post-quantum resilience. While
some view this as a distant concern, we believe that the current geopolitical
climate and the rapid pace of technological change demand a "better safe than
sorry" approach. This strategy aligns with the European Union’s mandate that
all Member States initiate the transition to post-quantum cryptography by the
end of 2026
.

Our development is guided by the belief that security must be both proactive and
transparent. By embedding post-quantum protocols into our zero-knowledge
framework, we address tomorrow's threats without compromising today's privacy.

This article explains briefly the reality of the quantum threat, the reason why
we decided to already integrate post-quantum cryptography (PQC) into our product
today and how we achieved this.

What is quantum computing and post quantum cryptography?

To understand the solution, we must first understand the problem.

From Bits to Qubits

Classical computers, from the smartphone in your pocket to the world’s most
powerful supercomputers, operate on bits: binary switches that are either 1 or
0. They solve problems by trying options one after another.

Quantum computers operate on entirely different principles. They use qubits,
which, thanks to the laws of quantum mechanics, can exist in a state of
superposition. This allows them to represent multiple states simultaneously.
Furthermore, through entanglement, qubits can be linked so that the state of one
instantly influences another.

A helpful analogy is a mouse in a maze: while a classical computer tries every
path one by one, a quantum computer acts like a flood of water, filling every
path of the maze at the same time to find the exit instantly. This unique
capability allows quantum machines to solve specific mathematical problems—like
factoring the large prime numbers that underpin RSA encryption—at speeds that
would take classical computers billions of years.

Current status

We are currently in what is called the NISQ era (Noisy Intermediate-Scale
Quantum). While hardware leaders like IBM and Google have made significant
improvements with chips like Heron and Willow,
quantum processors remain "noisy" and error-prone. The industry is currently
racing to develop "logical" qubits—error-corrected units stable enough for
sustained calculations.

Most experts, including those at CISA and various European
cybersecurity agencies, predict that a "cryptographically relevant" quantum
computer (one capable of breaking modern encryption) could emerge by the early
2030s. This milestone is often called Q-Day. While 2032 may seem distant, the
transition to new security standards is a massive infrastructural undertaking
that takes years to complete. If we wait for the hardware to arrive before
updating our software, we will already be too late.

Also for the development of the Quantum technology, Europe should be aware not
to rely too much on foreign investors and suppliers
and
develop more in the quantum-patents, standardization and certification. By the way, Europe did present it’s second quantum computer in
September last year
.

Defining Post-Quantum Cryptography (PQC)

It is a common misconception that protecting against quantum computers requires
owning one. In reality, Post-Quantum Cryptography (PQC) is a software solution.
These are new mathematical algorithms designed to run on the classical hardware
we use today, but which rely on problems (such as Lattice-based cryptography)
that are far too complex for even a quantum computer to solve. By updating the
"math" behind our encryption, we can protect our current data from future
threats.

The EU: 2026 is the start of the "Transition Phase.”

The European Union has officially signaled that the "quantum wait" is over.
Following the Commission Recommendation of April 2024, the EU has set 2026 as
the mandatory start year for all Member States to initiate their transition to
Post-Quantum Cryptography (PQC)
.

This initiative is a strategic pillar of European Digital Sovereignty and forms
part of a cohesive legislative front including the Chips Act,
the EU AI Act, the EU Quantum Act and the
Cybersecurity Act — the comprehensive 2026 package designed to
streamline NIS2 compliance and secure ICT supply chains. Together with the
Cyber Resilience Act (CRA), which mandates
'secure-by-design' standards for all digital products, these laws ensure that by
the end of 2026, governments and critical sectors must have national PQC
roadmaps in place and begin active pilots.

This move is a direct response to the 'Harvest Now, Decrypt Later' (HNDL)
threat, ensuring that the data we share today remains protected against the
quantum capabilities of tomorrow.

Is the need for PQC really immediate or is it exaggerated?

There is a valid debate within the security community regarding the urgency of
PQC. Skeptics, such as Peter Gutmann, argue that the threat is
often overstated by marketing departments and that current quantum progress
doesn't justify a massive immediate investment. Gutmann’s strongest point is
that quantum risk is frequently used to distract from the fundamental security
failures of today. He is absolutely correct: issues like broken access controls,
unauthorized provider access, poor key management, and the sprawl of "Shadow IT"
are far more immediate threats to most organizations than a hypothetical quantum
computer.

However, we believe this perspective actually strengthens the argument for a
more robust architectural approach. The focus on future threats should never
come at the expense of current security hygiene. This is precisely why we chose
a memory-safe language like Rust and implemented a zero-knowledge architecture.
These choices solve the "here and now" problems: mitigating memory
vulnerabilities and ensuring we, as the provider, cannot access your data.

Risk for specific data classes

Once those fundamentals are secure, the argument for PQC becomes a matter of
long-term risk management. While it is true that much of our daily data like
routine emails or temporary files, will be commercially irrelevant by the time a
functional quantum computer exists, the threat of "Harvest Now, Decrypt Later"
remains real for specific data classes.

High-value information, such as genomic data, trade secrets, and diplomatic
intelligence, has a "shelf life" that extends far beyond the 2030s. For these
records, encryption failure in ten years is a retrospective disaster. We believe
that for a security-focused product, treating all data as short-lived is a risk
that simply isn't worth taking.

I’ve added some more examples of possible future vulnerable data-categories as
an addendum to this article.

The challenges of transition

Implementing PQC is not without its hurdles. These new algorithms generally
require larger keys and signatures, which can increase the "handshake" time
during a secure connection. This can lead to a slight performance overhead,
particularly on mobile networks or low-power IoT devices. Furthermore, because
RSA and ECC have been battle-tested for decades, there is an inherent "newness
risk" with PQC. There is always a slim theoretical possibility that a flaw could
be found in these new mathematical structures as they face wider scrutiny.

Databeamer’s approach

Our development of Databeamer is guided by the belief that security must be both
proactive and transparent. By embedding post-quantum protocols into our
zero-knowledge framework, we address tomorrow’s threats without compromising
today’s privacy. In the spirit of 'don't trust, verify,' we have outlined our
specific implementation of these safeguards below.

Hybrid solution

Within Databeamer, we have adopted the hybrid encryption model (as also
recommended by the European Commission) within our
zero-knowledge architecture.

We combine ML-KEM-768 (NIST-standardized post-quantum cryptography) with X25519
(classical elliptic curve) for key encapsulation. An attacker must break both to
compromise your files, so it is protecting against both today's threats and
future quantum computers.

We selected ML-KEM-768 (NIST security level 3) as it provides strong protection
against quantum attacks while maintaining fast performance for everyday use.

Your files are encrypted with ChaCha20-Poly1305, a proven encryption standard.
The encryption keys are protected using our hybrid post-quantum approach, where
we combine both key exchanges using HKDF-SHA256 (the same method used in TLS
1.3).

Zero-Knowledge Architecture

Your password generates your encryption keys using the OPAQUE protocol: our
servers never see your keys or file contents. Each file gets a unique encryption
key, and we rotate keys when team members change, ensuring forward secrecy. Our
implementation uses audited cryptographic libraries, with independent security
audits planned.

Hybrid Digital Signatures

Operations like file uploads, key rotations, and group membership changes are
signed using both ML-DSA-65 (post-quantum) and Ed25519 (classical). This ensures
authenticity and prevents tampering.

By using this approach, we ensure that if a flaw is discovered in the new PQC
math (although we use a standardized and recommended PCQ, it is fairly new),
your data remains protected by the classical layer and the post-quantum layer
can be changed by a improved standardized version.

Future-safe security

European authorities, including ENISA and the Joint Research Centre
(JRC), are clear: the transition to quantum safety should not be a leap of
faith. That is the reason we follow their advice to use the hybrid “Safety Net"
approach. This strategy serves two critical goals:

  • 1.Risk Mitigation: Since PQC math is relatively new, a hybrid setup ensures that even if a future flaw is found in a quantum algorithm, your data remains shielded by the classical layer. An attacker must break both to gain access.

  • 2.Crypto-Agility: By building a modular, multi-layered system today, we ensure Crypto-Agility. As standards evolve, we can swap or update algorithms without rebuilding the entire infrastructure, guaranteeing operational continuity well into the 2030s while protecting against "Harvest Now, Decrypt Later" threats today.

Moving Forward

Digital sovereignty and long-term security are inseparable. Just as we have
learned that trust in infrastructure can be upended by geopolitical shifts, we
must recognize that the mathematical foundations of our security are also
subject to change.

Some data must remain confidential for longer than current cryptography can
guarantee. By adopting post-quantum cryptography today, we protect against
"harvest now, decrypt later" attacks where adversaries collect encrypted data
hoping to break it with future quantum computers. We aren't just following a
trend but we are ensuring that the trust our users place in Databeamer remains
valid for the future.

Addendum: vulnerable data classes

This addendum list some extra examples of data that could still be valuable in
30+ year future. Some data ages quickly. Some data gains value over time. The
mistake is treating all data as ephemeral.

1.Many data classes are valuable because they persist

Examples where 30+ year confidentiality matters:

Government & geopolitics

  • Diplomatic cables (confidential diplomatic messages)
  • Intelligence-sharing agreements
  • Military planning assumptions

Strategic vulnerabilities
History shows

  • Cold War documents decrypted decades later still reshaped geopolitics
  • Old intelligence remains useful for leverage, attribution, or pressure

The value is often contextual, not temporal.

Health & genetic data

  • Genomic data is essentially timeless
  • Medical records can affect:
  • Insurance
  • Employment
  • Family members (not just the individual)

You can’t “rotate” your genome.

IP, R&D, and trade secrets

  • Core algorithms
  • Chemical formulas
  • Engineering designs
  • Manufacturing processes

These often:

  • Remain relevant for decades
  • Become more valuable when competitors think they’re safe

Example:

  • Semiconductor processes
  • Industrial control logic
  • Pharma research paths (even failed ones)

Legal & compliance data

  • Contracts
  • NDAs
  • Legal strategies
  • Arbitration records

Old legal documents can:

  • Re-open disputes
  • Reveal liability
  • Enable retroactive claims

Identity & credential material

  • Identity documents
  • Authentication flows
  • Historical access patterns

Even if credentials rotate:

  • Identity does not
  • Behavioral patterns do not fully expire

2. “Outdated” does not mean “harmless”

This is the most underestimated flaw.

Even outdated data can be used for:

  • Correlation
  • Blackmail
  • Influence operations
  • Training AI models
  • Social engineering

Example:
An old org chart + old email styles + old credentials
→ very effective
spear-phishing today

3. Data value is often revealed later

Many data sets only become valuable after:

  • Political change
  • Legal regime change
  • Corporate mergers
  • Technological shifts

History is full of:

  • Archived communications becoming explosive later
  • “Harmless” data turning sensitive in hindsight