This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
#Sei Giga and the problem most chains aren’t designing for yet
I don’t think about quantum risk in crypto as a future key swap where it’s just new signatures and libraries.
That framing completely breaks once you look at high-performance chains.
At Giga scale, post-quantum security is a systems problem:
– Current ECDSA signatures are ~64 bytes
– Post-quantum NIST signatures jump to 1.3KB–8KB+
– At 200k TPS, that’s ~0.5 to 1.5 GB per second in signature data alone
– The chain turns into a signature DA layer with an EVM attached
But @SeiNetwork's design naturally leans toward:
– zk proof batching
– recursive aggregation
– hash-based, post-quantum-friendly verification paths
– constant-size verification per block instead of per tx
Which opens space for:
– commit now, verify later models
– economic bonding instead of immediate cryptographic certainty
– treating post-quantum security as an incentive design problem
In practice, tx commits to a hash of its post-quantum witness, and expensive verification only happens when challenged or needed.
This is closer to how high-performance distributed systems survive real adversarial load.
In my mind, Giga fits a post-quantum future because:
– it treats verification cost as a first-class scaling constraint
– it designs around proof systems, not just signatures
– it frames security as a protocol + incentive + migration problem
– it assumes quantum safety must coexist with internet-scale throughput
Might be one of the few architectures that can become quantum secure without sacrificing performance. ($/acc)