New research led by a top MIT researcher suggests that decentralization is not just a design choice, but a principle of efficiency, with control falling away as systems grow larger.
In the crypto industry, decentralization is loosely defined as the distribution of power and control among multiple independent participants, rather than a single central authority.
It means that no entity – such as a company or a government – can on its own decide how the system works, change the rules or stop transactions.
“The first rule of control is observability. You can’t control something you don’t observe – and observability doesn’t scale,” says Muriel Médard, co-founder and CEO of decentralized memory infrastructure company Optimum. Declutter in an interview on TOKEN2049 in Singapore.
The question is no longer whether you agree with or support decentralization, but more about how “centralization stops working once a system gets big enough,” Médard explains.
These ideas were first explored in Médard’s recent MIT study about wireless transmitters, which showed how distributing functions rather than centralizing them can make communications systems much more energy efficient.
But Médard’s claim that decentralization works more efficiently by default is less about its nature is and more about How it behaves.
Every statement about nature “is already subtracted from a fundamental lack of natural frameworks, as in the equation n-1,” said Virgilio Rivas, professor of philosophy at the Polytechnic University of the Philippines. Declutter.
Rivas explains that systems organize themselves by removing a single central or fixed frame. In simple terms, nature works through connections and differences and ‘behaves so as to be observed’ in its ‘real standard’.
Yet Médard’s team at Optimum is now applying the same principle to blockchain networks, turning its research into distributed efficiency into code.
Their new network layer, tested on Ethereum’s Hoodi testnet, distributed blocks in about 150 milliseconds – about 6.5 times faster than Gossipsub, the system Ethereum uses to share data between validators.
Dubbed momP2POptimum’s system builds on mathematical principles that Médard first developed more than twenty years ago in US military-funded research into reliable communications networks.
“What we’re building is the memory layer,” which works “like the operating system of a computer,” she explained. Memory helps systems move data, which is where most of the inefficiencies in blockchains occur, Médard claims.
Optimum says its testnet shows that faster data sharing can make blockchains like Ethereum or Solana run more efficiently, with faster transactions and lower fees that could impact the way users trade and communicate on-chain.
But while lower latency could help narrow price gaps, big wins like Optimum won’t immediately “erase” existing bottlenecks, said Kanny Lee, founder and CEO of decentralized exchange protocol SecondSwap.
“Even with a sixfold improvement over Ethereum, blockchain still runs much slower than traditional finance,” Lee said. Declutter. What this does indicate, however, is a “more efficient on-chain environment,” in which arbitrage “becomes more difficult and markets respond more quickly to information,” he added.
Given these benefits, blockchain systems could behave “less like a relay and more like an integrated trading network,” Lee said.
He noted that as infrastructure improves, the edge shifts from speed to access, as seen in markets for locked tokens, allocations and structured distributions that “trade on timing and access, not latency.”
