Tokenization might not be as big in 2030 as you think. Here’s why
It took almost nine years for Bitcoin to hit $10,000 for the first time, and a little over 12 years to crack $50,000. McKinsey argues patience will be needed with tokenization, too.
There’s been no shortage of excitement about how the tokenization of real-world assets can change the world as we know it.
One report by 21.co, published last October, enthusiastically predicted that this market will be worth $10 trillion by 2030 — and a still-impressive $3.5 trillion in the worst-case scenario.
This has been coupled with dizzying amounts of experimentation and fundraising, with tokenization platforms raking in millions of dollars so they can expand.
But according to McKinsey, a sobering reality check is needed.
While the consulting firm’s analysts agree that this technology will revolutionize financial institutions, transform the investing experience and make trading a lot cheaper, it believes there’s a real danger of the tokenization sector running before it can walk.
“There have been many false starts and challenges thus far.”
McKinsey
Bursting the bubble of keen analysts who believe that every single stock and fund on the planet will be tokenized in the blink of an eye, McKinsey questioned how much traction this fledgling sector can achieve between now and the end of the decade, which is just six years away.
“Based on our analysis, we expect that total tokenized market capitalization could reach around $2 trillion by 2030 … in a bullish scenario, this value could double to around $4 trillion, but we are less optimistic than previously published estimates.”
McKinsey
In the most pessimistic scenario set out by the authors, it’s argued that the tokenization sector could be worth as little as $1 trillion — less than Bitcoin’s current market cap.
This isn’t McKinsey trying to dismiss this technology as a fad, it’s a frank acknowledgement that adoption takes a while. It took almost nine years for Bitcoin to hit $10,000 for the first time, and a little over 12 years to crack $50,000.
And in setting out the challenges that tokenization faces in gaining wider momentum — and achieving an all-important “network effect” — McKinsey says that some of the challenges likely to face early adopters include:
- Limited liquidity, with disappointing transaction volumes failing to deliver a robust market
- Parallel issuance on old-fashioned platforms leading to greater expense
- Long-established processes being disrupted
Because of all this, tokenization faces an additional challenge to reach its full potential: proving that it’s better than the current way of doing things. Using bonds as an example, the report says:
“Overcoming the cold start problem would require constructing a use case in which the digital representation of collateral delivers material benefits — including much greater mobility, faster settlement, and more liquidity.”
McKinsey
Leave a Reply