TheDinarian
News • Business • Investing & Finance
How Blockchain Is Leveling the AI Playing Field
By Mitch Liu-CEO and co-founder of Theta Labs
December 06, 2024
post photo preview

In an industry dominated by commercial AI labs, blockchain technology is allowing universities to get more cost-effective access to compute and allowing them to compete.

The rapid advancement of artificial intelligence has created an unprecedented divide between commercial and academic research. While Silicon Valley's tech giants pour billions into developing ever-larger language models and sophisticated AI systems, university labs increasingly find themselves unable to compete. This disparity raises serious questions about the future of AI development and who gets to shape it.

AI Labs are Being Vastly Outspent

In recent years, commercial laboratories have dramatically outspent academic institutions in AI research. In 2021, industry giants spent more than $340 billion globally on AI research and development, dwarfing the financial contributions from governments. For comparison, US government agencies (excluding the Department of Defense) invested $1.5 billion, while the European Commission allocated €1 billion (around $1.1 billion) to similar efforts.

This enormous gap in spending has given commercial labs a clear advantage, especially in terms of access to vital resources like computing power, data and talent. With these assets, companies are leading the development of advanced AI models at a scale that academic institutions struggle to match. Industry AI models are, on average, 29 times larger than those developed in universities, showcasing the stark difference in resources and capabilities.

The sheer size and complexity of these industry-driven models highlight the dominance of commercial labs in the race to develop cutting-edge artificial intelligence, leaving academic research labs trailing far behind.

The reasons for this disparity extend beyond simple economics. While commercial AI labs can operate with long-term horizons and significant risk tolerance, academic researchers must navigate complex grant cycles, institutional bureaucracies and limited budgets

Perhaps most critically, academic institutions often lack access to the massive computing infrastructure required for cutting-edge AI research. Training large language models can cost millions in computing resources alone – a prohibitive expense for most university departments. This creates a troubling dynamic where potentially groundbreaking research ideas may never see the light of day simply due to the high cost of compute

This cost is growing exponentially. One study by the Stanford Institute for Human-Centered Intelligence showed that OpenAI’s GPT-3 and Google’s PaLM cost less than $10M to train while the most recent GPT-4 and Google Gemini Ultra cost $78M and $191M respectively. This rate of 10x per year is estimated to persist over the next few years with new foundational models soon costing in the billions. 

The 2024 AI Index Report from Stanford HAI reinforces this trend, highlighting the skyrocketing costs of training AI models, the potential depletion of high-quality data, the rapid rise of foundation models and the growing shift towards open-source AI—all factors that further entrench the dominance of well-resourced companies and challenge academic institutions in keeping pace.

However, new solutions are emerging that could help level the playing field. Distributed computing infrastructure, built on decentralized architecture powered by blockchain technology, is beginning to offer researchers alternative paths to access high-performance computing resources at a fraction of traditional costs. These networks aggregate unused GPU computing power from thousands of participants worldwide, creating a shared pool of resources that can be accessed on demand.

On Decentralized Networks

Recent developments in this space are promising. Several major research universities in South Korea, including KAIST and Yonsei University, have begun utilizing Theta EdgeCloud, our decentralized computing network of over 30,000 globally distributed edge nodes, for AI research, achieving comparable results to traditional cloud services at one-half to one-third of the costs. Their early successes suggest a viable path forward for other academic institutions facing similar resource constraints.

The implications extend far beyond cost savings. When academic researchers can compete more effectively with commercial labs, it helps ensure that AI development benefits from diverse perspectives and approaches. University research typically prioritizes transparency, peer review and public good over commercial interests in the form of open-source models and public data sets – values that become increasingly important as AI systems grow more powerful and influential in society.

Consider the current debate around AI safety and ethics. While commercial labs face pressure to rapidly deploy new monetization capabilities, academic researchers often take more measured approaches, thoroughly examining potential risks and societal impacts. However, this crucial work requires significant computational resources to test and validate safety measures and sift through vast amounts of data. More affordable access to computing power could enable more comprehensive safety research and testing.

We're also seeing promising developments in specialized AI applications that might not attract commercial investment but could provide significant societal benefits. Researchers at several universities are using distributed computing networks to develop AI models for ultra-rare disease researchclimate science and other public interest applications that might not have clear profit potential.

Openness and Transparency

Beyond the question of resources, academic institutions offer another crucial advantage: transparency and public accountability in their research. While commercial AI labs like OpenAI and Google Brain produce groundbreaking work, their research often occurs within closed environments where methodologies, data sources and negative results may not be fully disclosed. This isn't necessarily due to any misconduct – proprietary technology and competitive advantages are legitimate business concerns – but it does create limitations in how thoroughly their work can be examined and validated by the broader scientific community.

Academic research, by contrast, operates under different incentives. Universities typically publish comprehensive methodologies, open-source their models, share detailed results (including failed experiments) and subject their work to rigorous peer review. This openness allows other researchers to validate findings, build upon successful approaches and learn from unsuccessful ones. When KAIST AI researchers recently developed improvements to Stable Diffusion’s open-source text-to-image generative AI models for virtual clothing e-commerce applications, for example, they published complete technical documentation, public domain training data sets and methodology, enabling other institutions to replicate and enhance their work.

The distributed computing networks now emerging could help amplify these benefits of academic research. As more universities gain access to affordable computing power, we're likely to see an increase in reproducible studies, collaborative projects and open-source implementations. Many South Korean and other universities around the globe are already sharing their AI models and datasets through these networks, creating a virtuous cycle of innovation and verification.

This combination of computational accessibility and academic transparency could prove transformative. When researchers can both afford to run ambitious AI experiments and freely share their results, it accelerates the entire field's progress.

 

community logo
Join the TheDinarian Community
To read more articles like this, sign up and join my community today
0
What else you may like…
Videos
Podcasts
Posts
Articles
Trump just posted this about chemtrails 👀

“The enthusiasm for experiments that would pump pollutants into the high atmosphere has set off alarm bells here at the TRUMP EPA.”

00:02:52
The future of crypto = access, trust, transparency.

@evernorthxrp gives institutional + public investors simple, regulated, liquid exposure to XRP – and we’re compounding that value.

Watch below to learn how. 🎥👇

OP: @Ashgoblue

00:01:32
Coinbase CEO Brian Armstrong on CNBC: Crypto Market Structure Bill is CLOSE to passing 👀
00:00:39
👉 Coinbase just launched an AI agent for Crypto Trading

Custom AI assistants that print money in your sleep? 🔜

The future of Crypto x AI is about to go crazy.

👉 Here’s what you need to know:

💠 'Based Agent' enables creation of custom AI agents
💠 Users set up personalized agents in < 3 minutes
💠 Equipped w/ crypto wallet and on-chain functions
💠 Capable of completing trades, swaps, and staking
💠 Integrates with Coinbase’s SDK, OpenAI, & Replit

👉 What this means for the future of Crypto:

1. Open Access: Democratized access to advanced trading
2. Automated Txns: Complex trades + streamlined on-chain activity
3. AI Dominance: Est ~80% of crypto 👉txns done by AI agents by 2025

🚨 I personally wouldn't bet against Brian Armstrong and Jesse Pollak.

👉 Coinbase just launched an AI agent for Crypto Trading

The first multi-asset crypto fund in the U.S., now listed on NYSE Arca.

Welcome Grayscale CoinDesk Crypto 5 ETF! $GDLC.

https://x.com/NYSE/status/1981705231055433831

The International Asteroid Warning Network Initiated a Campaign to Monitor 3I/ATLAS

The closest approach to Earth is Dec 19 2025.

By Christmas, we’ll know whether 3I/ATLAS was just another comet or something that came looking back.

https://avi-loeb.medium.com/the-international-asteroid-warning-network-initiated-a-campaign-to-monitor-3i-atlas-d2a698859747

EpicX, the first perpetuals exchange purpose-built for the XRP economy

A fresh look at EpicX, the first perpetuals exchange purpose-built for the XRP economy.

EpicX transforms the XRP economy into a global trading venue that combines institutional-grade liquidity with frictionless usability, making pro-level trading accessible to everyone.

What EpicX brings:

⚫ One-tap onboarding via social login (MPC-secured), no wallet setup or gas friction

⚫ Ultra-fast trade execution with up to 40x leverage and composable margin

⚫ Multi-asset markets spanning crypto, stocks, and other RWAs

⚫ A self-funding loop: all fees directed to referrals, $EPIC buybacks, and grants

Beta next.

Register here → trade.epicchain.io

https://x.com/EpicOnChain/status/1978431421456019519

post photo preview
New Human Force
Join this Now! YOU have what it takes!

They are in our solar system, and in our event-stream in this Eternal Now.

Officialdom is clueless.

They think we are going to be at WAR with the Aliens.

Officialdom is very stupid.

Aliens is here. It’s not WAR. It’s Contention.

There is a difference.

Officialdom is clueless, still living in the last Millennium.

Aliens is here.

The Field in which we contend is This Eternal Now.

ALL HUMANS LIVE HERE, and ONLY HERE, in this

ETERNAL NOW.

It’s a Field of potentials, of pending Manifestation, this continuous event-stream of karma in which we have always lived our body’s Life.

This Eternal Now has always been our body’s Field of Contention.

The Aliens is here, in our Eternal Now.

Our common, shared, reality that we all continuously co-create now has Aliens.

It’s getting very complex in here.

Officialdom is clueless. They see the Aliens. They are freaking out. They think you are children, when it is their small minds, trapped in a reality that is only grit, mud, and ‘random chance’ who are childish.

Officialdom is stupid. They will and are reacting badly. As is their way, they are trying to hide shit from you. Silly grit bound minds don’t realize you can see everything from within the Eternal Now. They have yet to grasp that what they perceive as this Matterium, filled with ‘matter’, is but a hardening of our previous (past) internal states of being.

WAR happens in the Matterium.

Contention occurs within this Eternal Now where Consciousness shapes the manifesting event-stream.

YOU know this to be fact. You are a co-creator.

Contention with Aliens is happening in this instant in this Eternal Now.

Officialdom ain’t doing shit. They are still stuck in trying to move matter around to affect unfolding circumstances. That’s redoing the mirror trying to affect the reflection. Dumb fucks….

It’s up to US. To the New Humans. Those of us who live in this Eternal Now. Those of us who see that our body’s Lives (the Chain that cannot be broken) are expressions of the Ontology revealing itself to itself. It’s up to us guys.

We are not an Army. That’s a concept from the past, from before the emergence of the New Humans. We are a Force. A self-organizing collective with leadership resident in each, and every participant.

We are the New Human Force. By the time officialdom starts to speak about the Aliens in near-factual terms, we will already be engaging them in this Eternal Now.

By the time officialdom begins to move matter around (space ships & such) thinking it’s War, we will already be suffering casualties in this Eternal Now. That part is inevitable. It’s how we learn.

By the time officialdom realizes that some shit is going on in places and ways beyond its conception, we will already be pushing our dominance onto our partners in this First Contention, the Aliens. Nage cannot train without Uke.

Just as officialdom is scrambling to research the Ontology, this Eternal Now, and the event-stream, we will be settling terms with our new partners, the Aliens.

Come, join with us. It’s going to be a hellacious Contention.

We ARE the NEW HUMANS!

Together we are the Force that cannot be defeated.

Start YOUR training in this instance of this Eternal NOW.

Consume Neville Goddard videos as though all of human existence depended on YOUR mind and YOUR active, effective, imaginings!

It’s not a question of Mind over Matter as there is only Mind and it cares not for Matter. That’s residue.

Source

🙏 Donations Accepted 🙏

If you find value in my content, consider showing your support via:

💳 PayPal: 
1) Simply scan the QR code below 📲
2) or visit https://www.paypal.me/thedinarian

🔗 Crypto Donations👇
XRP: r9pid4yrQgs6XSFWhMZ8NkxW3gkydWNyQX
XLM: GDMJF2OCHN3NNNX4T4F6POPBTXK23GTNSNQWUMIVKESTHMQM7XDYAIZT
XDC: xdcc2C02203C4f91375889d7AfADB09E207Edf809A6

Read full Article
post photo preview
The Great Onboarding: US Government Anchors Global Economy into Web3 via Pyth Network

For years, the crypto world speculated that the next major cycle would be driven by institutional adoption, with Wall Street finally legitimizing Bitcoin through vehicles like ETFs. While that prediction has indeed materialized, a recent development signifies a far more profound integration of Web3 into the global economic fabric, moving beyond mere financial products to the very infrastructure of data itself. The U.S. government has taken a monumental step, cementing Web3's role as a foundational layer for modern data distribution. This door, once opened, is poised to remain so indefinitely.

The U.S. Department of Commerce has officially partnered with leading blockchain oracle providers, Pyth Network and Chainlink, to distribute critical official economic data directly on-chain. This initiative marks a historic shift, bringing immutable, transparent, and auditable data from the federal government itself onto decentralized networks. This is not just a technological upgrade; it's a strategic move to enhance data accuracy, transparency, and accessibility for a global audience.

Specifically, Pyth Network has been selected to publish Gross Domestic Product (GDP) data, starting with quarterly releases going back five years, with plans to expand to a broader range of economic datasets. Chainlink, the other key partner, will provide data feeds from the Bureau of Economic Analysis (BEA), including Real Gross Domestic Product (GDP) and the Personal Consumption Expenditures (PCE) Price Index. This crucial economic information will be made available across a multitude of blockchain networks, including major ecosystems like Ethereum, Avalanche, Base, Bitcoin, Solana, Tron, Stellar, Arbitrum One, Polygon PoS, and Optimism.

This development is closer to science fiction than traditional finance. The same oracle network, Pyth, that secures data for over 350 decentralized applications (dApps) across more than 50 blockchains, processing over $2.5 trillion in total trading volume through its oracles, is now the system of record for the United States' core economic indicators. Pyth's extensive infrastructure, spanning over 107 blockchains and supporting more than 600 applications, positions it as a trusted source for on-chain data. This is not about speculative assets; it's about leveraging proven, robust technology for critical public services.

The significance of this collaboration cannot be overstated. By bringing official statistics on-chain, the U.S. government is embracing cryptographic verifiability and immutable publication, setting a new precedent for how governments interact with decentralized technology. This initiative aligns with broader transparency goals and is supported by Secretary of Commerce Howard Lutnick, positioning the U.S. as a world leader in finance and blockchain innovation. The decision by a federal entity to trust decentralized oracles with sensitive economic data underscores the growing institutional confidence in these networks.

This is the cycle of the great onboarding. The distinction between "Web2" and "Web3" is rapidly becoming obsolete. When government data, institutional flows, and grassroots builders all operate on the same decentralized rails, we are simply talking about the internet—a new iteration, yes, but the internet nonetheless: an immutable internet where data is not only published but also verified and distributed in real-time.

Pyth Network stands as tangible proof that this technology serves a vital purpose. It demonstrates that the industry has moved beyond abstract "crypto tech" to offering solutions that address real-world needs and are now actively sought after and understood by traditional entities. Most importantly, it proves that Web3 is no longer seeking permission; it has received the highest validation a system can receive—the trust of governments and markets alike.

This is not merely a fleeting trend; it's a crowning moment in global adoption. The U.S. government has just validated what many in the Web3 space have been building towards for years: that Web3 is not a sideshow, but a foundational layer for the future. The current cycle will be remembered as the moment the world definitively crossed this threshold, marking the last great opportunity to truly say, "we were early."

🙏 Donations Accepted 🙏

If you find value in my content, consider showing your support via:

💳 PayPal: 
1) Simply scan the QR code below 📲
2) or visit https://www.paypal.me/thedinarian

🔗 Crypto Donations👇
XRP: r9pid4yrQgs6XSFWhMZ8NkxW3gkydWNyQX
XLM: GDMJF2OCHN3NNNX4T4F6POPBTXK23GTNSNQWUMIVKESTHMQM7XDYAIZT
XDC: xdcc2C02203C4f91375889d7AfADB09E207Edf809A6

Read full Article
post photo preview
US Dept of Commerce to publish GDP data on blockchain

On Tuesday during a televised White House cabinet meeting, Commerce Secretary Howard Lutnick announced the intention to publish GDP statistics on blockchains. Today Chainlink and Pyth said they were selected as the decentralized oracles to distribute the data.

Lutnick said, “The Department of Commerce is going to start issuing its statistics on the blockchain because you are the crypto President. And we are going to put out GDP on the blockchain, so people can use the blockchain for data distribution. And then we’re going to make that available to the entire government. So, all of you can do it. We’re just ironing out all the details.”

The data includes Real GDP and the PCE Price Index, which reflects changes in the prices of domestic consumer goods and services. The statistics are released monthly and quarterly. The biggest initial use will likely be by on-chain prediction markets. But as more data comes online, such as broader inflation data or interest rates from the Federal Reserve, it could be used to automate various financial instruments. Apart from using the data in smart contracts, sources of tamperproof data 👉will become increasingly important for generative AI.

While it would be possible to procure the data from third parties, it is always ideal to get it from the source to ensure its accuracy. Getting data directly from government sources makes it tamperproof, provided the original data feed has not been manipulated before it reaches the oracle.

Source

🙏 Donations Accepted 🙏

If you find value in my content, consider showing your support via:

💳 PayPal: 
1) Simply scan the QR code below 📲
2) or visit https://www.paypal.me/thedinarian

🔗 Crypto
XRP: r9pid4yrQgs6XSFWhMZ8NkxW3gkydWNyQX
XLM: GDMJF2OCHN3NNNX4T4F6POPBTXK23GTNSNQWUMIVKESTHMQM7XDYAIZT
XDC: xdcc2C02203C4f91375889d7AfADB09E207Edf809A6

Read full Article
See More
Available on mobile and TV devices
google store google store app store app store
google store google store app tv store app tv store amazon store amazon store roku store roku store
Powered by Locals