TheDinarian
News • Business • Investing & Finance
How Blockchain Is Leveling the AI Playing Field
By Mitch Liu-CEO and co-founder of Theta Labs
December 06, 2024
post photo preview

In an industry dominated by commercial AI labs, blockchain technology is allowing universities to get more cost-effective access to compute and allowing them to compete.

The rapid advancement of artificial intelligence has created an unprecedented divide between commercial and academic research. While Silicon Valley's tech giants pour billions into developing ever-larger language models and sophisticated AI systems, university labs increasingly find themselves unable to compete. This disparity raises serious questions about the future of AI development and who gets to shape it.

AI Labs are Being Vastly Outspent

In recent years, commercial laboratories have dramatically outspent academic institutions in AI research. In 2021, industry giants spent more than $340 billion globally on AI research and development, dwarfing the financial contributions from governments. For comparison, US government agencies (excluding the Department of Defense) invested $1.5 billion, while the European Commission allocated €1 billion (around $1.1 billion) to similar efforts.

This enormous gap in spending has given commercial labs a clear advantage, especially in terms of access to vital resources like computing power, data and talent. With these assets, companies are leading the development of advanced AI models at a scale that academic institutions struggle to match. Industry AI models are, on average, 29 times larger than those developed in universities, showcasing the stark difference in resources and capabilities.

The sheer size and complexity of these industry-driven models highlight the dominance of commercial labs in the race to develop cutting-edge artificial intelligence, leaving academic research labs trailing far behind.

The reasons for this disparity extend beyond simple economics. While commercial AI labs can operate with long-term horizons and significant risk tolerance, academic researchers must navigate complex grant cycles, institutional bureaucracies and limited budgets

Perhaps most critically, academic institutions often lack access to the massive computing infrastructure required for cutting-edge AI research. Training large language models can cost millions in computing resources alone – a prohibitive expense for most university departments. This creates a troubling dynamic where potentially groundbreaking research ideas may never see the light of day simply due to the high cost of compute

This cost is growing exponentially. One study by the Stanford Institute for Human-Centered Intelligence showed that OpenAI’s GPT-3 and Google’s PaLM cost less than $10M to train while the most recent GPT-4 and Google Gemini Ultra cost $78M and $191M respectively. This rate of 10x per year is estimated to persist over the next few years with new foundational models soon costing in the billions. 

The 2024 AI Index Report from Stanford HAI reinforces this trend, highlighting the skyrocketing costs of training AI models, the potential depletion of high-quality data, the rapid rise of foundation models and the growing shift towards open-source AI—all factors that further entrench the dominance of well-resourced companies and challenge academic institutions in keeping pace.

However, new solutions are emerging that could help level the playing field. Distributed computing infrastructure, built on decentralized architecture powered by blockchain technology, is beginning to offer researchers alternative paths to access high-performance computing resources at a fraction of traditional costs. These networks aggregate unused GPU computing power from thousands of participants worldwide, creating a shared pool of resources that can be accessed on demand.

On Decentralized Networks

Recent developments in this space are promising. Several major research universities in South Korea, including KAIST and Yonsei University, have begun utilizing Theta EdgeCloud, our decentralized computing network of over 30,000 globally distributed edge nodes, for AI research, achieving comparable results to traditional cloud services at one-half to one-third of the costs. Their early successes suggest a viable path forward for other academic institutions facing similar resource constraints.

The implications extend far beyond cost savings. When academic researchers can compete more effectively with commercial labs, it helps ensure that AI development benefits from diverse perspectives and approaches. University research typically prioritizes transparency, peer review and public good over commercial interests in the form of open-source models and public data sets – values that become increasingly important as AI systems grow more powerful and influential in society.

Consider the current debate around AI safety and ethics. While commercial labs face pressure to rapidly deploy new monetization capabilities, academic researchers often take more measured approaches, thoroughly examining potential risks and societal impacts. However, this crucial work requires significant computational resources to test and validate safety measures and sift through vast amounts of data. More affordable access to computing power could enable more comprehensive safety research and testing.

We're also seeing promising developments in specialized AI applications that might not attract commercial investment but could provide significant societal benefits. Researchers at several universities are using distributed computing networks to develop AI models for ultra-rare disease researchclimate science and other public interest applications that might not have clear profit potential.

Openness and Transparency

Beyond the question of resources, academic institutions offer another crucial advantage: transparency and public accountability in their research. While commercial AI labs like OpenAI and Google Brain produce groundbreaking work, their research often occurs within closed environments where methodologies, data sources and negative results may not be fully disclosed. This isn't necessarily due to any misconduct – proprietary technology and competitive advantages are legitimate business concerns – but it does create limitations in how thoroughly their work can be examined and validated by the broader scientific community.

Academic research, by contrast, operates under different incentives. Universities typically publish comprehensive methodologies, open-source their models, share detailed results (including failed experiments) and subject their work to rigorous peer review. This openness allows other researchers to validate findings, build upon successful approaches and learn from unsuccessful ones. When KAIST AI researchers recently developed improvements to Stable Diffusion’s open-source text-to-image generative AI models for virtual clothing e-commerce applications, for example, they published complete technical documentation, public domain training data sets and methodology, enabling other institutions to replicate and enhance their work.

The distributed computing networks now emerging could help amplify these benefits of academic research. As more universities gain access to affordable computing power, we're likely to see an increase in reproducible studies, collaborative projects and open-source implementations. Many South Korean and other universities around the globe are already sharing their AI models and datasets through these networks, creating a virtuous cycle of innovation and verification.

This combination of computational accessibility and academic transparency could prove transformative. When researchers can both afford to run ambitious AI experiments and freely share their results, it accelerates the entire field's progress.

 

community logo
Join the TheDinarian Community
To read more articles like this, sign up and join my community today
0
What else you may like…
Videos
Podcasts
Posts
Articles
⚠️ Vietnam has closed 86 million bank accounts...

"Vietnam has closed 86 million bank accounts... because they refused... the digital ID."

"You had to register a digital ID with biometric data... And if you don't do it, we'll take your money, whether you like it or not."

"It's coming here. They're pushing for digital IDs."

"We had it during Covid. Oh, just get one vaccination and you'll be fine. And then they took away your permissions... and said, no, actually, you need another one. And then another one."

"The more we give away our freedom, our power, the more they'll take."

"We are on the edge of a cliff... We have to show that we are not going to allow this or accept this. Because if we all stopped going to work, or we all stopped using our cards, they'd... listen to us rather quickly."

"And that's a choice we still have at the moment. If cash disappears, that choice is gone forever."

"And I wouldn't like to think of the world where we are controlled—where we can drive, where we can fly, what food we can eat—and the bank can choose ...

00:03:23
It's All About The Bloodlines Retaining Control 👁 THEY Are Targeting The Children

Chelsea Clinton has launched a new podcast aimed at “debunking misinformation” on health topics like vaccines and fluoride, featuring a lineup of so-called experts.

The show, That Can’t Be True!, will cover topics like childhood vaccines, fluoride, and raw milk, with Clinton and guests aiming to dismiss “misleading” claims.

Clinton has previously admitted to working with the World Health Organization and the Gates Foundation on a massive childhood immunization campaign 👉 to catch as many kids up as possible. 👩‍👧‍👧

00:02:26
Built On Stellar XLM 💎 😉

Blockchain adoption demands both privacy and transparency. Stellar is built for both.

@tomerweller, SDF's Chief Product Officer, shares the path to privacy on Stellar:

Dont underestimate Stellar..
This is financial advice. 💎

00:01:01
👉 Coinbase just launched an AI agent for Crypto Trading

Custom AI assistants that print money in your sleep? 🔜

The future of Crypto x AI is about to go crazy.

👉 Here’s what you need to know:

💠 'Based Agent' enables creation of custom AI agents
💠 Users set up personalized agents in < 3 minutes
💠 Equipped w/ crypto wallet and on-chain functions
💠 Capable of completing trades, swaps, and staking
💠 Integrates with Coinbase’s SDK, OpenAI, & Replit

👉 What this means for the future of Crypto:

1. Open Access: Democratized access to advanced trading
2. Automated Txns: Complex trades + streamlined on-chain activity
3. AI Dominance: Est ~80% of crypto 👉txns done by AI agents by 2025

🚨 I personally wouldn't bet against Brian Armstrong and Jesse Pollak.

👉 Coinbase just launched an AI agent for Crypto Trading
New Reggie Middleton Video..

Those (both citizens and sovereign nations) who don't own crypto/AI infrastructure (most importantly, the IP) will be relegated to a second class. Look at the forward citations of this patented invention to see how individuals can claim their piece of the pie.

https://x.com/ReggieMiddleton/status/1974107616637825225

Coinbase is Excited to be partnering with @Samsung to make crypto even more accessible.

We’re offering 75M+ Samsung Galaxy users in the U.S. free access to Coinbase One to bring them onboard. And we’ve fully integrated Samsung Pay, so every Coinbase user in the U.S. can use that to buy crypto.

https://www.coinbase.com/blog/Samsung-taps-Coinbase-to-bring-crypto-to-more-than-75-million-Galaxy-users

post photo preview

🚨🗞️NEW: Government Shutdown Puts Crypto ETF Approvals On Ice

Routine approvals and filings are delayed while the shutdown limits @SECGov operations, @rstormsf moves to dismiss his conviction, and this week’s top stories. ⬇️

https://www.cryptoinamerica.com/p/government-shutdown-puts-crypto-etf

post photo preview
post photo preview
The Great Onboarding: US Government Anchors Global Economy into Web3 via Pyth Network

For years, the crypto world speculated that the next major cycle would be driven by institutional adoption, with Wall Street finally legitimizing Bitcoin through vehicles like ETFs. While that prediction has indeed materialized, a recent development signifies a far more profound integration of Web3 into the global economic fabric, moving beyond mere financial products to the very infrastructure of data itself. The U.S. government has taken a monumental step, cementing Web3's role as a foundational layer for modern data distribution. This door, once opened, is poised to remain so indefinitely.

The U.S. Department of Commerce has officially partnered with leading blockchain oracle providers, Pyth Network and Chainlink, to distribute critical official economic data directly on-chain. This initiative marks a historic shift, bringing immutable, transparent, and auditable data from the federal government itself onto decentralized networks. This is not just a technological upgrade; it's a strategic move to enhance data accuracy, transparency, and accessibility for a global audience.

Specifically, Pyth Network has been selected to publish Gross Domestic Product (GDP) data, starting with quarterly releases going back five years, with plans to expand to a broader range of economic datasets. Chainlink, the other key partner, will provide data feeds from the Bureau of Economic Analysis (BEA), including Real Gross Domestic Product (GDP) and the Personal Consumption Expenditures (PCE) Price Index. This crucial economic information will be made available across a multitude of blockchain networks, including major ecosystems like Ethereum, Avalanche, Base, Bitcoin, Solana, Tron, Stellar, Arbitrum One, Polygon PoS, and Optimism.

This development is closer to science fiction than traditional finance. The same oracle network, Pyth, that secures data for over 350 decentralized applications (dApps) across more than 50 blockchains, processing over $2.5 trillion in total trading volume through its oracles, is now the system of record for the United States' core economic indicators. Pyth's extensive infrastructure, spanning over 107 blockchains and supporting more than 600 applications, positions it as a trusted source for on-chain data. This is not about speculative assets; it's about leveraging proven, robust technology for critical public services.

The significance of this collaboration cannot be overstated. By bringing official statistics on-chain, the U.S. government is embracing cryptographic verifiability and immutable publication, setting a new precedent for how governments interact with decentralized technology. This initiative aligns with broader transparency goals and is supported by Secretary of Commerce Howard Lutnick, positioning the U.S. as a world leader in finance and blockchain innovation. The decision by a federal entity to trust decentralized oracles with sensitive economic data underscores the growing institutional confidence in these networks.

This is the cycle of the great onboarding. The distinction between "Web2" and "Web3" is rapidly becoming obsolete. When government data, institutional flows, and grassroots builders all operate on the same decentralized rails, we are simply talking about the internet—a new iteration, yes, but the internet nonetheless: an immutable internet where data is not only published but also verified and distributed in real-time.

Pyth Network stands as tangible proof that this technology serves a vital purpose. It demonstrates that the industry has moved beyond abstract "crypto tech" to offering solutions that address real-world needs and are now actively sought after and understood by traditional entities. Most importantly, it proves that Web3 is no longer seeking permission; it has received the highest validation a system can receive—the trust of governments and markets alike.

This is not merely a fleeting trend; it's a crowning moment in global adoption. The U.S. government has just validated what many in the Web3 space have been building towards for years: that Web3 is not a sideshow, but a foundational layer for the future. The current cycle will be remembered as the moment the world definitively crossed this threshold, marking the last great opportunity to truly say, "we were early."

🙏 Donations Accepted 🙏

If you find value in my content, consider showing your support via:

💳 PayPal: 
1) Simply scan the QR code below 📲
2) or visit https://www.paypal.me/thedinarian

🔗 Crypto Donations👇
XRP: r9pid4yrQgs6XSFWhMZ8NkxW3gkydWNyQX
XLM: GDMJF2OCHN3NNNX4T4F6POPBTXK23GTNSNQWUMIVKESTHMQM7XDYAIZT
XDC: xdcc2C02203C4f91375889d7AfADB09E207Edf809A6

Read full Article
post photo preview
US Dept of Commerce to publish GDP data on blockchain

On Tuesday during a televised White House cabinet meeting, Commerce Secretary Howard Lutnick announced the intention to publish GDP statistics on blockchains. Today Chainlink and Pyth said they were selected as the decentralized oracles to distribute the data.

Lutnick said, “The Department of Commerce is going to start issuing its statistics on the blockchain because you are the crypto President. And we are going to put out GDP on the blockchain, so people can use the blockchain for data distribution. And then we’re going to make that available to the entire government. So, all of you can do it. We’re just ironing out all the details.”

The data includes Real GDP and the PCE Price Index, which reflects changes in the prices of domestic consumer goods and services. The statistics are released monthly and quarterly. The biggest initial use will likely be by on-chain prediction markets. But as more data comes online, such as broader inflation data or interest rates from the Federal Reserve, it could be used to automate various financial instruments. Apart from using the data in smart contracts, sources of tamperproof data 👉will become increasingly important for generative AI.

While it would be possible to procure the data from third parties, it is always ideal to get it from the source to ensure its accuracy. Getting data directly from government sources makes it tamperproof, provided the original data feed has not been manipulated before it reaches the oracle.

Source

🙏 Donations Accepted 🙏

If you find value in my content, consider showing your support via:

💳 PayPal: 
1) Simply scan the QR code below 📲
2) or visit https://www.paypal.me/thedinarian

🔗 Crypto
XRP: r9pid4yrQgs6XSFWhMZ8NkxW3gkydWNyQX
XLM: GDMJF2OCHN3NNNX4T4F6POPBTXK23GTNSNQWUMIVKESTHMQM7XDYAIZT
XDC: xdcc2C02203C4f91375889d7AfADB09E207Edf809A6

Read full Article
post photo preview
List Of Cardano Wallets

Well-known and actively maintained wallets supporting the Cardano Blockchain are EternlTyphonVesprYoroiLaceADAliteNuFiDaedalusGeroLodeWalletCoin WalletADAWalletAtomicGem WalletTrust and Exodus.

Note that in case of issues, usually only queries relating to official wallets can be answered in Cardano groups across telegram/forum. You may need to consult with specific wallet support teams for third party wallets.

Tips

  • Its is important to ensure that you're in sole control of your wallet keys, and that the keys used can be restored via alternate wallet providers if a particular one is non-functional. Hence, put extra attention to Non-Custodial and Compatibility fields.
  • The score column below is strictly a count of checks against each feature listed, the impact of specific feature (and thus, score) is up to reader's descretion.
  • The table represents current state on mainnet network, any future roadmap activities are out-of-scope.
  • Info on individual fields can be found towards the end of the page.
  • Any field that shows partial support (eg: open-source field) does not score the point for that field.

Brief info on fields above

  • Non-Custodial: are wallets where payment as well as stake keys are not shared/reused by wallet provider, and funds can be transparently verified on explorer
  • Compatibility: If the wallet mnemonics/keys can easily (for non-technical user) be used outside of specific wallet provider in major other wallets
  • Stake Control: Freedom to elect stake pool for user to delegate to (in user-friendly way)
  • Transparent Support: Easy approachability of a public interactive - eg: discord/telegram - group (with non-anonymous users) who can help out with support. Twitter/Email supports do not count for a check
  • Voting: Ability to participate in Catalyst voting process
  • Hardware Wallet: Integration with atleast Ledger Nano device
  • Native Assets: Ability to view native assets that belong to wallet
  • dApp Integration: Ability to interact with dApps
  • Stability: represents whether there have been large number of users reporting missing tokens/balance due to wallet backend being out of sync
  • Testnets Support: Ability to easily (for end-user) open wallets in atleast one of the cardano testnet networks
  • Custom Backend Support: Ability to elect a custom backend URL for selecting alternate way to submit transactions transactions created on client machines
  • Single/Multi Address Mode: Ability to use/import Single as well as Multiple Address modes for a wallet
  • Mobile App: Availability on atleast one of the popular mobile platforms
  • Desktop (app,extension,web): Ways to open wallet app on desktop PCs
  • Open Source: Whether the complete wallet (all components) are open source and can be run independently.

Source

🙏 Donations Accepted 🙏

If you find value in my content, consider showing your support via:

💳 PayPal: 
1) Simply scan the QR code below 📲
2) or visit https://www.paypal.me/thedinarian

🔗 Crypto
XRP: r9pid4yrQgs6XSFWhMZ8NkxW3gkydWNyQX
XLM: GDMJF2OCHN3NNNX4T4F6POPBTXK23GTNSNQWUMIVKESTHMQM7XDYAIZT
XDC: xdcc2C02203C4f91375889d7AfADB09E207Edf809A6

 

Read full Article
See More
Available on mobile and TV devices
google store google store app store app store
google store google store app tv store app tv store amazon store amazon store roku store roku store
Powered by Locals