TheDinarian
News • Business • Investing & Finance
Theta EdgeCloud: from Research Academia to the most Advanced AI Compute Infrastructure
March 18, 2024
post photo preview

Theta Labs CTO Jieyi Long and the Theta engineering team began researching AI, distributed systems and machine learning applied to blockchain in early 2022 when we invited Zhen Xiao, Professor of Computer Science at Peking University to join Theta as an academic advisor. Prof Xiao has published dozens of research papers in various AI disciplines over the past 20+ years after receiving his Ph.D. in Computer Science from Cornell University and his undergraduate studies at Peking University.

Optimized AI Compute Task Scheduling in EdgeCloud Virtualization Layer

Jieyi co-authored a research paper with Prof Xiao’s team titled “A Dual-Agent Scheduler for Distributed Deep Learning Jobs on Public Cloud via Reinforcement Learning”, published in August 2023 in ACM KDD’23, which is one of the most prestigious international conferences on knowledge discovery and data mining. The research learning and experience has found its way into the implementation of Theta EdgeCloud virtualization layer, and particularly around how AI compute task scheduling algorithms are optimized in a distributed environment.

In summary, the technical paper describes how cloud computing systems use powerful graphics processing units (GPUs) to train complex artificial intelligence models. To efficiently manage these systems, job scheduling is crucial. This involves deciding the order in which tasks are done and when and where they are run on available GPUs, particularly challenging when these GPUs are highly distributed globally. Traditional methods struggle with this complex task, new approaches described in the paper using AI machine learning show promise.

Traditional rule-based and heuristic-based methods often only focus on one aspect of scheduling and don’t consider how tasks and GPUs work together. For instance, they don’t account for differences in GPU performance and competition for resources, which can affect how long tasks take to complete.

To address these challenges, the paper proposes a new approach called a “dual-agent scheduler.” This approach uses two learning agents, namely “Placement Agent” and “Ordering Agent” in the diagram above, to jointly decide task order and GPU placement and more importantly consider how they affect each other. It also introduces a method to understand and manage differences in GPU performance. The proposed approach is tested on real-world data and shows promising results, and improvement in the efficiency of cloud computing systems.

Tree-of-Thought: A New Approach to Solving Complex Reasoning Problems by LLMs powered by EdgeCloud

In May 2023, Jieyi authored an innovative research paper titled “Large Language Model Guided Tree-of-Thought (ToT)” and its preprint is available here. The insights and approach described in this paper by Jieyi could have fundamental implications for improving the reasoning capability of large language models (LLMs) to solve complex problems. The Tree-of-Thought (ToT) technique is inspired by the human mind’s approach for solving complex reasoning tasks through trial and error and through a tree-like thought process, allowing for backtracking when necessary. This insight and inspiration gives rise to the possibility of mirroring the human mind in complex problem solving tasks by large language models.

Since the release of the preprint, the tree-of-thought concept has gained widespread recognition among the LLM community. The paper has already been cited by dozens of research articles, as well as several technical posts including this Forbes articleSeparately, LangChain, a highly popular open-source language model library, just officially implemented ToT based on Jieyi’s paper (see here for more details). Coincidentally, just two days after Jieyi uploaded his preprint, a research team from Princeton University and Google DeepMind published another widely cited “Tree of Thoughts” paper with very similar ideas. The ToT concept has inspired a line of influential work to improve LLM reasoning abilities, including Graph of Thoughts, Algorithm of Thoughts, etc. In a Keynote speech at Microsoft Build 2023, Andrej Karpathy from OpenAI commented that the ToT approach allows the LLM to explore different reasoning paths and potentially correct its mistakes or avoid dead ends in its reasoning process, which is similar to how the AlphaGo algorithm uses Monte Carlo Tree Search to explore multiple possible moves in the game of Go before selecting the best one.

The research paper introduces the Tree-of-Thought (ToT) framework, which enhances LLMs for solving mathematical problems like Sudoku. The framework involves adding a prompter agent, a checker module, a memory module and a ToT controller. In order to solve a given problem, these modules engage in a multi-round conversation with the LLM. The memory module records the conversation and state history of the problem solving process, which allows the system to backtrack to the previous steps of the thought-process and explore other directions from there.

To enable the ToT system to develop novel problem-solving strategies not found in its training data, we can potentially adopt the “self-play” technique, inspired by game-playing AI agents like AlphaGo. Unlike traditional self-supervised learning, self-play reinforcement learning allows for a broader exploration of solution space, potentially leading to significant improvements. By introducing a “quizzer” module to generate problem descriptions for training, similar to AlphaGo’s gameplay, the ToT framework can expand its problem-solving capabilities beyond the examples in its training data.

To verify the effectiveness of the proposed technique, we implemented a ToT-based solver for the Sudoku Puzzle. Experimental results show that the ToT framework can significantly increase the success rate of Sudoku puzzle solving. Our implementation of the ToT-based Sudoku solver is available on GitHub.

This work has significant implications to Theta, to be viewed as thought leaders in the AI industry, and more importantly as pioneers of the largest hybrid cloud decentralized AI compute infrastructure. This will enable EdgeCloud to support next-generation LLMs and other AI systems such as LLM powered autonomous agents. Such an agent typically requires running multiple deep-learning models orchestrated by a ToT-enhanced LLM acting as the “brain” of the agent. All these models, including the ToT-enhanced LLM, can potentially be run within EdgeCloud, extending the capabilities of Theta EdgeCloud to support AI systems with ever growing complexity.

EdgeCloud Future — a Decentralized Data Marketplace for Training AI Models

Lastly, Jieyi and team collaborated with Ali Farahanchi, Theta’s lead equity investor, and researchers from USC, Texas A&M, and FedML to co-author “Proof-of-Contribution-Based Design for Collaborative Machine Learning on Blockchain”, published in 2023 IEEE International conference on Decentralized Applications and Infrastructure, with the full paper available hereThis research lays the groundwork for future work on Theta EdgeCloud with the opportunity to implement a decentralized data marketplace for training AI models. This novel approach ensures that data contributors are fairly compensated with crypto rewards, training data remains fully private, the system is designed to withstand malicious parties, and verification of data contribution and quality of data using zero-knowledge (ZK) proofs.

This paper can be summarized with the following scenario. Imagine you need to create a new AI model, and you need data and computing power to train it. Instead of collecting all the data yourself, you want to collaborate with others who have data to contribute. But you also want to make sure everyone gets fairly compensated and keep the data private. The system needs to protect against any sneaky attempts to sabotage the model, ensure all computations are accurate, and work efficiently for everyone involved. To achieve this, the paper proposes a data marketplace built on blockchain technology and easy access to a GPU marketplace that is tailored to the needs of an AI developer. Here’s how it works:

Fair Compensation: Trainers are rewarded with crypto based on their contributions to training the model. This ensures everyone gets paid fairly for their input.

Privacy Protection: Data doesn’t need to be moved around, preserving its privacy. Trainers can keep their data secure while still contributing to the model.

Security Against Malicious Behavior: The system is designed to withstand attempts by malicious parties to disrupt or corrupt the model during training.

Verification: All computations in the marketplace, including assessing contributions and detecting outliers, can be verified using zero-knowledge proofs. This ensures the integrity of the process.

Efficiency and Universality: The marketplace is designed to be efficient and adaptable for different projects and participants.

In this system, a blockchain-based marketplace coordinates everything. There’s a special processing node called an aggregator that handles tasks like evaluating contributions, filtering out bad data, and adjusting settings for the model. Smart contracts on the blockchain ensures that everyone follows the rules and that honest contributors get their fair share of the rewards. The researchers have tested and implemented this system’s components and shown how it can be used effectively in real-world situations through various experiments.

In summary, Jieyi and the entire Theta engineering team truly value academic research, experience and how it directly benefits the Theta EdgeCloud platform and products. It is exciting to see how some research concepts like the AI Job scheduler are being rapidly incorporated into the first EdgeCloud release on May 1, while others have longer term implications such as the possibility to build ToT-powered autonomous LLM agents, and a decentralized AI data marketplace for training AI models. In the meantime, Theta is committed to becoming a thought leader in the AI industry by sharing research and releasing core AI building blocks with the wider community starting with Jieyi’s deep research on Tree-of-Thought algorithms.

 

Link

community logo
Join the TheDinarian Community
To read more articles like this, sign up and join my community today
0
What else you may like…
Videos
Podcasts
Posts
Articles
Blockchain NOT Needed, Barter 2.0

Fully distributed peer to peer network, verified.

Most complain about CBDC's and Digital ID's, while others build solutions around them

👉 No Blockchain, DLT or Currency needed!

Join @VeTest_2017 and I for another great discussion on the developments of Global Barter 2.0

00:01:21
Boooooooom 🚀 🤖 Next up: Autonomous AI agents trading real-world assets for you. $Veritaseum

Centralized exchanges are a single point of failure. The alternative is already working.
​WATCH: A live demonstration of a truly decentralized, peer-to-peer crypto trade.

We've bonded physical silver to NFTs on @base, creating a new asset class for a censorship-resistant, digital bartering economy.

👉 ​Next up: Autonomous AI agents trading real-world assets for you.

OP: https://x.com/ReggieMiddleton/status/1970275265340117235

00:02:55
SEC-CFTC Joint Roundtable 📚

🇺🇸 NEW: Paul Atkins declares today “a turning point in the history of American financial markets” at the SEC-CFTC joint roundtable on regulatory harmonization.

“For too long, the SEC and the CFTC have operated in parallel lanes, too often in conflict with each other…we are charting a new course.”

00:00:28
👉 Coinbase just launched an AI agent for Crypto Trading

Custom AI assistants that print money in your sleep? 🔜

The future of Crypto x AI is about to go crazy.

👉 Here’s what you need to know:

💠 'Based Agent' enables creation of custom AI agents
💠 Users set up personalized agents in < 3 minutes
💠 Equipped w/ crypto wallet and on-chain functions
💠 Capable of completing trades, swaps, and staking
💠 Integrates with Coinbase’s SDK, OpenAI, & Replit

👉 What this means for the future of Crypto:

1. Open Access: Democratized access to advanced trading
2. Automated Txns: Complex trades + streamlined on-chain activity
3. AI Dominance: Est ~80% of crypto 👉txns done by AI agents by 2025

🚨 I personally wouldn't bet against Brian Armstrong and Jesse Pollak.

👉 Coinbase just launched an AI agent for Crypto Trading

🇺🇸 NEW: SEC told ETF issuers to withdraw 19b-4s, as new generic standards make them unnecessary, per add per Eleneor Terrett.

post photo preview

🚨 SWIFT PARTNERS WITH CONSENSYS ON BLOCKCHAIN LEDGER 🚨

SWIFT has chosen to work with Consensys on a prototype for its groundbreaking blockchain-based ledger, aiming to bring its hallmark security, resiliency, and scalability to the world of tokenized digital finance. This marks a pivotal step in extending SWIFT’s trusted platform into a new era of 24/7, instant cross-border payments and digital asset settlement.

🔑 Key Points:

🔹 Project Launch: SWIFT and Consensys are collaborating on a conceptual prototype for a shared blockchain ledger designed to facilitate secure, real-time transactions between financial institutions. The initiative will start with a focus on regulated tokenized value and real-time global payments, leveraging SWIFT's robust reputation in global finance.

🔹 Bank Consortium: More than 30 leading banks—including JPMorgan, Bank of America, HSBC, and Deutsche Bank—are actively involved in the design and pilot phase, ensuring the new system meets compliance and ...

post photo preview

🚨 SWIFT PARTNERS WITH CONSENSYS ON BLOCKCHAIN LEDGER 🚨

SWIFT has chosen to work with Consensys on a prototype for its groundbreaking blockchain-based ledger, aiming to bring its hallmark security, resiliency, and scalability to the world of tokenized digital finance. This marks a pivotal step in extending SWIFT’s trusted platform into a new era of 24/7, instant cross-border payments and digital asset settlement.

🔑 Key Points:

🔹 Project Launch: SWIFT and Consensys are collaborating on a conceptual prototype for a shared blockchain ledger designed to facilitate secure, real-time transactions between financial institutions. The initiative will start with a focus on regulated tokenized value and real-time global payments, leveraging SWIFT's robust reputation in global finance.

🔹 Bank Consortium: More than 30 leading banks—including JPMorgan, Bank of America, HSBC, and Deutsche Bank—are actively involved in the design and pilot phase, ensuring the new system meets compliance and ...

post photo preview
post photo preview
The Great Onboarding: US Government Anchors Global Economy into Web3 via Pyth Network

For years, the crypto world speculated that the next major cycle would be driven by institutional adoption, with Wall Street finally legitimizing Bitcoin through vehicles like ETFs. While that prediction has indeed materialized, a recent development signifies a far more profound integration of Web3 into the global economic fabric, moving beyond mere financial products to the very infrastructure of data itself. The U.S. government has taken a monumental step, cementing Web3's role as a foundational layer for modern data distribution. This door, once opened, is poised to remain so indefinitely.

The U.S. Department of Commerce has officially partnered with leading blockchain oracle providers, Pyth Network and Chainlink, to distribute critical official economic data directly on-chain. This initiative marks a historic shift, bringing immutable, transparent, and auditable data from the federal government itself onto decentralized networks. This is not just a technological upgrade; it's a strategic move to enhance data accuracy, transparency, and accessibility for a global audience.

Specifically, Pyth Network has been selected to publish Gross Domestic Product (GDP) data, starting with quarterly releases going back five years, with plans to expand to a broader range of economic datasets. Chainlink, the other key partner, will provide data feeds from the Bureau of Economic Analysis (BEA), including Real Gross Domestic Product (GDP) and the Personal Consumption Expenditures (PCE) Price Index. This crucial economic information will be made available across a multitude of blockchain networks, including major ecosystems like Ethereum, Avalanche, Base, Bitcoin, Solana, Tron, Stellar, Arbitrum One, Polygon PoS, and Optimism.

This development is closer to science fiction than traditional finance. The same oracle network, Pyth, that secures data for over 350 decentralized applications (dApps) across more than 50 blockchains, processing over $2.5 trillion in total trading volume through its oracles, is now the system of record for the United States' core economic indicators. Pyth's extensive infrastructure, spanning over 107 blockchains and supporting more than 600 applications, positions it as a trusted source for on-chain data. This is not about speculative assets; it's about leveraging proven, robust technology for critical public services.

The significance of this collaboration cannot be overstated. By bringing official statistics on-chain, the U.S. government is embracing cryptographic verifiability and immutable publication, setting a new precedent for how governments interact with decentralized technology. This initiative aligns with broader transparency goals and is supported by Secretary of Commerce Howard Lutnick, positioning the U.S. as a world leader in finance and blockchain innovation. The decision by a federal entity to trust decentralized oracles with sensitive economic data underscores the growing institutional confidence in these networks.

This is the cycle of the great onboarding. The distinction between "Web2" and "Web3" is rapidly becoming obsolete. When government data, institutional flows, and grassroots builders all operate on the same decentralized rails, we are simply talking about the internet—a new iteration, yes, but the internet nonetheless: an immutable internet where data is not only published but also verified and distributed in real-time.

Pyth Network stands as tangible proof that this technology serves a vital purpose. It demonstrates that the industry has moved beyond abstract "crypto tech" to offering solutions that address real-world needs and are now actively sought after and understood by traditional entities. Most importantly, it proves that Web3 is no longer seeking permission; it has received the highest validation a system can receive—the trust of governments and markets alike.

This is not merely a fleeting trend; it's a crowning moment in global adoption. The U.S. government has just validated what many in the Web3 space have been building towards for years: that Web3 is not a sideshow, but a foundational layer for the future. The current cycle will be remembered as the moment the world definitively crossed this threshold, marking the last great opportunity to truly say, "we were early."

🙏 Donations Accepted 🙏

If you find value in my content, consider showing your support via:

💳 PayPal: 
1) Simply scan the QR code below 📲
2) or visit https://www.paypal.me/thedinarian

🔗 Crypto Donations👇
XRP: r9pid4yrQgs6XSFWhMZ8NkxW3gkydWNyQX
XLM: GDMJF2OCHN3NNNX4T4F6POPBTXK23GTNSNQWUMIVKESTHMQM7XDYAIZT
XDC: xdcc2C02203C4f91375889d7AfADB09E207Edf809A6

Read full Article
post photo preview
US Dept of Commerce to publish GDP data on blockchain

On Tuesday during a televised White House cabinet meeting, Commerce Secretary Howard Lutnick announced the intention to publish GDP statistics on blockchains. Today Chainlink and Pyth said they were selected as the decentralized oracles to distribute the data.

Lutnick said, “The Department of Commerce is going to start issuing its statistics on the blockchain because you are the crypto President. And we are going to put out GDP on the blockchain, so people can use the blockchain for data distribution. And then we’re going to make that available to the entire government. So, all of you can do it. We’re just ironing out all the details.”

The data includes Real GDP and the PCE Price Index, which reflects changes in the prices of domestic consumer goods and services. The statistics are released monthly and quarterly. The biggest initial use will likely be by on-chain prediction markets. But as more data comes online, such as broader inflation data or interest rates from the Federal Reserve, it could be used to automate various financial instruments. Apart from using the data in smart contracts, sources of tamperproof data 👉will become increasingly important for generative AI.

While it would be possible to procure the data from third parties, it is always ideal to get it from the source to ensure its accuracy. Getting data directly from government sources makes it tamperproof, provided the original data feed has not been manipulated before it reaches the oracle.

Source

🙏 Donations Accepted 🙏

If you find value in my content, consider showing your support via:

💳 PayPal: 
1) Simply scan the QR code below 📲
2) or visit https://www.paypal.me/thedinarian

🔗 Crypto
XRP: r9pid4yrQgs6XSFWhMZ8NkxW3gkydWNyQX
XLM: GDMJF2OCHN3NNNX4T4F6POPBTXK23GTNSNQWUMIVKESTHMQM7XDYAIZT
XDC: xdcc2C02203C4f91375889d7AfADB09E207Edf809A6

Read full Article
post photo preview
List Of Cardano Wallets

Well-known and actively maintained wallets supporting the Cardano Blockchain are EternlTyphonVesprYoroiLaceADAliteNuFiDaedalusGeroLodeWalletCoin WalletADAWalletAtomicGem WalletTrust and Exodus.

Note that in case of issues, usually only queries relating to official wallets can be answered in Cardano groups across telegram/forum. You may need to consult with specific wallet support teams for third party wallets.

Tips

  • Its is important to ensure that you're in sole control of your wallet keys, and that the keys used can be restored via alternate wallet providers if a particular one is non-functional. Hence, put extra attention to Non-Custodial and Compatibility fields.
  • The score column below is strictly a count of checks against each feature listed, the impact of specific feature (and thus, score) is up to reader's descretion.
  • The table represents current state on mainnet network, any future roadmap activities are out-of-scope.
  • Info on individual fields can be found towards the end of the page.
  • Any field that shows partial support (eg: open-source field) does not score the point for that field.

Brief info on fields above

  • Non-Custodial: are wallets where payment as well as stake keys are not shared/reused by wallet provider, and funds can be transparently verified on explorer
  • Compatibility: If the wallet mnemonics/keys can easily (for non-technical user) be used outside of specific wallet provider in major other wallets
  • Stake Control: Freedom to elect stake pool for user to delegate to (in user-friendly way)
  • Transparent Support: Easy approachability of a public interactive - eg: discord/telegram - group (with non-anonymous users) who can help out with support. Twitter/Email supports do not count for a check
  • Voting: Ability to participate in Catalyst voting process
  • Hardware Wallet: Integration with atleast Ledger Nano device
  • Native Assets: Ability to view native assets that belong to wallet
  • dApp Integration: Ability to interact with dApps
  • Stability: represents whether there have been large number of users reporting missing tokens/balance due to wallet backend being out of sync
  • Testnets Support: Ability to easily (for end-user) open wallets in atleast one of the cardano testnet networks
  • Custom Backend Support: Ability to elect a custom backend URL for selecting alternate way to submit transactions transactions created on client machines
  • Single/Multi Address Mode: Ability to use/import Single as well as Multiple Address modes for a wallet
  • Mobile App: Availability on atleast one of the popular mobile platforms
  • Desktop (app,extension,web): Ways to open wallet app on desktop PCs
  • Open Source: Whether the complete wallet (all components) are open source and can be run independently.

Source

🙏 Donations Accepted 🙏

If you find value in my content, consider showing your support via:

💳 PayPal: 
1) Simply scan the QR code below 📲
2) or visit https://www.paypal.me/thedinarian

🔗 Crypto
XRP: r9pid4yrQgs6XSFWhMZ8NkxW3gkydWNyQX
XLM: GDMJF2OCHN3NNNX4T4F6POPBTXK23GTNSNQWUMIVKESTHMQM7XDYAIZT
XDC: xdcc2C02203C4f91375889d7AfADB09E207Edf809A6

 

Read full Article
See More
Available on mobile and TV devices
google store google store app store app store
google store google store app tv store app tv store amazon store amazon store roku store roku store
Powered by Locals