TheDinarian
News • Business • Investing & Finance
Lightweight and Flexible Data Access for Algorand
March 24, 2023
post photo preview

Algorand has released a new tool for blockchain data access: Conduit. Conduit is a modular plugin-based tool and a powerful upgrade to the one-size-fits-all Indexer. Conduit allows dapps to get exactly the data they need in an affordable deployment.

Useful, but bulky: the Indexer

The Indexer is a ready-to-go open-source tool that pulls the data from the blockchain, stores it in a database, and offers an API to serve that data. The existence of the Indexer has been a significant boon for the Algorand ecosystem, allowing anybody to easily read the Algorand blockchain.

However, the Indexer has historically had one major drawback: it is expensive to run. There are two main reasons for this:

  1. Running an Indexer requires also running an archival node that stores every block since the beginning of the blockchain.
  2. The Indexer collects the entire blockchain history (every transaction since block zero) in a Postgres database.

These facts make the Indexer a multi-Terabyte deployment. A typical Indexer requires a number of expensive resources, and these multiply for production deployments needing redundancy, load-balancing, and covering multiple regions.

The scale of the Indexer also makes it slow to initialize, and only capable of serving the specific queries for which it is indexed. As the Algorand blockchain has grown, it has become impractical for smaller projects to maintain their own Indexers.

Consequently, the ecosystem mostly relies on a few API/data providers. These providers run Indexers and charge dapps for their API calls. This is more economical and practical than each group running their own Indexer, but it presents other inflexibilities.

Dapps should have an accessible option to own their own data access infrastructure. This is what Conduit was built for.

Conduit, the Basics

Conduit is a new solution with several major advantages:

  1. Conduit does not require running an archival algod node.
  2. Conduit lets users filter incoming blockchain data, allowing them to collect strictly the data they need for their applications.
  3. Conduit offers a data pruning feature that allows users to automatically delete old transactions when pruning is enabled.
  4. With Conduit, users can build custom data exporters that use the data destination of their choice.
  5. Conduit is designed as an extensible plugin architecture. Any community-contributed plugin can be integrated by anyone.

Conduit allows users to configure their own data pipelines for filtering, aggregation, and storage of transactions and accounts on any Algorand network.

A Conduit pipeline is composed of an importer, optional processor(s), and exporter plugins. Along with the Conduit release, the following noteworthy plugins are made available.

  • Algod importer — fetches blocks from an algod REST API.
  • Filter processor — filters data based on transaction fields.
  • Postgres exporter — writes the data to a Postgres database.
  • File writer exporter — writes the data to a file.

Configuring a Conduit pipeline requires defining which plugins to use, and if necessary, configuring the plugins. For example, the filter processor requires a definition of what to filter.

This is best demonstrated with an example. See a basic walkthrough here.

Conduit’s Filter Processor

The filter processor is a key new feature introduced with Conduit. It allows users to filter the transaction data based on any transaction field — transaction type, app ID, asset ID, sender, receiver, amount, etc. These filters can also be combined.

Since many transactions are submitted as grouped transactions, the filter processor allows users to choose whether or not to include the entire transaction group when the filter conditions are met.

The filter processor will always include inner transactions for transactions that match the specified filter conditions.

Full details on the filter processor are here.

A New Node Configuration for Conduit: Follow Mode

Conduit is used to track data from the blockchain and make it available to the off-chain world. Every time a new block is created on-chain, Conduit is informed about every change to every piece of state since the prior block, such as new accounts created, app states updated, boxes deleted, etc.

Some dapps use an object called the ApplyData to track some kinds of state changes, however this approach is technically limited. Not all changes are reflected in this object, and ApplyData are only cached for 4 rounds on non-archival nodes, meaning that delayed handling of ApplyData updates for more than 15 or so seconds will result in an unrecoverable state error.

The old Indexer architecture solved these challenges by requiring access to an archival algod node. Indexer used a “local ledger” to track the state changes from round to round, and thus avoided the incomplete ApplyData object. The drawback of this design is the need for an expensive archival node.

Conduit instead requires access to a node in a new lightweight “follow mode” configuration which replaces the need for the archival configuration. Conduit can pause and unpause this node’s round updates as required. The pause functionality ensures that the Conduit process will not miss out on any blockchain state updates. Conduit also makes use of a new “state delta” endpoint introduced in the node to eliminate the requirement for a large local ledger.

A node with follow mode enabled cannot participate in consensus, as votes based on paused state information would be rejected. Similarly, submitting transactions to such a node is not possible, as acceptance based on paused, outdated state information might be judged invalid by the rest of the blockchain.

Conduit as an Extensible Tool

Focusing on open-source principles and decentralization, Conduit’s design encourages custom-built solutions, setting it apart from the Indexer. In our initial release, we encourage new plugin submissions via PRs to the Conduit repository. We aim for the plugin framework to inspire community involvement, allowing everyone to benefit from shared efforts. Currently, we’re engaging the community to identify optimal management for externally-supported plugins long-term (join the conversation on Discord #conduit channel!)

We have already seen the development of a Kafka plugin by a community member (Iridium#4127 on Discord), who has this to say about Conduit:

“… it [Conduit] allows [you] to choose your … targeted product (e.g. Kafka) to quickly build a plugin and let the data flow. Mainly it’s just importing the correct library — configure your connection and use the library to send messages to your system. Receiving is already handled by Conduit.”

Comparing Deployments: Legacy Indexer vs. Conduit Architecture

Indexer, legacy architecture

  • Requires an archival algod node, which requires at least 1.1 TB of storage.
  • Requires a Postgres database with full historical data, or 1.5 TB of storage.

Source for the above: howbigisalgorand.com

Conduit architecture

  • Requires a node with “follow mode” enabled, which requires 40 GB of storage (like other non-archival nodes).
  • Conduit can use a Postgres database, or a different data store. The user can store full historical data, or a subset. This is at most 1.5 TB if storing the full history, and could be as little as a few GB.

The costs of these deployments will vary depending on whether users are self-hosted or using cloud providers (and vary greatly by provider). However, the storage costs will be strictly less for a Conduit-backed deployment.

Note that storage will likely be the major cost factor, and bandwidth and compute requirements are similar across both architectures.

Continued Indexer Support

We are continuing to support the existing releases of Indexer which run its old architecture (using the archival node) at this time. If users would like to continue using the Indexer but also want to save costs by removing the need for an archival node, they have the option to run an Indexer backed by Conduit. The Indexer interface remains the same. See our migration guide here.

Conduit Builds Better Apps

Conduit was designed to be flexible and extensible, intended to allow developers to build whatever data solution fits their needs. As such, Conduit has countless applications.

Want to run Conduit to support your dapp reporting needs?

Want to extend the Indexer API?

Want to power an event-driven system based on on-chain events?

Want to scale your API Provider service by using CockroachDB?

Want to dump everything to S3 and just query that?

The limitations imposed by the Indexer’s rigidity no longer apply. While Conduit doesn’t provide everything for free, it offers users the flexibility to build what they need.

Link

community logo
Join the TheDinarian Community
To read more articles like this, sign up and join my community today
0
What else you may like…
Videos
Podcasts
Posts
Articles
September 07, 2025
Utility, Utility, Utility

🚨Robinhood CEO - Vlad Tenev says: “It’s time to move beyond Bitcoin and meme coins into real-world assets!”

For up to date cryptocurrencies available through Robinhood:
https://robinhood.com/us/en/support/articles/coin-availability/

00:00:24
September 06, 2025
3 Companies Control 80% Of U.S. Banking👀

3 companies. 80% of U.S. banking. You need to know their names.

Watch us break it down in the latest Stronghold 101

00:03:58
September 06, 2025
We Have Been Lied To, For Far To Long!

Impossible Ancient Knowledge That DEBUNKS Our History!

Give them a follow:

Jays info:
@TheProjectUnity on X
youtube.com/c/ProjectUnity

Geoffrey Drumms info:
@TheLandOfChem on X
www.youtube.com/@thelandofchem

00:18:36
👉 Coinbase just launched an AI agent for Crypto Trading

Custom AI assistants that print money in your sleep? 🔜

The future of Crypto x AI is about to go crazy.

👉 Here’s what you need to know:

💠 'Based Agent' enables creation of custom AI agents
💠 Users set up personalized agents in < 3 minutes
💠 Equipped w/ crypto wallet and on-chain functions
💠 Capable of completing trades, swaps, and staking
💠 Integrates with Coinbase’s SDK, OpenAI, & Replit

👉 What this means for the future of Crypto:

1. Open Access: Democratized access to advanced trading
2. Automated Txns: Complex trades + streamlined on-chain activity
3. AI Dominance: Est ~80% of crypto 👉txns done by AI agents by 2025

🚨 I personally wouldn't bet against Brian Armstrong and Jesse Pollak.

👉 Coinbase just launched an AI agent for Crypto Trading
Pyth Network DAO

Beyond revenue, the Phase 2 proposal asks for the DAO to consider whether and how the network can deliver value back to the community.

This new product could fuel the DAO, and the DAO should consider whether it wants to support buybacks, rewards, and strengthening the network for all stakeholders.

Looking ahead to Phase 3: Total market coverage.

→ 200–300 new symbols added each month
→ 3K+ by year-end, 10K+ in 2026
→ Complete coverage across: trading venues, OTC markets, permissioned & unpermissioned DeFi

Pyth will become the most comprehensive financial data layer in the world.

https://x.com/PythNetwork/status/1963255788698484942

🚨BREAKING: Ledger CTO Charles Guillemet warns of a supply chain attack in the JavaScript ecosystem after an NPM account compromise.

He advises users to carefully verify every transaction if using a hardware wallet, and to avoid on-chain transactions entirely if they don’t.

Stay safe.

https://x.com/CoinDesk/status/1965110299456847944

$ETH ETF outflow of $96,700,000 🔴 yesterday.

BlackRock sold $192,700,000 in Ethereum.

post photo preview
post photo preview
The Great Onboarding: US Government Anchors Global Economy into Web3 via Pyth Network

For years, the crypto world speculated that the next major cycle would be driven by institutional adoption, with Wall Street finally legitimizing Bitcoin through vehicles like ETFs. While that prediction has indeed materialized, a recent development signifies a far more profound integration of Web3 into the global economic fabric, moving beyond mere financial products to the very infrastructure of data itself. The U.S. government has taken a monumental step, cementing Web3's role as a foundational layer for modern data distribution. This door, once opened, is poised to remain so indefinitely.

The U.S. Department of Commerce has officially partnered with leading blockchain oracle providers, Pyth Network and Chainlink, to distribute critical official economic data directly on-chain. This initiative marks a historic shift, bringing immutable, transparent, and auditable data from the federal government itself onto decentralized networks. This is not just a technological upgrade; it's a strategic move to enhance data accuracy, transparency, and accessibility for a global audience.

Specifically, Pyth Network has been selected to publish Gross Domestic Product (GDP) data, starting with quarterly releases going back five years, with plans to expand to a broader range of economic datasets. Chainlink, the other key partner, will provide data feeds from the Bureau of Economic Analysis (BEA), including Real Gross Domestic Product (GDP) and the Personal Consumption Expenditures (PCE) Price Index. This crucial economic information will be made available across a multitude of blockchain networks, including major ecosystems like Ethereum, Avalanche, Base, Bitcoin, Solana, Tron, Stellar, Arbitrum One, Polygon PoS, and Optimism.

This development is closer to science fiction than traditional finance. The same oracle network, Pyth, that secures data for over 350 decentralized applications (dApps) across more than 50 blockchains, processing over $2.5 trillion in total trading volume through its oracles, is now the system of record for the United States' core economic indicators. Pyth's extensive infrastructure, spanning over 107 blockchains and supporting more than 600 applications, positions it as a trusted source for on-chain data. This is not about speculative assets; it's about leveraging proven, robust technology for critical public services.

The significance of this collaboration cannot be overstated. By bringing official statistics on-chain, the U.S. government is embracing cryptographic verifiability and immutable publication, setting a new precedent for how governments interact with decentralized technology. This initiative aligns with broader transparency goals and is supported by Secretary of Commerce Howard Lutnick, positioning the U.S. as a world leader in finance and blockchain innovation. The decision by a federal entity to trust decentralized oracles with sensitive economic data underscores the growing institutional confidence in these networks.

This is the cycle of the great onboarding. The distinction between "Web2" and "Web3" is rapidly becoming obsolete. When government data, institutional flows, and grassroots builders all operate on the same decentralized rails, we are simply talking about the internet—a new iteration, yes, but the internet nonetheless: an immutable internet where data is not only published but also verified and distributed in real-time.

Pyth Network stands as tangible proof that this technology serves a vital purpose. It demonstrates that the industry has moved beyond abstract "crypto tech" to offering solutions that address real-world needs and are now actively sought after and understood by traditional entities. Most importantly, it proves that Web3 is no longer seeking permission; it has received the highest validation a system can receive—the trust of governments and markets alike.

This is not merely a fleeting trend; it's a crowning moment in global adoption. The U.S. government has just validated what many in the Web3 space have been building towards for years: that Web3 is not a sideshow, but a foundational layer for the future. The current cycle will be remembered as the moment the world definitively crossed this threshold, marking the last great opportunity to truly say, "we were early."

🙏 Donations Accepted 🙏

If you find value in my content, consider showing your support via:

💳 PayPal: 
1) Simply scan the QR code below 📲
2) or visit https://www.paypal.me/thedinarian

🔗 Crypto Donations👇
XRP: r9pid4yrQgs6XSFWhMZ8NkxW3gkydWNyQX
XLM: GDMJF2OCHN3NNNX4T4F6POPBTXK23GTNSNQWUMIVKESTHMQM7XDYAIZT
XDC: xdcc2C02203C4f91375889d7AfADB09E207Edf809A6

Read full Article
post photo preview
US Dept of Commerce to publish GDP data on blockchain

On Tuesday during a televised White House cabinet meeting, Commerce Secretary Howard Lutnick announced the intention to publish GDP statistics on blockchains. Today Chainlink and Pyth said they were selected as the decentralized oracles to distribute the data.

Lutnick said, “The Department of Commerce is going to start issuing its statistics on the blockchain because you are the crypto President. And we are going to put out GDP on the blockchain, so people can use the blockchain for data distribution. And then we’re going to make that available to the entire government. So, all of you can do it. We’re just ironing out all the details.”

The data includes Real GDP and the PCE Price Index, which reflects changes in the prices of domestic consumer goods and services. The statistics are released monthly and quarterly. The biggest initial use will likely be by on-chain prediction markets. But as more data comes online, such as broader inflation data or interest rates from the Federal Reserve, it could be used to automate various financial instruments. Apart from using the data in smart contracts, sources of tamperproof data 👉will become increasingly important for generative AI.

While it would be possible to procure the data from third parties, it is always ideal to get it from the source to ensure its accuracy. Getting data directly from government sources makes it tamperproof, provided the original data feed has not been manipulated before it reaches the oracle.

Source

🙏 Donations Accepted 🙏

If you find value in my content, consider showing your support via:

💳 PayPal: 
1) Simply scan the QR code below 📲
2) or visit https://www.paypal.me/thedinarian

🔗 Crypto
XRP: r9pid4yrQgs6XSFWhMZ8NkxW3gkydWNyQX
XLM: GDMJF2OCHN3NNNX4T4F6POPBTXK23GTNSNQWUMIVKESTHMQM7XDYAIZT
XDC: xdcc2C02203C4f91375889d7AfADB09E207Edf809A6

Read full Article
post photo preview
List Of Cardano Wallets

Well-known and actively maintained wallets supporting the Cardano Blockchain are EternlTyphonVesprYoroiLaceADAliteNuFiDaedalusGeroLodeWalletCoin WalletADAWalletAtomicGem WalletTrust and Exodus.

Note that in case of issues, usually only queries relating to official wallets can be answered in Cardano groups across telegram/forum. You may need to consult with specific wallet support teams for third party wallets.

Tips

  • Its is important to ensure that you're in sole control of your wallet keys, and that the keys used can be restored via alternate wallet providers if a particular one is non-functional. Hence, put extra attention to Non-Custodial and Compatibility fields.
  • The score column below is strictly a count of checks against each feature listed, the impact of specific feature (and thus, score) is up to reader's descretion.
  • The table represents current state on mainnet network, any future roadmap activities are out-of-scope.
  • Info on individual fields can be found towards the end of the page.
  • Any field that shows partial support (eg: open-source field) does not score the point for that field.

Brief info on fields above

  • Non-Custodial: are wallets where payment as well as stake keys are not shared/reused by wallet provider, and funds can be transparently verified on explorer
  • Compatibility: If the wallet mnemonics/keys can easily (for non-technical user) be used outside of specific wallet provider in major other wallets
  • Stake Control: Freedom to elect stake pool for user to delegate to (in user-friendly way)
  • Transparent Support: Easy approachability of a public interactive - eg: discord/telegram - group (with non-anonymous users) who can help out with support. Twitter/Email supports do not count for a check
  • Voting: Ability to participate in Catalyst voting process
  • Hardware Wallet: Integration with atleast Ledger Nano device
  • Native Assets: Ability to view native assets that belong to wallet
  • dApp Integration: Ability to interact with dApps
  • Stability: represents whether there have been large number of users reporting missing tokens/balance due to wallet backend being out of sync
  • Testnets Support: Ability to easily (for end-user) open wallets in atleast one of the cardano testnet networks
  • Custom Backend Support: Ability to elect a custom backend URL for selecting alternate way to submit transactions transactions created on client machines
  • Single/Multi Address Mode: Ability to use/import Single as well as Multiple Address modes for a wallet
  • Mobile App: Availability on atleast one of the popular mobile platforms
  • Desktop (app,extension,web): Ways to open wallet app on desktop PCs
  • Open Source: Whether the complete wallet (all components) are open source and can be run independently.

Source

🙏 Donations Accepted 🙏

If you find value in my content, consider showing your support via:

💳 PayPal: 
1) Simply scan the QR code below 📲
2) or visit https://www.paypal.me/thedinarian

🔗 Crypto
XRP: r9pid4yrQgs6XSFWhMZ8NkxW3gkydWNyQX
XLM: GDMJF2OCHN3NNNX4T4F6POPBTXK23GTNSNQWUMIVKESTHMQM7XDYAIZT
XDC: xdcc2C02203C4f91375889d7AfADB09E207Edf809A6

 

Read full Article
See More
Available on mobile and TV devices
google store google store app store app store
google store google store app tv store app tv store amazon store amazon store roku store roku store
Powered by Locals