TheDinarian
News • Business • Investing & Finance
Lightweight and Flexible Data Access for Algorand
March 24, 2023
post photo preview

Algorand has released a new tool for blockchain data access: Conduit. Conduit is a modular plugin-based tool and a powerful upgrade to the one-size-fits-all Indexer. Conduit allows dapps to get exactly the data they need in an affordable deployment.

Useful, but bulky: the Indexer

The Indexer is a ready-to-go open-source tool that pulls the data from the blockchain, stores it in a database, and offers an API to serve that data. The existence of the Indexer has been a significant boon for the Algorand ecosystem, allowing anybody to easily read the Algorand blockchain.

However, the Indexer has historically had one major drawback: it is expensive to run. There are two main reasons for this:

  1. Running an Indexer requires also running an archival node that stores every block since the beginning of the blockchain.
  2. The Indexer collects the entire blockchain history (every transaction since block zero) in a Postgres database.

These facts make the Indexer a multi-Terabyte deployment. A typical Indexer requires a number of expensive resources, and these multiply for production deployments needing redundancy, load-balancing, and covering multiple regions.

The scale of the Indexer also makes it slow to initialize, and only capable of serving the specific queries for which it is indexed. As the Algorand blockchain has grown, it has become impractical for smaller projects to maintain their own Indexers.

Consequently, the ecosystem mostly relies on a few API/data providers. These providers run Indexers and charge dapps for their API calls. This is more economical and practical than each group running their own Indexer, but it presents other inflexibilities.

Dapps should have an accessible option to own their own data access infrastructure. This is what Conduit was built for.

Conduit, the Basics

Conduit is a new solution with several major advantages:

  1. Conduit does not require running an archival algod node.
  2. Conduit lets users filter incoming blockchain data, allowing them to collect strictly the data they need for their applications.
  3. Conduit offers a data pruning feature that allows users to automatically delete old transactions when pruning is enabled.
  4. With Conduit, users can build custom data exporters that use the data destination of their choice.
  5. Conduit is designed as an extensible plugin architecture. Any community-contributed plugin can be integrated by anyone.

Conduit allows users to configure their own data pipelines for filtering, aggregation, and storage of transactions and accounts on any Algorand network.

A Conduit pipeline is composed of an importer, optional processor(s), and exporter plugins. Along with the Conduit release, the following noteworthy plugins are made available.

  • Algod importer — fetches blocks from an algod REST API.
  • Filter processor — filters data based on transaction fields.
  • Postgres exporter — writes the data to a Postgres database.
  • File writer exporter — writes the data to a file.

Configuring a Conduit pipeline requires defining which plugins to use, and if necessary, configuring the plugins. For example, the filter processor requires a definition of what to filter.

This is best demonstrated with an example. See a basic walkthrough here.

Conduit’s Filter Processor

The filter processor is a key new feature introduced with Conduit. It allows users to filter the transaction data based on any transaction field — transaction type, app ID, asset ID, sender, receiver, amount, etc. These filters can also be combined.

Since many transactions are submitted as grouped transactions, the filter processor allows users to choose whether or not to include the entire transaction group when the filter conditions are met.

The filter processor will always include inner transactions for transactions that match the specified filter conditions.

Full details on the filter processor are here.

A New Node Configuration for Conduit: Follow Mode

Conduit is used to track data from the blockchain and make it available to the off-chain world. Every time a new block is created on-chain, Conduit is informed about every change to every piece of state since the prior block, such as new accounts created, app states updated, boxes deleted, etc.

Some dapps use an object called the ApplyData to track some kinds of state changes, however this approach is technically limited. Not all changes are reflected in this object, and ApplyData are only cached for 4 rounds on non-archival nodes, meaning that delayed handling of ApplyData updates for more than 15 or so seconds will result in an unrecoverable state error.

The old Indexer architecture solved these challenges by requiring access to an archival algod node. Indexer used a “local ledger” to track the state changes from round to round, and thus avoided the incomplete ApplyData object. The drawback of this design is the need for an expensive archival node.

Conduit instead requires access to a node in a new lightweight “follow mode” configuration which replaces the need for the archival configuration. Conduit can pause and unpause this node’s round updates as required. The pause functionality ensures that the Conduit process will not miss out on any blockchain state updates. Conduit also makes use of a new “state delta” endpoint introduced in the node to eliminate the requirement for a large local ledger.

A node with follow mode enabled cannot participate in consensus, as votes based on paused state information would be rejected. Similarly, submitting transactions to such a node is not possible, as acceptance based on paused, outdated state information might be judged invalid by the rest of the blockchain.

Conduit as an Extensible Tool

Focusing on open-source principles and decentralization, Conduit’s design encourages custom-built solutions, setting it apart from the Indexer. In our initial release, we encourage new plugin submissions via PRs to the Conduit repository. We aim for the plugin framework to inspire community involvement, allowing everyone to benefit from shared efforts. Currently, we’re engaging the community to identify optimal management for externally-supported plugins long-term (join the conversation on Discord #conduit channel!)

We have already seen the development of a Kafka plugin by a community member (Iridium#4127 on Discord), who has this to say about Conduit:

“… it [Conduit] allows [you] to choose your … targeted product (e.g. Kafka) to quickly build a plugin and let the data flow. Mainly it’s just importing the correct library — configure your connection and use the library to send messages to your system. Receiving is already handled by Conduit.”

Comparing Deployments: Legacy Indexer vs. Conduit Architecture

Indexer, legacy architecture

  • Requires an archival algod node, which requires at least 1.1 TB of storage.
  • Requires a Postgres database with full historical data, or 1.5 TB of storage.

Source for the above: howbigisalgorand.com

Conduit architecture

  • Requires a node with “follow mode” enabled, which requires 40 GB of storage (like other non-archival nodes).
  • Conduit can use a Postgres database, or a different data store. The user can store full historical data, or a subset. This is at most 1.5 TB if storing the full history, and could be as little as a few GB.

The costs of these deployments will vary depending on whether users are self-hosted or using cloud providers (and vary greatly by provider). However, the storage costs will be strictly less for a Conduit-backed deployment.

Note that storage will likely be the major cost factor, and bandwidth and compute requirements are similar across both architectures.

Continued Indexer Support

We are continuing to support the existing releases of Indexer which run its old architecture (using the archival node) at this time. If users would like to continue using the Indexer but also want to save costs by removing the need for an archival node, they have the option to run an Indexer backed by Conduit. The Indexer interface remains the same. See our migration guide here.

Conduit Builds Better Apps

Conduit was designed to be flexible and extensible, intended to allow developers to build whatever data solution fits their needs. As such, Conduit has countless applications.

Want to run Conduit to support your dapp reporting needs?

Want to extend the Indexer API?

Want to power an event-driven system based on on-chain events?

Want to scale your API Provider service by using CockroachDB?

Want to dump everything to S3 and just query that?

The limitations imposed by the Indexer’s rigidity no longer apply. While Conduit doesn’t provide everything for free, it offers users the flexibility to build what they need.

Link

community logo
Join the TheDinarian Community
To read more articles like this, sign up and join my community today
0
What else you may like…
Videos
Podcasts
Posts
Articles
The Gold Standard ✨️ And The USD 💵
00:02:30
IMF Admitting Crypto Is Inevitable 💥

When you have the IMF Admitting crypto is inevitable, BlackRock Tokenizing the financial system, the FED hinting at ending QT, Gold doing a parabolic move & the FED hinting at renewed easing.

This isn’t coincidence.
This is strategic coordination.

OP: Vandell33

00:00:47
Listen to this... 🤯

Catherine Fitts, she just revealed that interdimensional beings are pulling the strings in this world 🧐😱👽

👉Re-read your religious book, with interdimensional beings in mind and it will all start to make sense... 😉

00:00:23
👉 Coinbase just launched an AI agent for Crypto Trading

Custom AI assistants that print money in your sleep? 🔜

The future of Crypto x AI is about to go crazy.

👉 Here’s what you need to know:

💠 'Based Agent' enables creation of custom AI agents
💠 Users set up personalized agents in < 3 minutes
💠 Equipped w/ crypto wallet and on-chain functions
💠 Capable of completing trades, swaps, and staking
💠 Integrates with Coinbase’s SDK, OpenAI, & Replit

👉 What this means for the future of Crypto:

1. Open Access: Democratized access to advanced trading
2. Automated Txns: Complex trades + streamlined on-chain activity
3. AI Dominance: Est ~80% of crypto 👉txns done by AI agents by 2025

🚨 I personally wouldn't bet against Brian Armstrong and Jesse Pollak.

👉 Coinbase just launched an AI agent for Crypto Trading

🚨 JOHN BOLLINGER WARNS: “PAY ATTENTION SOON” AS CHARTS SIGNAL IMMINENT MAJOR MOVE 🚨

Veteran technical analyst John Bollinger—the creator of the Bollinger Bands indicator—has identified potential “W bottom” patterns forming on the charts of Ether (ETH) and Solana (SOL), and advises traders to watch closely for a significant market move.

🔑 Key Points:

🔹 W Bottom Setups: Bollinger sees early signs of double-bottom (“W”) formations in both Ether and Solana, which historically signal bullish reversals and the potential for substantial price advances if confirmed.

🔹 Bitcoin Lagging—But Watch Closely: While the pattern hasn't emerged on Bitcoin’s chart yet, BTC has posted a “V” shaped recovery after a major dip below $104,000, and now sits at the lower end of its recent range. Past market behavior suggests that similar patterns could soon develop for Bitcoin.

🔹 Historical Precedent: The last time Bollinger issued a comparable alert was July 2024—Bitcoin ...

🚨 SOLCRAVO LAUNCHES XRP SMART CONTRACTS: ENHANCING UTILITY AND YIELD FOR XRP HOLDERS 🚨

SolCravo has launched a new platform delivering smart contracts for XRP, enabling holders to earn on-chain yields and participate in automated asset management without selling or transferring their XRP.

🔑 Key Points:

🔹 Core Offering: SolCravo allows users to connect their XRP wallets and engage directly with smart contracts that automate income generation, putting their assets to work while maintaining full self-custody and control. The service is intended to be user-friendly for both new and experienced participants.

🔹 Multi-Asset Support: While focused on XRP, SolCravo's platform is multi-chain—supporting BTC, ETH, BNB, LTC, SOL, and USDT alongside XRP, making it a centralized hub for asset management and smart contract engagement for leading cryptocurrencies.

🔹 Contract Tiers: Users can select among several contract options, ranging from a $100 “Starter Contract” to a $16,000+ ...

post photo preview

🔥 BINANCE CRACKS DOWN ON BOT FARMS, BANS OVER 600 ACCOUNTS 🔥

Binance has taken swift and decisive action to maintain the integrity of its Binance Alpha program, permanently banning over 600 accounts for engaging in fraudulent activity.

The accounts were found to be abusing the platform's reward mechanisms using sophisticated, automated tools, commonly referred to as "bot farms."

🔑 Key Details:

🔹 Platform Targeted: Binance Alpha is a section within the Binance ecosystem (often tied to the Binance Web3 Wallet) designed to give users early access to promising, emerging crypto projects and exclusive token generation events (TGEs) through its Alpha Points system.

🔹 The Violation: The banned accounts were utilizing fraudulent automated tools (scripts, bots, and other non-manual methods) to unfairly "farm" or accumulate Alpha Points and disproportionately claim rewards, effectively cheating the system and undermining the fairness for legitimate ...

post photo preview
post photo preview
New Human Force
Join this Now! YOU have what it takes!

They are in our solar system, and in our event-stream in this Eternal Now.

Officialdom is clueless.

They think we are going to be at WAR with the Aliens.

Officialdom is very stupid.

Aliens is here. It’s not WAR. It’s Contention.

There is a difference.

Officialdom is clueless, still living in the last Millennium.

Aliens is here.

The Field in which we contend is This Eternal Now.

ALL HUMANS LIVE HERE, and ONLY HERE, in this

ETERNAL NOW.

It’s a Field of potentials, of pending Manifestation, this continuous event-stream of karma in which we have always lived our body’s Life.

This Eternal Now has always been our body’s Field of Contention.

The Aliens is here, in our Eternal Now.

Our common, shared, reality that we all continuously co-create now has Aliens.

It’s getting very complex in here.

Officialdom is clueless. They see the Aliens. They are freaking out. They think you are children, when it is their small minds, trapped in a reality that is only grit, mud, and ‘random chance’ who are childish.

Officialdom is stupid. They will and are reacting badly. As is their way, they are trying to hide shit from you. Silly grit bound minds don’t realize you can see everything from within the Eternal Now. They have yet to grasp that what they perceive as this Matterium, filled with ‘matter’, is but a hardening of our previous (past) internal states of being.

WAR happens in the Matterium.

Contention occurs within this Eternal Now where Consciousness shapes the manifesting event-stream.

YOU know this to be fact. You are a co-creator.

Contention with Aliens is happening in this instant in this Eternal Now.

Officialdom ain’t doing shit. They are still stuck in trying to move matter around to affect unfolding circumstances. That’s redoing the mirror trying to affect the reflection. Dumb fucks….

It’s up to US. To the New Humans. Those of us who live in this Eternal Now. Those of us who see that our body’s Lives (the Chain that cannot be broken) are expressions of the Ontology revealing itself to itself. It’s up to us guys.

We are not an Army. That’s a concept from the past, from before the emergence of the New Humans. We are a Force. A self-organizing collective with leadership resident in each, and every participant.

We are the New Human Force. By the time officialdom starts to speak about the Aliens in near-factual terms, we will already be engaging them in this Eternal Now.

By the time officialdom begins to move matter around (space ships & such) thinking it’s War, we will already be suffering casualties in this Eternal Now. That part is inevitable. It’s how we learn.

By the time officialdom realizes that some shit is going on in places and ways beyond its conception, we will already be pushing our dominance onto our partners in this First Contention, the Aliens. Nage cannot train without Uke.

Just as officialdom is scrambling to research the Ontology, this Eternal Now, and the event-stream, we will be settling terms with our new partners, the Aliens.

Come, join with us. It’s going to be a hellacious Contention.

We ARE the NEW HUMANS!

Together we are the Force that cannot be defeated.

Start YOUR training in this instance of this Eternal NOW.

Consume Neville Goddard videos as though all of human existence depended on YOUR mind and YOUR active, effective, imaginings!

It’s not a question of Mind over Matter as there is only Mind and it cares not for Matter. That’s residue.

Source

🙏 Donations Accepted 🙏

If you find value in my content, consider showing your support via:

💳 PayPal: 
1) Simply scan the QR code below 📲
2) or visit https://www.paypal.me/thedinarian

🔗 Crypto Donations👇
XRP: r9pid4yrQgs6XSFWhMZ8NkxW3gkydWNyQX
XLM: GDMJF2OCHN3NNNX4T4F6POPBTXK23GTNSNQWUMIVKESTHMQM7XDYAIZT
XDC: xdcc2C02203C4f91375889d7AfADB09E207Edf809A6

Read full Article
post photo preview
The Great Onboarding: US Government Anchors Global Economy into Web3 via Pyth Network

For years, the crypto world speculated that the next major cycle would be driven by institutional adoption, with Wall Street finally legitimizing Bitcoin through vehicles like ETFs. While that prediction has indeed materialized, a recent development signifies a far more profound integration of Web3 into the global economic fabric, moving beyond mere financial products to the very infrastructure of data itself. The U.S. government has taken a monumental step, cementing Web3's role as a foundational layer for modern data distribution. This door, once opened, is poised to remain so indefinitely.

The U.S. Department of Commerce has officially partnered with leading blockchain oracle providers, Pyth Network and Chainlink, to distribute critical official economic data directly on-chain. This initiative marks a historic shift, bringing immutable, transparent, and auditable data from the federal government itself onto decentralized networks. This is not just a technological upgrade; it's a strategic move to enhance data accuracy, transparency, and accessibility for a global audience.

Specifically, Pyth Network has been selected to publish Gross Domestic Product (GDP) data, starting with quarterly releases going back five years, with plans to expand to a broader range of economic datasets. Chainlink, the other key partner, will provide data feeds from the Bureau of Economic Analysis (BEA), including Real Gross Domestic Product (GDP) and the Personal Consumption Expenditures (PCE) Price Index. This crucial economic information will be made available across a multitude of blockchain networks, including major ecosystems like Ethereum, Avalanche, Base, Bitcoin, Solana, Tron, Stellar, Arbitrum One, Polygon PoS, and Optimism.

This development is closer to science fiction than traditional finance. The same oracle network, Pyth, that secures data for over 350 decentralized applications (dApps) across more than 50 blockchains, processing over $2.5 trillion in total trading volume through its oracles, is now the system of record for the United States' core economic indicators. Pyth's extensive infrastructure, spanning over 107 blockchains and supporting more than 600 applications, positions it as a trusted source for on-chain data. This is not about speculative assets; it's about leveraging proven, robust technology for critical public services.

The significance of this collaboration cannot be overstated. By bringing official statistics on-chain, the U.S. government is embracing cryptographic verifiability and immutable publication, setting a new precedent for how governments interact with decentralized technology. This initiative aligns with broader transparency goals and is supported by Secretary of Commerce Howard Lutnick, positioning the U.S. as a world leader in finance and blockchain innovation. The decision by a federal entity to trust decentralized oracles with sensitive economic data underscores the growing institutional confidence in these networks.

This is the cycle of the great onboarding. The distinction between "Web2" and "Web3" is rapidly becoming obsolete. When government data, institutional flows, and grassroots builders all operate on the same decentralized rails, we are simply talking about the internet—a new iteration, yes, but the internet nonetheless: an immutable internet where data is not only published but also verified and distributed in real-time.

Pyth Network stands as tangible proof that this technology serves a vital purpose. It demonstrates that the industry has moved beyond abstract "crypto tech" to offering solutions that address real-world needs and are now actively sought after and understood by traditional entities. Most importantly, it proves that Web3 is no longer seeking permission; it has received the highest validation a system can receive—the trust of governments and markets alike.

This is not merely a fleeting trend; it's a crowning moment in global adoption. The U.S. government has just validated what many in the Web3 space have been building towards for years: that Web3 is not a sideshow, but a foundational layer for the future. The current cycle will be remembered as the moment the world definitively crossed this threshold, marking the last great opportunity to truly say, "we were early."

🙏 Donations Accepted 🙏

If you find value in my content, consider showing your support via:

💳 PayPal: 
1) Simply scan the QR code below 📲
2) or visit https://www.paypal.me/thedinarian

🔗 Crypto Donations👇
XRP: r9pid4yrQgs6XSFWhMZ8NkxW3gkydWNyQX
XLM: GDMJF2OCHN3NNNX4T4F6POPBTXK23GTNSNQWUMIVKESTHMQM7XDYAIZT
XDC: xdcc2C02203C4f91375889d7AfADB09E207Edf809A6

Read full Article
post photo preview
US Dept of Commerce to publish GDP data on blockchain

On Tuesday during a televised White House cabinet meeting, Commerce Secretary Howard Lutnick announced the intention to publish GDP statistics on blockchains. Today Chainlink and Pyth said they were selected as the decentralized oracles to distribute the data.

Lutnick said, “The Department of Commerce is going to start issuing its statistics on the blockchain because you are the crypto President. And we are going to put out GDP on the blockchain, so people can use the blockchain for data distribution. And then we’re going to make that available to the entire government. So, all of you can do it. We’re just ironing out all the details.”

The data includes Real GDP and the PCE Price Index, which reflects changes in the prices of domestic consumer goods and services. The statistics are released monthly and quarterly. The biggest initial use will likely be by on-chain prediction markets. But as more data comes online, such as broader inflation data or interest rates from the Federal Reserve, it could be used to automate various financial instruments. Apart from using the data in smart contracts, sources of tamperproof data 👉will become increasingly important for generative AI.

While it would be possible to procure the data from third parties, it is always ideal to get it from the source to ensure its accuracy. Getting data directly from government sources makes it tamperproof, provided the original data feed has not been manipulated before it reaches the oracle.

Source

🙏 Donations Accepted 🙏

If you find value in my content, consider showing your support via:

💳 PayPal: 
1) Simply scan the QR code below 📲
2) or visit https://www.paypal.me/thedinarian

🔗 Crypto
XRP: r9pid4yrQgs6XSFWhMZ8NkxW3gkydWNyQX
XLM: GDMJF2OCHN3NNNX4T4F6POPBTXK23GTNSNQWUMIVKESTHMQM7XDYAIZT
XDC: xdcc2C02203C4f91375889d7AfADB09E207Edf809A6

Read full Article
See More
Available on mobile and TV devices
google store google store app store app store
google store google store app tv store app tv store amazon store amazon store roku store roku store
Powered by Locals