TheDinarian
News • Business • Investing & Finance
💥 Grass: The First Ever Layer 2 Data Rollup
June 22, 2024
post photo preview

You cant say I didn't warn you! I have been sending you the link for $GRASS for months now. This is the equivelant to a Theta Staking Node, minus the staking.. for now! You can still get in on this before it goes mainstream and the $GRASS token officially launches mainnet on Solana... ~The Dinarian

What Problem Does Grass Solve?

Over the past few weeks, we’ve been releasing content to explain Grass’s role in the AI stack.  As you now know, the protocol performs a number of functions that help builders access web data to train their models with.  This is the crucial first stage of the AI pipeline and the launching point for all development.  

In Grass’s case, residential devices around the world host a network of nodes that scrape and process raw data from the web.  It cleans and converts that data into structured datasets for use in AI training.  And most importantly, it sources web data in a way that involves - and rewards - the participation of nearly a million people around the world.  It single handedly created the category of AI data provisioning, and it’s the reason some of the largest AI companies in the world have chosen to work with us.  It is the Data Layer of AI.

At the same time, we’ve also spent the past few weeks reflecting on the current state of artificial intelligence.  We’ve asked ourselves about the most pressing issues it faces, and as a prominent piece of AI infrastructure ourselves, what we can do to solve them.  

Our conclusion is that the biggest problem in AI right now is a lack of data transparency.  One glance at the news will tell you why.  Ask yourself, why would an AI model equate Elon Musk with Hitler?  Or erase an entire ethnic group from world history?  Was it trained with bad data?  Or worse, with good data selectively chosen to give bad answers?

The answer is, we don’t know.  And we don’t know because there’s no way to know.  We don’t know what data these models were trained on, because no mechanism exists for proving it.  There’s no way for users to verify data provenance, because there’s no way for builders to verify it themselves.

This is the problem that Grass plans to solve, and we’re now building a layer 2 data rollup to solve it.  How, you may ask?

Allow us to explain. 

How A Layer Two Will Establish Data Provenance 

The world needs a method for proving the origin of AI training data, and that’s what Grass is now building.  Soon, every time data is scraped by Grass nodes, metadata will be recorded to verify the website it was scraped from.  This metadata will then be permanently embedded in every dataset, enabling builders to know its source with total certainty.  They can then share this lineage with their users, who can rest easier knowing that the AI models they interact with were not deliberately trained to give misleading answers.  

This will be a big lift and involve a major expansion of our protocol as we prepare for scraping operations to reach tens of millions of web requests per minute.  Each of these will need to be validated, which will take more throughput than any L1 can provide.  That’s why we’re announcing our plan to build a layer 2 solution to handle this significant upgrade to our capabilities.  The L2 will be a sovereign rollup, featuring a ZK processor so that metadata can be batched for validation and used to provide a persistent lineage for every dataset we produce.  This is what it will take for the base layer of all AI development to advance to the next stage.  

The benefits of this are numerous: it will combat data poisoning, empower open source AI, and create a path towards user visibility into the models we interact with every day. 

Below, we'll describe the system’s basic design.

The Architecture of Grass

The easiest way to understand these upgrades is by consulting a diagram of the Grass Data Rollup.  On the left, between Client and Web Server, you see Grass’s network as it’s traditionally been defined.  Clients make web requests, which are sent through a validator and ultimately routed through Grass nodes.  Whichever website the client has requested, its server will respond to the web request, allowing its data to be scraped and sent back up the line. Then it will be cleaned, processed, and prepared for use in training the next generation of AI models.  

Back in the L2 diagram, you’ll see two major additions on the right that will accompany the launch of Grass’s sovereign layer two: The Grass Data Ledger and the ZK processor.  

Each of these has its own function, so we’ll explain them one at a time. 

The Grass Data Ledger is where all data is ultimately stored.  It is a permanent ledger of every dataset scraped on Grass, now embedded with metadata to document its lineage from the moment of origin.  Proofs of each dataset’s metadata will be stored on Solana’s settlement layer, and the settlement data itself will also be available through the ledger.  It’s important to note the significance of Grass having a place to store the data it scrapes, though we’ll get to this shortly.  

  • The ZK Processor

As we described above, the purpose of the ZK processor is to assist in recording the provenance of datasets scraped on Grass’s network.  Picture the process.

When a node on the network - in other words, a user with the Grass extension - sends a web request to a given website, it returns an encrypted response including all of the data requested by the node.  For all intents and purposes, this is when our dataset is born, and this is the moment of origin that needs to be documented.  

And this is exactly the moment that is captured when our metadata is recorded.  It contains a number of fields - session keys, the URL of the website scraped, the IP address of the target website, a timestamp of the transaction, and of course the data itself.  This is all the information necessary to know beyond a shadow of a doubt that a given dataset originated from the website it claims to be from, and therefore that a given AI model is properly - and faithfully - trained.  

The ZK processor enters the equation because this data needs to be settled on-chain, yet we don’t want all of it visible to Solana validators.  Moreover, the sheer volume of web requests that will someday be performed on Grass will inevitably overwhelm the throughput capacity of any L1 - even one as capable as Solana.  Grass will soon scale to the point where tens of millions of web requests are performed every minute, and the metadata from every single one of them will need to be settled on-chain.  It’s not conceivably possible to commit these transactions to the L1 without a ZK processor making proofs and batching them first. Hence, the L2 - the only possible way to achieve what we’re setting out to do.    

Now, why is this such a big deal?

Layer Two Benefits 

  • The Data Ledger 

The Data Ledger is significant because it escalates Grass’s expansion into an additional - and fundamentally different - business model.  While the protocol will continue to vet buyers who send their own web requests and scrape their own data on the network, a growing portion of its activity will involve the data already stored on the ledger.  With this capability, Grass can now scrape data strategically curated for use in LLM training and host it on an ever-widening data repository.   

This repository is the data layer of a modular AI stack, from which builders can pick and choose constituent parts to train infinitely differentiated models.  It is a microcosm of the internet itself, supplying training data that is already structured and ready to be ingested by AI.  

  • The ZK Processor 

We’ve already gone into a bit of detail about why the ZK processor matters.  By enabling us to create proofs of the metadata that documents the origin of Grass datasets, it creates a mechanism for builders  and users to verify that AI models were actually trained correctly.  This is a huge deal in itself

There is, however, one piece we didn’t mention earlier.  

In addition to documenting the websites from which datasets originated, the metadata also indicates which node on the network it was routed through.  Significantly, this means that whenever a node scrapes the web, they can get credit for their work without revealing any identifying information about themselves.  

Now, why is this important?

It’s important because once you can prove which nodes have done which work, you can start rewarding them proportionately.  Some nodes are more valuable than others.  Some scrape more data than their peers.  And these are exactly the nodes we need to incentivize to continue the breakneck expansion of the network that we’ve seen over the past few months. We believe this mechanism will significantly boost rewards in the most in-demand locations around the world, ultimately encouraging the people of those locales to sign up and exponentially increase the network’s capacity.  

It should go without saying that the larger the network gets, the more capacity we have to scrape and the larger our repository of stored web data will be.  A flywheel will inevitably be produced where more data means we’ll have more to offer AI labs who need training data - thus providing the incentive for the Grass network to keep growing.  

Conclusion

To summarize, most of the high profile issues with AI today stem from a lack of visibility into how models are trained, and we believe this can be addressed by empowering open source AI with a system for verifying data provenance.  Our solution is to build the first ever layer 2 data rollup, which will make it possible to introduce a mechanism for recording metadata documenting the origin of all datasets.  

ZK proofs of this data will be stored on the L1 settlement layer, and the metadata itself will ultimately be tied to its underlying dataset, as these datasets are stored themselves on our own data ledger.  Grass provides the data layer for a modular AI stack, and these developments will lay the groundwork for greater transparency and rewards for node providers that are proportionate to the amount of work they perform.  

This update should help to communicate some of the projects we have on the horizon and clarify the thinking that drives our decision making.  We’re happy to play a part in making AI more transparent, and excited to see the many use cases that will arise for our product going forward.  These upgrades will open up a wide range of opportunities for developers, so if you or your team are interested in building on Grass, please reach out on Discord.  Thanks for your support and do stay tuned.  

community logo
Join the TheDinarian Community
To read more articles like this, sign up and join my community today
0
What else you may like…
Videos
Podcasts
Posts
Articles
The Gold Standard ✨️ And The USD 💵
00:02:30
IMF Admitting Crypto Is Inevitable 💥

When you have the IMF Admitting crypto is inevitable, BlackRock Tokenizing the financial system, the FED hinting at ending QT, Gold doing a parabolic move & the FED hinting at renewed easing.

This isn’t coincidence.
This is strategic coordination.

OP: Vandell33

00:00:47
Listen to this... 🤯

Catherine Fitts, she just revealed that interdimensional beings are pulling the strings in this world 🧐😱👽

👉Re-read your religious book, with interdimensional beings in mind and it will all start to make sense... 😉

00:00:23
👉 Coinbase just launched an AI agent for Crypto Trading

Custom AI assistants that print money in your sleep? 🔜

The future of Crypto x AI is about to go crazy.

👉 Here’s what you need to know:

💠 'Based Agent' enables creation of custom AI agents
💠 Users set up personalized agents in < 3 minutes
💠 Equipped w/ crypto wallet and on-chain functions
💠 Capable of completing trades, swaps, and staking
💠 Integrates with Coinbase’s SDK, OpenAI, & Replit

👉 What this means for the future of Crypto:

1. Open Access: Democratized access to advanced trading
2. Automated Txns: Complex trades + streamlined on-chain activity
3. AI Dominance: Est ~80% of crypto 👉txns done by AI agents by 2025

🚨 I personally wouldn't bet against Brian Armstrong and Jesse Pollak.

👉 Coinbase just launched an AI agent for Crypto Trading

🚨 JOHN BOLLINGER WARNS: “PAY ATTENTION SOON” AS CHARTS SIGNAL IMMINENT MAJOR MOVE 🚨

Veteran technical analyst John Bollinger—the creator of the Bollinger Bands indicator—has identified potential “W bottom” patterns forming on the charts of Ether (ETH) and Solana (SOL), and advises traders to watch closely for a significant market move.

🔑 Key Points:

🔹 W Bottom Setups: Bollinger sees early signs of double-bottom (“W”) formations in both Ether and Solana, which historically signal bullish reversals and the potential for substantial price advances if confirmed.

🔹 Bitcoin Lagging—But Watch Closely: While the pattern hasn't emerged on Bitcoin’s chart yet, BTC has posted a “V” shaped recovery after a major dip below $104,000, and now sits at the lower end of its recent range. Past market behavior suggests that similar patterns could soon develop for Bitcoin.

🔹 Historical Precedent: The last time Bollinger issued a comparable alert was July 2024—Bitcoin ...

🚨 SOLCRAVO LAUNCHES XRP SMART CONTRACTS: ENHANCING UTILITY AND YIELD FOR XRP HOLDERS 🚨

SolCravo has launched a new platform delivering smart contracts for XRP, enabling holders to earn on-chain yields and participate in automated asset management without selling or transferring their XRP.

🔑 Key Points:

🔹 Core Offering: SolCravo allows users to connect their XRP wallets and engage directly with smart contracts that automate income generation, putting their assets to work while maintaining full self-custody and control. The service is intended to be user-friendly for both new and experienced participants.

🔹 Multi-Asset Support: While focused on XRP, SolCravo's platform is multi-chain—supporting BTC, ETH, BNB, LTC, SOL, and USDT alongside XRP, making it a centralized hub for asset management and smart contract engagement for leading cryptocurrencies.

🔹 Contract Tiers: Users can select among several contract options, ranging from a $100 “Starter Contract” to a $16,000+ ...

post photo preview

🔥 BINANCE CRACKS DOWN ON BOT FARMS, BANS OVER 600 ACCOUNTS 🔥

Binance has taken swift and decisive action to maintain the integrity of its Binance Alpha program, permanently banning over 600 accounts for engaging in fraudulent activity.

The accounts were found to be abusing the platform's reward mechanisms using sophisticated, automated tools, commonly referred to as "bot farms."

🔑 Key Details:

🔹 Platform Targeted: Binance Alpha is a section within the Binance ecosystem (often tied to the Binance Web3 Wallet) designed to give users early access to promising, emerging crypto projects and exclusive token generation events (TGEs) through its Alpha Points system.

🔹 The Violation: The banned accounts were utilizing fraudulent automated tools (scripts, bots, and other non-manual methods) to unfairly "farm" or accumulate Alpha Points and disproportionately claim rewards, effectively cheating the system and undermining the fairness for legitimate ...

post photo preview
post photo preview
New Human Force
Join this Now! YOU have what it takes!

They are in our solar system, and in our event-stream in this Eternal Now.

Officialdom is clueless.

They think we are going to be at WAR with the Aliens.

Officialdom is very stupid.

Aliens is here. It’s not WAR. It’s Contention.

There is a difference.

Officialdom is clueless, still living in the last Millennium.

Aliens is here.

The Field in which we contend is This Eternal Now.

ALL HUMANS LIVE HERE, and ONLY HERE, in this

ETERNAL NOW.

It’s a Field of potentials, of pending Manifestation, this continuous event-stream of karma in which we have always lived our body’s Life.

This Eternal Now has always been our body’s Field of Contention.

The Aliens is here, in our Eternal Now.

Our common, shared, reality that we all continuously co-create now has Aliens.

It’s getting very complex in here.

Officialdom is clueless. They see the Aliens. They are freaking out. They think you are children, when it is their small minds, trapped in a reality that is only grit, mud, and ‘random chance’ who are childish.

Officialdom is stupid. They will and are reacting badly. As is their way, they are trying to hide shit from you. Silly grit bound minds don’t realize you can see everything from within the Eternal Now. They have yet to grasp that what they perceive as this Matterium, filled with ‘matter’, is but a hardening of our previous (past) internal states of being.

WAR happens in the Matterium.

Contention occurs within this Eternal Now where Consciousness shapes the manifesting event-stream.

YOU know this to be fact. You are a co-creator.

Contention with Aliens is happening in this instant in this Eternal Now.

Officialdom ain’t doing shit. They are still stuck in trying to move matter around to affect unfolding circumstances. That’s redoing the mirror trying to affect the reflection. Dumb fucks….

It’s up to US. To the New Humans. Those of us who live in this Eternal Now. Those of us who see that our body’s Lives (the Chain that cannot be broken) are expressions of the Ontology revealing itself to itself. It’s up to us guys.

We are not an Army. That’s a concept from the past, from before the emergence of the New Humans. We are a Force. A self-organizing collective with leadership resident in each, and every participant.

We are the New Human Force. By the time officialdom starts to speak about the Aliens in near-factual terms, we will already be engaging them in this Eternal Now.

By the time officialdom begins to move matter around (space ships & such) thinking it’s War, we will already be suffering casualties in this Eternal Now. That part is inevitable. It’s how we learn.

By the time officialdom realizes that some shit is going on in places and ways beyond its conception, we will already be pushing our dominance onto our partners in this First Contention, the Aliens. Nage cannot train without Uke.

Just as officialdom is scrambling to research the Ontology, this Eternal Now, and the event-stream, we will be settling terms with our new partners, the Aliens.

Come, join with us. It’s going to be a hellacious Contention.

We ARE the NEW HUMANS!

Together we are the Force that cannot be defeated.

Start YOUR training in this instance of this Eternal NOW.

Consume Neville Goddard videos as though all of human existence depended on YOUR mind and YOUR active, effective, imaginings!

It’s not a question of Mind over Matter as there is only Mind and it cares not for Matter. That’s residue.

Source

🙏 Donations Accepted 🙏

If you find value in my content, consider showing your support via:

💳 PayPal: 
1) Simply scan the QR code below 📲
2) or visit https://www.paypal.me/thedinarian

🔗 Crypto Donations👇
XRP: r9pid4yrQgs6XSFWhMZ8NkxW3gkydWNyQX
XLM: GDMJF2OCHN3NNNX4T4F6POPBTXK23GTNSNQWUMIVKESTHMQM7XDYAIZT
XDC: xdcc2C02203C4f91375889d7AfADB09E207Edf809A6

Read full Article
post photo preview
The Great Onboarding: US Government Anchors Global Economy into Web3 via Pyth Network

For years, the crypto world speculated that the next major cycle would be driven by institutional adoption, with Wall Street finally legitimizing Bitcoin through vehicles like ETFs. While that prediction has indeed materialized, a recent development signifies a far more profound integration of Web3 into the global economic fabric, moving beyond mere financial products to the very infrastructure of data itself. The U.S. government has taken a monumental step, cementing Web3's role as a foundational layer for modern data distribution. This door, once opened, is poised to remain so indefinitely.

The U.S. Department of Commerce has officially partnered with leading blockchain oracle providers, Pyth Network and Chainlink, to distribute critical official economic data directly on-chain. This initiative marks a historic shift, bringing immutable, transparent, and auditable data from the federal government itself onto decentralized networks. This is not just a technological upgrade; it's a strategic move to enhance data accuracy, transparency, and accessibility for a global audience.

Specifically, Pyth Network has been selected to publish Gross Domestic Product (GDP) data, starting with quarterly releases going back five years, with plans to expand to a broader range of economic datasets. Chainlink, the other key partner, will provide data feeds from the Bureau of Economic Analysis (BEA), including Real Gross Domestic Product (GDP) and the Personal Consumption Expenditures (PCE) Price Index. This crucial economic information will be made available across a multitude of blockchain networks, including major ecosystems like Ethereum, Avalanche, Base, Bitcoin, Solana, Tron, Stellar, Arbitrum One, Polygon PoS, and Optimism.

This development is closer to science fiction than traditional finance. The same oracle network, Pyth, that secures data for over 350 decentralized applications (dApps) across more than 50 blockchains, processing over $2.5 trillion in total trading volume through its oracles, is now the system of record for the United States' core economic indicators. Pyth's extensive infrastructure, spanning over 107 blockchains and supporting more than 600 applications, positions it as a trusted source for on-chain data. This is not about speculative assets; it's about leveraging proven, robust technology for critical public services.

The significance of this collaboration cannot be overstated. By bringing official statistics on-chain, the U.S. government is embracing cryptographic verifiability and immutable publication, setting a new precedent for how governments interact with decentralized technology. This initiative aligns with broader transparency goals and is supported by Secretary of Commerce Howard Lutnick, positioning the U.S. as a world leader in finance and blockchain innovation. The decision by a federal entity to trust decentralized oracles with sensitive economic data underscores the growing institutional confidence in these networks.

This is the cycle of the great onboarding. The distinction between "Web2" and "Web3" is rapidly becoming obsolete. When government data, institutional flows, and grassroots builders all operate on the same decentralized rails, we are simply talking about the internet—a new iteration, yes, but the internet nonetheless: an immutable internet where data is not only published but also verified and distributed in real-time.

Pyth Network stands as tangible proof that this technology serves a vital purpose. It demonstrates that the industry has moved beyond abstract "crypto tech" to offering solutions that address real-world needs and are now actively sought after and understood by traditional entities. Most importantly, it proves that Web3 is no longer seeking permission; it has received the highest validation a system can receive—the trust of governments and markets alike.

This is not merely a fleeting trend; it's a crowning moment in global adoption. The U.S. government has just validated what many in the Web3 space have been building towards for years: that Web3 is not a sideshow, but a foundational layer for the future. The current cycle will be remembered as the moment the world definitively crossed this threshold, marking the last great opportunity to truly say, "we were early."

🙏 Donations Accepted 🙏

If you find value in my content, consider showing your support via:

💳 PayPal: 
1) Simply scan the QR code below 📲
2) or visit https://www.paypal.me/thedinarian

🔗 Crypto Donations👇
XRP: r9pid4yrQgs6XSFWhMZ8NkxW3gkydWNyQX
XLM: GDMJF2OCHN3NNNX4T4F6POPBTXK23GTNSNQWUMIVKESTHMQM7XDYAIZT
XDC: xdcc2C02203C4f91375889d7AfADB09E207Edf809A6

Read full Article
post photo preview
US Dept of Commerce to publish GDP data on blockchain

On Tuesday during a televised White House cabinet meeting, Commerce Secretary Howard Lutnick announced the intention to publish GDP statistics on blockchains. Today Chainlink and Pyth said they were selected as the decentralized oracles to distribute the data.

Lutnick said, “The Department of Commerce is going to start issuing its statistics on the blockchain because you are the crypto President. And we are going to put out GDP on the blockchain, so people can use the blockchain for data distribution. And then we’re going to make that available to the entire government. So, all of you can do it. We’re just ironing out all the details.”

The data includes Real GDP and the PCE Price Index, which reflects changes in the prices of domestic consumer goods and services. The statistics are released monthly and quarterly. The biggest initial use will likely be by on-chain prediction markets. But as more data comes online, such as broader inflation data or interest rates from the Federal Reserve, it could be used to automate various financial instruments. Apart from using the data in smart contracts, sources of tamperproof data 👉will become increasingly important for generative AI.

While it would be possible to procure the data from third parties, it is always ideal to get it from the source to ensure its accuracy. Getting data directly from government sources makes it tamperproof, provided the original data feed has not been manipulated before it reaches the oracle.

Source

🙏 Donations Accepted 🙏

If you find value in my content, consider showing your support via:

💳 PayPal: 
1) Simply scan the QR code below 📲
2) or visit https://www.paypal.me/thedinarian

🔗 Crypto
XRP: r9pid4yrQgs6XSFWhMZ8NkxW3gkydWNyQX
XLM: GDMJF2OCHN3NNNX4T4F6POPBTXK23GTNSNQWUMIVKESTHMQM7XDYAIZT
XDC: xdcc2C02203C4f91375889d7AfADB09E207Edf809A6

Read full Article
See More
Available on mobile and TV devices
google store google store app store app store
google store google store app tv store app tv store amazon store amazon store roku store roku store
Powered by Locals