Everyone is always looking for ways to improve their finances and we often hear that staking in crypto can't be profitable and stable during the bear market. At P2P we think that it depends on how effectively your staking provider uses the infrastructure.
Today we want to share the story of how we became a successful Node operator (NOP) on Chainlink by continuously improving our performance metrics. We will also talk about Chainlink’s oracle network, its current state and how NOPs can get a stable revenue even during a bear market.
Chainlink is the market-leading decentralised oracle network providing real-world data to smart contracts on any blockchain. Currently, Chainlink supplies data for DeFi consumers across 14 networks:
Ethereum registered the highest number of working oracles during 2022.
Within each network, oracles can provide different types of data:
P2P currently provides more than 2000 unique data feeds on different chains. While most of these data feeds are shared across multiple chains, some of them are unique to a specific chain, for example, the METIS-USD data feed is present only in the Metis network. Here’s the distribution of data feeds per network:
These data feeds are distributed between node operators in every blockchain. This is the first side of Chainlink’s decentralisation.
A few technical details:
Oracles generate reports for data feeds continuously by sending requests to data providers (APIs) and aggregating them (median). Every time consumers need data, Chainlink asks one of the oracles, that is assigned to that data feed, to write that data to the blockchain. The session data recording is called the Round and the chosen oracle is called the Leader of the Round. The Leader gets data from other oracles (who are also assigned to this data feed), calculates the median value and writes it to the blockchain. If for any reason the Leader couldn’t do it - the next oracle becomes the Leader and has to do it.
It is not enough to just attract a large number of oracles, it is also important to ensure that the oracle’s data is decentralised. This is achieved by using different data sources for different oracles.
For example, the price feed for ETH-USD in Fantom is distributed between oracles and data sources (APIs) as follows:
Note: A 1 means that an oracle gets the data from the API, 0 means that the oracle doesn’t get it from that source
The degree of decentralisation can vary as it depends on the number of data sources that provide the price data and the number of oracles. We only need the fact that different oracles send requests to different API services to proceed with the data when Chainlink needs it for understanding Chainlink’s decentralisation.
P2P joined the Chainlink oracles network in 2018. We first started as a NOP in Ethereum on several data feeds and we haven't stopped growing since that time. Today we are present in 6 networks and we provide data for more than 150 feeds.
This level of growth was a serious challenge for us as a company. Looking back it seems that it would be impossible to become a successful Node operator without a data-driven approach. Luckily, in 2020 we had already understood that we needed to collect and analyse data about NOPs’ performance in Chainlink. In this article we will walk you through our path and go through it from the beginning.
We provide 4 steps for a successful data-driven approach to node management:
Let's start with the first step.
The main purpose of the Chainlink protocol is to provide data for users. As we have previously mentioned, “Round” is the act of writing data to the blockchain by an Oracle.
Here are the number of rounds for the Ethereum mainnet in 2022.
This is how we compare the consumption of Chainlink’s data by different chains (for example Avalanche, Fantom, Harmony, Moonriver and Ethereum). You can notice there is a peak in the number of rounds in May (Terra collapse), June (Celsius) and November (FTT/FTX collapse). This is true for every chain:
We can track the number of rounds to measure the consumption of data supported by Chainlink for any chain/feed/oracle. For example here is the number of rounds for ETH-USD (dark blue), FTT-USD (green) and ATOM-USD (bright blue) on Ethereum’s mainnet:
It is not enough to just provide data, our purpose in Chainlink as a NOP is to provide accurate data. For that, we need an estimation of the quality of the data that is calculated based on the oracles’ answers every round. We call this “Deviation”. It is calculated by comparing a particular oracle’s answer for that Round to the Round’s final value (the median of each oracle's answer). This way, we can track the variance of each oracle every Round. This can also be used to calculate the average value of each oracle’s deviation. Here is how we can compare the data quality for different chains:
It is important to mention that the most popular deviation threshold for data feed defined by Chainlink is 0.5%. There are a lot of feeds with even a 5% deviation threshold.
It is also interesting to compare the performance of different NOPs by their ability to write Chainlink data on-chain. We use the Transaction success rate (TSR) for this. This is the ratio of the number of successful transactions to the number of unsuccessful transactions.
This is not the main subject of the current article. We plan to talk about this a bit more in a future post. Today we will only mention the main architecture of the ETL (Extract, transform, load steps of data uploading pipeline) process:
The first 3 sources are:
Stay tuned if you want to know more about data and indexation in the future.
Every NOP is a company first. The main target of every business is to be profitable. Let’s look deeper into Node operator economics.
Oracles get rewarded in LINK for every report to the blockchain, regardless of who was the Leader of the round. The Leader gets additional rewards for writing data to the blockchain.
The main expenses of an Oracle are gas costs, infrastructure costs and human resources. Oracles should pay the gas cost to write data to the blockchain when it was chosen as Leader of the Round.
We can track rewards and gas costs to estimate the revenue of the oracle's performance. Here is Chainlink on Ethereum financial metrics:
So the total net revenue for all of the oracles in Ethereum is 37.3 mil $ during this year.
Now we know the Chainlink mechanics. We also know that a Bear market in Crypto leads to smaller amounts of revenue for every project. But 2022 has also brought us a lot of activity from scandals involving multiple projects: Terra, Celsius, FTT and so on. What if we want to understand how stable an oracle’s revenue can be during an unusual event ? We will definitely want to know what the gas spending value was and how many rewards the oracles got. It will also be great to see deviations to understand the consensus about price data between oracles.
Let’s see what was happening with an oracle's net revenue during 2022 across 6 networks: Ethereum, Solana, Fantom, Moonriver, Harmony and Avalanche:
Here's what was happening with revenue during April-may ‘22 to understand how the Terra event influenced Chainlink NOPs:
We can see that everything went up: Costs, Rewards and Net Revenue. So during this commotion around Terra, we see a peak in the number of rounds as we mentioned earlier. It led to an increase in network utilisation and higher gas costs. But it also brought more rewards to node operators and higher Net Revenue as a result.
A slightly different situation was the FTT/FTX collapse. We’ve already seen that there were way more rounds for FTT data feeds. If we dig deeper we can also see that it happened to every asset associated with FTX such as Solana. But what about net revenue?
It was the same during September and October, with no significant differences. But what about P2P:
Our revenue hasn't changed much during the last 3 months.
Besides revenue, every NOP should care about its reputation in the Chainlink network. As we previously mentioned, we track reputation by 2 key metrics:
Here’s the Deviation stat for 5 networks:
The two red vertical lines mark the Terra and FTT/FTX events.
It is expected that during a big market event, the consistency of oracles decreases. We can see the huge deviation in Avalanche and Moonriver during the Terra collapse.
During the FTX event we can observe a deviation, although much smaller when compared to the earlier one:
We can also compare oracles based on TSR to estimate how successful oracles are in writing data to the blockchain. For example, here is the TSR for Ethereum’s mainnet:
We can observe that during November and for most of 2022, P2P had a TSR ranging between 90-100% in every network except Fantom and Solana (and Moonriver in April). This is because those chains use a different transaction execution mechanism compared to most EVM chains. You can make sure that this is quite a good metric value by comparing us to others.
Here is the distribution of different NOPs TSR for the Fantom network:
We can observe that even at its lowest point P2P was among the top NOPs.
The median TSR or all Nops on Solana was 6.07% and the median TSR for P2P was 6% for the same period.
In this section, we will discuss how we solve business problems by using a data-driven approach.
On August 2021 Ethereum released EIP-1559. A significant aspect of this proposal was how it overhauled the transaction fee system. For P2P it meant that we could now use EIP-1559 to prioritize transactions. We weren’t sure how this would affect the transaction success rate, in other words, did miners have a preference for one type of transaction?
We’ve decided to run an A/B test. The design was to switch the priority fee algorithm from Standard to 1559 every 15 minutes and set the configuration of both algorithms. As a result, we’ve got the same transaction success rate and significantly different cap fees as can be seen in the picture below:
This is how data analysis and data-driven approaches are applied in decision-making.
Chainlink Staking v.0.1 launched on December 6, 2022. During this first version there are two ways to stake Chainlink:
If you want to see the details you can read this post in our blog or on Chainlink's official website. But as you may see the community staking pool is already full and won’t be expended till v.1.0 (9-12 months):
That is why we are glad to provide our clients with a custodial solution to get a higher APR through P2P with a 10% fee. Through P2P you need a minimum of 10k LINK and can stake up to 50k LINK.
P2P Validator is a world-leading staking provider with the best industry security practices and proven expertise. We provide comprehensive due diligence on digital assets and offer only top-notch staking opportunities. At the time of the latest update, more than 1,5 billion USD is staked with P2P Validator by over 25,000 delegators across 25+ networks.
Get the latest posts delivered right to your inboxSubscribe
<h3 id="the-graph-overview">The Graph overview</h3><p>The Graph is a decentralised protocol for querying and indexing data from blockchains. It is the backbone of major crypto projects like Uniswap, Lido, Livepeer and Decentraland. Basically, any dApp in order to be truly decentralised has to store its data on a distributed database. And The Graph is where dApps can store and collect information in a familiar Web 2.0 fashion but with Web3 security and reliability.</p><p>You can capitalise on major Web3 projects’ successes and failures and earn fees on The Graph platform. The Graph supplies the databases and computing powers to support these projects and this work is done by the <strong>indexers<em>.</em></strong> This work is backed by <strong><em>delegators</em></strong> that stake GRT by delegating it to them. Anyone can become a delegator and get between 9% and 15% APY on their GRT tokens.</p><p>Those looking for a <strong>higher yield and comfortable with higher risks</strong> can consider becoming a <strong>curator</strong>. </p><h3 id="how-to-become-a-curator-and-earn-up-to-30-apr-on-the-graph">How to become a curator and earn up to 30% APR on The Graph</h3><p>A Curators’ job is to identify the most prosperous Web3 projects that use The Graph as their database and <strong>buy their shares</strong>. When a curator holds these shares they get a part of the query fees served by indexers with some of them returning <strong>up to 30% APY</strong>. </p><p>With the new dashboard made by P2P for curators, it has become easier than ever to make data-driven decisions.</p><p><a href="https://reports.p2p.org/superset/dashboard/graph_curation?ref=p2p.org">https://reports.p2p.org/superset/dashboard/graph_curation</a></p><figure class="kg-card kg-image-card"><img src="https://lh4.googleusercontent.com/LWEFh4gxRK4CE8sSXkllZBvzy36YjMQi451EtVbXDWa3eHQR8s8UczlTY6Tbmq5_fJ1oJae4keRPkUWjlb8P_JJfEnjdo6N19XL0KdeABpsAIj1u5rJk4KV8_zE1_rZRcvwvFdenPU9_bAQNkbGsHGrjvABnpwgb8rjVSo5goYyZc4ic6ZUaTBXnc2BwfA" class="kg-image" alt loading="lazy" width="602" height="197"></figure><p>The first tab (subgraphs) is designed to give you data on where the upcoming query fees are going to be higher. </p><p>The creators’ share column would help you identify how many other curators may leave and thus lower your share price, as it is unlikely (but still possible!) for subgraph creators to abandon their own subgraph.</p><p><strong>Query fees (QF)</strong> probability is calculated based on the past 30 days. If there were no queries in the last month, the probability of new queries is close to 0. The percentage takes into account all closed allocations, their size, and duration and estimates the probability of the next query.</p><p><strong>QF APR</strong> is estimated based on the annualised Price per Share changes <strong>in the last 30 days </strong>based on indexers' allocation collect events.</p><p><strong>QF APR If You Signal N GRT</strong> is calculated considering current signals & query fees collected in the last 30 days.</p><figure class="kg-card kg-image-card"><img src="https://lh4.googleusercontent.com/fPz7MeZb9VYyYw_iuPEagHOUkDL8YfaTa9lYmumMy2HQ0Uj9Tr87iApkHN2rS1z6WBC28LpmrQO9dvUGWpBNVZ-YO76Z4V3uKPA65sTplIWK_Of07bnIRIPEJv1QdEBWVNGGLj_JMNDdnxyR-k-5qblJEHdGDyMNhpvj4bBJyGbh_5Iafk99X34fyjPU9g" class="kg-image" alt loading="lazy"></figure><h3 id="how-are-query-fees-accrued">How are query fees accrued</h3><p>Everything starts with indexers allocating their GRT to subgraphs. They would allocate their tokens proportionally to the number of signals on the subgraphs. </p><p>In the table below Proportion = Signals / Allocated GRT</p><!--kg-card-begin: html--><p align="center"> <img src="https://p2p.org/economy/content/images/2022/12/image.png"> </p><!--kg-card-end: html--><p>If for example, curators added signals to the UMA subgraph, this would in theory lead to more indexers allocating GRT there. An allocation generates rewards for a period of 28 epochs only (approx. 28 days). After that, the allocation stops generating rewards and indexers have an incentive to close them. They may also close their allocation earlier if the signals on the subgraphs change. You can observe indexers' behaviour towards particular subgraphs in the Selected Subgraphs’ Current Indexers’ Allocations table.</p><figure class="kg-card kg-image-card"><img src="https://lh5.googleusercontent.com/A2KQHjzfMubFzSf3ZibEnHuYz0GCsRrsTNU5ygynfA_13tZzoOMiWI9ejdvDeqpfG7rO3u4rXDkeWQdWRBkPcDFdlb16_Oo_Hi4dQskoZRBl_mOJZXj7Qxakft6kF4KjlJ2X26jzZuBSWsllv9WqHq0s3r0VNB8MR9SpZjMKpV2bmuqL1lwDzBOnGLzV9A" class="kg-image" alt loading="lazy" width="602" height="99"></figure><p>Here's how the allocation lifecycle plays out:</p><figure class="kg-card kg-image-card"><img src="https://p2p.org/economy/content/images/2022/12/image-1.png" class="kg-image" alt loading="lazy" width="647" height="133" srcset="https://p2p.org/economy/content/images/size/w600/2022/12/image-1.png 600w, https://p2p.org/economy/content/images/2022/12/image-1.png 647w"></figure><p>Because Indexers have to pay gas for every action sometimes they choose not to collect query fees if the reward doesn't outweigh the cost. They can store those query fees and accumulate them until it is economically viable to claim them. To help you predict whether an indexer will claim query fees, we provide you with the stats for previous allocations. Curators only get their share of query fees after the indexers claim them.</p><h3 id="differences-between-the-share-price-and-the-token-price">Differences between the share price and the token price</h3><p>Query fees are not the primary source of income for curators. Most of the tokens earned come from the <strong><em>shares' price changes</em></strong>.</p><p>The share price is not tied to the project’s token price. For example, changes in the Uniswap token price, UNI, do not correlate with changes in price for its subgraph shares. Some of the projects with a subgraph do not have a token at all, for example, Connext has not released its token yet. You also need to pay attention to the subgraph publisher. For example, Messari has published subgraphs for Uniswap, Lido and other projects, but those subgraphs are not used by Uniswap or Lido, but by Messari.</p><p>With the help of our new tools, you can take a look at how other curators earn GRT on the share price changes. You can try to figure out the tactics that others use to become more profitable by taking a look at their portfolio and the actions they took.</p><h3 id="shares-price-and-bonding-curve">Shares price and bonding curve</h3><p>When signalling onto the subgraph a curator buys shares. The price of a share is pre-determined by the bonding curve and each subsequent share is more expensive. The same logic applies in the opposite direction. If someone sells a share, the share price of each curator on that subgraph drops as the price goes down the bonding curve. This process is well documented in <a href="https://thegraph.com/docs/en/network/curating/?ref=p2p.org">The Graph's official documentation</a>. The primary way to earn rewards in curation is by being among the first to notice the true potential of a subgraph.</p><p>There are a few tactics that can be applied to be successful in this way of earning rewards. One that is very frowned upon by the community is the use of front-running bots that signal on a subgraph as soon as it appears in the network. There are different proposals on how to decrease their impact on the industry. These bots are not as aggressive as in other parts of the crypto ecosystem, so they still leave an opportunity for common investors to earn rewards in the curation market.</p><p>Projects with big names, such as Lido, Messari, and Curve get their signals very quickly as it is assumed that they will generate both hype and query fees that would generate rewards. But keep in mind that anyone can create a subgraph with any name, so it is better to make sure that the subgraph you plan to signal on is the official one. In all the cases mentioned above, a good place to discuss any Curation related questions is the official <a href="https://discord.com/invite/vtvv7FP?ref=p2p.org">discord channel</a>.</p><p>We hope our new tool helps you in your curation process decision-making, or if it is the case, it helps you get started. We are always happy to hear your feedback on the work we are doing, so do not hesitate to reach out, and make suggestions on how we can make The Graph even better.</p><p>Check out the tool below.</p><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://reports.p2p.org/superset/dashboard/graph_curation/?ref=p2p.org"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Superset</div><div class="kg-bookmark-description"></div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://reports.p2p.org/static/assets/images/favicon.png" alt></div></div><div class="kg-bookmark-thumbnail"><img src="https://reports.p2p.org/static/assets/images/loading.gif" alt></div></a></figure><hr><h2 id="about-p2p-validator"><strong>About P2P Validator</strong></h2><p><a href="https://p2p.org/?utm_source=blog&utm_medium=economy&utm_campaign=cosmos_fee">P2P Validator</a> is a world-leading non-custodial staking provider with the best industry practices and proven expertise. At the time of publishing, P2P Validator is trusted with over $1B in staked assets by over 30,000 delegators across 40+ networks.</p>