# Data Streams Architecture
Source: https://docs.chain.link/data-streams/architecture
Chainlink Data Streams delivers low-latency market data offchain that you can verify onchain. This architecture uses a pull-based design, preserving trust-minimization with cryptographic verification while offering greater flexibility compared to traditional push-based oracle designs.
Data Streams offers two implementation approaches to meet different needs:
1. **Standard API Implementation**: The default implementation where applications directly retrieve report data through REST APIs or WebSocket connections and can verify this data onchain when needed.
2. **Streams Trade Implementation**: An alternative implementation that combines Data Streams with Chainlink Automation for automated data retrieval and execution with frontrunning mitigation.
## High Level Architecture
Chainlink Data Streams has the following core components:
- **A Chainlink Decentralized Oracle Network (DON):** This DON operates similarly to the DONs that power [Chainlink Data Feeds](/data-feeds), but the key difference is that it signs and delivers reports to the Chainlink Data Streams Aggregation Network rather than delivering answers onchain directly. This allows the Data Streams DON to deliver reports more frequently for time-sensitive applications. Nodes in the DON retrieve data from many different data providers, reach a consensus about the data, sign a report including that data, and deliver the report to the Data Streams Aggregation Network.
- **The Chainlink Data Streams Aggregation Network:** The Data Streams Aggregation Network stores the signed reports and makes them available for retrieval. The network uses an [active-active multi-site deployment](/data-streams/architecture#active-active-multi-site-deployment) to ensure high availability and robust fault tolerance by operating multiple active sites in parallel. The network delivers these reports directly via API for standard implementations or to Chainlink Automation upon request for [Streams Trade](/data-streams/architecture#streams-trade-architecture) implementations.
- **The Chainlink Verifier Contract:** This contract verifies the signature from the DON to cryptographically guarantee that the report has not been altered from the time that the DON reached consensus to the point where you use the data in your application.
## Standard API Implementation Architecture
The Standard API Implementation is the default way to access Chainlink Data Streams. It provides direct access to report data through REST APIs and WebSocket connections, giving you maximum flexibility in how and when you access market data.
### Example of offchain price updates
Data Streams enables seamless offchain price updates through a mechanism designed for real-time data delivery. Here is an example of how your Client will benefit from low-latency market data directly from the Data Streams Aggregation Network:
1. The Client opens a WebSocket connection to the Data Streams Aggregation Network to subscribe to new reports with low latency.
2. The Data Streams Aggregation Network streams price reports via WebSocket, which gives the Client instant access to updated market data.
3. The Client stores the price reports in a cache for quick access and use, which preserves data integrity over time.
4. The Client regularly queries the Data Streams Aggregation Network for any missed reports to ensure data completeness.
5. The Aggregation Network sends back an array of reports to the Client.
6. The Client updates its cache to backfill any missing reports, ensuring the data set remains complete and current.
### Key benefits of the Standard API Implementation
- **Maximum Flexibility**: Access data only when you need it, on your own terms
- **Direct Integration**: Integrate directly with your systems using our SDKs or APIs
- **Customized Usage**: Control exactly when and how data verification happens onchain
- **Reduced Gas Costs**: Only pay for verification when absolutely necessary
- **Real-time Access**: Get sub-second data latency through WebSocket subscriptions
## Streams Trade Architecture
Streams Trade is an alternative implementation that combines Chainlink Data Streams with [Chainlink Automation](/chainlink-automation) to deliver automated trade execution with frontrunning mitigation. Using Chainlink Automation with Data Streams automates trade execution and mitigates frontrunning by executing the transaction before the data is recorded onchain. Chainlink Automation requests data from the Data Streams Aggregation Network. It executes transactions only in response to the data and the verified report, so the transaction is executed correctly and independently from the decentralized application itself.
### Example trading flow using Streams Trade
One example of how to use Data Streams with Automation is in a decentralized exchange. An example flow might work using the following process:
1. A user initiates a trade by confirming an `initiateTrade` transaction in their wallet.
2. The onchain contract for the decentralized exchange responds by emitting a [log trigger](/chainlink-automation/concepts/automation-concepts#upkeeps-and-triggers) event.
3. The Automation upkeep monitors the contract for the event. When Automation detects the event, it runs the `checkLog` function specified in the upkeep contract. The upkeep is defined by the decentralized exchange.
4. The `checkLog` function uses a `revert` with a custom error called `StreamsLookup`. This approach aligns with [EIP-3668](https://eips.ethereum.org/EIPS/eip-3668#use-of-revert-to-convey-call-information) and conveys the required information through the data in the `revert` custom error.
5. Automation monitors the `StreamsLookup` custom error that triggers Data Streams to process the offchain data request. Data Streams then returns the requested signed report in the `checkCallback` function for Automation.
6. Automation passes the report to the Automation Registry, which executes the `performUpkeep` function defined by the decentralized exchange. The report is included as a variable in the `performUpkeep` function.
7. The `performUpkeep` function calls the `verify` function on the Data Streams onchain verifier contract and passes the report as a variable.
8. The verifier contract returns a `verifierResponse` bytes value to the upkeep.
9. If the response indicates that the report is valid, the upkeep executes the user's requested trade. If the response is invalid, the upkeep rejects the trade and notifies the user.
This is one example of how you can combine Data Streams and Automation, but the systems are highly configurable. You can write your own log triggers or [custom logic triggers](/chainlink-automation/guides/register-upkeep) to initiate Automation upkeeps for a various array of events. You can configure the `StreamsLookup` to retrieve multiple reports. You can configure the `performUpkeep` function to perform a wide variety of actions using the report.
### Key benefits of the Streams Trade Implementation
- **Automated Execution**: No need to build execution logic - Automation handles it
- **Frontrunning Protection**: Transaction data and price data revealed atomically onchain
- **Trustless Design**: Fully decentralized solution with no centralized execution components
- **Ease of Integration**: Built-in support for standard DeFi operations
Read the [Getting Started](/data-streams/getting-started) guide to learn how to build your own smart contract that retrieves reports from Data Streams using the Streams Trade implementation.
**Note**: Before implementing Streams Trade, ensure that Chainlink Automation is available on your desired network by checking the [Automation Supported Networks page](/chainlink-automation/overview/supported-networks).
## Active-Active Multi-Site Deployment
Active-active is a system configuration strategy where redundant systems remain active simultaneously to serve requests. Incoming requests are distributed across all active resources and load-balanced to provide high availability, scalability, and fault tolerance. This strategy is the opposite of active-passive where a secondary system remains inactive until the primary system fails.
The Data Streams API services use an active-active setup as a highly available and resilient architecture across multiple distributed and fully isolated origins. This setup ensures that the services are operational even if one origin fails, which provides robust fault tolerance and high availability. This configuration applies to both the [REST API](/data-streams/reference/data-streams-api/interface-api) and the [WebSocket API](/data-streams/reference/data-streams-api/interface-ws). A global load balancer seamlessly manages the system to provide automated and transparent failovers. For advanced use cases, the service publishes available origins using HTTP headers, which enables you to interact directly with specific origin locations if necessary.
### Active-Active Setup
The API services are deployed across multiple distributed data centers. Each active deployment is fully isolated and capable of handling requests independently. This redundancy ensures that the service can withstand the failure of any single site without interrupting service availability.
### Global Load Balancer
A global load balancer sits in front of the distributed deployments. The load balancer directs incoming traffic to the healthiest available site based on real-time health checks and observed load.
- **Automated Failover:** In the event of a site failure, traffic is seamlessly rerouted to operational sites without user intervention.
- **Load Distribution:** Requests are balanced across all active sites to optimize resource usage and response times.
### Origin Publishing
To enable advanced interactions, the service includes the origin information for all of the available origins in the HTTP headers of API responses. This feature allows customers to explicitly target specific deployments if desired. It also allows for concurrent WebSocket consumption from multiple sites, ensuring fault tolerant WebSocket subscriptions, low-latency, and minimized risk of report gaps.
### Example Failover Scenarios
Automatic failover handles availability and traffic routing in the following scenarios:
- **Automatic Failover:** If one of the origins becomes unavailable, the global load balancer automatically reroutes traffic to the next available origin. This process is transparent to the user and ensures uninterrupted service. During automatic failover, WebSockets experience a reconnect. Failed REST requests must be retried.
- **Manual Traffic Steering:** If you want to bypass the load balancer and target a specific site, you can use the origin headers to direct your requests. This manual targeting does not affect the automated failover capabilities provided by the load balancer, so a request will succeed even if the specified origin is unavailable.
- **Multi-origin concurrent WebSocket subscriptions:** In order to maintain a highly available and fault tolerant report stream, you can subscribe to up to two available origins simultaneously. This compares the latest consumed timestamp for each stream and discards duplicate reports before merging the report stream locally.
---
# Data Streams Backed xStock streams
Source: https://docs.chain.link/data-streams/backed-streams
---
# Data Streams Billing
Source: https://docs.chain.link/data-streams/billing
Chainlink Data Streams offers two billing models:
1. **Subscription model**: A subscription-based billing option.
2. **Pay-per-report model**: You pay to verify reports from Data Streams onchain using the verifier contract. You pay per report verified. If you verify multiple reports in a batch, you pay for all of the reports included in that batch.
The verification price is 0.35 USD per report. Chainlink Data Streams supports fee payments in LINK and in alternative assets, which currently includes native blockchain gas tokens and their ERC20-wrapped version. Payments made in alternative assets have a 10% surcharge when compared to LINK payments.
[Contact us](https://chainlinkcommunity.typeform.com/datastreams?#ref_id=docs) to learn more about Mainnet pricing and subscription options.
---
# Data Streams Best Practices
Source: https://docs.chain.link/data-streams/concepts/best-practices
This page provides best practices and recommendations for using Chainlink Data Streams effectively in your applications. These practices can help you manage risks, optimize performance, and ensure compliance with market standards.
***
## Real-World Assets (RWA)
Apply these RWA best practices when integrating or operating markets that use tokenized real-world assets. Developers and operators are responsible for assessing market integrity, implementing mitigations, and managing application-level risks — see the [Developer Responsibilities](/data-streams/developer-responsibilities) guidance for details.
### Market Hours
Markets for Real-World Assets (RWA) operate during specific hours and are subject to various market conditions that can create risks for applications. The following sections outline common market issues and how to mitigate them.
#### Market gaps
Market gaps occur when there are interruptions in trading or price discovery, leading to periods where the last available price may not reflect current market conditions. These gaps can create risks, particularly around market opens, closures, and unexpected disruptions.
#### Market close
Large price jumps between trading sessions due to after-hours news.
A large price jump at market open could cause sudden liquidations, potentially leaving the perpetual DEX with bad debt if a trader's collateral is insufficient to cover the losses.
| Data Stream behavior | User guidance |
| :------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
`midPrice`: Closing price is repeated until market open.
`marketStatus`: 1 = Market Closed.
`lastUpdateTimestamp`: Timestamp of the closing price of the last session.
| Keep markets closed while `marketStatus = 1` to prevent users trading at unfair prices.
Leverage available should be set in line with the asset average volatility to avoid bad debt if a trader's collateral is insufficient to cover the losses. |
#### Price formation at open/close
Certain assets (e.g., FX open on Sunday afternoon) experience gradual price discovery due to fragmented liquidity and delayed trading activity.
The perpetual DEX should avoid opening their market with the last close price.
| Data Stream behavior | User guidance |
| :----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | :-------------------------------------------------------------------------------------------------------------- |
|
`midPrice`: Closing price is repeated until a bid/ask becomes available or a transaction occurs.
`marketStatus`: `2` (Market Open).
`lastUpdateTimestamp`: Timestamp of the closing price of the last session.
| Wait until `lastUpdateTimestamp` is current before opening the market so traders don't execute on stale quotes. |
#### Sudden failures
Unexpected system outages, order execution failures, or data feed disruptions can occur.
The price will be flat during that period, meaning if a perp DEX lacks a mechanism to handle halts, it may struggle to determine fair prices thus leading to unpredictable liquidations.
| Data Stream behavior | User guidance |
| :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | :---------------------------------------------------------------------------------------------------------------- |
|
`midPrice`: Last mid-price is repeated until a new price is available.
`marketStatus`: `2` (Market Open).
`lastUpdateTimestamp`: Timestamp of the last mid-price.
| Decide whether to allow users to open/close positions when `marketStatus = 2` but `lastUpdateTimestamp` is stale. |
#### Trading halts
Stocks can be halted due to extreme volatility (e.g., limit up/down rules) or regulatory actions.
The price will be flat during that period, meaning if a perp DEX lacks a mechanism to handle halts, it may struggle to determine fair prices thus leading to unpredictable liquidations.
| Data Stream behavior | User guidance |
| :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | :---------------------------------------------------------------------------------------------------------------- |
|
`midPrice`: Last mid-price is repeated until a new price is available.
`marketStatus`: `2` (Market Open).
`lastUpdateTimestamp`: Timestamp of the last mid-price.
| Decide whether to allow users to open/close positions when `marketStatus = 2` but `lastUpdateTimestamp` is stale. |
***
### Volatility & low liquidity
During periods of high volatility or low liquidity, price movements can become unpredictable and exaggerated. These conditions can increase the risk of sudden liquidations and bad debt accumulation, requiring careful risk management strategies.
#### Algorithmic & HFT activity
Rapid-fire trading by algos can create unpredictable price movements.
High volatility can lead to liquidation and potential bad debt accumulation.
| Data Stream behavior | User guidance |
| :-------------------------------------------------------------------------------------------------------------------------------------------- | :----------------------------------------------------------------------- |
|
`midPrice`: Current mid price.
`marketStatus`: `2` (Market Open).
`lastUpdateTimestamp`: Current timestamp.
| Monitor liquidation thresholds closely to prevent accumulating bad debt. |
#### Low liquidity at open/close
Reduced market depth at trading session transitions can lead to higher volatility and spreads.
High volatility can lead to liquidation and potential bad debt accumulation.
| Data Stream behavior | User guidance |
| :-------------------------------------------------------------------------------------------------------------------------------------------- | :----------------------------------------------------------------------- |
|
`midPrice`: Current mid price.
`marketStatus`: `2` (Market Open).
`lastUpdateTimestamp`: Current timestamp.
| Monitor liquidation thresholds closely to prevent accumulating bad debt. |
***
### Corporate actions
Corporate actions are events initiated by publicly traded companies that can significantly impact stock prices and trading behavior. These actions are usually announced outside regular trading hours and can cause substantial price movements when markets reopen. Users should monitor these events closely as they can lead to sudden price adjustments that may trigger unexpected liquidations or require position modifications.
#### Bankruptcy & delisting
Bankruptcy can lead to delisting or complete loss of equity value.
Delisting will zero out prices for the asset.
| Data Stream behavior | User guidance |
| :--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | :----------------------------------------------------------------------------------------------- |
|
`midPrice`: Closing price is repeated until a new price is available.
`marketStatus`: `1` (Market Closed).
`lastUpdateTimestamp`: Closing timestamp of the last session.
| Monitor delisting news during `marketStatus` = `1` and close markets permanently once confirmed. |
#### Spin-offs
When a company spins off a business unit into a separate publicly traded entity, the parent company's stock may adjust accordingly, while the spun-off company's shares begin trading independently.
Positions may need to be manually adjusted if the DEX doesn't support tracking the new entity.
| Data Stream behavior | User guidance |
| :------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | :--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
`midPrice`: Closing price is repeated until the first post-spin trade.
`marketStatus`: `1` (Market Closed).
`lastUpdateTimestamp`: Last close.
| Monitor spin-off and split announcements while `marketStatus = 1`.
Auto-pause the market if the first post-event price moves by more than X% from the prior close, update positions, then reopen.
If automatic adjustment isn't possible, disable leverage during the event window to prevent unfair liquidations. |
#### Stock splits & reverse splits
A stock split increases the number of shares while reducing the price per share (e.g., 2-for-1 split), often making shares more accessible to investors. A reverse split does the opposite, consolidating shares to increase the price per share.
A 2-for-1 split would reduce the price by 50% from the previous trading session, any leveraged user could get liquidated.
| Data Stream behavior | User guidance |
| :------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | :--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
`midPrice`: Closing price is repeated until the split-adjusted price prints.
`marketStatus`: `1` (Market Closed).
`lastUpdateTimestamp`: Last close.
| Monitor spin-off and split announcements while `marketStatus = 1`.
Auto-pause the market if the first post-event price moves by more than X% from the prior close, update positions, then reopen.
If automatic adjustment isn't possible, disable leverage during the event window to prevent unfair liquidations. |
#### Mergers & acquisitions (M\&A)
If a company is being acquired, its stock price may rise to reflect the acquisition premium. The acquiring company's stock might fluctuate based on investor sentiment regarding the deal's financial and strategic impact.
Announcements can cause sharp price spikes or sustained moves.
| Data Stream behavior | User guidance |
| :------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | :----------------------------------------------------------------------- |
|
`midPrice`: Closing price is repeated until a new price prints.
`marketStatus`: `1` (Market Closed).
`lastUpdateTimestamp`: Last close.
| Monitor liquidation thresholds closely to prevent accumulating bad debt. |
#### Share buybacks & stock issuance
Reduced share supply from a buyback can drive stock prices higher, while an increase in share supply can lead to price dilution.
Announcements can cause sharp price spikes or sustained moves.
| Data Stream behavior | User guidance |
| :------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | :----------------------------------------------------------------------- |
|
`midPrice`: Closing price is repeated until a new price prints.
`marketStatus`: `1` (Market Closed).
`lastUpdateTimestamp`: Last close.
| Monitor liquidation thresholds closely to prevent accumulating bad debt. |
#### Dividends
A company's stock price typically adjusts to reflect dividend payments. For example, when a company declares a 10% dividend, its stock price often drops by a similar amount on the ex-dividend date, as new buyers are no longer entitled to that dividend.
Announcements can cause sharp price spikes or sustained moves.
| Data Stream behavior | User guidance |
| :------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | :----------------------------------------------------------------------- |
|
`midPrice`: Closing price is repeated until the ex-date trade prints.
`marketStatus`: `1` (Market Closed).
`lastUpdateTimestamp`: Last close.
| Monitor liquidation thresholds closely to prevent accumulating bad debt. |
***
### Handling stock splits for tokenized assets
Corporate actions, such as stock splits and reverse splits, require precise handling for tokenized assets to ensure price continuity and avoid disruptions. These events alter per‑share pricing while leaving the underlying economic exposure unchanged. They can produce abrupt per‑share price moves and must be handled carefully to avoid incorrect onchain price computations and unexpected liquidations.
In the [v10 report schema](/data-streams/reference/report-schema-v10), continuity is preserved by staging a multiplier change with a scheduled `activationDateTime` so the Theoretical Price (`price` \* `currentMultiplier`) remains continuous. Split ratios are typically known in advance, but activation may occur while markets are closed, so some external price sources may not reflect the split until trading resumes.
#### Guiding principle
Follow these principles when handling multiplier changes during corporate actions:
1. The protocol considers the Theoretical Price as `price` \* `currentMultiplier`.
2. Ahead of the event, `newMultiplier` and `activationDateTime` are staged.
3. At `activationDateTime` (Unix), `currentMultiplier` becomes `newMultiplier`.
- The underlying `price` from traditional markets should start reflecting the split the next time trading opens, so at the next `price` update, the Theoretical Price should remain continuous.
#### Example (10:1 split, AAPL)
The following hypothetical scenario demonstrates how a 10:1 AAPL stock split is handled through the staged multiplier system, showing the progression from announcement through protocol reopening with proper price continuity maintained throughout.
The following timeline outlines the key events and actions taken at each stage:
- **T-2**: [Split announcement and multiplier staging](#announcement-t-2)
- **T-1**: [Protocol preparation and monitoring setup](#protocol-engagement-t-1)
- **T0**: [Multiplier activation (split effective date)](#activation-t0)
- **T+1**: [Market reopening with adjusted prices](#market-reopening-t1)
- **T+2**: [Protocol resumption after verification](#protocol-reopening-t2)
##### Announcement (T-2)
A 10:1 AAPL stock split is announced. [The report](/data-streams/reference/report-schema-v10) updates to stage the split:
- `newMultiplier` is set to 10x the value of `currentMultiplier`.
- `activationDateTime` is set to the Unix timestamp of the split.
- `currentMultiplier` is unaffected until activation.
##### Protocol engagement (T-1)
At this stage, users are advised to monitor for changes in `activationDateTime` and inspect the upcoming change to prepare appropriate action, such as preparing the protocol for a pause around the `activationDateTime` in order to ensure appropriate handling of the stock split.
##### Activation (T0)
When the provider applies the split, [the report](/data-streams/reference/report-schema-v10) updates:
- `newMultiplier` remains the current value.
- `activationDateTime` is set to `0`.
- `currentMultiplier` is updated to the same value as `newMultiplier`.
If activation occurs while the underlying market is closed, prices may still show the pre‑event last trade. Do not compute the Theoretical Price during this pre-adjustment window. Monitor `marketStatus` and keep the protocol paused until the first post‑event trade prints and the Theoretical Price is continuous.
##### Market reopening (T1)
The stock split has taken effect. Generally, this occurs after the market closes or over the weekend, meaning `price` may not yet reflect the new economic value per share. Upon the market reopening, `price` should start reflecting the split-adjusted value.
##### Protocol reopening (T2)
Users should pause markets before `activationDateTime` and keep them paused until:
- The market has reopened (monitor `marketStatus`)
- `price` has updated in line with the split ratio (e.g., 10:1)
- You have confirmed that the Theoretical Price matches expectations
After all the above checks have been confirmed, users can unpause their protocol and continue and resume normal operation.
#### Activation-time convention
Each tokenized asset issuer sets its own activation time. For example, the xStocks default `activationDateTime` is 00:00 UTC on the effective date. Once `activationDateTime` is reached, `currentMultiplier` becomes `newMultiplier`.
Because underlying venues may be closed at activation, some external price sources may not reflect the split immediately. If `activationDateTime` occurs while the underlying market is closed, the report’s `currentMultiplier` will become `newMultiplier`; however, `price` can remain at the pre-event level until the market reopens. During this post-activation, pre-adjustment interval (after the multiplier has changed but before the underlying `price` updates), the Theoretical Price can be incorrect. Use `marketStatus` to pause until `price` reflects the event.
#### Integrator risk & handling
Computing `price` \* `currentMultiplier` when the price has not adjusted (e.g., market closed) can produce large errors. It is critical to ensure that the Theoretical Price is again reflective of actual market conditions before allowing live trading.
Treat any multiplier change (splits, dividends, etc) and `activationDateTime` as a maintenance window; pause/guard the protocol, then verify post-activation conditions before resuming.
For broader guidance around market hours and event handling, refer to the [Market Hours](#market-hours) guidance.
---
# DEX State Price Streams
Source: https://docs.chain.link/data-streams/concepts/dex-state-price-streams
DEX State Price Streams are streams tailored for assets that derive most—if not all—of their liquidity from **decentralized exchanges (DEXs)**. While standard crypto price streams often incorporate centralized exchange (CEX) data and focus on assets with high trading volumes, these DEX-centric streams incorporate onchain market data to more accurately reflect the conditions under which certain "long-tail" or newly launched tokens trade.
## Why a Separate Methodology?
- **Liquidity Distribution**: Many emerging or specialized tokens have limited CEX listings but deep onchain liquidity in Automated Market Maker (AMM) pools.
- **DEX-Specific Dynamics**: AMMs use liquidity pools rather than traditional order books, meaning trading volume and liquidity conditions are captured differently than in centralized order-book markets.
- **Volume vs. Liquidity Mismatch**: Some assets may have relatively low trading volume but significant liquidity depth in DEXs. Traditional trade-based pricing algorithms are often not well optimized in these cases due to sparse activity and cannot provide continuous streaming prices. DEX State Price Streams address this by leveraging the current state of onchain liquidity to determine the execution price a trader would receive. This enables more accurate, real-time pricing even for low-volume assets.
- **Mitigating Manipulation**: Observing onchain "end of block" states and applying filters to aggregated DEX data can significantly reduce the impact of block-level price manipulation.
## High-Level Outline of the DEX State Price Methodology
1. **Asset Selection**
- Focuses on DEX-dominant or DEX-exclusive assets, generally those where 80% or more of trading volume is on decentralized exchanges.
- Uses preliminary screens to identify relevant liquidity pools (e.g., top 95% of trading volume).
2. **Onchain Pool Monitoring**
- Selects pools across multiple DEX protocols to capture an aggregate view of token price and liquidity.
- Gathers end-of-block state information (reserves, ticks, or pool-specific data) rather than raw trade-by-trade events to protect against temporary price swings or intra-block MEV.
3. **Weighted Aggregation**
- Aggregates individual pool prices using a volume-based weighting scheme.
- Each pool's long-term trading volume or liquidity depth influences its weight, helping ensure more dominant pools have a proportionally greater impact on the final price.
4. **Outlier & Time-Lag Filters**
- Applies outlier detection to remove anomalous values that deviate significantly from broader market activity.
- Introduces a mild time-lag filter to smooth short-term volatility and mitigate sudden spikes or dips due to low-liquidity trades.
5. **Liquidity Metrics**
- Incorporates advanced liquidity metrics (e.g., market depth, liquidity distribution across pools) to refine the weighting and guard against abrupt liquidity shifts.
- An emergency pause mechanism can temporarily suspend price reporting if a severe liquidity drop is detected.
6. **Final DEX State Price**
- The result is a price that reflects both onchain market activity and relevant liquidity conditions, providing a robust solution for DEX-based assets.
## How to Use DEX State Price Streams
You can use DEX State Price streams like other Chainlink Data streams. The key distinction lies in the report schema you need to configure, specifically the [Report Schema V3 (DEX)](/data-streams/reference/report-schema-v3-dex). In this schema, the `bid` and `ask` fields have the same value as the `price` field. This is due to DEX-based markets depending on continuous liquidity pools instead of order-book mechanics to establish the reference price. Aside from this difference, the integration process follows the same technical steps as any Chainlink Data stream.
## Risk Mitigation
Although DEX State Price Streams services are powered by Chainlink decentralized oracle networks—which have a long track record of being highly secure, reliable, and accurate—users must ensure that they understand each feed’s unique update parameters (and the liquidity/volume profiles of the corresponding assets) and implement relevant [risk mitigation techniques](/data-feeds/selecting-data-feeds#risk-mitigation) based on the intended use case.
Utilizing DEX state introduces specific risks inherent to DeFi:
- **Smart Contract Risk**: Smart contracts, operating on certain blockchains like Ethereum, are autonomous code segments designed to execute transactions in a decentralized manner without the need for intermediaries. Despite their simplicity and intended security, they remain susceptible to bugs and exploitations. A flaw within the smart contract code can be exploited by malicious entities to manipulate the state price or other liquidity metrics. Such occurrences may lead to oracle price abnormality and financial losses for lending protocols.
- **Layer 2 and Cross-chain Bridge Hack Risk**: A bridge in the context of DeFi consists of smart contracts that enable the transfer of assets between different blockchains or layers. Bridges hold significant reserves of tokens to facilitate these transfers. However, they can become targets for hacks. An unauthorized withdrawal from the bridge depletes the reserves necessary for users to redeem their assets. This can lead to the implied price of the tokens—essentially the price that is expected based on available reserves and pool ratios—no longer accurately reflecting the true state or market price of the tokens.
- **External Dependency Risk**: Some DEX pool state prices rely on external exchange rates, which can come from an oracle, a ratio of queryable balances, or another calculation method. Bad actors might attempt to manipulate these exchange rates to influence the outcome of the state price or profit from the price discrepancies. This can lead to unintended consequences, such as liquidations, loss of funds, or arbitrage opportunities for attackers.
**Consumers of State Pricing must perform their own risk assessments**. They should consider factors like market depth, underlying asset security, and bridge dependencies, and then adjust protocol parameters (e.g., LTV, liquidation thresholds, supply caps) accordingly. The provided market depth metrics (when available) will aid in this assessment.
---
# Data Streams Liquidity-Weighted Bid-Ask Prices (LWBA)
Source: https://docs.chain.link/data-streams/concepts/liquidity-weighted-prices
Chainlink Data Streams provides [reports](/data-streams/reference/report-schema-v3) with a *Mid* and *Liquidity-Weighted Bid and Ask (LWBA)* prices. These three prices form a pricing spread that offers protocols insight into market activity based on the current state of the order books.
**Note**: At the moment, only [Crypto streams](/data-streams/crypto-streams) provide LWBA prices.
## Bid and Ask prices
### What are Bid and Ask prices?
Bid and Ask prices appear in the order book of an exchange and represent the trading orders buyers and sellers actively submit. These prices drive potential market transactions.
On an exchange, the Mid price in the order book is derived from the midpoint between the best Bid and the best Ask price. While the Mid price serves as a useful reference, for instance, in calculating funding rates, it is important to note that it is not a price at which trades should be executed. Actual trades must match an open bid or ask.
- **Bid price**: The Bid price refers to the maximum price that a buyer is willing to pay for an asset. It represents the highest price at which a buyer would agree to buy.
- **Ask price**: The Ask price is the minimum price at which a seller is willing to sell their asset. It represents the lowest price at which a seller would agree to sell.
### Key characteristics
- **Price spread**: The difference between the Ask and the Bid prices is the *spread*. A narrower spread typically indicates a more liquid market, whereas a wider spread can signify less liquidity or higher volatility.
- **Market orders**: When traders place market orders to buy or sell immediately, they accept the best available Ask or Bid prices from the market. This interaction drives immediate transactions but can also impact the market price if the order size is substantial compared to the available volume.
- **Liquidity and volume**: Bid and Ask prices are not static and can fluctuate based on the asset's trading volume and market liquidity. High liquidity generally results in a narrower spread, which enables more trading activity near this spread and facilitates smoother transactions without substantial impacts on the market price.
### Role in Chainlink Data Streams
In Chainlink Data Streams, Bid and Ask prices play a pivotal role in the construction of Liquidity-Weighted Bid and Ask (LWBA) prices.
## What is a Liquidity-Weighted price?
A Liquidity-Weighted price considers the Bid and Ask prices based on the available liquidity at each price level in the order books. This method weights price data by the volume of assets available at each price point, and provides a more accurate reflection of market conditions where larger orders would significantly impact the price.
## What are the benefits of LWBA prices?
Liquidity-weighted Bid and Ask prices help in several key areas:
- **Accuracy**: They offer a more precise measure of the market price that considers the volume available at different price levels and allows for true market sentiment to be reflected.
- **Risk Management**: They consider the market's depth and help protocols assess and manage their risk more effectively and dynamically during periods of high volatility.
- **Efficiency**: Liquidity-weighted prices facilitate smarter order execution, which accurately reflects realistic slippage and enhances the trading experience for traders and protocols by closely mirroring actual market conditions.
## Use case examples
- **Derivatives trading**: Platforms can use LWBA prices to set more accurate strike prices in options markets or fair settlement prices in futures contracts.
- **Liquidity provision**: Automated market makers (AMMs) and other liquidity providers can use these prices to quote more accurate and realistic prices that reflect current market conditions.
- **Loan collateralization**: Lending platforms might use liquidity-weighted prices to determine more realistic valuations for collateral during loan issuance and liquidation processes, thereby reducing the risk of unfair liquidations.
## Practical examples of LWBA prices
To better understand how LWBA prices are calculated and used, consider the following examples that illustrate their application in different market scenarios. Each example explains the calculation, highlights trading and risk management benefits, and discusses the spread between Bid and Ask prices.
### Example 1: Market volatility
**Scenario**: Assume the market experiences a sudden drop, leading to a scenario where most buy orders accumulate at a lower price point. Similarly, sellers might be inclined to lower their ask prices to exit positions quickly. The order book shows 10 units at $0.99 but a more substantial 100 units at $0.90.
**Order book**:
| Type | Price | Quantity |
| ---- | ----- | -------- |
| Bid | $0.99 | 10 |
| Bid | $0.90 | 100 |
| Ask | $1.01 | 15 |
| Ask | $1.05 | 85 |
**LWBA Price Calculation**:
- Bid = ((10 x 0.99) + (100 x 0.90)) / 110 = $0.909
- Ask = ((15 x 1.01) + (85 x 1.05)) / 100 = $1.044
**Spread Calculation**: Ask - Bid = $1.044 - $0.909 = $0.135
The LWBA Bid price of $0.909 and Ask price of $1.044 provide a realistic snapshot of the market under volatile conditions. The spread of $0.135 illustrates a wider gap due to high volatility, impacts liquidity and indicates a less efficient market. This dual perspective helps traders and protocols manage risk by pricing assets closer to the most liquid market levels and offers a more cautious approach to valuation during market drops.
### Example 2: Market stability
**Scenario**: In a stable market, traders place the bulk of buy and sell orders close to the current market prices, leading to high liquidity near these levels.
**Order book**:
| Type | Price | Quantity |
| ---- | ----- | -------- |
| Bid | $0.98 | 90 |
| Bid | $0.99 | 10 |
| Ask | $1.00 | 50 |
| Ask | $1.01 | 50 |
**LWBA Price Calculation**:
- Bid = ((10 x 0.99) + (90 x 0.98)) / 100 = $0.981
- Ask = ((50 x 1.00) + (50 x 1.01)) / 100 = $1.005
**Spread Calculation**: Ask - Bid = $1.005 - $0.981 = $0.024
The LWBA Bid price of $0.981 and Ask price of $1.005 reflect the concentration of liquidity near the top price points, with a narrow spread of $0.024 that indicates a highly liquid and efficient market. This pricing accuracy allows traders to execute orders close to their desired price points with minimal slippage and enhances transaction cost efficiency and market predictiveness in stable conditions.
### Conclusion
| Example | Weighted Bid Price | Weighted Ask Price | Spread | Market condition | Key benefit |
| ------- | ------------------ | ------------------ | ------ | ---------------- | ---------------------------------------------- |
| 1 | $0.909 | $1.044 | $0.135 | High volatility | Aligns price with majority liquidity |
| 2 | $0.981 | $1.005 | $0.024 | Market stability | Minimizes slippage and improves price accuracy |
In these examples, Liquidity-Weighted Bid and Ask prices, along with the spread between them, adjust dynamically based on the actual distribution of orders within the market. By reflecting real-time liquidity and volume, these prices and spreads provide protocols and traders with critical information that enhances pricing accuracy, reduces trading risks, and increases efficiency across various market conditions.
---
# Data Streams Crypto streams
Source: https://docs.chain.link/data-streams/crypto-streams
---
# Developer Responsibilities: Market Integrity and Application Code Risks
Source: https://docs.chain.link/data-streams/developer-responsibilities
Chainlink Data Streams provide access to high-frequency market data backed by decentralized, fault-tolerant, and transparent infrastructure, where offchain data can be pulled onchain and verified by Chainlink's Verify Contract, as needed by your application. The assets priced by Chainlink Data Streams are subject to market conditions beyond the ability of Chainlink node operators to control. As such, developers are responsible for understanding market conditions and other external risks and how they can impact their products and services.
When integrating Chainlink Data Streams, developers must understand that the performance of individual Streams is subject to risks associated with both market integrity and application code.
- **Market Integrity Risks** are those associated with external market conditions impacting price behavior and data quality in unanticipated ways. Developers are solely responsible for monitoring and mitigating any potential market integrity risks.
- **Application Code Risks** are those associated with the quality, reliability, and dependencies of the code on which an application operates. Developers are solely responsible for monitoring and mitigating any potential application code risks related to their own products and services.
See the [Market Manipulation vs. Oracle Exploits](https://chain.link/education-hub/market-manipulation-vs-oracle-exploits) article for information about market integrity risks and how developers can protect their applications.
## Developer Responsibilities
Developers are responsible for maintaining the security and user experience of their applications. They must also securely manage all interactions between their applications and third-party services.
In particular, developers implementing Chainlink Data Streams in their code and applications are responsible for their application's market integrity and code risks that may cause unanticipated pricing data behavior. These are described below in more detail:
For guidance on mitigations, see the [Data Streams Best Practices](/data-streams/concepts/best-practices) — developers remain solely responsible for their integrations.
### Market Integrity Risks
Market conditions can impact the pricing behavior of assets in ways beyond the ability of Chainlink node operators to predict or control.
Market integrity risk factors can include, but are not limited to, [market manipulation](https://chain.link/education-hub/market-manipulation-vs-oracle-exploits) such as Spoofing, Ramping, Bear Raids, Cross-Market Manipulation, Washtrading, and Frontrunning. Developers are solely responsible for accounting for such risk factors when integrating Chainlink Data Streams into their applications. Developers should understand the market risks around the assets they intend their application to support before integrating associated Chainlink Data Streams and inform their end users about applicable market risks.
Developers should be aware that some assets, particularly those with low liquidity, may experience significant price volatility due to factors such as wider price spreads among exchanges. In such conditions, large trades can significantly move an asset's price, causing unexpected price oscillations.
#### Traditional Market Assets
Among other assets, Data Streams supports foreign exchange (FX) spot markets for major currencies against USD, gold and silver spot against USD, and WTI oil spot contracts. In traditional finance, these markets are some of the most liquid instruments and are traded across most financial centers including London, New York, Tokyo, and Singapore. Unlike Crypto markets, most traditional markets do not trade 24/7 and therefore liquidity and spreads can vary during the day and trading week. For assets traded within traditional markets, Chainlink Data Streams provides developers with an indication of the market's status (either open or closed), such as via a market status flag in price reports, which can be used by applications as applicable.
The market status provided on Streams serves as an indication of the open and close hours for traditional market assets based on historical practice; it is provided for referential purposes only. Developers are responsible for independently assessing the risks associated with trading at these times, particularly at opening and closing price levels. Developers are solely responsible for determining the actual status of markets for any streams they utilize. Protocol developers are advised to proceed with caution and make trading decisions at their own risk.
Under the shared responsibility model, developers must thoroughly understand the methodologies, trading behaviors, and market liquidity patterns related to traditional asset classes and instruments, particularly during and around market opening and closing hours. Developers should be aware of unique risks such as price volatility, sudden liquidity shifts, and trading halts that can affect data reliability and availability. A comprehensive understanding of these factors will help developers better anticipate and address potential issues, resulting in more resilient and trustworthy integrations for end users.
**Data Streams developers are solely responsible for defining and implementing their own risk procedures and systems, including being aware of market open and closing times, and bank holidays, when integrating associated Chainlink Data Streams.**
### Market hours
Some Data Streams reflect assets that trade only during specific market windows or are published with a delay. Developers must not assume 24/7 trading availability for these streams and should implement validations and fallback logic where appropriate:
- Offchain equity and ETF assets are traded only during standard [market hours](/data-streams/market-hours). Do not use these streams outside those windows.
- Forex (foreign exchange) assets trade only during defined market hours for the pair and, for some currencies, primarily during local banking hours. Do not use Forex streams outside the market hours relevant to the currency pair.
- UK ETF price streams may be published with a 15-minute delay from their original source. These assets are traded only during standard market hours — do not use UK ETF streams outside their specified trading windows or assume real-time prices.
- Other assets may have specific market hours defined by their respective exchanges or liquidity providers.
For more information, see the [Market Hours Best Practices](/data-streams/concepts/best-practices#market-hours); developers remain solely responsible for how they use this guidance and for any risks or liabilities resulting from their integrations.
#### DEX-based Assets
Data Streams also provides pricing data related to assets that trade, primarily, on decentralized exchanges (DEXs). Under the Shared Responsibility model, it is essential that developers understand the methodology and risks associated with such DEX-based assets. The risks include, but are not limited to:
- Data may be sourced by reading the state of onchain contracts and estimating the price at which trades in a certain asset pool could be executed. The accuracy of prices may be hindered by such factors as:
- Slippage: The movement of prices between (i) the time of observation, and (ii) the time of execution, caused by other transactions changing the state of the respective smart contract.
- Price Impact: The movement of price caused by the volume of trades being settled on a respective pool.
- Certain assets may not trade actively—if data is based on traded prices, it may not reflect the current state of the respective pool, and/or the current realizable price.
- There is a certain level of latency between (i) the observability of price data on DEXs, and (ii) the price data being reflected in our streams. This can increase the risk of frontrunning.
#### New Token Data Streams
When a token is newly launched, the historical data required to implement a rigorous data quality and risk analysis is unavailable. Consistent price discovery may involve an indeterminate amount of time. Developers are responsible for understanding the market and volatility risks inherent with such assets when choosing to integrate them into any products or services.
Newly issued tokens may have limited trading activity at launch. Thin order books can lead to substantial price fluctuation resulting in significant volatility. Developers are responsible for being aware, when choosing to integrate such tokens into any products or services, of the inherent volatility, potential for illiquidity, and significant price fluctuation that often characterize newly issued tokens as market attempts to find a clearing level, i.e equilibrium price.
Token prices may exhibit oscillations between two or more price points within regular intervals due to price disagreements across exchanges. In early trading sessions, wide bid and ask spreads may reflect uncertainty about the token market value. Developers are responsible for understanding the possibility of token price oscillation of this nature when choosing to integrate newly issued tokens into any products or services.
Developers implementing Chainlink New Token Data Streams are responsible for independently verifying the liquidity, quality and stability of the assets pricing when integrating them in their use cases
#### Custom Data Streams
Custom Data Streams are built to serve a specific use case and might not be suitable for general use or your use case's risk parameters. Users must evaluate the properties of a feed to make sure it aligns with their intended use case. [Contact the Chainlink Labs team](https://chain.link/contact?ref_id=DataStreams) if you want more detail on any specific feeds in this category.
### Application Code Risks
Developers implementing Chainlink Data Streams are solely responsible for instituting risk mitigations, including, but not limited to, data quality checks, circuit breakers, and appropriate contingency logic for their use case. Some general guidelines include:
- **Code quality and reliability:** Developers must execute code using Chainlink Data Streams only if their code meets the quality and reliability requirements for their use case and application.
- **Code and application audits:** Developers are responsible for auditing their code and applications before deploying to production. Developers must determine the quality of any audits and ensure that they meet the requirements for their application.
- **Code dependencies and imports:** Developers are responsible for ensuring the quality, reliability, and security of any dependencies or imported packages that they use with Chainlink Data Streams, and review and audit these dependencies and packages.
- **Implementing Contingency Logic:** In extreme circumstances, including situations beyond the control of Chainlink node operators, Chainlink Data Streams may experience periods of unavailability or performance degradation. When a WebSocket connection is dropped, user systems must manage reconnections effectively. Developers are responsible for creating contingency plans tailored to their specific application needs, such as:
- Implementing the [Data Streams Architecture](/data-streams/architecture),
- Adopting an [active-active](/data-streams/architecture#active-active-multi-site-deployment) deployment strategy and [configuring the SDK](/data-streams/reference/data-streams-api/go-sdk#config-struct) to support multiple concurrent WebSocket connections,
- Retrieving any potentially missing reports via the [REST API](/data-streams/reference/data-streams-api/interface-api).
### Additional Considerations on Data Usage and Verification
- **Combined Data Sources:** Mixing data from Data Streams with other pricing sources introduces additional complexity and risk. It may result in unforeseen results, potentially impacting application performance or reliability. Developers are solely responsible for any such outcomes, and it is imperative that thorough risk assessments and comprehensive testing protocols are implemented.
- **Bypassing Data Verification:** Verifying against the Chainlink Verifier Contract is essential for ensuring data integrity. Bypassing or improperly implementing it increases the risk of data manipulation and should be avoided.
Developers are responsible for understanding and managing all additional risk factors. Implementing proper risk assessment and mitigation strategies is crucial to maintaining the integrity and reliability of applications that rely on external data sources.
---
# Data Streams Exchange Rate streams
Source: https://docs.chain.link/data-streams/exchange-rate-streams
---
# Chainlink Data Streams
Source: https://docs.chain.link/data-streams
Chainlink Data Streams delivers low-latency market data offchain, which you can verify onchain. This approach provides decentralized applications (dApps) with on-demand access to high-frequency market data backed by decentralized, fault-tolerant, and transparent infrastructure.
Traditional push-based oracles update onchain data at set intervals or when certain price thresholds are met. In contrast, Chainlink Data Streams uses a pull-based design that preserves trust-minimization with onchain verification.
Data Streams are offered in [several report formats](/data-streams/reference/report-schema-overview), each designed for distinct asset classes.
## Sub-Second Data and Commit-and-Reveal
Chainlink Data Streams supports sub-second data resolution for latency-sensitive use cases by retrieving data only when needed. You can combine the data with any transaction in near real time. A "commit-and-reveal" approach mitigates frontrunning by making trade data and stream data visible atomically onchain.
## Comparison to push-based oracles
Chainlink's push-based oracles regularly publish price data onchain. By contrast, Chainlink Data Streams relies on a pull-based design, letting you retrieve a report and verify it onchain whenever you need it. Verification confirms that the decentralized oracle network (DON) agreed on and signed the data. Some applications only need onchain data at fixed intervals, which suits push-based oracles. However, others require higher-frequency updates and lower latency. Pull-based oracles meet these needs and still provide cryptographic guarantees about data accuracy.
Pull-based oracles also operate more efficiently by retrieving data only when necessary. For example, a decentralized exchange might fetch a Data Streams report and verify it onchain only when a user executes a trade, rather than continuously pushing updates that might not be immediately used.
## Comprehensive market insights
Chainlink Data Streams offers price points such as mid prices and [Liquidity-Weighted Bid and Ask](/data-streams/concepts/liquidity-weighted-prices) (LWBA) for Crypto Streams. LWBA prices reflect current order book conditions, providing deeper insight into market liquidity and depth. With additional parameters, such as volatility and liquidity metrics, Data Streams helps protocols enhance trading accuracy, improve onchain risk management, and dynamically adjust margins or settlement conditions in response to real-time market shifts.
## High availability and resilient infrastructure
Data Streams API services use an [active-active multi-site deployment](/data-streams/architecture#active-active-multi-site-deployment) model across multiple distributed and isolated origins. This architecture ensures continuous operations even if one origin fails, delivering robust fault tolerance and high availability.
For real-time streaming applications, the SDKs support **High Availability (HA) mode** that establishes multiple simultaneous connections for zero-downtime operation. When enabled, HA mode provides:
- **Automatic failover** between connections
- **Report deduplication** across connections
- **Automatic origin discovery** to find available endpoints
- **Per-connection monitoring** and statistics
**Learn more:** [Go SDK](/data-streams/reference/data-streams-api/go-sdk#high-availability-ha-mode) | [Rust SDK](/data-streams/reference/data-streams-api/rust-sdk#high-availability-mode) | [TypeScript SDK](/data-streams/reference/data-streams-api/ts-sdk#high-availability-mode)
## Example use cases
Access to low-latency, high-frequency data enables a variety of onchain applications:
- **Perpetual Futures:** Sub-second data and frontrunning mitigation allow onchain perpetual futures protocols to compete with centralized exchanges on performance while retaining transparency and decentralization.
- **Options:** Pull-based oracles provide timely settlement of options contracts with the added benefit of market liquidity data to support dynamic onchain risk management.
- **Prediction Markets:** High-frequency updates let participants act on real-time data, ensuring quick reactions to events and accurate settlement.
## Key capabilities
- **Sub-second Latency:** Pull data on-demand with minimal delay
- **Cryptographic Verification:** Verify data authenticity onchain when needed
- **Multiple Access Methods:** REST API, WebSocket, or SDK integration
- **Comprehensive Market Data:** Mid prices, LWBA prices, volatility, and liquidity metrics
- **High Availability:** Multi-site deployment ensures 99.9%+ uptime
## How to use Data Streams
You can access Chainlink Data Streams through SDKs and APIs, allowing you to build custom solutions with low-latency, high-frequency data. Fetch reports or subscribe to report updates from the Data Streams Aggregation Network and verify their authenticity onchain.
### Integration options
Access data directly through REST APIs or WebSocket connections using our SDKs:
- **[Go SDK](/data-streams/reference/data-streams-api/go-sdk)** - Full-featured SDK with comprehensive examples
- **[Rust SDK](/data-streams/reference/data-streams-api/rust-sdk)** - High-performance SDK for Rust applications
- **[TypeScript SDK](/data-streams/reference/data-streams-api/ts-sdk)** - Type-safe SDK for TypeScript applications
- **[REST API](/data-streams/reference/data-streams-api/interface-api)** or **[WebSocket](/data-streams/reference/data-streams-api/interface-ws)** - Direct access to Data Streams endpoints
### Getting started
1. Understand the Architecture: Review the [system components and data flow](/data-streams/architecture) to understand how Data Streams works.
2. Explore Available Data: Browse [available reports and associated schemas](/data-streams/reference/report-schema-overview) to see what data is available.
3. Try the API: Follow our [hands-on tutorial](/data-streams/tutorials/go-sdk-fetch) to fetch and decode your first report.
4. Implement Verification: Add [onchain verification](/data-streams/reference/data-streams-api/onchain-verification) to ensure data authenticity in your smart contracts.
### Streams Trade: An alternative implementation
For applications that require automated data retrieval and execution, Streams Trade combines Chainlink Data Streams with [Chainlink Automation](/chainlink-automation) to deliver automated trade execution with frontrunning mitigation. This approach suits dApps that require automated, trust-minimized trade execution and high-frequency market data.
[Learn more about Streams Trade](/data-streams/streams-trade).
---
# Data Streams Market Hours
Source: https://docs.chain.link/data-streams/market-hours
Markets for several assets are actively traded only during certain hours.
***
## Cryptocurrency
Cryptocurrency markets operate continuously, with no designated market close.
| Asset class | Hours |
| ----------- | --------------------------- |
| **Crypto** | 24/7/365 — No market close. |
***
## Real-World Asset (RWA) market hours
RWA markets operate during specific hours, with breaks for holidays and sometimes daily pauses.
| Asset class | Weekly Open | Weekly Close | Daily Breaks \* | Bank Holidays \*\* |
| ----------------------------------------------------------- | ------------- | ------------- | ------------------- | -------------------------------------------------------------------------------------- |
| **US Equities** (top-50 by market cap + selected ETFs) | **09:30 Mon** | **16:00 Fri** | — | [NYSE holiday calendar](https://www.nyse.com/markets/hours-calendars) |
| **FX Majors** (G10 + KRW, SGD, HKD, CNH …) | **17:00 Sun** | **17:00 Fri** | — | Jan 1, Dec 25 |
| **Precious Metals (Spot)** (XAU, XAG) | **18:00 Sun** | **17:00 Fri** | 17:00–18:00 Mon-Thu | Jan 1, Good Fri, Dec 25 |
| **Commodities** (WTI Synthetic Spot) | **18:00 Sun** | **17:00 Fri** | 17:00–18:00 Mon-Thu | [NYMEX holiday calendar](https://www.cmegroup.com/tools-information/holiday-calendar/) |
\* Times shown as **HH:MM ET**.\
\*\* Half-day trading may apply on the eve of certain U.S. holidays (e.g., Jul 3, Nov 28). Consult the linked exchange calendars for exact cut-off times.
***
## User Recommendations
For comprehensive guidance on managing risks related to market hours, market gaps, volatility, and corporate actions, see the [Best Practices](/data-streams/concepts/best-practices#market-hours) page.
---
# Data Streams Net Asset Value streams
Source: https://docs.chain.link/data-streams/nav-streams
---
# Data Streams Candlestick API
Source: https://docs.chain.link/data-streams/reference/candlestick-api
The Candlestick API provides open-high-low-close (OHLC) aggregated trading data.
OHLC data is provided in two formats: [the standard OHLCV format](#get-candlestick-data-row-format), widely used by retail crypto exchanges, ideal for human readability and compatibility; [and the columnar format](#get-candlestick-data-column-format), preferred by HFT systems, optimized for efficient, large-scale data processing.
The API provides both historical candlestick data through the [history endpoints](#get-candlestick-data-column-format) (updated every minute) and live price updates through the [streaming endpoint](#get-streaming-price-updates) (updated every second).
## Domains
| Description | Testnet URL | Mainnet URL |
| :----------------------- | :---------------------------------------------- | :-------------------------------------- |
| Candlestick API endpoint | https\://priceapi.testnet-dataengine.chain.link | https\://priceapi.dataengine.chain.link |
## API Endpoints
### Authorize and get token
##### Endpoint
**`/api/v1/authorize`**
| Type | Description | Parameter(s) |
| :-------- | :------------------------------------------------ | :------------------------------------------------------------------------------ |
| HTTP POST | Authorizes a user and returns a JWT access token. |
`login`: The user ID.
`password`: The user's API key.
|
##### Sample request
```bash
curl -X POST \
-H "Content-Type: application/x-www-form-urlencoded" \
-d "login={YOUR_USER_ID}&password={YOUR_API_KEY}" \
https://priceapi.testnet-dataengine.chain.link/api/v1/authorize
```
##### Response
- **Status**: `200`
```json
{
"d": {
"access_token": "[ACCESS_TOKEN]",
"expiration": 1747203979
},
"s": "ok"
}
```
| Field | Type | Description |
| :------------- | :------- | :---------------------------------------- |
| `s` | `string` | The status of the request. |
| `d` | `object` | The data returned by the API call. |
| `access_token` | `string` | The JWT token for subsequent requests. |
| `expiration` | `number` | The expiry timestamp (unix) of the token. |
##### Error Responses
| Status Code | Error Message | Description |
| :---------- | :------------------------------------------------------------------------------ | :---------------------------------------- |
| `400` | `Parse error - Login: missing required field, Password: missing required field` | A required field was missing. |
| `401` | `Unauthorized - Invalid credentials` | The user password (API key) is incorrect. |
| `404` | `Not found - user not found for id {USER_ID}` | The user login was not found. |
| `500` | `Error - Failed to generate token` | The server failed to create a token. |
### Get list of supported symbols
##### Endpoint
**`/api/v1/symbol_info`**
| Type | Description | Parameter(s) |
| :------- | :------------------------------------------------------- | :----------------------------------------------------------------------------------------------- |
| HTTP GET | Gets a list of all supported symbols on the environment. |
`group` (optional): Filter symbols by group. Currently only supports "crypto".
|
##### Sample request
```bash
curl -X GET \
-H "Authorization: Bearer {YOUR_ACCESS_TOKEN}" \
https://priceapi.testnet-dataengine.chain.link/api/v1/symbol_info
```
##### Response
- **Status**: `200`
```json
{
"s": "ok",
"symbol": ["ETHUSD", "BTCUSD"],
"currency": ["USD", "USD"],
"base-currency": ["ETH", "BTC"]
}
```
| Field | Type | Description |
| :-------------- | :------- | :------------------------------- |
| `s` | `string` | The status of the request. |
| `symbol` | `array` | Array of supported symbols. |
| `currency` | `array` | Array of symbol currencies. |
| `base-currency` | `array` | Array of symbol base currencies. |
##### Error Responses
| Status Code | Error Message | Description |
| :---------- | :-------------------------------------------------------------------------------------------------------------------------------------------------------- | :----------------------------------------------- |
| `401` | `Unauthorized - Authorization header is required \|\| Invalid authorization header format \|\| token signature is invalid: signature is invalid \|\| ...` | The authorization header was missing or invalid. |
| `500` | `Error - Something went wrong` | An unexpected server error occurred. |
### Get candlestick data (column format)
##### Endpoint
**`/api/v1/history`**
| Type | Description | Parameter(s) |
| :------- | :-------------------------------------- | :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| HTTP GET | Gets candlestick data in column format. |
`symbol`: The symbol to query.
`resolution`: Resolution of the data (required but not used, use "1m").
`from`: Unix timestamp of the leftmost required bar (inclusive).
`to`: Unix timestamp of the rightmost required bar (inclusive).
|
**Note**: The resolution of the data is currently based on the size of the time window:
| Max time window size | Resolution of candles |
| :------------------- | :-------------------- |
| \<= 24 hours | 1 minute |
| \<= 5 days | 5 minutes |
| \<= 30 days | 30 minutes |
| \<= 90 days | 1 hour |
| \<= 6 months | 2 hours |
| > 6 months | 1 day |
##### Sample request
```bash
curl -X GET \
-H "Authorization: Bearer {YOUR_ACCESS_TOKEN}" \
"https://priceapi.testnet-dataengine.chain.link/api/v1/history?symbol=ETHUSD&resolution=1m&from=1746072068&to=1746158468"
```
##### Response
- **Status**: `200`
```json
{
"s": "ok",
"t": [1746158460, 1746158400, 1746158340],
"c": [1.84685e21, 1.848515087189567e21, 1.8490380305e21, 1.8481266e21],
"o": [1.8483674e21, 1.848602513e21, 1.8481267e21],
"h": [1.8498753129131415e21, 1.848875387e21, 1.8490380305e21],
"l": [1.8468008021426886e21, 1.848243519e21, 1.8475677870725296e21],
"v": []
}
```
| Field | Type | Description |
| :---- | :------- | :--------------------------------------------------------------------- |
| `s` | `string` | The status of the request. |
| `t` | `array` | Array of unix timestamps (the time of each candle). |
| `c` | `array` | Array of numbers (the close (last) value of each candle). |
| `o` | `array` | Array of numbers (the open (first) value of each candle). |
| `h` | `array` | Array of numbers (the high (max) value of each candle). |
| `l` | `array` | Array of numbers (the low (min) value of each candle). |
| `v` | `array` | Array of numbers (the volume of each candle. Not currently supported). |
> **Note**: If candles cannot be found for the given symbol/time period, a response with empty arrays will be provided. E.g., `{ "s": "ok", "t": [], "c": [], "o": [], "h": [], "l": [], "v": [] }`
##### Error Responses
| Status Code | Error Message | Description |
| :---------- | :-------------------------------------------------------------------------------------------------------------------------------------------------------- | :----------------------------------------------- |
| `400` | `Parse error xxx: missing required field` | A required parameter was missing. |
| `401` | `Unauthorized - Authorization header is required \|\| Invalid authorization header format \|\| token signature is invalid: signature is invalid \|\| ...` | The authorization header was missing or invalid. |
| `404` | `Not found- Could not find feedID for symbol` | The provided symbol is not supported. |
| `500` | `Something went wrong. Please try again later.` | An unexpected server error occurred. |
### Get candlestick data (row format)
##### Endpoint
**`/api/v1/history/rows`**
| Type | Description | Parameter(s) |
| :------- | :----------------------------------- | :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| HTTP GET | Gets candlestick data in row format. |
`symbol`: The symbol to query.
`resolution`: Resolution of the data (required but not used, use "1m").
`from`: Unix timestamp of the leftmost required bar (inclusive).
`to`: Unix timestamp of the rightmost required bar (inclusive).
|
##### Sample request
```bash
curl -X GET \
-H "Authorization: Bearer {YOUR_ACCESS_TOKEN}" \
"https://priceapi.testnet-dataengine.chain.link/api/v1/history/rows?symbol=ETHUSD&resolution=1m&from=1746072068&to=1746158468"
```
##### Response
- **Status**: `200`
```json
{
"s": "ok",
"candles": [
[1746158460, 1.8483674e21, 1.8498753129131415e21, 1.8468008021426886e21, 1.84685e21, 0],
[1746158400, 1.848602513e21, 1.848875387e21, 1.848243519e21, 1.848515087189567e21, 0],
[1746158340, 1.8481267e21, 1.8490380305e21, 1.8475677870725296e21, 1.8490380305e21, 0]
]
}
```
| Field | Type | Description |
| :-------- | :------- | :-------------------------------------------------------------------------------------------------- |
| `s` | `string` | The status of the request. |
| `candles` | `array` | Array of arrays of numbers. Each array candle contains: `[ time, open, high, low, close, volume ]`. |
> **Note**: If candles cannot be found for the given symbol/time period, a response with an empty `candles` array is provided. E.g., `{ "s": "ok", "candles": [] }`
##### Error Responses
| Status Code | Error Message | Description |
| :---------- | :-------------------------------------------------------------------------------------------------------------------------------------------------------- | :----------------------------------------------- |
| `400` | `Parse error xxx: missing required field` | A required parameter was missing. |
| `401` | `Unauthorized - Authorization header is required \|\| Invalid authorization header format \|\| token signature is invalid: signature is invalid \|\| ...` | The authorization header was missing or invalid. |
| `404` | `Not found- Could not find feedID for symbol` | The provided symbol is not supported. |
| `500` | `Something went wrong. Please try again later.` | An unexpected server error occurred. |
### Get streaming price updates
##### Endpoint
**`/api/v1/streaming`**
| Type | Description | Parameter(s) |
| :------- | :-------------------------------------------------------- | :----------------------------------------------------------------------------------------------- |
| HTTP GET | Gets streaming price updates using HTTP chunked encoding. |
`symbol` or `feedId`: A comma-separated list of symbols or feed IDs to stream.
|
##### Sample request
```bash
curl -X GET \
-H "Authorization: Bearer {YOUR_ACCESS_TOKEN}" \
-H "Connection: keep-alive" \
"https://priceapi.testnet-dataengine.chain.link/api/v1/streaming?symbol=ETHUSD,BTCUSD"
```
##### Response
- **Status**: `200` (Streaming)
A stream of JSON objects.
**Trade Response:**
```json
{
"f": "t",
"i": "ETHUSD",
"fid": "[FEED_ID]",
"p": 2.68312e21,
"t": 1748525526,
"s": 1
}
```
| Field | Type | Description |
| :---- | :-------- | :------------------------------------------------------------- |
| `f` | `string` | Message type. Will always be “t” to indicate a trade response. |
| `i` | `string` | Identifier. The symbol for this response. |
| `fid` | `string` | The hex-encoded feed ID for this response. |
| `p` | `float64` | The latest price update for the symbol/feedId. |
| `t` | `float64` | The time of the price as a unix timestamp. |
| `s` | `float64` | Size of the trade. This will always be 1. |
**Heartbeat (sent every 5 seconds):**
```json
{ "heartbeat": 1748525528 }
```
| Field | Type | Description |
| :---------- | :-------- | :----------------------------------------------------- |
| `heartbeat` | `float64` | The time of the heartbeat message as a unix timestamp. |
##### Error Responses
| Status Code | Error Message | Description |
| :---------- | :-------------------------------------------------------------------------------------------------------------------------------------------------------- | :----------------------------------------------- |
| `400` | `Parse error xxx: One of symbol or feedId must be provided` | A required parameter was missing. |
| `401` | `Unauthorized - Authorization header is required \|\| Invalid authorization header format \|\| token signature is invalid: signature is invalid \|\| ...` | The authorization header was missing or invalid. |
| `404` | `Not found- Could not find feedID for symbol` | The provided symbol is not supported. |
| `500` | `Something went wrong. Please try again later.` | An unexpected server error occurred. |
---
# API Authentication - Go examples
Source: https://docs.chain.link/data-streams/reference/data-streams-api/authentication/go-examples
Below are complete examples for authenticating with the Data Streams API in Go. Each example shows how to properly generate the required headers and make a request.
To learn more about the Data Streams API authentication, see the [Data Streams Authentication](/data-streams/reference/data-streams-api/authentication) page.
**Note**: The Data Streams SDKs handle authentication automatically. If you're using the [Go SDK](/data-streams/reference/data-streams-api/go-sdk), [Rust SDK](/data-streams/reference/data-streams-api/rust-sdk), or [TypeScript SDK](/data-streams/reference/data-streams-api/ts-sdk), you don't need to implement the authentication logic manually.
## API Authentication Example
### Requirements
- [Go](https://go.dev/doc/install) (v1.18 or later recommended)
- API credentials from Chainlink Data Streams
### Running the Example
1. Create a file named auth-example.go with the example code shown below
2. Set your API credentials as environment variables:
```bash
export STREAMS_API_KEY="your-api-key"
export STREAMS_API_SECRET="your-api-secret"
```
3. Run with go run auth-example.go
**Example code**:
```go
package main
import (
"context"
"crypto/hmac"
"crypto/sha256"
"encoding/hex"
"encoding/json"
"fmt"
"io"
"log"
"net/http"
"net/url"
"os"
"strconv"
"time"
)
// SingleReport represents a data feed report structure
type SingleReport struct {
FeedID string `json:"feedID"`
ValidFromTimestamp uint32 `json:"validFromTimestamp"`
ObservationsTimestamp uint32 `json:"observationsTimestamp"`
FullReport string `json:"fullReport"`
}
// SingleReportResponse is the response structure for a single report
type SingleReportResponse struct {
Report SingleReport `json:"report"`
}
// GenerateHMAC creates the signature for authentication
func GenerateHMAC(method string, path string, body []byte, apiKey string, apiSecret string) (string, int64) {
// Generate timestamp (milliseconds since Unix epoch)
timestamp := time.Now().UTC().UnixMilli()
// Generate body hash
serverBodyHash := sha256.New()
serverBodyHash.Write(body)
bodyHashString := hex.EncodeToString(serverBodyHash.Sum(nil))
// Create string to sign
stringToSign := fmt.Sprintf("%s %s %s %s %d",
method,
path,
bodyHashString,
apiKey,
timestamp)
// Generate HMAC-SHA256 signature
signedMessage := hmac.New(sha256.New, []byte(apiSecret))
signedMessage.Write([]byte(stringToSign))
signature := hex.EncodeToString(signedMessage.Sum(nil))
return signature, timestamp
}
// GenerateAuthHeaders creates HTTP headers with authentication information
func GenerateAuthHeaders(method string, pathWithParams string, apiKey string, apiSecret string) http.Header {
header := http.Header{}
signature, timestamp := GenerateHMAC(method, pathWithParams, []byte(""), apiKey, apiSecret)
header.Add("Authorization", apiKey)
header.Add("X-Authorization-Timestamp", strconv.FormatInt(timestamp, 10))
header.Add("X-Authorization-Signature-SHA256", signature)
return header
}
// FetchSingleReport retrieves a single report for a specific feed
func FetchSingleReport(ctx context.Context, feedID string) (*SingleReport, error) {
// Get API credentials from environment variables
apiKey := os.Getenv("STREAMS_API_KEY")
apiSecret := os.Getenv("STREAMS_API_SECRET")
// Validate credentials
if apiKey == "" || apiSecret == "" {
return nil, fmt.Errorf("API credentials not set. Please set STREAMS_API_KEY and STREAMS_API_SECRET environment variables")
}
// API connection details
host := "api.testnet-dataengine.chain.link"
path := "/api/v1/reports/latest"
// Build query parameters
params := url.Values{
"feedID": {feedID},
}
// Create the request URL
reqURL := &url.URL{
Scheme: "https",
Host: host,
Path: path,
RawQuery: params.Encode(),
}
// Create the HTTP request
req, err := http.NewRequestWithContext(ctx, http.MethodGet, reqURL.String(), nil)
if err != nil {
return nil, fmt.Errorf("error creating request: %w", err)
}
// Add authentication headers
req.Header = GenerateAuthHeaders(req.Method, req.URL.RequestURI(), apiKey, apiSecret)
// Execute the request
client := &http.Client{}
resp, err := client.Do(req)
if err != nil {
return nil, fmt.Errorf("error sending request: %w", err)
}
defer resp.Body.Close()
// Read the response body
body, err := io.ReadAll(resp.Body)
if err != nil {
return nil, fmt.Errorf("error reading response: %w", err)
}
// Check for non-success status code
if resp.StatusCode != http.StatusOK {
return nil, fmt.Errorf("API error (status code %d): %s", resp.StatusCode, string(body))
}
// Parse the response
var reportResp SingleReportResponse
if err := json.Unmarshal(body, &reportResp); err != nil {
return nil, fmt.Errorf("error parsing response: %w", err)
}
return &reportResp.Report, nil
}
func main() {
// Create a context with cancellation
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()
// Example feed ID (ETH/USD)
feedID := "0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782"
fmt.Printf("Fetching latest report for feed ID: %s\n", feedID)
// Fetch the report
report, err := FetchSingleReport(ctx, feedID)
if err != nil {
log.Fatalf("Error: %v", err)
}
// Display the report
fmt.Println("Successfully retrieved report:")
fmt.Printf(" Feed ID: %s\n", report.FeedID)
fmt.Printf(" Valid From: %d\n", report.ValidFromTimestamp)
fmt.Printf(" Observations Timestamp: %d\n", report.ObservationsTimestamp)
fmt.Printf(" Full Report: %s\n", report.FullReport)
}
```
**Expected output**:
```bash
Fetching latest report for feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Successfully retrieved report:
Feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Valid From: 1747921357
Observations Timestamp: 1747921357
Full Report: 0x00090d9e8d96765a0c49e03a6ae05c[...]bffd8feddabf120e14bf
```
### Production Considerations
While this example demonstrates the authentication mechanism, production applications should consider:
- **HTTP client reuse**: Create a single `http.Client` with timeout settings and reuse it
- **Retry logic**: Implement exponential backoff for transient failures
- **Structured logging**: Use a logging library like `zap` or `logrus` instead of `fmt.Printf`
- **Configuration**: Use a configuration library like `viper` for managing settings
- **Metrics**: Add instrumentation for monitoring API call performance
- **Error types**: Define custom error types for better error handling
- **Testing**: Add unit tests for HMAC generation and integration tests
For production use, consider using the [Go SDK](/data-streams/reference/data-streams-api/go-sdk) which handles authentication automatically and provides built-in fault tolerance.
## WebSocket Authentication Example
### Requirements
- [Go](https://go.dev/doc/install) (v1.18 or later recommended)
- API credentials from Chainlink Data Streams
### Running the Example
1. Create a new directory for the example:
```bash
mkdir ws-go-example && cd ws-go-example
```
2. Initialize a Go module:
```bash
go mod init ws-example
```
3. Create a file named ws-auth-example.go with the example code shown below
4. Set your API credentials as environment variables:
```bash
export STREAMS_API_KEY="your-api-key"
export STREAMS_API_SECRET="your-api-secret"
```
5. Install the required dependencies:
```bash
go mod tidy
```
6. Run the example
```bash
go run ws-auth-example.go
```
7. Press Ctrl+C to stop the WebSocket stream when you're done
**Example code**:
```go
package main
import (
"context"
"crypto/hmac"
"crypto/sha256"
"encoding/hex"
"encoding/json"
"fmt"
"log"
"net/http"
"os"
"os/signal"
"strconv"
"strings"
"time"
"github.com/gorilla/websocket"
)
// Constants for ping interval, pong timeout, and write timeout
const (
pingInterval = 5 * time.Second
pongTimeout = 10 * time.Second
writeTimeout = 5 * time.Second
)
// FeedReport represents the data structure received from the WebSocket
type FeedReport struct {
Report struct {
FeedID string `json:"feedID"`
FullReport string `json:"fullReport"`
} `json:"report"`
}
// GenerateHMAC creates the signature for authentication
func GenerateHMAC(method, path string, apiKey string, apiSecret string) (string, int64) {
// Generate timestamp (milliseconds since Unix epoch)
timestamp := time.Now().UTC().UnixMilli()
// Generate body hash (empty for this connection)
serverBodyHash := sha256.New()
bodyHashString := hex.EncodeToString(serverBodyHash.Sum(nil))
// Create string to sign
stringToSign := fmt.Sprintf("%s %s %s %s %d",
method,
path,
bodyHashString,
apiKey,
timestamp)
// Generate HMAC-SHA256 signature
signedMessage := hmac.New(sha256.New, []byte(apiSecret))
signedMessage.Write([]byte(stringToSign))
signature := hex.EncodeToString(signedMessage.Sum(nil))
return signature, timestamp
}
// pingLoop sends periodic pings to keep the connection alive
func pingLoop(ctx context.Context, conn *websocket.Conn) {
ticker := time.NewTicker(pingInterval)
defer ticker.Stop()
for {
select {
case <-ctx.Done():
return
case <-ticker.C:
log.Println("Sending ping to keep connection alive...")
if err := conn.WriteControl(websocket.PingMessage, []byte{}, time.Now().Add(writeTimeout)); err != nil {
log.Printf("Error sending ping: %v", err)
return
}
}
}
}
func main() {
// Get API credentials from environment variables
apiKey := os.Getenv("STREAMS_API_KEY")
apiSecret := os.Getenv("STREAMS_API_SECRET")
// Validate credentials
if apiKey == "" || apiSecret == "" {
log.Fatal("API credentials not set. Please set STREAMS_API_KEY and STREAMS_API_SECRET environment variables")
}
// WebSocket connection details
host := "ws.testnet-dataengine.chain.link"
path := "/api/v1/ws"
feedIDs := []string{"0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782"} // ETH/USD
// Validate feed IDs
if len(feedIDs) == 0 {
log.Fatal("No feed ID(s) provided")
}
queryParams := fmt.Sprintf("feedIDs=%s", strings.Join(feedIDs, ","))
fullPath := fmt.Sprintf("%s?%s", path, queryParams)
// Generate authentication signature and timestamp
signature, timestamp := GenerateHMAC("GET", fullPath, apiKey, apiSecret)
// Create HTTP header for WebSocket connection
header := http.Header{}
header.Add("Authorization", apiKey)
header.Add("X-Authorization-Timestamp", strconv.FormatInt(timestamp, 10))
header.Add("X-Authorization-Signature-SHA256", signature)
// Create WebSocket URL
wsURL := fmt.Sprintf("wss://%s%s?%s", host, path, queryParams)
fmt.Println("Connecting to:", wsURL)
// Create context for handling cancellation
ctx, cancel := context.WithCancel(context.Background())
defer cancel()
// Set up channel to handle interrupt signal
interrupt := make(chan os.Signal, 1)
signal.Notify(interrupt, os.Interrupt)
// Connect to WebSocket server
conn, resp, err := websocket.DefaultDialer.DialContext(ctx, wsURL, header)
if err != nil {
if resp != nil {
log.Fatalf("WebSocket connection error (HTTP %d): %v", resp.StatusCode, err)
} else {
log.Fatalf("WebSocket connection error: %v", err)
}
}
defer conn.Close()
// Add pong handler to reset read deadline when pong is received
conn.SetPongHandler(func(string) error {
log.Println("Received pong from server")
return conn.SetReadDeadline(time.Now().Add(pongTimeout))
})
// Start the ping loop in a separate goroutine
go pingLoop(ctx, conn)
// Set initial read deadline
err = conn.SetReadDeadline(time.Now().Add(pongTimeout))
if err != nil {
log.Fatalf("Error setting read deadline: %v", err)
}
fmt.Println("WebSocket connection established")
// Create channel for done signal
done := make(chan struct{})
// Handle incoming messages in a separate goroutine
go func() {
defer close(done)
for {
_, message, err := conn.ReadMessage()
if err != nil {
log.Printf("WebSocket read error: %v", err)
return
}
// Parse the message
var report FeedReport
if err := json.Unmarshal(message, &report); err != nil {
log.Printf("Error parsing message: %v", err)
fmt.Println("Raw message:", string(message))
continue
}
fmt.Printf("Received report for Feed ID: %s\n", report.Report.FeedID)
}
}()
// Wait for interrupt signal or error
for {
select {
case <-done:
return
case <-interrupt:
fmt.Println("\nInterrupt signal received, closing connection...")
// Close the WebSocket connection gracefully
err := conn.WriteControl(
websocket.CloseMessage,
websocket.FormatCloseMessage(websocket.CloseNormalClosure, ""),
time.Now().Add(time.Second),
)
if err != nil {
log.Printf("Error sending close message: %v", err)
}
// Wait for message handling to complete or timeout
select {
case <-done:
case <-time.After(time.Second):
}
return
}
}
}
```
**Expected output**:
```bash
Connecting to: wss://ws.testnet-dataengine.chain.link/api/v1/ws?feedIDs=0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
WebSocket connection established
Received report for Feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Received report for Feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Received report for Feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Received report for Feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Received report for Feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Received report for Feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
2025/05/22 08:34:20 Sending ping to keep connection alive...
2025/05/22 08:34:20 Received pong from server
Received report for Feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Received report for Feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
```
### Production Considerations
While this example already includes many production-ready features (keepalive, timeouts, graceful shutdown), production applications should additionally consider:
- **Automatic reconnection**: Implement exponential backoff reconnection logic for network disruptions
- **Message buffering**: Queue outgoing messages during reconnection attempts
- **Structured logging**: Use `zap` or `logrus` with log levels instead of `log.Printf`
- **Metrics collection**: Track connection status, message rates, and latency
- **Configuration management**: Make timeouts and intervals configurable via environment or config files
- **Error categorization**: Define custom error types to distinguish between retriable and fatal errors
- **Health checks**: Expose WebSocket connection status for monitoring systems
- **Testing**: Add unit tests for HMAC generation and mock WebSocket server for integration tests
For production use, consider using the [Go SDK](/data-streams/reference/data-streams-api/go-sdk) which handles authentication automatically and provides built-in fault tolerance for streaming connections.
---
# Data Streams Authentication
Source: https://docs.chain.link/data-streams/reference/data-streams-api/authentication
This page explains how to authenticate with the Chainlink Data Streams API, covering both REST API and WebSocket connections. It includes detailed information about the required authentication headers, how to generate an HMAC signature, and examples in multiple programming languages.
## Authentication Requirements
All requests to the Data Streams API (both REST and WebSocket) require three authentication headers:
| Header | Description |
| ---------------------------------- | -------------------------------------------- |
| `Authorization` | Your API key (UUID format) |
| `X-Authorization-Timestamp` | Current timestamp with millisecond precision |
| `X-Authorization-Signature-SHA256` | HMAC signature generated using SHA-256 |
These headers authenticate your request and ensure data integrity.
### Authorization Header
The `Authorization` header contains your API key, which is provided to you when you sign up for Chainlink Data Streams access. It follows this format:
```
Authorization: YOUR_API_KEY
```
Your API key is a UUID string that identifies your account. Keep it secure and do not share it publicly.
### X-Authorization-Timestamp Header
The `X-Authorization-Timestamp` header contains the current timestamp in milliseconds since the Unix epoch (January 1, 1970, 00:00:00 UTC). It follows this format:
```
X-Authorization-Timestamp: 1716211845123
```
The timestamp must be within 5 seconds of the server time to prevent replay attacks. This means your system clock must be synchronized with a reliable time source.
### X-Authorization-Signature-SHA256 Header
The `X-Authorization-Signature-SHA256` header contains the HMAC-SHA256 signature of your request. It's created using your API secret (provided along with your API key) and follows this format:
```
X-Authorization-Signature-SHA256: YOUR_HMAC_SIGNATURE
```
## Generating the HMAC Signature
To generate the HMAC signature correctly:
1. **Create the string to sign:**
- For REST API: Combine the HTTP method, endpoint path with query parameters, body hash, API key, and timestamp
- For WebSocket: Same format as REST API requests, using `GET` as the method
2. **Generate the signature:**
- Use HMAC-SHA256 algorithm with your API secret as the key
- Sign the string created in step 1
- Hex encode the resulting binary hash
### String to Sign Format
The string to sign follows this format (for both REST API and WebSocket connections):
```
METHOD FULL_PATH BODY_HASH API_KEY TIMESTAMP
```
Where:
- `METHOD`: The HTTP method in uppercase (e.g., `GET`, `POST`) (use `GET` for WebSocket connections)
- `FULL_PATH`: The endpoint path including query parameters (e.g., `/api/v1/reports/latest?feedID=0x123...`)
- `BODY_HASH`: SHA-256 hash of the request body, hex encoded (use empty string hash for GET requests or WebSocket connections)
- `API_KEY`: Your API key (same as in the `Authorization` header)
- `TIMESTAMP`: The timestamp in milliseconds since the Unix epoch (same as in the `X-Authorization-Timestamp` header)
Note that the elements are joined with a single space character, not newlines.
#### Body Hash Calculation
Even for GET requests and WebSocket connections, you need to include a body hash in the string to sign:
1. Calculate the SHA-256 hash of the request body (for GET requests or WebSocket connections, use an empty string)
2. Hex encode the resulting hash
3. Include this hex-encoded hash in the string to sign
## Authentication Examples
Below are complete examples for authenticating with the Data Streams API in various languages. Each example shows how to properly generate the required headers and make a request.
- [JavaScript examples](/data-streams/reference/data-streams-api/authentication/javascript-examples)
- [TypeScript examples](/data-streams/reference/data-streams-api/authentication/typescript-examples)
- [Go examples](/data-streams/reference/data-streams-api/authentication/go-examples)
- [Rust examples](/data-streams/reference/data-streams-api/authentication/rust-examples)
## Common Authentication Errors
When working with the Data Streams API, the most common authentication issues include:
- **Signature Mismatch**: The HMAC signature generated by your client doesn't match what the server expects, often due to incorrect string-to-sign format or hash encoding
- **Timestamp Issues**: Your system time isn't synchronized with the server (must be within 5 seconds)
- **Missing or Malformed Headers**: Required authentication headers are missing or have incorrect format
- **Insufficient Permissions**: Your API key doesn't have access to the requested feed ID
For complete information on all possible error responses:
- [REST API Error Response Codes](/data-streams/reference/data-streams-api/interface-api#error-response-codes)
- [WebSocket Error Response Codes](/data-streams/reference/data-streams-api/interface-ws#error-response-codes)
## Troubleshooting Authentication
If you're experiencing authentication issues, follow these steps:
1. **Check your API credentials**:
- Verify you're using the correct API key and secret
- Ensure there are no extra spaces or special characters
2. **Verify timestamp synchronization**:
- Check if your system clock is accurate
- Consider using Network Time Protocol (NTP) to synchronize your clock
3. **Inspect your signature calculation**:
- Verify the string to sign follows the exact format shown above
- Check that you're using HMAC-SHA256 (not just SHA-256)
- Ensure you're hex encoding the binary hash output (not Base64)
4. **Debug output**:
- Print out the exact string being signed (without the API secret)
- Compare timestamps between your system and a reliable time source
- Check if all required headers are correctly formatted
---
# API Authentication - JavaScript examples
Source: https://docs.chain.link/data-streams/reference/data-streams-api/authentication/javascript-examples
Below are complete examples for authenticating with the Data Streams API in JavaScript, using Node.js. Each example shows how to properly generate the required headers and make a request.
To learn more about the Data Streams API authentication, see the [Data Streams Authentication](/data-streams/reference/data-streams-api/authentication) page.
**Note**: The Data Streams SDKs handle authentication automatically. If you're using the [Go SDK](/data-streams/reference/data-streams-api/go-sdk), [Rust SDK](/data-streams/reference/data-streams-api/rust-sdk), or [TypeScript SDK](/data-streams/reference/data-streams-api/ts-sdk), you don't need to implement the authentication logic manually.
## API Authentication Example
### Prerequisites
- [Node.js](https://nodejs.org/en/download/) (v20 or later)
- API credentials from Chainlink Data Streams
### Running the Example
1. Create a file named auth-example.js with the example code shown below
2. Set your API credentials as environment variables:
```bash
export STREAMS_API_KEY="your-api-key"
export STREAMS_API_SECRET="your-api-secret"
```
3. Run the example:
```bash
node auth-example.js
```
**Example code**:
```javascript
const crypto = require("crypto")
const https = require("https")
const { promisify } = require("util")
/**
* SingleReport represents a data feed report structure
* @typedef {Object} SingleReport
* @property {string} feedID - Feed identifier
* @property {number} validFromTimestamp - Timestamp from which the report is valid
* @property {number} observationsTimestamp - Timestamp of the observations
* @property {string} fullReport - Full report data in hex format
*/
/**
* SingleReportResponse is the response structure for a single report
* @typedef {Object} SingleReportResponse
* @property {SingleReport} report - Report data
*/
/**
* Generates HMAC signature for API authentication
* @param {string} method - HTTP method (GET, POST, etc.)
* @param {string} path - Request path including query parameters
* @param {Buffer|string} body - Request body (empty string for GET)
* @param {string} apiKey - API key for authentication
* @param {string} apiSecret - API secret for signature generation
* @returns {Object} Object containing signature and timestamp
*/
function generateHMAC(method, path, body, apiKey, apiSecret) {
// Generate timestamp (milliseconds since Unix epoch)
const timestamp = Date.now()
// Create body hash (empty for GET request)
const bodyHash = crypto
.createHash("sha256")
.update(body || "")
.digest("hex")
// Create string to sign
const stringToSign = `${method} ${path} ${bodyHash} ${apiKey} ${timestamp}`
// Generate HMAC-SHA256 signature
const signature = crypto.createHmac("sha256", apiSecret).update(stringToSign).digest("hex")
return { signature, timestamp }
}
/**
* Generates authentication headers for API requests
* @param {string} method - HTTP method
* @param {string} path - Request path with query parameters
* @param {string} apiKey - API key
* @param {string} apiSecret - API secret
* @returns {Object} Headers object for the request
*/
function generateAuthHeaders(method, path, apiKey, apiSecret) {
const { signature, timestamp } = generateHMAC(method, path, "", apiKey, apiSecret)
return {
Authorization: apiKey,
"X-Authorization-Timestamp": timestamp.toString(),
"X-Authorization-Signature-SHA256": signature,
}
}
/**
* Makes an HTTP request and returns a promise resolving to the response
* @param {Object} options - HTTP request options
* @returns {Promise} Response data as string
*/
function makeRequest(options) {
return new Promise((resolve, reject) => {
const req = https.request(options, (res) => {
let data = ""
res.on("data", (chunk) => {
data += chunk
})
res.on("end", () => {
if (res.statusCode >= 200 && res.statusCode < 300) {
resolve(data)
} else {
reject(new Error(`API error (status code ${res.statusCode}): ${data}`))
}
})
})
req.on("error", (error) => {
reject(new Error(`Request error: ${error.message}`))
})
req.end()
})
}
/**
* Fetches a single report for a specific feed
* @param {string} feedID - The feed ID to fetch the report for
* @returns {Promise} Promise resolving to the report data
*/
async function fetchSingleReport(feedID) {
// Get API credentials from environment variables
const apiKey = process.env.STREAMS_API_KEY
const apiSecret = process.env.STREAMS_API_SECRET
// Validate credentials
if (!apiKey || !apiSecret) {
throw new Error("API credentials not set. Please set STREAMS_API_KEY and STREAMS_API_SECRET environment variables")
}
// API connection details
const method = "GET"
const host = "api.testnet-dataengine.chain.link"
const path = "/api/v1/reports/latest"
const queryString = `?feedID=${feedID}`
const fullPath = path + queryString
// Create request options with authentication headers
const options = {
hostname: host,
path: fullPath,
method: method,
headers: generateAuthHeaders(method, fullPath, apiKey, apiSecret),
}
try {
// Make the request
const responseData = await makeRequest(options)
// Parse the response
const response = JSON.parse(responseData)
return response.report
} catch (error) {
throw new Error(`Failed to fetch report: ${error.message}`)
}
}
// Main execution function to support async/await
async function main() {
try {
// Example feed ID (ETH/USD)
const feedID = "0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782"
console.log(`Fetching latest report for feed ID: ${feedID}`)
// Fetch the report
const report = await fetchSingleReport(feedID)
// Display the report
console.log("Successfully retrieved report:")
console.log(` Feed ID: ${report.feedID}`)
console.log(` Valid From: ${report.validFromTimestamp}`)
console.log(` Observations Timestamp: ${report.observationsTimestamp}`)
console.log(` Full Report: ${report.fullReport}`)
} catch (error) {
console.error("Error:", error.message)
process.exit(1)
}
}
// Start the main function
main()
```
**Expected output**:
```bash
Fetching latest report for feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Successfully retrieved report:
Feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Valid From: 1747922906
Observations Timestamp: 1747922906
Full Report: 0x00090d9e8d96765a0c49e03a6ae05c82e8f8de70cf179baa632f183[...]31208430b586890eff87d12750
```
### Production Considerations
While this example demonstrates the authentication mechanism, production applications should consider:
- **HTTP client libraries**: Use robust libraries like `axios` or `node-fetch` for better error handling and retry capabilities
- **Retry logic**: Implement exponential backoff for transient failures
- **Request timeouts**: Add timeout handling to prevent hanging requests
- **Error types**: Create custom error classes for better error categorization
- **Logging**: Use structured logging libraries like `winston` or `pino` instead of `console.log`
- **Configuration**: Use environment configuration libraries like `dotenv` for managing credentials
- **Input validation**: Validate feed IDs and other inputs before making requests
- **Testing**: Add unit tests for HMAC generation and integration tests for API calls
## WebSocket Authentication Example
### Prerequisites
- [Node.js](https://nodejs.org/en/download/) (v20 or later)
- API credentials from Chainlink Data Streams
### Project Setup
1. Initialize a new Node.js project in your directory:
```bash
npm init -y
```
2. Install the required WebSocket dependency:
```bash
npm install ws
```
### Running the Example
1. Create a file named ws-auth-example.js with the example code shown below
2. Set your API credentials as environment variables:
```bash
export STREAMS_API_KEY="your-api-key"
export STREAMS_API_SECRET="your-api-secret"
```
3. Run the example:
```bash
node ws-auth-example.js
```
4. Press Ctrl+C to stop the WebSocket stream when you're done
**Example code**:
```javascript
const crypto = require("crypto")
const WebSocket = require("ws")
const os = require("os")
const { URL } = require("url")
// Constants for ping/pong intervals and timeouts
const PING_INTERVAL = 5000 // 5 seconds
const PONG_TIMEOUT = 10000 // 10 seconds
const CONNECTION_TIMEOUT = 30000 // 30 seconds
/**
* FeedReport represents the data structure received from the WebSocket
* @typedef {Object} FeedReport
* @property {Object} report - The report object
* @property {string} report.feedID - Feed identifier
* @property {string} report.fullReport - Full report data in hex format
*/
/**
* Generates HMAC signature for API authentication
* @param {string} method - HTTP method (GET for WebSocket connections)
* @param {string} path - Request path including query parameters
* @param {string} apiKey - API key for authentication
* @param {string} apiSecret - API secret for signature generation
* @returns {Object} Object containing signature and timestamp
*/
function generateHMAC(method, path, apiKey, apiSecret) {
// Generate timestamp (milliseconds since Unix epoch)
const timestamp = Date.now()
// Create body hash (empty for WebSocket connection)
const bodyHash = crypto.createHash("sha256").update("").digest("hex")
// Create string to sign
const stringToSign = `${method} ${path} ${bodyHash} ${apiKey} ${timestamp}`
// Generate HMAC-SHA256 signature
const signature = crypto.createHmac("sha256", apiSecret).update(stringToSign).digest("hex")
return { signature, timestamp }
}
/**
* Sets up the WebSocket connection with proper authentication
* @param {string} apiKey - API key for authentication
* @param {string} apiSecret - API secret for signature generation
* @param {string[]} feedIDs - Array of feed IDs to subscribe to
* @returns {Promise} Promise resolving to WebSocket connection
*/
function setupWebSocketConnection(apiKey, apiSecret, feedIDs) {
return new Promise((resolve, reject) => {
// Validate feed IDs
if (!feedIDs || feedIDs.length === 0) {
reject(new Error("No feed ID(s) provided"))
return
}
// WebSocket connection details
const host = "ws.testnet-dataengine.chain.link"
const path = "/api/v1/ws"
const queryString = `?feedIDs=${feedIDs.join(",")}`
const fullPath = path + queryString
// Generate authentication signature and timestamp
const { signature, timestamp } = generateHMAC("GET", fullPath, apiKey, apiSecret)
// Create WebSocket URL
const wsURL = `wss://${host}${fullPath}`
console.log("Connecting to:", wsURL)
// Set up the WebSocket connection with authentication headers
const ws = new WebSocket(wsURL, {
headers: {
Authorization: apiKey,
"X-Authorization-Timestamp": timestamp.toString(),
"X-Authorization-Signature-SHA256": signature,
},
timeout: CONNECTION_TIMEOUT,
})
// Handle connection errors
ws.on("error", (err) => {
reject(new Error(`WebSocket connection error: ${err.message}`))
})
// Resolve the promise when the connection is established
ws.on("open", () => {
console.log("WebSocket connection established")
resolve(ws)
})
})
}
/**
* Sets up ping/pong mechanism to keep the connection alive
* @param {WebSocket} ws - WebSocket connection
*/
function setupPingPong(ws) {
// Set up ping interval
const pingInterval = setInterval(() => {
if (ws.readyState === WebSocket.OPEN) {
console.log("Sending ping to keep connection alive...")
ws.ping(crypto.randomBytes(8))
// Set up pong timeout - if we don't receive a pong within timeout, close the connection
ws.pongTimeout = setTimeout(() => {
console.log("No pong received, closing connection...")
ws.terminate()
}, PONG_TIMEOUT)
}
}, PING_INTERVAL)
// Clear pong timeout when pong is received
ws.on("pong", () => {
console.log("Received pong from server")
clearTimeout(ws.pongTimeout)
})
// Clear intervals on close
ws.on("close", () => {
clearInterval(pingInterval)
clearTimeout(ws.pongTimeout)
})
}
/**
* Handle WebSocket messages
* @param {WebSocket} ws - WebSocket connection
*/
function handleMessages(ws) {
ws.on("message", (data) => {
try {
// Parse the message
const message = JSON.parse(data.toString())
// Check if it has the expected report format
if (message.report && message.report.feedID) {
const report = message.report
console.log("\nReceived new report:")
console.log(` Feed ID: ${report.feedID}`)
// Display timestamps if available
if (report.validFromTimestamp) {
console.log(` Valid From: ${report.validFromTimestamp}`)
}
if (report.observationsTimestamp) {
console.log(` Observations Timestamp: ${report.observationsTimestamp}`)
}
// Display the full report with truncation for readability
if (report.fullReport) {
const reportPreview =
report.fullReport.length > 40 ? `${report.fullReport.substring(0, 40)}...` : report.fullReport
console.log(` Full Report: ${reportPreview}`)
}
} else {
console.log("Received message with unexpected format:", message)
}
} catch (error) {
console.error("Error parsing message:", error)
console.log("Raw message:", data.toString())
}
})
}
/**
* Main function to set up and manage the WebSocket connection
*/
async function main() {
try {
// Get API credentials from environment variables
const apiKey = process.env.STREAMS_API_KEY
const apiSecret = process.env.STREAMS_API_SECRET
// Validate credentials
if (!apiKey || !apiSecret) {
throw new Error(
"API credentials not set. Please set STREAMS_API_KEY and STREAMS_API_SECRET environment variables"
)
}
// Example feed ID (ETH/USD)
const feedIDs = ["0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782"]
// Set up WebSocket connection
const ws = await setupWebSocketConnection(apiKey, apiSecret, feedIDs)
// Set up ping/pong to keep connection alive
setupPingPong(ws)
// Handle incoming messages
handleMessages(ws)
// Set up graceful shutdown on SIGINT (Ctrl+C)
process.on("SIGINT", () => {
console.log("\nInterrupt signal received, closing connection...")
// Close the WebSocket connection gracefully
if (ws.readyState === WebSocket.OPEN) {
ws.close(1000, "Client shutting down")
}
// Allow some time for the close to be sent, then exit
setTimeout(() => {
console.log("Exiting...")
process.exit(0)
}, 1000)
})
} catch (error) {
console.error("Error:", error.message)
process.exit(1)
}
}
// Start the main function
main()
```
**Expected output**:
```bash
Connecting to: wss://ws.testnet-dataengine.chain.link/api/v1/ws?feedIDs=0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
WebSocket connection established
Received new report:
Feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Valid From: 1747929684
Observations Timestamp: 1747929684
Full Report: 0x00090d9e8d96765a0c49e03a6ae05c82e8f8de...
Received new report:
Feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Valid From: 1747929685
Observations Timestamp: 1747929685
Full Report: 0x00090d9e8d96765a0c49e03a6ae05c82e8f8de...
Sending ping to keep connection alive...
Received pong from server
^C
Interrupt signal received, closing connection...
Exiting...
```
### Production Considerations
While this example already includes many production-ready features (keepalive, timeouts, graceful shutdown), production applications should additionally consider:
- **Automatic reconnection**: Implement exponential backoff reconnection logic for network disruptions
- **Message queuing**: Buffer outgoing messages during reconnection attempts
- **WebSocket libraries**: Consider using libraries like `socket.io-client` or `reconnecting-websocket` for additional features
- **Structured logging**: Use `winston` or `pino` with log levels instead of `console.log`
- **Metrics collection**: Track connection status, message rates, and latency
- **Configuration management**: Use `dotenv` or similar for managing environment variables and timeouts
- **Error categorization**: Create custom error classes to distinguish between retriable and fatal errors
- **Testing**: Add unit tests for HMAC generation and mock WebSocket server for integration tests
---
# API Authentication - Rust examples
Source: https://docs.chain.link/data-streams/reference/data-streams-api/authentication/rust-examples
Below are complete examples for authenticating with the Data Streams API in Rust. Each example shows how to properly generate the required headers and make a request.
To learn more about the Data Streams API authentication, see the [Data Streams Authentication](/data-streams/reference/data-streams-api/authentication) page.
**Note**: The Data Streams SDKs handle authentication automatically. If you're using the [Go SDK](/data-streams/reference/data-streams-api/go-sdk), [Rust SDK](/data-streams/reference/data-streams-api/rust-sdk), or [TypeScript SDK](/data-streams/reference/data-streams-api/ts-sdk), you don't need to implement the authentication logic manually.
## API Authentication Example
### Requirements
- [Rust](https://www.rust-lang.org/tools/install) (v1.70 or later recommended)
- API credentials from Chainlink Data Streams
### Running the Example
1. Create a Cargo.toml file:
```toml
[package]
name = "chainlink-streams-direct-auth"
version = "0.1.0"
edition = "2021"
[dependencies]
reqwest = { version = "0.11", features = ["json", "blocking"] }
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
tokio = { version = "1.32", features = ["full"] }
hmac = "0.12"
sha2 = "0.10"
hex = "0.4"
chrono = "0.4"
```
2. Create a src/main.rs file:
```rust
use hmac::{Hmac, Mac};
use reqwest::header::{HeaderMap, HeaderValue};
use serde::{Deserialize, Serialize};
use sha2::{Digest, Sha256};
use std::env;
use std::error::Error;
use std::time::{SystemTime, UNIX_EPOCH};
type HmacSha256 = Hmac;
// SingleReport represents a data feed report structure
#[derive(Debug, Deserialize, Serialize)]
struct SingleReport {
#[serde(rename = "feedID")]
feed_id: String,
#[serde(rename = "validFromTimestamp")]
valid_from_timestamp: u32,
#[serde(rename = "observationsTimestamp")]
observations_timestamp: u32,
#[serde(rename = "fullReport")]
full_report: String,
}
// SingleReportResponse is the response structure for a single report
#[derive(Debug, Deserialize, Serialize)]
struct SingleReportResponse {
report: SingleReport,
}
// Generate HMAC signature for API authentication
fn generate_hmac(
method: &str,
path: &str,
body: &[u8],
api_key: &str,
api_secret: &str
) -> Result<(String, u128), Box> {
// Generate timestamp (milliseconds since Unix epoch)
let timestamp = SystemTime::now()
.duration_since(UNIX_EPOCH)
.expect("Time went backwards")
.as_millis();
// Generate body hash
let mut hasher = Sha256::new();
hasher.update(body);
let body_hash = hex::encode(hasher.finalize());
// Create string to sign
let string_to_sign = format!("{} {} {} {} {}", method, path, body_hash, api_key, timestamp);
// Generate HMAC-SHA256 signature
let mut mac = HmacSha256::new_from_slice(api_secret.as_bytes())?;
mac.update(string_to_sign.as_bytes());
let signature = hex::encode(mac.finalize().into_bytes());
Ok((signature, timestamp))
}
// Generate authentication headers for API requests
fn generate_auth_headers(
method: &str,
path: &str,
api_key: &str,
api_secret: &str
) -> Result> {
let (signature, timestamp) = generate_hmac(method, path, &[], api_key, api_secret)?;
let mut headers = HeaderMap::new();
headers.insert("Authorization", HeaderValue::from_str(api_key)?);
headers.insert(
"X-Authorization-Timestamp",
HeaderValue::from_str(×tamp.to_string())?
);
headers.insert(
"X-Authorization-Signature-SHA256",
HeaderValue::from_str(&signature)?
);
Ok(headers)
}
// Fetch a single report for a specific feed
async fn fetch_single_report(feed_id: &str) -> Result> {
// Get API credentials from environment variables
let api_key = env::var("STREAMS_API_KEY")
.map_err(|_| "API credentials not set. Please set STREAMS_API_KEY environment variable")?;
let api_secret = env::var("STREAMS_API_SECRET")
.map_err(|_| "API credentials not set. Please set STREAMS_API_SECRET environment variable")?;
// API connection details
let method = "GET";
let host = "api.testnet-dataengine.chain.link";
let path = "/api/v1/reports/latest";
let full_path = format!("{}?feedID={}", path, feed_id);
// Create headers with authentication
let headers = generate_auth_headers(method, &full_path, &api_key, &api_secret)?;
// Create and execute the request
let url = format!("https://{}{}", host, full_path);
let client = reqwest::Client::new();
let response = client
.get(&url)
.headers(headers)
.send()
.await?;
// Check for non-success status code
if !response.status().is_success() {
let status = response.status();
let error_text = response.text().await?;
return Err(format!("API error (status code {}): {}", status, error_text).into());
}
// Parse the response
let report_resp: SingleReportResponse = response.json().await?;
Ok(report_resp.report)
}
#[tokio::main]
async fn main() -> Result<(), Box> {
// Example feed ID (ETH/USD)
let feed_id = "0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782";
println!("Fetching latest report for feed ID: {}", feed_id);
// Fetch the report
let report = fetch_single_report(feed_id).await?;
// Display the report
println!("Successfully retrieved report:");
println!(" Feed ID: {}", report.feed_id);
println!(" Valid From: {}", report.valid_from_timestamp);
println!(" Observations Timestamp: {}", report.observations_timestamp);
// Display the full report with truncation for readability
let report_preview = if report.full_report.len() > 40 {
format!("{}...", &report.full_report[..40])
} else {
report.full_report.clone()
};
println!(" Full Report: {}", report_preview);
Ok(())
}
```
3. Set your API credentials as environment variables:
```bash
export STREAMS_API_KEY="your-api-key"
export STREAMS_API_SECRET="your-api-secret"
```
4. Run with cargo run
**Expected output**:
```bash
Fetching latest report for feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Successfully retrieved report:
Feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Valid From: 1747933113
Observations Timestamp: 1747933113
Full Report: 0x00090d9e8d96765a0c49e03a6ae05c82e8f8de...
```
### Production Considerations
While this example demonstrates the authentication mechanism, production applications should consider:
- **Connection resilience**: Implement retry logic with exponential backoff for network failures
- **Error handling**: Use custom error types instead of string errors for better error management
- **Logging**: Replace `println!` with structured logging (e.g., `tracing`, `env_logger`)
- **Configuration**: Make API endpoints configurable through environment variables
- **Resource management**: Implement graceful shutdown for long-running connections
- **Testing**: Add unit tests for HMAC generation and integration tests for API calls
For production use, consider using the [Rust SDK](/data-streams/reference/data-streams-api/rust-sdk) which handles authentication automatically and provides built-in fault tolerance.
## WebSocket Authentication Example
### Requirements
- [Rust](https://www.rust-lang.org/tools/install) (v1.70 or later recommended)
- API credentials from Chainlink Data Streams
### Running the Example
1. Create a Cargo.toml file:
```toml
[package]
name = "chainlink-streams-direct-auth"
version = "0.1.0"
edition = "2021"
[dependencies]
tokio = { version = "1.32", features = ["full"] }
tokio-tungstenite = { version = "0.20", features = ["native-tls"] }
futures-util = "0.3"
hmac = "0.12"
sha2 = "0.10"
hex = "0.4"
chrono = "0.4"
url = "2.4"
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
```
2. Create a src/main.rs file:
```rust
use hmac::{ Hmac, Mac };
use sha2::{ Digest, Sha256 };
use std::{ env, error::Error, time::{ SystemTime, UNIX_EPOCH } };
use tokio_tungstenite::{
connect_async,
tungstenite::client::IntoClientRequest,
tungstenite::protocol::Message,
};
use futures_util::{ StreamExt, SinkExt };
use serde::{ Deserialize, Serialize };
type HmacSha256 = Hmac;
// Report structure for deserializing WebSocket messages
#[derive(Debug, Deserialize, Serialize)]
struct ReportWrapper {
report: Report,
}
#[derive(Debug, Deserialize, Serialize)]
struct Report {
#[serde(rename = "feedID")]
feed_id: String,
#[serde(rename = "fullReport")]
full_report: String,
#[serde(rename = "validFromTimestamp")]
valid_from_timestamp: u64,
#[serde(rename = "observationsTimestamp")]
observations_timestamp: u64,
}
// Generate HMAC signature for API authentication
fn generate_hmac(
method: &str,
path: &str,
body: &[u8],
api_key: &str,
api_secret: &str
) -> Result<(String, u128), Box> {
// Generate timestamp (milliseconds since Unix epoch)
let timestamp = SystemTime::now()
.duration_since(UNIX_EPOCH)
.expect("Time went backwards")
.as_millis();
// Generate body hash
let mut hasher = Sha256::new();
hasher.update(body);
let body_hash = hex::encode(hasher.finalize());
// Create string to sign
let string_to_sign = format!("{} {} {} {} {}", method, path, body_hash, api_key, timestamp);
// Generate HMAC-SHA256 signature
let mut mac = HmacSha256::new_from_slice(api_secret.as_bytes())?;
mac.update(string_to_sign.as_bytes());
let signature = hex::encode(mac.finalize().into_bytes());
Ok((signature, timestamp))
}
// Set up WebSocket connection with proper authentication
async fn setup_websocket_connection(
api_key: &str,
api_secret: &str,
feed_ids: &[&str]
) -> Result<
tokio_tungstenite::WebSocketStream>,
Box
> {
// Validate feed IDs
if feed_ids.is_empty() {
return Err("No feed ID(s) provided".into());
}
// WebSocket connection details
let host = "ws.testnet-dataengine.chain.link";
let path = "/api/v1/ws";
let feed_ids_joined = feed_ids.join(",");
let full_path = format!("{}?feedIDs={}", path, feed_ids_joined);
// Generate authentication signature and timestamp
let (signature, timestamp) = generate_hmac("GET", &full_path, &[], api_key, api_secret)?;
// Create WebSocket URL
let ws_url = format!("wss://{}{}", host, full_path);
println!("Connecting to: {}", ws_url);
// Create request with auth headers
let mut request = ws_url.into_client_request()?;
request.headers_mut().insert("Authorization", api_key.parse()?);
request.headers_mut().insert("X-Authorization-Timestamp", timestamp.to_string().parse()?);
request.headers_mut().insert("X-Authorization-Signature-SHA256", signature.parse()?);
// Connect to WebSocket server
let (ws_stream, _) = connect_async(request).await?;
println!("WebSocket connection established");
Ok(ws_stream)
}
// Handle incoming WebSocket messages
async fn handle_messages(
mut ws_stream: tokio_tungstenite::WebSocketStream>
) {
println!("Waiting for incoming messages... (press Ctrl+C to exit)");
// Process messages as they arrive
while let Some(msg) = ws_stream.next().await {
match msg {
Ok(msg) => {
match msg {
Message::Text(text) => {
// Try to parse JSON
if let Ok(report_wrapper) = serde_json::from_str::(&text) {
let report = &report_wrapper.report;
println!("\nReceived new report:");
println!(" Feed ID: {}", report.feed_id);
println!(" Valid From: {}", report.valid_from_timestamp);
println!(" Observations Timestamp: {}", report.observations_timestamp);
// Display the full report with truncation for readability
let report_preview = if report.full_report.len() > 40 {
format!("{}...", &report.full_report[..40])
} else {
report.full_report.clone()
};
println!(" Full Report: {}", report_preview);
} else {
println!("Received text message: {}", text);
}
}
Message::Binary(data) => {
// Try to parse binary as JSON
if let Ok(text) = String::from_utf8(data.clone()) {
if
let Ok(report_wrapper) = serde_json::from_str::(
&text
)
{
let report = &report_wrapper.report;
println!("\nReceived new report:");
println!(" Feed ID: {}", report.feed_id);
println!(" Valid From: {}", report.valid_from_timestamp);
println!(
" Observations Timestamp: {}",
report.observations_timestamp
);
// Display the full report with truncation
let report_preview = if report.full_report.len() > 40 {
format!("{}...", &report.full_report[..40])
} else {
report.full_report.clone()
};
println!(" Full Report: {}", report_preview);
} else {
println!("Received binary message: {} bytes", data.len());
}
} else {
println!("Received binary message (not UTF-8): {} bytes", data.len());
}
}
Message::Ping(ping_data) => {
println!("Received ping, sending pong response");
// Send a pong with the same data to keep the connection alive
if let Err(e) = ws_stream.send(Message::Pong(ping_data)).await {
eprintln!("Error sending pong: {}", e);
}
}
Message::Pong(_) => println!("Received pong"),
Message::Close(_) => {
println!("Received close message");
break;
}
Message::Frame(_) => println!("Received raw frame"),
}
}
Err(e) => {
eprintln!("Error receiving message: {}", e);
break;
}
}
}
}
#[tokio::main]
async fn main() -> Result<(), Box> {
// Get API credentials from environment variables
let api_key = env
::var("STREAMS_API_KEY")
.map_err(|_| "API credentials not set. Please set STREAMS_API_KEY environment variable")?;
let api_secret = env
::var("STREAMS_API_SECRET")
.map_err(
|_| "API credentials not set. Please set STREAMS_API_SECRET environment variable"
)?;
// Example feed IDs (ETH/USD)
let feed_ids = vec!["0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782"];
// Set up WebSocket connection
let ws_stream = setup_websocket_connection(&api_key, &api_secret, &feed_ids).await?;
// Set up a task to handle WebSocket communication
let ws_task = tokio::spawn(handle_messages(ws_stream));
// Wait for user to press Ctrl+C
tokio::signal::ctrl_c().await?;
println!("Shutting down...");
// Clean up resources
let _ = ws_task.abort();
Ok(())
}
```
3. Set your API credentials as environment variables:
```bash
export STREAMS_API_KEY="your-api-key"
export STREAMS_API_SECRET="your-api-secret"
```
4. Run with cargo run
**Expected output**:
```bash
Connecting to: wss://ws.testnet-dataengine.chain.link/api/v1/ws?feedIDs=0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
WebSocket connection established
Waiting for incoming messages... (press Ctrl+C to exit)
Received ping, sending pong response
Received new report:
Feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Valid From: 1747934358
Observations Timestamp: 1747934358
Full Report: 0x00090d9e8d96765a0c49e03a6ae05c82e8f8de...
Received new report:
Feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Valid From: 1747934359
Observations Timestamp: 1747934359
Full Report: 0x00090d9e8d96765a0c49e03a6ae05c82e8f8de...
^CShutting down...
```
### Production Considerations
While this example demonstrates WebSocket authentication, production applications should consider:
- **Connection resilience**: Implement automatic reconnection with exponential backoff
- **Heartbeat mechanism**: Send periodic pings to detect stale connections
- **Message buffering**: Queue messages during reconnection attempts
- **Error handling**: Use custom error types for better error categorization
- **Logging**: Replace `println!` with structured logging (e.g., `tracing`, `env_logger`)
- **Configuration**: Make WebSocket endpoints and timeouts configurable
- **Graceful shutdown**: Properly close WebSocket connections with close frames
- **Testing**: Add tests for connection handling and message parsing
For production use, consider using the [Rust SDK](/data-streams/reference/data-streams-api/rust-sdk) which handles authentication automatically and provides built-in fault tolerance.
---
# API Authentication - TypeScript examples
Source: https://docs.chain.link/data-streams/reference/data-streams-api/authentication/typescript-examples
Below are complete examples for authenticating with the Data Streams API in TypeScript, using Node.js. Each example shows how to properly generate the required headers and make a request.
To learn more about the Data Streams API authentication, see the [Data Streams Authentication](/data-streams/reference/data-streams-api/authentication) page.
**Note**: The Data Streams SDKs handle authentication automatically. If you're using the [Go SDK](/data-streams/reference/data-streams-api/go-sdk), [Rust SDK](/data-streams/reference/data-streams-api/rust-sdk), or [TypeScript SDK](/data-streams/reference/data-streams-api/ts-sdk), you don't need to implement the authentication logic manually.
## API Authentication Example
### Prerequisites
- [Node.js](https://nodejs.org/en/download/) (v20 or later recommended)
- API credentials from Chainlink Data Streams
### Project Setup
1. Set up your TypeScript environment:
```bash
# Install TypeScript tools
npm install -g ts-node
npm install --save-dev typescript @types/node
```
2. Create a tsconfig.json file in your project folder:
```json
{
"compilerOptions": {
"target": "ES2020",
"module": "commonjs",
"esModuleInterop": true,
"forceConsistentCasingInFileNames": true,
"strict": true,
"skipLibCheck": true
}
}
```
### Running the Example
1. Create a file named auth-example.ts with the example code shown below
2. Set your API credentials as environment variables:
```bash
export STREAMS_API_KEY="your-api-key"
export STREAMS_API_SECRET="your-api-secret"
```
3. Run the example:
```bash
ts-node auth-example.ts
```
**Example code**:
```typescript
import crypto from "crypto"
import https from "https"
import { IncomingMessage, ClientRequest } from "http"
/**
* Single report data structure
*/
interface SingleReport {
feedID: string
validFromTimestamp: number
observationsTimestamp: number
fullReport: string
}
/**
* SingleReportResponse is the response structure for a single report
*/
interface SingleReportResponse {
report: SingleReport
}
/**
* Generates HMAC signature for API authentication
* @param method - HTTP method (GET, POST, etc.)
* @param path - Request path including query parameters
* @param body - Request body (empty string for GET)
* @param apiKey - API key for authentication
* @param apiSecret - API secret for signature generation
* @returns Object containing signature and timestamp
*/
function generateHMAC(
method: string,
path: string,
body: string | Buffer,
apiKey: string,
apiSecret: string
): { signature: string; timestamp: number } {
// Generate timestamp (milliseconds since Unix epoch)
const timestamp: number = Date.now()
// Create body hash (empty for GET request)
const bodyHash: string = crypto
.createHash("sha256")
.update(body || "")
.digest("hex")
// Create string to sign
const stringToSign: string = `${method} ${path} ${bodyHash} ${apiKey} ${timestamp}`
// Generate HMAC-SHA256 signature
const signature: string = crypto.createHmac("sha256", apiSecret).update(stringToSign).digest("hex")
return { signature, timestamp }
}
/**
* Generates authentication headers for API requests
* @param method - HTTP method
* @param path - Request path with query parameters
* @param apiKey - API key
* @param apiSecret - API secret
* @returns Headers object for the request
*/
function generateAuthHeaders(method: string, path: string, apiKey: string, apiSecret: string): Record {
const { signature, timestamp } = generateHMAC(method, path, "", apiKey, apiSecret)
return {
Authorization: apiKey,
"X-Authorization-Timestamp": timestamp.toString(),
"X-Authorization-Signature-SHA256": signature,
}
}
/**
* Makes an HTTP request and returns a promise resolving to the response
* @param options - HTTP request options
* @returns Response data as string
*/
function makeRequest(options: https.RequestOptions): Promise {
return new Promise((resolve, reject) => {
const req: ClientRequest = https.request(options, (res: IncomingMessage) => {
let data = ""
res.on("data", (chunk: Buffer) => {
data += chunk
})
res.on("end", () => {
if (res.statusCode && res.statusCode >= 200 && res.statusCode < 300) {
resolve(data)
} else {
reject(new Error(`API error (status code ${res.statusCode}): ${data}`))
}
})
})
req.on("error", (error: Error) => {
reject(new Error(`Request error: ${error.message}`))
})
req.end()
})
}
/**
* Fetches a single report for a specific feed
* @param feedID - The feed ID to fetch the report for
* @returns Promise resolving to the report data
*/
async function fetchSingleReport(feedID: string): Promise {
// Get API credentials from environment variables
const apiKey = process.env.STREAMS_API_KEY
const apiSecret = process.env.STREAMS_API_SECRET
// Validate credentials
if (!apiKey || !apiSecret) {
throw new Error("API credentials not set. Please set STREAMS_API_KEY and STREAMS_API_SECRET environment variables")
}
// API connection details
const method = "GET"
const host = "api.testnet-dataengine.chain.link"
const path = "/api/v1/reports/latest"
const queryString = `?feedID=${feedID}`
const fullPath = path + queryString
// Create request options with authentication headers
const options: https.RequestOptions = {
hostname: host,
path: fullPath,
method: method,
headers: generateAuthHeaders(method, fullPath, apiKey, apiSecret),
}
try {
// Make the request
const responseData = await makeRequest(options)
// Parse the response
const response = JSON.parse(responseData) as SingleReportResponse
return response.report
} catch (error: any) {
throw new Error(`Failed to fetch report: ${error.message}`)
}
}
/**
* Main execution function to support async/await
*/
async function main(): Promise {
try {
// Example feed ID (ETH/USD)
const feedID = "0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782"
console.log(`Fetching latest report for feed ID: ${feedID}`)
// Fetch the report
const report = await fetchSingleReport(feedID)
// Display the report
console.log("Successfully retrieved report:")
console.log(` Feed ID: ${report.feedID}`)
console.log(` Valid From: ${report.validFromTimestamp}`)
console.log(` Observations Timestamp: ${report.observationsTimestamp}`)
console.log(` Full Report: ${report.fullReport}`)
} catch (error: any) {
console.error("Error:", error.message)
process.exit(1)
}
}
// Start the main function
main()
```
**Expected output**:
```bash
Fetching latest report for feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Successfully retrieved report:
Feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Valid From: 1747929367
Observations Timestamp: 1747929367
Full Report: 0x00090d9e8d96765a0c49e03a6ae05c82e8d69000000[...]79dd065e3f83ca7c0ca4aaa
```
### Production Considerations
While this example demonstrates the authentication mechanism, production applications should consider:
- **HTTP client libraries**: Use type-safe libraries like `axios` with TypeScript support for better error handling and retry capabilities
- **Retry logic**: Implement exponential backoff for transient failures with proper type definitions
- **Request timeouts**: Add typed timeout handling to prevent hanging requests
- **Error types**: Create custom error classes extending `Error` for better error categorization
- **Logging**: Use structured logging libraries like `winston` or `pino` with TypeScript definitions
- **Configuration**: Use libraries like `dotenv` with type-safe configuration schemas (e.g., using `zod`)
- **Input validation**: Use runtime type validation libraries for feed IDs and API responses
- **Testing**: Add unit tests with `jest` and `@types/jest` for type-safe testing
## WebSocket Authentication Example
### Prerequisites
- [Node.js](https://nodejs.org/en/download/) (v20 or later recommended)
- API credentials from Chainlink Data Streams
### Project Setup
1. Set up your TypeScript environment:
```bash
# Install TypeScript tools
npm install -g ts-node
npm install --save-dev typescript @types/node
```
2. Install the WebSocket library and its TypeScript definitions:
```bash
npm install ws @types/ws
```
3. Create a tsconfig.json file in your project folder:
```json
{
"compilerOptions": {
"target": "ES2020",
"module": "commonjs",
"esModuleInterop": true,
"forceConsistentCasingInFileNames": true,
"strict": true,
"skipLibCheck": true
}
}
```
### Running the Example
1. Create a file named ws-auth-example.ts with the example code shown below
2. Set your API credentials as environment variables (if not already set):
```bash
export STREAMS_API_KEY="your-api-key"
export STREAMS_API_SECRET="your-api-secret"
```
3. Run the example:
```bash
ts-node ws-auth-example.ts
```
**Example code**:
```typescript
import crypto from "crypto"
import WebSocket from "ws"
// Constants for ping/pong intervals and timeouts
const PING_INTERVAL = 5000 // 5 seconds
const PONG_TIMEOUT = 10000 // 10 seconds
const CONNECTION_TIMEOUT = 30000 // 30 seconds
/**
* WebSocket with custom properties for TypeScript
*/
interface CustomWebSocket extends WebSocket {
pongTimeout?: NodeJS.Timeout
}
/**
* FeedReport represents the data structure received from the WebSocket
*/
interface FeedReport {
report: {
feedID: string
validFromTimestamp?: number
observationsTimestamp?: number
fullReport: string
}
}
/**
* Generates HMAC signature for API authentication
* @param method - HTTP method (GET for WebSocket connections)
* @param path - Request path including query parameters
* @param apiKey - API key for authentication
* @param apiSecret - API secret for signature generation
* @returns Object containing signature and timestamp
*/
function generateHMAC(
method: string,
path: string,
apiKey: string,
apiSecret: string
): { signature: string; timestamp: number } {
// Generate timestamp (milliseconds since Unix epoch)
const timestamp: number = Date.now()
// Create body hash (empty for WebSocket connection)
const bodyHash: string = crypto.createHash("sha256").update("").digest("hex")
// Create string to sign
const stringToSign: string = `${method} ${path} ${bodyHash} ${apiKey} ${timestamp}`
// Generate HMAC-SHA256 signature
const signature: string = crypto.createHmac("sha256", apiSecret).update(stringToSign).digest("hex")
return { signature, timestamp }
}
/**
* Sets up the WebSocket connection with proper authentication
* @param apiKey - API key for authentication
* @param apiSecret - API secret for signature generation
* @param feedIDs - Array of feed IDs to subscribe to
* @returns Promise resolving to WebSocket connection
*/
function setupWebSocketConnection(apiKey: string, apiSecret: string, feedIDs: string[]): Promise {
return new Promise((resolve, reject) => {
// Validate feed IDs
if (!feedIDs || feedIDs.length === 0) {
reject(new Error("No feed ID(s) provided"))
return
}
// WebSocket connection details
const host = "ws.testnet-dataengine.chain.link"
const path = "/api/v1/ws"
const queryString = `?feedIDs=${feedIDs.join(",")}`
const fullPath = path + queryString
// Generate authentication signature and timestamp
const { signature, timestamp } = generateHMAC("GET", fullPath, apiKey, apiSecret)
// Create WebSocket URL
const wsURL = `wss://${host}${fullPath}`
console.log("Connecting to:", wsURL)
// Set up the WebSocket connection with authentication headers
const ws = new WebSocket(wsURL, {
headers: {
Authorization: apiKey,
"X-Authorization-Timestamp": timestamp.toString(),
"X-Authorization-Signature-SHA256": signature,
},
handshakeTimeout: CONNECTION_TIMEOUT,
}) as CustomWebSocket
// Handle connection errors
ws.on("error", (err) => {
reject(new Error(`WebSocket connection error: ${err.message}`))
})
// Resolve the promise when the connection is established
ws.on("open", () => {
console.log("WebSocket connection established")
resolve(ws)
})
})
}
/**
* Sets up ping/pong mechanism to keep the connection alive
* @param ws - WebSocket connection
*/
function setupPingPong(ws: CustomWebSocket): void {
// Set up ping interval
const pingInterval = setInterval(() => {
if (ws.readyState === ws.OPEN) {
console.log("Sending ping to keep connection alive...")
ws.ping(Buffer.from(crypto.randomBytes(8)))
// Set up pong timeout - if we don't receive a pong within timeout, close the connection
ws.pongTimeout = setTimeout(() => {
console.log("No pong received, closing connection...")
ws.terminate()
}, PONG_TIMEOUT)
}
}, PING_INTERVAL)
// Clear pong timeout when pong is received
ws.on("pong", () => {
console.log("Received pong from server")
clearTimeout(ws.pongTimeout)
})
// Clear intervals on close
ws.on("close", () => {
clearInterval(pingInterval)
clearTimeout(ws.pongTimeout)
})
}
/**
* Handle WebSocket messages
* @param ws - WebSocket connection
*/
function handleMessages(ws: CustomWebSocket): void {
ws.on("message", (data: Buffer | string) => {
try {
// Parse the message
const message = JSON.parse(data.toString()) as FeedReport
// Check if it has the expected report format
if (message.report && message.report.feedID) {
const report = message.report
console.log("\nReceived new report:")
console.log(` Feed ID: ${report.feedID}`)
// Display timestamps if available
if (report.validFromTimestamp) {
console.log(` Valid From: ${report.validFromTimestamp}`)
}
if (report.observationsTimestamp) {
console.log(` Observations Timestamp: ${report.observationsTimestamp}`)
}
// Display the full report with truncation for readability
if (report.fullReport) {
const reportPreview =
report.fullReport.length > 40 ? `${report.fullReport.substring(0, 40)}...` : report.fullReport
console.log(` Full Report: ${reportPreview}`)
}
} else {
console.log("Received message with unexpected format:", message)
}
} catch (error: any) {
console.error("Error parsing message:", error)
console.log("Raw message:", data.toString())
}
})
}
/**
* Main function to set up and manage the WebSocket connection
*/
async function main(): Promise {
try {
// Get API credentials from environment variables
const apiKey = process.env.STREAMS_API_KEY
const apiSecret = process.env.STREAMS_API_SECRET
// Validate credentials
if (!apiKey || !apiSecret) {
throw new Error(
"API credentials not set. Please set STREAMS_API_KEY and STREAMS_API_SECRET environment variables"
)
}
// Example feed ID (ETH/USD)
const feedIDs = ["0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782"]
// Set up WebSocket connection
const ws = await setupWebSocketConnection(apiKey, apiSecret, feedIDs)
// Set up ping/pong to keep connection alive
setupPingPong(ws)
// Handle incoming messages
handleMessages(ws)
// Set up graceful shutdown on SIGINT (Ctrl+C)
process.on("SIGINT", () => {
console.log("\nInterrupt signal received, closing connection...")
// Close the WebSocket connection gracefully
if (ws.readyState === ws.OPEN) {
ws.close(1000, "Client shutting down")
}
// Allow some time for the close to be sent, then exit
setTimeout(() => {
console.log("Exiting...")
process.exit(0)
}, 1000)
})
} catch (error: any) {
console.error("Error:", error.message)
process.exit(1)
}
}
// Start the main function
main()
```
**Expected output**:
```bash
Connecting to: wss://ws.testnet-dataengine.chain.link/api/v1/ws?feedIDs=0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
WebSocket connection established
Received new report:
Feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Valid From: 1747930054
Observations Timestamp: 1747930054
Full Report: 0x00090d9e8d96765a0c49e03a6ae05c82e8f8de...
[...]
Received new report:
Feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Valid From: 1747930059
Observations Timestamp: 1747930059
Full Report: 0x00090d9e8d96765a0c49e03a6ae05c82e8f8de...
Sending ping to keep connection alive...
Received pong from server
Received new report:
Feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Valid From: 1747930060
Observations Timestamp: 1747930060
Full Report: 0x00090d9e8d96765a0c49e03a6ae05c82e8f8de...
^C
Interrupt signal received, closing connection...
Exiting...
```
### Production Considerations
While this example already includes many production-ready features (keepalive, timeouts, graceful shutdown), production applications should additionally consider:
- **Automatic reconnection**: Implement typed exponential backoff reconnection logic for network disruptions
- **Message queuing**: Create typed message buffers during reconnection attempts
- **WebSocket libraries**: Consider type-safe libraries like `socket.io-client` with full TypeScript support
- **Structured logging**: Use `winston` or `pino` with TypeScript definitions for type-safe logging
- **Metrics collection**: Implement typed metrics for connection status, message rates, and latency
- **Configuration management**: Use `dotenv` with type-safe schemas (e.g., `zod`) for environment configuration
- **Error categorization**: Create custom error classes with specific types for different failure scenarios
- **Testing**: Add unit tests with `jest` and WebSocket mocks for type-safe testing
---
# Data Streams SDK (Go)
Source: https://docs.chain.link/data-streams/reference/data-streams-api/go-sdk
This documentation provides a detailed reference for the Data Streams SDK for Go. It implements a client library that offers a domain-oriented abstraction for interacting with the Data Streams API and enables both point-in-time data retrieval and real-time data streaming with built-in fault tolerance capabilities.
## Requirements
- Go 1.22.4 or later
- Valid Chainlink Data Streams credentials
## Installation
```go
go get github.com/smartcontractkit/data-streams-sdk/go
```
## streams
### Import
```go
import streams "github.com/smartcontractkit/data-streams-sdk/go"
```
### Types
#### `Client` interface
[Interface](https://github.com/smartcontractkit/data-streams-sdk/blob/main/go/client.go#L21) for interacting with the Data Streams API. Use the [`New`](/data-streams/reference/data-streams-api/go-sdk#new) function to create a new instance of the client.
###### Interface Methods
- `GetFeeds`: Lists all streams available to you.
```go
GetFeeds(ctx context.Context) (r []*feed.Feed, err error)
```
- `GetLatestReport`: Fetches the latest report available for the specified `FeedID`.
```go
GetLatestReport(ctx context.Context, id feed.ID) (r *ReportResponse, err error)
```
- `GetReports`: Fetches reports for the specified stream IDs and a given timestamp.
```go
GetReports(ctx context.Context, ids []feed.ID, timestamp uint64) ([]*ReportResponse, error)
```
- `GetReportPage`: Paginates the reports for a specified `FeedID` starting from a given timestamp.
```go
GetReportPage(ctx context.Context, id feed.ID, startTS uint64) (*ReportPage, error)
```
- `Stream`: Creates a real-time report stream for specified stream IDs.
```go
Stream(ctx context.Context, feedIDs []feed.ID) (Stream, error)
```
- `StreamWithStatusCallback`: Creates a real-time report stream for specified stream IDs with a callback function for connection status updates.
```go
StreamWithStatusCallback(ctx context.Context, feedIDs []feed.ID, connStatusCallback func(isConnected bool, host string, origin string)) (Stream, error)
```
#### `Config` struct
Configuration struct for the client. `Config` specifies the client configuration and dependencies. If you specify the `Logger` function, informational client activity is logged.
```go
type Config struct {
ApiKey string // Client API key
ApiSecret string // Client API secret
RestURL string // Rest API URL
WsURL string // Websocket API URL
WsHA bool // Use concurrent connections to multiple Streams servers
WsMaxReconnect int // Maximum number of reconnection attempts for Stream underlying connections
LogDebug bool // Log debug information
InsecureSkipVerify bool // Skip server certificate chain and host name verification
Logger func(format string, a ...any) // Logger function
// InspectHttpResponse intercepts http responses for rest requests.
// The response object must not be modified.
InspectHttpResponse func(*http.Response)
}
```
**Note**: If `WsMaxReconnect` is not specified, it defaults to 5 attempts.
#### `CtxKey` string
Custom context key type used for passing additional headers.
```go
type CtxKey string
```
- Constants:
- `CustomHeadersCtxKey`: Key for passing custom HTTP headers.
```go
const (
// CustomHeadersCtxKey is used as key in the context.Context object
// to pass in a custom http headers in a http.Header to be used by the client.
// Custom header values will overwrite client headers if they have the same key.
CustomHeadersCtxKey CtxKey = "CustomHeaders"
)
```
#### `ReportPage` struct
Represents a paginated response of reports.
```go
type ReportPage struct {
Reports []*ReportResponse // Slice of ReportResponse, representing individual report entries.
NextPageTS uint64 // Timestamp for the next page, used for fetching subsequent pages.
}
```
#### `ReportResponse` struct
Implements the report envelope that contains the full report payload, its stream ID and timestamps. Use the [`Decode`](/data-streams/reference/data-streams-api/go-sdk#decode) function to decode the report payload.
```go
type ReportResponse struct {
FeedID feed.ID `json:"feedID"`
FullReport []byte `json:"fullReport"`
ValidFromTimestamp uint64 `json:"validFromTimestamp"`
ObservationsTimestamp uint64 `json:"observationsTimestamp"`
}
```
##### Methods
- `MarshalJSON`: Serializes the `ReportResponse` into JSON.
```go
func (r *ReportResponse) MarshalJSON() ([]byte, error)
```
- `String`: Returns the string representation of the `ReportResponse`.
```go
func (r *ReportResponse) String() (s string)
```
- `UnmarshalJSON`: Deserializes the `ReportResponse` from JSON.
```go
func (r *ReportResponse) UnmarshalJSON(b []byte) (err error)
```
#### `Stats` struct
Statistics related to the Stream's operational metrics.
```go
type Stats struct {
Accepted uint64 // Total number of accepted reports
Deduplicated uint64 // Total number of deduplicated reports when in HA
TotalReceived uint64 // Total number of received reports
PartialReconnects uint64 // Total number of partial reconnects when in HA
FullReconnects uint64 // Total number of full reconnects
ConfiguredConnections uint64 // Number of configured connections if in HA
ActiveConnections uint64 // Current number of active connections
}
```
##### Methods
- `String`: Returns a string representation of the `Stats`.
```go
func (s Stats) String() (st string)
```
#### `Stream` interface
[Interface](https://github.com/smartcontractkit/data-streams-sdk/blob/main/go/stream.go#L39) for managing a real-time data stream.
##### Interface Methods
- `Read`: Reads the next available report from the stream. This method will pause (block) the execution until one of the following occurs:
- A report is successfully received.
- The provided context is canceled, typically due to a timeout or a cancel signal.
- An error state is encountered in any of the underlying connections, such as a network failure.
```go
Read(context.Context) (*ReportResponse, error)
```
- `Stats`: Returns statistics about the stream operations as `Stats`.
```go
Stats() Stats
```
- `Close`: Closes the stream.
```go
Close() error
```
### High Availability (HA) Mode
The Data Streams SDK supports High Availability mode for WebSocket connections. When enabled with `WsHA: true`, the SDK maintains concurrent connections to different instances to ensure high availability, fault tolerance and minimize the risk of report gaps.
#### HA Mode Behavior
When high availability mode is enabled:
- The client queries the server for available origins using the `X-Cll-Available-Origins` header
- Multiple WebSocket connections are established to different server instances
- Reports are deduplicated when received across connections
- Partial reconnects occur when some connections fail while others remain active
- Full reconnects occur when all connections fail
#### HA Configuration
HA mode is controlled by the `WsHA` field in the client configuration:
```go
config := streams.Config{
WsHA: true, // Use concurrent connections to multiple Streams servers
}
```
### Functions
#### `New`
Creates a new client instance with the specified configuration.
```go
func New(cfg Config) (c Client, err error)
```
**Note**: `New` does not initialize any connections to the Data Streams service.
#### `LogPrintf`
Utility function for logging.
```go
func LogPrintf(format string, a ...any)
```
### Errors
#### `ErrStreamClosed`
Error returned when attempting to use a closed stream.
```go
var ErrStreamClosed = fmt.Errorf("client: use of closed Stream")
```
## feed
### Import
```go
import feed "github.com/smartcontractkit/data-streams-sdk/go/feed"
```
### Types
#### `Feed` struct
Identifies the report stream ID.
```go
type Feed struct {
FeedID ID `json:"feedID"`
}
```
Where `ID` is the unique identifier for the stream.
#### `FeedVersion` uint16
Represents the stream [report schema](/data-streams/reference/report-schema-overview) version.
```go
type FeedVersion uint16
```
##### Constants
- `FeedVersion1`: Version 1 schema
- `FeedVersion2`: Version 2 schema
- `FeedVersion3`: Version 3 schema
- `FeedVersion4`: Version 4 schema
- `FeedVersion5`: Version 5 schema
- `FeedVersion6`: Version 6 schema
- `FeedVersion7`: Version 7 schema
- `FeedVersion8`: Version 8 schema
- `FeedVersion9`: Version 9 schema
- `FeedVersion10`: Version 10 schema
#### `ID` [32]byte
Represents a unique identifier for a stream.
```go
type ID [32]byte
```
##### Methods
- `FromString`: Converts a string into an `ID`.
```go
func (f *ID) FromString(s string) (err error)
```
- `MarshalJSON`: Serializes the `ID` into JSON.
```go
func (f *ID) MarshalJSON() (b []byte, err error)
```
- `String`: Returns the string representation of the `ID`.
```go
func (f *ID) String() (id string)
```
- `UnmarshalJSON`: Deserializes the `ID` from JSON.
```go
func (f *ID) UnmarshalJSON(b []byte) (err error)
```
- `Version`: Returns the version of the stream.
```go
func (f *ID) Version() FeedVersion
```
## report
### Import
```go
import report "github.com/smartcontractkit/data-streams-sdk/go/report"
```
### Types
#### `Data` interface
[Interface](https://github.com/smartcontractkit/data-streams-sdk/blob/main/go/report/report.go#L20) that represents the actual report data and attributes.
```go
type Data interface {
v1.Data | v2.Data | v3.Data | v4.Data | v5.Data | v6.Data | v7.Data | v8.Data | v9.Data | v10.Data
Schema() abi.Arguments
}
```
#### `Report` struct
Represents the report envelope that contains the full report payload, its Feed ID, and timestamps.
```go
type Report[T Data] struct {
Data T
ReportContext [3][32]byte
ReportBlob []byte
RawRs [][32]byte
RawSs [][32]byte
RawVs [32]byte
}
```
### Functions
#### `Decode`
Decodes the report serialized bytes and its data.
```go
func Decode[T Data](fullReport []byte) (r *Report[T], err error)
```
Example:
```go
payload, _ := hex.DecodeString(
`0006bd87830d5f336e205cf5c63329a1dab8f5d56812eaeb7c69300e66ab8e22000000000000000000000000000000000000000000000000000000000cf7ed13000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000e0000000000000000000000000000000000000000000000000000000000000022000000000000000000000000000000000000000000000000000000000000003000101000101000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000012000030ab7d02fbba9c6304f98824524407b1f494741174320cfd17a2c22eec1de0000000000000000000000000000000000000000000000000000000066a8f5c60000000000000000000000000000000000000000000000000000000066a8f5c6000000000000000000000000000000000000000000000000000057810653dd9000000000000000000000000000000000000000000000000000541315da76d6100000000000000000000000000000000000000000000000000000000066aa474600000000000000000000000000000000000000000000000009a697ee4230350400000000000000000000000000000000000000000000000009a6506d1426d00000000000000000000000000000000000000000000000000009a77d03ae355fe0000000000000000000000000000000000000000000000000000000000000000672bac991f5233df89f581dc02a89dd8d48419e3558b247d3e65f4069fa45c36658a5a4820dc94fc47a88a21d83474c29ee38382c46b6f9a575b9ce8be4e689c03c76fac19fbec4a29dba704c72cc003a6be1f96af115e322321f0688e24720a5d9bd7136a1d96842ec89133058b888b2e6572b5d4114de2426195e038f1c9a5ce50016b6f5a5de07e08529b845e1c622dcbefa0cfa2ffd128e9932ecee8efd869bc56d09a50ceb360a8d366cfa8eefe3f64279c88bdbc887560efa9944238eb000000000000000000000000000000000000000000000000000000000000000060e2a800f169f26164533c7faff6c9073cd6db240d89444d3487113232f9c31422a0993bb47d56807d0dc26728e4c8424bb9db77511001904353f1022168723010c46627c890be6e701e766679600696866c888ec80e7dbd428f5162a24f2d8262f846bdb06d9e46d295dd8e896fb232be80534b0041660fe4450a7ede9bc3b230722381773a4ae81241568867a759f53c2bdd05d32b209e78845fc58203949e50a608942b270c456001e578227ad00861cf5f47b27b09137a0c4b7f8b4746cef`)
report, err := report.Decode[v3.Data](payload)
if err != nil {
streams.LogPrintf("error decoding report: %s", err)
os.Exit(1)
}
streams.LogPrintf(
"FeedID: %s, FeedVersion: %d, Bid: %s, Ask: %s, BenchMark: %s, LinkFee: %s, NativeFee: %s, ValidFromTS: %d, ExpiresAt: %d",
report.Data.FeedID.String(),
report.Data.FeedID.Version(),
report.Data.Bid.String(),
report.Data.Ask.String(),
report.Data.BenchmarkPrice.String(),
report.Data.LinkFee.String(),
report.Data.NativeFee.String(),
report.Data.ValidFromTimestamp,
report.Data.ExpiresAt,
)
```
## Report Format and Schema Versions
### Schema Versions
The SDK supports multiple report schema versions (v1-v10), each optimized for different use cases:
- **v2**: Feed IDs starting with `0x0002`
- **v3**: Feed IDs starting with `0x0003` (Crypto Streams)
- **v4**: Feed IDs starting with `0x0004` (Real-World Assets)
- **v5**: Feed IDs starting with `0x0005`
- **v6**: Feed IDs starting with `0x0006` (Multiple Price Values)
- **v7**: Feed IDs starting with `0x0007` (Exchange Rate)
- **v8**: Feed IDs starting with `0x0008` (Non-OTC RWA)
- **v9**: Feed IDs starting with `0x0009` (NAV Fund Data)
- **v10**: Feed IDs starting with `0x000a` (Tokenized Equity)
### Common Fields
All report versions include standard metadata:
- `FeedID`: Unique identifier for the data stream
- `ValidFromTimestamp`: When the report becomes valid
- `ExpiresAt`: When the report expires
- `LinkFee`: Fee in LINK tokens
- `NativeFee`: Fee in native blockchain currency
- `ObservationsTimestamp`: When the data was observed
### Usage Pattern
Each version follows the same decoding pattern:
```go
import v3 "github.com/smartcontractkit/data-streams-sdk/go/report/v3"
report, err := report.Decode[v3.Data](fullReport)
```
Use the appropriate version import based on your feed's schema version, which can be determined by calling `feedID.Version()`.
For complete field definitions and decoding examples, see the [report schema overview](/data-streams/reference/report-schema-overview) and [fetch tutorial](/data-streams/tutorials/go-sdk-fetch).
---
# Data Streams Reference
Source: https://docs.chain.link/data-streams/reference/data-streams-api
### API Interfaces
- [REST API](/data-streams/reference/data-streams-api/interface-api) - HTTP-based interface for simple integrations
- [WebSocket](/data-streams/reference/data-streams-api/interface-ws) - Real-time data streaming via WebSocket connection
### SDK Integration
- [Go SDK](/data-streams/reference/data-streams-api/go-sdk) - Native Go language integration
- [Rust SDK](/data-streams/reference/data-streams-api/rust-sdk) - Native Rust language integration
### Authentication
- [Authentication](/data-streams/reference/data-streams-api/authentication) - Learn how to authenticate with the Data Streams API (not required if using the SDKs)
- [JavaScript examples](/data-streams/reference/data-streams-api/authentication/javascript-examples)
- [TypeScript examples](/data-streams/reference/data-streams-api/authentication/typescript-examples)
- [Go examples](/data-streams/reference/data-streams-api/authentication/go-examples)
- [Rust examples](/data-streams/reference/data-streams-api/authentication/rust-examples)
### Verification
#### EVM chains
- [Onchain report data verification](/data-streams/reference/data-streams-api/onchain-verification) - Verify the authenticity of received data on EVM chains
#### Solana
- [Verify reports using the onchain integration method](/data-streams/tutorials/solana-onchain-report-verification)
- [Verify reports using the offchain integration method](/data-streams/tutorials/solana-offchain-report-verification)
---
# Data Streams REST API
Source: https://docs.chain.link/data-streams/reference/data-streams-api/interface-api
## Domains
| Description | Testnet URL | Mainnet URL |
| :------------ | :----------------------------------------- | :--------------------------------- |
| REST endpoint | https\://api.testnet-dataengine.chain.link | https\://api.dataengine.chain.link |
## Authentication
All requests to the Data Streams REST API require authentication. For comprehensive documentation on generating authentication headers and implementing the HMAC authentication process, please refer to the [Data Streams Authentication](/data-streams/reference/data-streams-api/authentication) page.
**Note**: If you're using a [Data Streams SDK](/data-streams/reference/data-streams-api/go-sdk), you don't need to manually generate authentication headers.
### Headers
All routes require the following three headers for user authentication:
| Header | Description |
| ---------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `Authorization` | The user's unique identifier, provided as a UUID (Universally Unique IDentifier). |
| `X-Authorization-Timestamp` | The current timestamp, with precision up to milliseconds. The timestamp must closely synchronize with the server time, allowing a maximum discrepancy of 5 seconds (by default). |
| `X-Authorization-Signature-SHA256` | The HMAC (Hash-based Message Authentication Code) signature, generated by hashing parts of the request and its metadata using SHA-256 with a shared secret key. |
## API endpoints
### Return a single report at a given timestamp
##### Endpoint
**`/api/v1/reports`**
| Type | Description | Parameter(s) |
| -------- | ---------------------------------------------- | ------------------------------------------------------------------------------------------------------------- |
| HTTP GET | Returns a single report for a given timestamp. |
`feedID`: A Data Streams stream ID.
`timestamp`: The Unix timestamp for the report.
|
##### Sample request
```http
GET /api/v1/reports?feedID=×tamp=
```
##### Sample response
```json
{
"report": {
"feedID": "Hex encoded feedId.",
"validFromTimestamp": "Report's earliest applicable timestamp (in seconds).",
"observationsTimestamp": "Report's latest applicable timestamp (in seconds).",
"fullReport": "A blob containing the report context and body. Encode the fee token into the payload before passing it to the contract for verification."
}
}
```
### Return a single report with the latest timestamp
##### Endpoint
**`/api/v1/reports/latest`**
| Type | Parameter(s) |
| -------- | ----------------------------------- |
| HTTP GET | `feedID`: A Data Streams stream ID. |
##### Sample request
```http
GET /api/v1/reports/latest?feedID=
```
##### Sample response
```json
{
"report": {
"feedID": "Hex encoded feedId.",
"validFromTimestamp": "Report's earliest applicable timestamp (in seconds).",
"observationsTimestamp": "Report's latest applicable timestamp (in seconds).",
"fullReport": "A blob containing the report context and body. Encode the fee token into the payload before passing it to the contract for verification."
}
}
```
### Return a report for multiple FeedIDs at a given timestamp
##### Endpoint
**`/api/v1/reports/bulk`**
| Type | Description | Parameter(s) |
| -------- | ---------------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------- |
| HTTP GET | Return a report for multiple FeedIDs at a given timestamp. |
`feedIDs`: A comma-separated list of Data Streams stream IDs.
`timestamp`: The Unix timestamp for the reports.
|
##### Sample request
```http
GET /api/v1/reports/bulk?feedIDs=,,...×tamp=
```
##### Sample response
```json
{
"reports": [
{
"feedID": "Hex encoded feedId.",
"validFromTimestamp": "Report's earliest applicable timestamp (in seconds).",
"observationsTimestamp": "Report's latest applicable timestamp (in seconds).",
"fullReport": "A blob containing the report context and body. Encode the fee token into the payload before passing it to the contract for verification."
}
//...
]
}
```
### Return multiple sequential reports for a single stream ID, starting at a given timestamp
##### Endpoint
**`/api/v1/reports/page`**
| Type | Description | Parameter(s) |
| -------- | ----------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| HTTP GET | Return multiple sequential reports for a single stream ID, starting at a given timestamp. |
`feedID`: A Data Streams stream ID.
`startTimestamp`: The Unix timestamp for the first report.
`limit` (optional): The number of reports to return.
|
##### Sample request
```http
GET /api/v1/reports/page?feedID=&startTimestamp=&limit=
```
##### Sample response
```json
{
"reports": [
{
"feedID": "Hex encoded feedId.",
"validFromTimestamp": "Report's earliest applicable timestamp (in seconds).",
"observationsTimestamp": "Report's latest applicable timestamp (in seconds).",
"fullReport": "A blob containing the report context and body. Encode the fee token into the payload before passing it to the contract for verification."
}
//...
]
}
```
## Error response codes
| Status Code | Description |
| ---------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| 400 Bad Request | This error is triggered when:
There is any missing/malformed query argument.
Required headers are missing or provided with incorrect values.
|
| 401 Unauthorized User | This error is triggered when:
Authentication fails, typically because the HMAC signature provided by the client doesn't match the one expected by the server.
A user requests access to a stream without the appropriate permission or that does not exist.
|
| 500 Internal Server | Indicates an unexpected condition encountered by the server, preventing it from fulfilling the request. This error typically points to issues on the server side. |
| 206 Missing data (`/bulk` endpoint only) | Indicates that at least one stream ID data is missing from the report. E.g., you requested a report for stream IDs ``, ``, and `` at a given timestamp. If data for `` is missing from the report (not available yet at the specified timestamp), you get `[, ]` and a 206 response. |
---
# Data Streams WebSocket
Source: https://docs.chain.link/data-streams/reference/data-streams-api/interface-ws
## Domains
| Description | Testnet URL | Mainnet URL |
| :----------------- | :-------------------------------------- | :------------------------------ |
| WebSocket endpoint | wss\://ws.testnet-dataengine.chain.link | wss\://ws.dataengine.chain.link |
## Authentication
All connections to the Data Streams WebSocket API require authentication. For comprehensive documentation on generating authentication headers and implementing the HMAC authentication process, please refer to the [Data Streams Authentication](/data-streams/reference/data-streams-api/authentication) page.
**Note**: If you're using a [Data Streams SDK](/data-streams/reference/data-streams-api/go-sdk), you don't need to manually generate authentication headers.
### Headers
All routes require the following three headers for user authentication:
| Header | Description |
| ---------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `Authorization` | The user’s unique identifier, provided as a UUID (Universally Unique IDentifier). |
| `X-Authorization-Timestamp` | The current timestamp, with precision up to milliseconds. The timestamp must closely synchronize with the server time, allowing a maximum discrepancy of 5 seconds (by default). |
| `X-Authorization-Signature-SHA256` | The HMAC (Hash-based Message Authentication Code) signature, generated by hashing parts of the request and its metadata using SHA-256 with a shared secret key. |
## WebSocket Connection
Establish a streaming WebSocket connection that sends reports for the given stream ID(s) after they are verified.
##### Endpoint
**`/api/v1/ws`**
| Type | Parameter(s) |
| --------- | ------------------------------------------------------------- |
| WebSocket | `feedIDs`: A comma-separated list of Data Streams stream IDs. |
##### Sample request
```http
GET /api/v1/ws?feedIDs=,,...
```
##### Sample response
```json
{
"report": {
"feedID": "hex encoded feedId",
"fullReport": "a blob containing the report context + body, can be passed unmodified to the contract for verification"
}
}
```
## Error response codes
| Status Code | Description |
| --------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| 400 Bad Request | This error is triggered when:
There is any missing/malformed query argument.
Required headers are missing or provided with incorrect values.
|
| 401 Unauthorized User | This error is triggered when:
Authentication fails, typically because the HMAC signature provided by the client doesn't match the one expected by the server.
A user requests access to a stream without the appropriate permission or that does not exist.
|
| 500 Internal Server | Indicates an unexpected condition encountered by the server, preventing it from fulfilling the request. This error typically points to issues on the server side. |
---
# Onchain report verification (EVM chains)
Source: https://docs.chain.link/data-streams/reference/data-streams-api/onchain-verification
## Verify reports onchain
To verify data onchain on EVM chains, Data Streams requires several Solidity interfaces.
The primary onchain interaction occurs between the `IVerifierProxy` interface and your protocol's client contract. Find the Verifier proxy address for each stream on the [Stream Addresses](/data-streams/crypto-streams) page.
### Interfaces
- IVerifierProxy
- IFeeManager
In the current code example for verifying reports onchain, these interfaces are specified in the example itself. Imports for these interfaces will be available in the future.
### Contract example to verify report data onchain
This contract example allows you to verify reports and pay the verification fee in LINK tokens. Your contract must have sufficient LINK tokens to pay for the verification fee. Learn how to [fund your contract with LINK tokens](/resources/fund-your-contract).
---
# Data Streams SDK (Rust)
Source: https://docs.chain.link/data-streams/reference/data-streams-api/rust-sdk
The [Data Streams SDK for Rust](https://github.com/smartcontractkit/data-streams-sdk/tree/main/rust) provides a client library for interacting with Chainlink Data Streams. It offers both point-in-time data retrieval and real-time data streaming capabilities with built-in fault tolerance.
## Requirements
- Rust 1.70 or later
- Valid Data Streams API credentials
## Features
- **REST API Client**: Fetch point-in-time data from Data Streams
- **WebSocket Client**: Stream real-time data with automatic reconnection
- **Report Decoding**: Built-in support for decoding and validating multiple report formats (V3, V8)
- **High Availability**: WebSocket connection management with failover support
- **Tracing Support**: Optional logging via the [`tracing`](https://crates.io/crates/tracing) crate
- **Async/Await**: Built on Tokio for efficient async operations
## Installation
Add the SDK to your project by including it in your `Cargo.toml`:
```toml
[dependencies]
chainlink-data-streams-sdk = "1.0.3"
chainlink-data-streams-report = "1.0.3"
```
### Feature Flags
The SDK provides several feature flags to customize its functionality:
- `"rest"`: Enables the REST API client (enabled by default)
- `"websocket"`: Enables the WebSocket client for real-time streaming (enabled by default)
- `"tracing"`: Enables logging with the `tracing` crate (optional)
- `"full"`: Enables all features (default)
## Report Types
The Rust SDK supports multiple report formats, including **V3 (Crypto)** and **V8 (RWA)**. For the complete list of fields and their detailed descriptions, refer to the dedicated schema pages:
- [V3 (Cryptocurrency) Report Schema](/data-streams/reference/report-schema-v3)
- [V8 (RWA) Report Schema](/data-streams/reference/report-schema-v8)
Below are basic code snippets for decoding these reports with the SDK:
### V3 Reports (Crypto Streams)
```rust
use chainlink_data_streams_report::report::{ decode_full_report, v3::ReportDataV3 };
// After you get 'report_blob' (for example, from 'decode_full_report' or a contract call):
let report_data = ReportDataV3::decode(&report_blob)?;
// Access whichever fields you need:
println!("Benchmark Price: {}", report_data.benchmark_price);
println!("Bid: {}", report_data.bid);
println!("Ask: {}", report_data.ask);
// ... etc.
```
For more details on every field in V3 (Crypto) reports, see the [V3 report schema page](/data-streams/reference/report-schema-v3).
### V8 Reports (RWA Streams)
```rust
use chainlink_data_streams_report::report::v8::ReportDataV8;
let report_data = ReportDataV8::decode(&report_blob)?;
// Example usage:
println!("Mid Price: {}", report_data.mid_price);
println!("Market Status: {}", report_data.market_status);
// ... etc.
```
For more details on every field in v8 (RWA) reports, see the [V8 report schema page](/data-streams/reference/report-schema-v8).
## Authentication
The SDK uses HMAC authentication for all API requests. Configure your credentials:
```rust
use chainlink_data_streams_sdk::config::Config;
use std::env;
let api_key = env::var("API_KEY").expect("API_KEY must be set");
let user_secret = env::var("USER_SECRET").expect("USER_SECRET must be set");
let config = Config::new(
api_key,
user_secret,
"https://api.testnet-dataengine.chain.link".to_owned(),
"wss://ws.testnet-dataengine.chain.link".to_owned(),
)
.build()?;
```
Security best practices:
- Store credentials in environment variables
- Avoid hardcoding credentials in source code
- Use separate credentials for development and production
- Rotate credentials periodically
## WebSocket Features
### High Availability Mode
```rust
use chainlink_data_streams_sdk::config::{Config, WebSocketHighAvailability};
let ws_urls = "wss://ws1.dataengine.chain.link,wss://ws2.dataengine.chain.link";
let config = Config::new(api_key, api_secret, rest_url, ws_urls)
// Enable WebSocket HA mode
.with_ws_ha(WebSocketHighAvailability::Enabled)
// Increase the max reconnection attempts (optional, default is 5)
.with_ws_max_reconnect(10)
.build()?;
```
- Multiple WebSocket endpoints
- Automatic failover on connection loss
- Parallel connections to reduce gaps in data
### Connection Management
The SDK allows you to:
- Set `ws_max_reconnect`: The maximum number of reconnection attempts (default: 5)
- Enable HA for multiple WebSocket endpoints
- Optional `insecure_skip_verify` for TLS
Example:
```rust
use chainlink_data_streams_sdk::config::{Config, InsecureSkipVerify, WebSocketHighAvailability};
let ws_urls = "wss://ws.testnet-dataengine.chain.link";
let config = Config::new(api_key, api_secret, rest_url, ws_urls)
.with_ws_ha(WebSocketHighAvailability::Enabled)
.with_ws_max_reconnect(5)
.with_insecure_skip_verify(InsecureSkipVerify::Enabled)
.build()?;
// Create and initialize the stream
let mut stream = Stream::new(&config, feed_ids).await?;
stream.listen().await?;
```
## Error Handling
The SDK defines distinct error types:
- `ClientError` for REST-based issues (e.g., HTTP request failures)
- `StreamError` for WebSocket streaming issues
- `HmacError` for authentication/HMAC generation problems
Example:
```rust
use chainlink_data_streams_sdk::client::Client;
use chainlink_data_streams_sdk::client::ClientError;
match client.get_latest_report(feed_id).await {
Ok(report_response) => {
println!("Report: {:?}", report_response.report);
}
Err(ClientError::ApiError(e)) => {
eprintln!("Server returned an error: {}", e);
}
Err(e) => {
eprintln!("Some other request error: {}", e);
}
}
```
## Examples
### Step-by-Step Guides
- [Fetch and decode reports using the REST API](/data-streams/tutorials/rust-sdk-fetch)
- [Stream and decode reports via WebSocket](/data-streams/tutorials/rust-sdk-stream)
### More Examples
The [SDK repository](https://github.com/smartcontractkit/data-streams-sdk/blob/main/rust/docs/examples/) includes additional examples for common use cases:
- Fetching a single report, bulk reports, or paginated reports
- Compressing report data
- Simple WebSocket streaming
- Multiple WebSocket endpoints (HA)
## Performance Considerations
- Reuse the same `Client` instance whenever possible
- Gracefully close WebSocket `Stream` objects when not in use (`stream.close().await?`)
- Monitor memory usage if you expect large volumes of data
- Use timeouts or retry logic in your own application where needed
## Configuration Reference
Below is a detailed guide to the available builder methods for the `ConfigBuilder`. Each method is **optional** but can help you tailor the SDK’s behavior for your specific needs.
```rust
pub struct ConfigBuilder {
// Enables High Availability (HA) for WebSocket connections
pub fn with_ws_ha(mut self, WebSocketHighAvailability) -> Self;
// Sets the max number of WebSocket reconnection attempts before giving up
pub fn with_ws_max_reconnect(mut self, usize) -> Self;
// Allows skipping TLS certificate verification (use with caution in production)
pub fn with_insecure_skip_verify(mut self, InsecureSkipVerify) -> Self;
// Provides a way to inspect HTTP responses for logging or debugging (REST calls only)
pub fn with_inspect_http_response(mut self, fn(&Response)) -> Self;
// Finalizes and validates the config
pub fn build(self) -> Result;
}
```
### `with_ws_ha(WebSocketHighAvailability)`
- Purpose: Enables or disables HA mode for WebSocket streaming.
- Values:
- `WebSocketHighAvailability::Enabled`: Maintains multiple WebSocket connections to different endpoints simultaneously.
- `WebSocketHighAvailability::Disabled`: Maintains a single connection.
- Default: `Disabled`.
- Example Use Case:
```rust
// When you want uninterrupted streaming even if one server goes down:
let config = Config::new(api_key, api_secret, rest_url, "wss://ws1,...,wss://wsN")
.with_ws_ha(WebSocketHighAvailability::Enabled)
.build()?;
```
### `with_ws_max_reconnect(usize)`
- Purpose: Sets the maximum number of reconnection attempts before the SDK stops trying.
- Default: `5`.
- Parameter: `usize` indicates how many times you want the SDK to attempt reconnecting after a connection failure.
- Example Use Case:
```rust
// If you want to be very persistent:
.with_ws_max_reconnect(20)
```
### `with_insecure_skip_verify(InsecureSkipVerify)`
- Purpose: When set, it allows skipping TLS certificate verification.
- Values:
- `InsecureSkipVerify::Enabled`: Skips verification of certificates — useful for local dev or self-signed certs.
- `InsecureSkipVerify::Disabled`: Normal, secure certificate handling (recommended for production).
- Default: `Disabled`.
- Example Use Case:
```rust
// Useful in development when using self-signed certs or containers
.with_insecure_skip_verify(InsecureSkipVerify::Enabled)
```
### `with_inspect_http_response(fn(&Response))`
- Purpose: Allows you to provide a callback function that inspects any HTTP `Response` received by the REST client.
- Default: `None`. (No inspection)
- Example:
```rust
// Log every HTTP response's status code
.with_inspect_http_response(|response| {
println!("Received HTTP status: {}", response.status());
})
```
---
# Data Streams SDK (TypeScript)
Source: https://docs.chain.link/data-streams/reference/data-streams-api/ts-sdk
The Data Streams SDK for accessing Chainlink Data Streams with real-time streaming and historical data retrieval.
## Requirements
- Node.js >= 20.0.0
- TypeScript >= 5.3.x
- Valid Chainlink Data Streams credentials
## Features
- **[Real-time streaming](#streaming)** via WebSocket connections
- **[High Availability mode](#high-availability-mode)** with multiple connections and automatic failover
- **Historical data access** via [REST API](#rest-api)
- **[Automatic report decoding](#schema-auto-detection)** for all supported formats (V2, V3, V4, V5, V6, V7, V8, V9, V10)
- **[Metrics](#observability-logs--metrics)** for monitoring and observability
- **Type-safe** with full TypeScript support
- **Event-driven architecture** for complete developer control
## Installation
```bash
npm install @chainlink/data-streams-sdk
```
## Configuration
### Configuration Interface
```typescript
interface Config {
// Required
apiKey: string // API key for authentication
userSecret: string // User secret for authentication
endpoint: string // REST API URL
wsEndpoint: string // WebSocket URL
// Optional - Request & Retry
timeout?: number // Request timeout (default: 30000ms)
retryAttempts?: number // Retry attempts (default: 3)
retryDelay?: number // Retry delay (default: 1000ms)
// Optional - High Availability
haMode?: boolean // Enable HA mode (default: false)
haConnectionTimeout?: number // HA connection timeout (default: 10000ms)
connectionStatusCallback?: (isConnected: boolean, host: string, origin: string) => void
// Optional - Logging
logging?: LoggingConfig // See Logging Configuration section
}
```
### Basic Usage
```typescript
const client = createClient({
apiKey: process.env.API_KEY,
userSecret: process.env.USER_SECRET,
endpoint: "https://api.dataengine.chain.link",
wsEndpoint: "wss://ws.dataengine.chain.link",
})
```
### High Availability Example
```typescript
const haClient = createClient({
apiKey: process.env.API_KEY,
userSecret: process.env.USER_SECRET,
endpoint: "https://api.dataengine.chain.link", // Mainnet only
wsEndpoint: "wss://ws.dataengine.chain.link", // Single endpoint with origin discovery
haMode: true,
})
```
**Note:** [High Availability mode](#high-availability-mode) is only available on mainnet, not testnet.
## Examples
**Quick Commands:**
```bash
# Real-time streaming
npx ts-node examples/stream-reports.ts 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
# High Availability streaming
npx ts-node examples/stream-reports.ts 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782 --ha
# Get latest report
npx ts-node examples/get-latest-report.ts 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
# List all available feeds
npx ts-node examples/list-feeds.ts
```
**Complete examples**
See the [SDK repo examples](https://github.com/smartcontractkit/data-streams-sdk/tree/main/typescript/examples) for detailed usage and setup. Available examples include:
- **Streaming:** Basic streaming, HA mode, metrics monitoring
- **REST API:** Latest reports, historical data, bulk operations, feed management
- **Configuration:** Logging setup, debugging, monitoring integration
## API Reference
### Streaming
```typescript
// Create stream
const stream = client.createStream(feedIds, options?);
// Events
stream.on('report', (report) => { ... });
stream.on('error', (error) => { ... });
stream.on('disconnected', () => { ... });
stream.on('reconnecting', (info) => { ... });
// Control
await stream.connect();
await stream.close();
// Metrics
const metrics = stream.getMetrics();
```
### Stream Options
```typescript
interface StreamOptions {
maxReconnectAttempts?: number // Default: 5
// Base delay (in ms) for exponential backoff.
// Actual delay grows as: base * 2^(attempt-1) with jitter, capped at 10000ms.
// Default: 1000ms; user-provided values are clamped to the safe range [200ms, 10000ms].
reconnectInterval?: number
}
```
### REST API
```typescript
// Get feeds
const feeds = await client.listFeeds();
// Get latest report
const report = await client.getLatestReport(feedId);
// Get historical report
const report = await client.getReportByTimestamp(feedId, timestamp);
// Get report page
const reports = await client.getReportsPage(feedId, startTime, limit?);
// Get bulk reports
const reports = await client.getReportsBulk(feedIds, timestamp);
```
## Report Format
### Quick Decoder Usage
```typescript
import { decodeReport } from "@chainlink/data-streams-sdk"
const decoded = decodeReport(report.fullReport, report.feedID)
```
### Schema Auto-Detection
The SDK automatically detects and decodes all report versions based on Feed ID patterns:
- **V2**: Feed IDs starting with `0x0002`
- **V3**: Feed IDs starting with `0x0003` (Crypto Streams)
- **V4**: Feed IDs starting with `0x0004` (Real-World Assets)
- **V5**: Feed IDs starting with `0x0005`
- **V6**: Feed IDs starting with `0x0006` (Multiple Price Values)
- **V7**: Feed IDs starting with `0x0007`
- **V8**: Feed IDs starting with `0x0008` (Non-OTC RWA)
- **V9**: Feed IDs starting with `0x0009` (NAV Fund Data)
- **V10**: Feed IDs starting with `0x000a` (Tokenized Equity)
### Common Fields
All reports include standard metadata:
```typescript
interface BaseFields {
version: "V2" | "V3" | "V4" | "V5" | "V6" | "V7" | "V8" | "V9" | "V10"
nativeFee: bigint
linkFee: bigint
expiresAt: number
feedID: string
validFromTimestamp: number
observationsTimestamp: number
}
```
### Schema-Specific Fields
- **V2/V3/V4**: `price: bigint` - Standard price data
- **V3**: `bid: bigint, ask: bigint` - Crypto bid/ask spreads
- **V4**: `marketStatus: MarketStatus` - Real-world asset market status
- **V5**: `rate: bigint, timestamp: number, duration: number` - Interest rate data with observation timestamp and duration
- **V6**: `price: bigint, price2: bigint, price3: bigint, price4: bigint, price5: bigint` - Multiple price values in a single payload
- **V7**: `exchangeRate: bigint` - Exchange rate data
- **V8**: `midPrice: bigint, lastUpdateTimestamp: number, marketStatus: MarketStatus` - Non-OTC RWA data
- **V9**: `navPerShare: bigint, navDate: number, aum: bigint, ripcord: number` - NAV fund data
- **V10**: `price: bigint, lastUpdateTimestamp: number, marketStatus: MarketStatus, currentMultiplier: bigint, newMultiplier: bigint, activationDateTime: number, tokenizedPrice: bigint` - Tokenized equity data
For complete field definitions, see the [complete list of available reports and their schemas](/data-streams/reference/report-schema-overview).
## High Availability Mode
HA mode establishes multiple simultaneous connections for zero-downtime operation:
- **Automatic failover** between connections
- **Report deduplication** across connections
- **Automatic origin discovery** to find available endpoints
- **Per-connection monitoring** and statistics
```typescript
const client = createClient({
// ...config
haMode: true,
wsEndpoint: "wss://ws.dataengine.chain.link", // Single endpoint (mainnet only)
})
```
**How it works:** When `haMode` is `true`, the SDK automatically discovers multiple origin endpoints behind the single URL and establishes separate connections to each origin.
**Connection monitoring:** The optional `connectionStatusCallback` can be used to integrate with external monitoring systems. The SDK already provides comprehensive connection logs, so this callback is primarily useful for custom alerting or metrics collection. See [`examples/metrics-monitoring.ts`](https://github.com/smartcontractkit/data-streams-sdk/blob/main/typescript/examples/metrics-monitoring.ts) for a complete implementation example.
**Important:** HA mode is only available on mainnet endpoints.
## Error Handling
### Error Types Overview
| **Error Type** | **When Thrown** | **Key Properties** |
| ------------------------------- | ------------------------------------------- | --------------------------------------------- |
| `ValidationError` | Invalid feed IDs, timestamps, parameters | `message` |
| `AuthenticationError` | Invalid credentials, HMAC failures | `message` |
| `APIError` | HTTP 4xx/5xx, network timeouts, rate limits | `statusCode`, `message` |
| `ReportDecodingError` | Corrupted report data, unsupported versions | `message` |
| `WebSocketError` | Connection failures, protocol errors | `message` |
| `OriginDiscoveryError` | HA discovery failures | `cause`, `message` |
| `MultiConnectionError` | All HA connections failed | `message` |
| `PartialConnectionFailureError` | Some HA connections failed | `failedConnections`, `totalConnections` |
| `InsufficientConnectionsError` | HA degraded performance | `availableConnections`, `requiredConnections` |
### Usage Examples
```typescript
import {
ValidationError,
AuthenticationError,
APIError,
ReportDecodingError,
WebSocketError,
OriginDiscoveryError,
MultiConnectionError,
} from "./src"
// REST API error handling
try {
const report = await client.getLatestReport(feedId)
} catch (error) {
if (error instanceof ValidationError) {
// Invalid feed ID or parameters
} else if (error instanceof AuthenticationError) {
// Check API credentials
} else if (error instanceof APIError) {
// Server error - check error.statusCode (429, 500, etc.)
} else if (error instanceof ReportDecodingError) {
// Corrupted or unsupported report format
}
}
// Streaming error handling
stream.on("error", (error) => {
if (error instanceof WebSocketError) {
// Connection issues - retry or fallback
} else if (error instanceof OriginDiscoveryError) {
// HA discovery failed - falls back to static config
} else if (error instanceof MultiConnectionError) {
// All HA connections failed - critical
}
})
```
**Catch-all error handling:**
```typescript
import { DataStreamsError } from "./src"
try {
// Any SDK operation
} catch (error) {
if (error instanceof DataStreamsError) {
// Handles ANY SDK error (base class for all error types above)
console.log("SDK error:", error.message)
} else {
// Non-SDK error (network, system, etc.)
console.log("System error:", error)
}
}
```
## Observability (Logs & Metrics)
The SDK is designed to plug into your existing observability stack.
### Logging (Pino/Winston/Console)
Pass your logger to the SDK and choose a verbosity level. For deep WS diagnostics, enable connection debug.
### Quick Start
```typescript
import { createClient, LogLevel } from "@chainlink/data-streams-sdk"
// Silent mode (default) - Zero overhead
const client = createClient({
/* ... config without logging */
})
// Basic console logging
const client = createClient({
// ... other config
logging: {
logger: {
info: console.log,
warn: console.warn,
error: console.error,
},
},
})
```
Using Pino (structured JSON):
```typescript
import pino from "pino"
import { createClient, LogLevel } from "@chainlink/data-streams-sdk"
const root = pino({ level: process.env.PINO_LEVEL || "info" })
const sdk = root.child({ component: "sdk" })
const client = createClient({
// ...config
logging: {
logger: {
info: sdk.info.bind(sdk),
warn: sdk.warn.bind(sdk),
error: sdk.error.bind(sdk),
debug: sdk.debug.bind(sdk),
},
logLevel: LogLevel.INFO,
// For very verbose WS diagnostics, set DEBUG + enableConnectionDebug
// logLevel: LogLevel.DEBUG,
// enableConnectionDebug: true,
},
})
```
Command-line with pretty output:
```bash
PINO_LEVEL=info npx ts-node examples/metrics-monitoring.ts | npx pino-pretty
```
### Log Levels
#### 🔴 ERROR
**Critical failures only**
- Authentication failures
- Network connection errors
- Report decoding failures
- API request failures
- Unexpected crashes
**Example Use:** Production alerts & monitoring
***
#### 🟡 WARN
**Everything in ERROR +**
- Partial reconnections
- Fallback to static origins
- Retry attempts
- Connection timeouts
- Invalid data warnings
**Example Use:** Production environments
***
#### 🔵 INFO
**Everything in WARN +**
- Client initialization
- Successful API calls
- Stream connections
- Report retrievals
- Connection status changes
- Connection mode determination
**Example Use:** Development & staging
***
#### 🔍 DEBUG
**Everything in INFO +**
- Feed ID validation
- Report decoding steps
- Auth header generation
- Request/response details
- WebSocket ping/pong
- Origin discovery process
- Configuration validation
- Origin tracking (HA mode)
**Example Use:** Debugging & development only
### Logging Configuration Options
```typescript
interface LoggingConfig {
/** External logger functions (console, winston, pino, etc.) */
logger?: {
debug?: (message: string, ...args: any[]) => void
info?: (message: string, ...args: any[]) => void
warn?: (message: string, ...args: any[]) => void
error?: (message: string, ...args: any[]) => void
}
/** Minimum logging level - filters out lower priority logs */
logLevel?: LogLevel // DEBUG (0) | INFO (1) | WARN (2) | ERROR (3)
/** Enable WebSocket ping/pong and connection state debugging logs */
enableConnectionDebug?: boolean
}
```
**Compatible with:** console, winston, pino, and any logger with `debug/info/warn/error` methods. See `examples/logging-basic.ts` for complete integration examples.
**For debugging:** Use `LogLevel.DEBUG` for full diagnostics and `enableConnectionDebug: true` to see WebSocket ping/pong messages and connection state transitions.
**Origin tracking** in HA mode shows which specific endpoint received each report.
### Metrics (`stream.getMetrics()`)
The `stream.getMetrics()` API provides a complete snapshot for dashboards and alerts:
```typescript
const m = stream.getMetrics()
// m.accepted, m.deduplicated, m.totalReceived
// m.partialReconnects, m.fullReconnects
// m.activeConnections, m.configuredConnections
// m.originStatus: { [origin]: ConnectionStatus }
```
Simple periodic print (example):
```typescript
setInterval(() => {
const m = stream.getMetrics()
console.log(`accepted=${m.accepted} dedup=${m.deduplicated} active=${m.activeConnections}/${m.configuredConnections}`)
}, 30000)
```
Refer to `examples/metrics-monitoring.ts` for a full metrics dashboard example.
## Testing
```bash
npm test # All tests
npm run test:unit # Unit tests only
npm run test:integration # Integration tests only
```
## Feed IDs
For available feed IDs, select your desired report [from the report schema overview](/data-streams/reference/report-schema-overview).
---
# Data Streams Reference
Source: https://docs.chain.link/data-streams/reference/overview
Chainlink Data Streams offers two distinct solutions for accessing low-latency market data. Choose the solution that best fits your use case.
### Which solution should I use?
| Feature | [Data Streams API](#data-streams-api) | [Candlestick API](#candlestick-api) |
| ----------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------- |
| **Primary Use Case** | Low-latency data for smart contracts | OHLC data for analytics, charting, and dashboards |
| **Data Format** | Signed, verifiable data reports | Aggregated OHLC (candlestick) data |
| **Interfaces** | SDKs ([Go](/data-streams/reference/data-streams-api/go-sdk), [Rust](/data-streams/reference/data-streams-api/rust-sdk), [TypeScript](/data-streams/reference/data-streams-api/ts-sdk)), [REST API](/data-streams/reference/data-streams-api/interface-api) & [WebSocket](/data-streams/reference/data-streams-api/interface-ws) | [REST API](/data-streams/reference/candlestick-api) (including a [streaming endpoint](/data-streams/reference/candlestick-api#get-streaming-price-updates)) |
| **Authentication** | [HMAC Signature](/data-streams/reference/data-streams-api/authentication) (automatic with SDKs) | [JWT](/data-streams/reference/candlestick-api#authorize-and-get-token) (token-based) |
| **Cryptographic Proof** | ✅ Yes (See [Verification methods](#verification)) | ❌ No |
***
## Data Streams API
The [Data Streams API](/data-streams/reference/data-streams-api) provides cryptographically signed, verifiable data reports that can be used for onchain consumption by smart contracts. This comprehensive solution offers multiple integration paths to support applications that require low-latency, tamper-proof data.
### Integration Methods
#### SDKs (Recommended)
- [Go SDK](/data-streams/reference/data-streams-api/go-sdk) - Native Go language integration
- [Rust SDK](/data-streams/reference/data-streams-api/rust-sdk) - Native Rust language integration
- [TypeScript SDK](/data-streams/reference/data-streams-api/ts-sdk) - Native TypeScript language integration
#### Direct API Access
- [REST API](/data-streams/reference/data-streams-api/interface-api) - HTTP-based interface for simple integrations
- [WebSocket](/data-streams/reference/data-streams-api/interface-ws) - Real-time data streaming via WebSocket connection
#### Authentication
- [Authentication Guides](/data-streams/reference/data-streams-api/authentication) - HMAC signature implementation examples (not required when using SDKs)
#### Verification
##### EVM chains
- [Onchain report data verification](/data-streams/reference/data-streams-api/onchain-verification) - Verify the authenticity of received data on EVM chains
##### Solana
- [Verify reports using the onchain integration method](/data-streams/tutorials/solana-onchain-report-verification) - Verify reports directly within your Solana program
- [Verify reports using the offchain integration method](/data-streams/tutorials/solana-offchain-report-verification) - Verify reports client-side using the Rust SDK
**[View the Data Streams API Reference →](/data-streams/reference/data-streams-api)**
***
## Candlestick API
The [Candlestick API](/data-streams/reference/candlestick-api) provides historical and real-time OHLC (Open-High-Low-Close) data designed for offchain applications. This API has been optimized for analytics, charting, and dashboard applications.
### Integration Methods
#### Direct API Access
- [REST API](/data-streams/reference/candlestick-api) - Access historical OHLC data via HTTP
- [Streaming Endpoint](/data-streams/reference/candlestick-api#get-streaming-price-updates) - Get real-time price updates
### Authentication
- [Authentication Endpoint](/data-streams/reference/candlestick-api#authorize-and-get-token) - JWT token-based authentication
**[View the Candlestick API Reference →](/data-streams/reference/candlestick-api)**
---
# Overview
Source: https://docs.chain.link/data-streams/reference/report-schema-overview
## Available Report Schemas
Below is a summary of all available Data Streams report schemas, their main use cases, and key fields.
| Report Schema | Version | Use Case / Purpose | Key Fields |
| :------------------------------------------------------------ | :--------------------------------------------------------------- | :---------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------------- |
| [Cryptocurrency](#cryptocurrency-report-schema) | [View schema (v3)](/data-streams/reference/report-schema-v3) | Crypto price streams | `price`, `bid`, `ask` |
| [DEX State Price](#dex-state-price-report-schema) | [View schema (v3)](/data-streams/reference/report-schema-v3-dex) | DEX state crypto price streams | `price`, `bid`, `ask` (All fields equal; [details](/data-streams/concepts/dex-state-price-streams#how-to-use-dex-state-price-streams)) |
| [Exchange Rate](#exchange-rate-report-schema) | [View schema (v7)](/data-streams/reference/report-schema-v7) | Tokenized asset exchange rates | `exchangeRate` |
| [Real World Asset (RWA)](#real-world-asset-rwa-report-schema) | [View schema (v8)](/data-streams/reference/report-schema-v8) | Real World Asset (RWA) price streams | `midPrice`, `marketStatus`, `lastUpdateTimestamp` |
| [Net Asset Value (NAV)](#net-asset-value-nav-report-schema) | [View schema (v9)](/data-streams/reference/report-schema-v9) | Net Asset Value (NAV) streams | `aum`, `navPerShare`, `navDate`, `ripcord` |
| [Backed xStock](#backed-xstock-report-schema) | [View schema (v10)](/data-streams/reference/report-schema-v10) | Tokenized stocks and backed asset streams | `price`, `tokenizedPrice`, `marketStatus`, `currentMultiplier`, `newMultiplier` |
## Cryptocurrency Report Schema
[Chainlink Cryptocurrency Reports](/data-streams/reference/report-schema-v3) provide market data for digital assets, supporting high-frequency, onchain use cases. Each report includes a consensus mid price (`price`), as well as simulated bid (`bid`) and ask (`ask`) prices that estimate the impact of buying or selling at a specified liquidity depth. These values help protocols and applications understand current market conditions and potential slippage for larger trades.
For a deeper explanation of how liquidity-weighted bid and ask prices work, see [Liquidity-Weighted Bid-Ask Prices (LWBA)](/data-streams/concepts/liquidity-weighted-prices).
## DEX State Price Report Schema
[Chainlink DEX State Price Reports](/data-streams/reference/report-schema-v3-dex) are designed for assets that derive most or all of their liquidity from decentralized exchanges (DEXs). Unlike standard crypto price streams, these reports use onchain market data to reflect the unique conditions of AMM pools and DEX-dominant tokens. In this report, the `price`, `bid`, and `ask` fields are all equal, representing the execution price a trader would receive based on the current state of onchain liquidity pools, rather than order-book mechanics. This approach enables accurate, real-time pricing, even for long-tail or newly launched tokens in low-volume environments.
[The DEX State Price methodology](/data-streams/concepts/dex-state-price-streams#high-level-outline-of-the-dex-state-price-methodology) aggregates data from multiple DEX pools, applies volume and TVL-based weighting, and uses filters to reduce manipulation and smooth volatility. Users should be aware of the specific risks inherent to DeFi, such as smart contract vulnerabilities, bridge dependencies, and external price manipulation. It is important to review [risk mitigation guidance](/data-streams/concepts/dex-state-price-streams#risk-mitigation) and adjust protocol parameters accordingly when integrating DEX State Price Reports.
For more details on the methodology and risk considerations, see [DEX State Price Streams](/data-streams/concepts/dex-state-price-streams).
## Exchange Rate Report Schema
[Chainlink Exchange Rate Data Streams](/data-streams/reference/report-schema-v7) provide tamper-proof access to the exchange rate of tokenized assets, delivered over the low latency, high frequency Chainlink Data Streams infrastructure. Exchange rates typically read directly from onchain smart contracts. These exchange rates generally have a single calculating agent: the issuing protocol. This report is designed specifically for Exchange Rate feeds and reports one main value: the exchange rate (`exchangeRate`).
While the Data Streams architecture is designed to deliver updates at least once per second, the exchange rate itself is often updated only once per day by the data source, in line with standard market practices. Data Streams ensures that any Exchange Rate update, whenever it occurs, is captured and made available immediately and at low latency, allowing seamless integration into onchain applications alongside other real-time data streams.
## Real World Asset (RWA) Report Schema
[Chainlink RWA Data Streams](/data-streams/reference/report-schema-v8) provide fresh, reliable, and accurate financial market data for real-world assets, enabling DeFi users to gain onchain exposure to physical assets. Each report includes a staleness measure (`lastUpdateTimestamp`), consensus median price (`midPrice`) and market status (`marketStatus`).
RWA assets trade on traditional exchanges during [market hours](/data-streams/market-hours). These market hours vary by asset class and can be subject to unexpected halts, pauses and other behaviors affecting traditional markets. For this reason, this class of Data Streams contains a market hours flag and a staleness measure to equip our users to handle these events correctly. It is critical that users implement correct safeguards on their end to pause markets, add more conservative risk caps, or do whatever else is appropriate for their application.
## Net Asset Value (NAV) Report Schema
[Chainlink NAV Data Streams](/data-streams/reference/report-schema-v9) provide real-time, tamper-proof access to the Net Asset Value (`navPerShare`) of tokenized assets, funds, or portfolios, delivered over the low latency, high frequency Chainlink Data Streams infrastructure. Each report includes NAV per share (`navPerShare`), NAV date (`navDate`), assets under management (`aum`), and ripcord status (`ripcord`). The ripcord is set to true (`1`) by the asset issuer when the consumer should ignore the value being sent (for cases such as maintenance, upstream data source outages, etc). The feed data will remain stale until the ripcord returns false (`0`).
NAV is a fundamental financial metric that represents the value of an investment vehicle such as a mutual fund or ETF and is calculated as the total assets minus the total liabilities.
Data Streams ensures that any NAV update, whenever it occurs, is captured and made available immediately and at low latency, allowing for seamless integration with onchain applications alongside other real-time data streams. Although the NAV value may not change frequently, Data Streams provides the most recent NAV as soon as it is published by the source.
## Backed xStock Report Schema
[Chainlink Backed xStock Data Streams](/data-streams/reference/report-schema-v10) provide fresh, reliable, and accurate financial market data for Tokenized Equities such as [xStock](https://xstocks.com/us) assets, enabling DeFi users to gain onchain exposure to tokenized stocks. These Streams are a unique product provided by Chainlink Labs for partners such as xStocks. They combine data from our US equity Data Streams with data from the tokenization service, which enables users to correctly handle corporate actions affecting the underlying equities. Each report includes the staleness measure (`lastUpdateTimestamp`), consensus mid price (`price`), market status (`marketStatus`), current multiplier (`currentMultiplier`, the number of underlying shares each xStock is redeemable for), new multiplier (`newMultiplier`, the future number of shares after a scheduled corporate action), activation date/time of the corporate action (`activationDateTime`) and the tokenized price if available on primary or secondary markets (`tokenizedPrice`).
The underlying US equities trade on traditional exchanges during [market hours](/data-streams/market-hours). These market hours depend per asset class and can be subject to unexpected halts, pauses and other behaviors affecting traditional markets. For this reason, this class of Data Streams contains a market hours flag and a staleness measure to equip our users to handle these events correctly. It is critical that users implement correct safeguards on their end to pause markets, add more conservative risk caps, or implement other measures appropriate for their application.
[The schema](/data-streams/reference/report-schema-v10) is designed specifically for tokenized equities such as xStocks and contains data from the Chainlink US equities streams, combined with data provided by the tokenizer to properly handle corporate actions. With this enhanced data users are able to handle expected and unexpected market events such as pauses, halts and market off hours. The tokenization provider layers in data around the `currentMultiplier`, the `newMultiplier` and the `activationDateTime` of the new multiplier to handle corporate actions.
---
# Backed xStock Report Schema (v10)
Source: https://docs.chain.link/data-streams/reference/report-schema-v10
Chainlink Backed xStock Data Streams adhere to the report schema outlined below.
### Schema Fields
| Field | Type | Description |
| ----------------------- | --------- | ------------------------------------------------------------------------------------------------------------ |
| `feedId` | `bytes32` | Unique identifier for the Data Streams feed |
| `validFromTimestamp` | `uint32` | Earliest timestamp when the price is valid (seconds) |
| `observationsTimestamp` | `uint32` | Latest timestamp when the price is valid (seconds) |
| `nativeFee` | `uint192` | Cost to verify report onchain (native token) |
| `linkFee` | `uint192` | Cost to verify report onchain (LINK) |
| `expiresAt` | `uint32` | Expiration date of the report (seconds) |
| `lastUpdateTimestamp` | `uint64` | Timestamp of the last valid price update (nanoseconds) |
| `price` | `int192` | Last traded price from the real-world equity market |
| `marketStatus` | `uint32` | Status of the real-world equity market. Possible values: `0` (Unknown), `1` (Closed), `2` (Open) |
| `currentMultiplier` | `int192` | Currently applied multiplier accounting for past corporate actions |
| `newMultiplier` | `int192` | Multiplier to be applied at the activationDateTime (set to `0` if none is scheduled) |
| `activationDateTime` | `uint32` | When the next corporate action takes effect (set to `0` if none is scheduled) (seconds) |
| `tokenizedPrice` | `int192` | 24/7 tokenized equity price as traded on supported exchanges (In development; currently returns `0`). |
**Notes:**
- Future Backed xStock streams may use different report schemas.
- `price` updates in real time during market open, but may become stale during market closed periods.
- `tokenizedPrice` will be available in an upcoming release of Backed xStock Data Streams. Currently, it will always return `0`.
- `currentMultiplier` reflects all past corporate actions and is updated only when a new action is activated.
- `activationDateTime` and `newMultiplier` provide advance notice of upcoming corporate actions, allowing protocols to prepare.
- See more detailed guidance for handling stock splits in the [Best Practices](/data-streams/concepts/best-practices#handling-stock-splits-for-tokenized-assets) documentation.
---
# Report Schemas
Source: https://docs.chain.link/data-streams/reference/report-schema-v3-dex
DEX State Price streams adhere to the report schema outlined below.
## Schema Fields
| Field | Type | Description |
| ----------------------- | --------- | ---------------------------------------------------------------------------------------------------------- |
| `feedID` | `bytes32` | Unique identifier for the data stream |
| `validFromTimestamp` | `uint32` | Start timestamp of price validity period (seconds) |
| `observationsTimestamp` | `uint32` | End timestamp of price validity period (seconds) |
| `nativeFee` | `uint192` | Verification cost in native blockchain tokens |
| `linkFee` | `uint192` | Verification cost in LINK tokens |
| `expiresAt` | `uint32` | Timestamp when this report expires (seconds) |
| `price` | `int192` | DON consensus median [DEX state price](/data-streams/concepts/dex-state-price-streams) (18 decimal places) |
| `bid` | `int192` | N/A, equals `price`. |
| `ask` | `int192` | N/A, equals `price`. |
**Notes**:
- Future DEX State Price streams may use different report schemas.
- The `bid` and `ask` fields exist but contain the same value as the `price` field.
---
# Report Schemas
Source: https://docs.chain.link/data-streams/reference/report-schema-v3
Cryptocurrency streams adhere to the report schema outlined below.
## Schema Fields
| Field | Type | Description |
| ----------------------- | --------- | -------------------------------------------------- |
| `feedID` | `bytes32` | Unique identifier for the data stream |
| `validFromTimestamp` | `uint32` | Start timestamp of price validity period (seconds) |
| `observationsTimestamp` | `uint32` | End timestamp of price validity period (seconds) |
| `nativeFee` | `uint192` | Verification cost in native blockchain tokens |
| `linkFee` | `uint192` | Verification cost in LINK tokens |
| `expiresAt` | `uint32` | Timestamp when this report expires (seconds) |
| `price` | `int192` | DON consensus median price |
| `bid` | `int192` | Simulated buy impact price at X% liquidity depth |
| `ask` | `int192` | Simulated sell impact price at X% liquidity depth |
**Note**: Future Cryptocurrency streams may use different report schemas.
---
# Report Schemas
Source: https://docs.chain.link/data-streams/reference/report-schema-v4
Real World Asset (RWA) streams adhere to the report schema outlined below.
## Schema Fields
| Field | Type | Description |
| ----------------------- | --------- | ---------------------------------------------------------------------------------------------------------------------------- |
| `feedID` | `bytes32` | The unique identifier for the stream |
| `validFromTimestamp` | `uint32` | The earliest timestamp during which the price is valid (seconds) |
| `observationsTimestamp` | `uint32` | The latest timestamp during which the price is valid (seconds) |
| `nativeFee` | `uint192` | The cost to verify this report onchain when paying with the blockchain's native token |
| `linkFee` | `uint192` | The cost to verify this report onchain when paying with LINK |
| `expiresAt` | `uint32` | The expiration date of this report (seconds) |
| `price` | `int192` | The DON's consensus median price |
| `marketStatus` | `uint32` | The DON's consensus on whether the market is currently open. Possible values: `0` (`Unknown`), `1` (`Closed`), `2` (`Open`). |
**Notes**:
- Future RWA streams may use different report schemas.
- [Bid and Ask](/data-streams/concepts/liquidity-weighted-prices) values are not available for the first iteration of the RWA report schema (v4).
---
# Report Schemas
Source: https://docs.chain.link/data-streams/reference/report-schema-v7
Exchange Rate streams adhere to the report schema outlined below.
## Schema Fields
| Field | Type | Description |
| ----------------------- | --------- | ---------------------------------------------------- |
| `feedId` | `bytes32` | Unique identifier for the Data Streams feed |
| `validFromTimestamp` | `uint32` | Earliest timestamp when the price is valid (seconds) |
| `observationsTimestamp` | `uint32` | Latest timestamp when the price is valid (seconds) |
| `nativeFee` | `uint192` | Cost to verify report onchain (native token) |
| `linkFee` | `uint192` | Cost to verify report onchain (LINK) |
| `expiresAt` | `uint32` | Expiration date of the report (seconds) |
| `exchangeRate` | `int192` | DON's consensus median exchange rate |
**Notes**:
- Future Exchange Rate streams may use different report schemas.
---
# Report Schemas
Source: https://docs.chain.link/data-streams/reference/report-schema-v8
RWA streams adhere to the report schema outlined below.
### Schema Fields
| Value | Type | Description |
| ----------------------- | --------- | ----------------------------------------------------------------------------------------------------------- |
| `feedID` | `bytes32` | Unique identifier for the Data Streams feed |
| `validFromTimestamp` | `uint32` | Earliest timestamp when the price is valid (seconds) |
| `observationsTimestamp` | `uint32` | Latest timestamp when the price is valid (seconds) |
| `nativeFee` | `uint192` | Cost to verify report onchain (native token) |
| `linkFee` | `uint192` | Cost to verify report onchain (LINK) |
| `expiresAt` | `uint32` | Expiration date of the report (seconds) |
| `lastUpdateTimestamp` | `uint64` | Timestamp of the last valid price update (nanoseconds) |
| `midPrice` | `int192` | DON's consensus median price |
| `marketStatus` | `uint32` | [Market status](/data-streams/market-hours). Possible values: `0` (`Unknown`), `1` (`Closed`), `2` (`Open`) |
**Notes**:
- `marketStatus`:
- Users are responsible for handling market status changes in their applications.
- For further guidance, refer to the [Market Hours Best Practices](/data-streams/concepts/best-practices#market-hours) documentation.
- Future RWA streams may use different report schemas.
---
# Report Schemas
Source: https://docs.chain.link/data-streams/reference/report-schema-v9
Chainlink NAV Data Streams streams adhere to the report schema outlined below.
### Schema Fields
| Value | Type | Description |
| ----------------------- | --------- | ----------------------------------------------------------------------------------------------------------------------------------------------- |
| `feedID` | `bytes32` | Unique identifier for the Data Streams feed |
| `validFromTimestamp` | `uint32` | Earliest timestamp when the price is valid (seconds) |
| `observationsTimestamp` | `uint32` | Latest timestamp when the price is valid (seconds) |
| `nativeFee` | `uint192` | Cost to verify report onchain (native token) |
| `linkFee` | `uint192` | Cost to verify report onchain (LINK) |
| `expiresAt` | `uint32` | Expiration date of the report (seconds) |
| `navPerShare` | `int192` | DON consensus NAV Per Share value as reported by the Fund Manager |
| `navDate` | `uint64` | Timestamp for the date the publication of NAV Report (nanoseconds) |
| `aum` | `int192` | DON consensus Total USD value of Assets Under Management |
| `ripcord` | `uint32` | Whether the provider paused NAV reporting Possible values: `0` (normal state), `1` (paused state) [More details](#ripcord-status) |
**Notes:**
- Future NAV streams may use different report schemas.
##### `ripcord` Status
- 0 (false) - **Data Provider is OK**. This indicates that the Fund’s data provider and data accuracy is reporting as expected.
- 1 (true) - **Data Provider is flagging a Pause**. This indicates that the Fund’s data provider detected outliers, deviated thresholds, management or operational related pause. **During `ripcord=1`, do not consume any NAV data**.
---
# Data Streams Real World Asset streams
Source: https://docs.chain.link/data-streams/rwa-streams
---
# Streams Trade Implementation
Source: https://docs.chain.link/data-streams/streams-trade
The Streams Trade implementation combines Chainlink Data Streams with [Chainlink Automation](/chainlink-automation) to enable automated trade execution. This implementation allows decentralized applications to automate trade execution, mitigate frontrunning, and limit bias or adverse incentives in executing non-user-triggered orders.
Read more about the [Streams Trade Architecture](/data-streams/architecture#streams-trade-architecture) and an [example trading flow](/data-streams/architecture#example-trading-flow-using-streams-trade).
## Getting Started
To implement Streams Trade in your application:
1. First, ensure Streams Trade is available on your desired network by checking the [Supported Networks](/data-streams/supported-networks#streams-trade-implementation-onchain-lookup) page.
2. Review the [Architecture Documentation](/data-streams/architecture#streams-trade-architecture) to understand the system components.
3. See an [Example Trading Flow](/data-streams/architecture#example-trading-flow-using-streams-trade) to understand how trades are executed.
4. Follow our [Getting Started Guide](/data-streams/getting-started) to set up your first integration.
## Common Use Cases
Streams Trade is particularly well-suited for:
- **Perpetual Futures Protocols**: Enable high-performance onchain perpetual futures that can compete with centralized exchanges
- **Automated Market Making**: Implement sophisticated market making strategies with real-time price updates
- **Options Protocols**: Enable precise and timely settlement of options contracts
- **Prediction Markets**: Allow quick responses to real-time events with accurate settlement data
---
# Interfaces
Source: https://docs.chain.link/data-streams/streams-trade/interfaces
Data Streams require several interfaces in order to retrieve and verify reports.
- Automation interfaces:
- [StreamsLookupCompatibleInterface](/chainlink-automation/reference/automation-interfaces#streamslookupcompatibleinterface)
- [ILogAutomation](/chainlink-automation/reference/automation-interfaces#ilogautomation)
- Data Streams interfaces:
- IVerifierProxy
- IReportHandler
In the current code example for using Data Streams with Automation, these interfaces are specified in the example itself. Imports for these interfaces will be available in the future.
---
# Supported Networks
Source: https://docs.chain.link/data-streams/supported-networks
Chainlink Data Streams provides data access directly via API or WebSocket for offchain use cases. It involves [verifying report integrity](/data-streams/tutorials/evm-onchain-report-verification) against onchain verifier proxy contracts.
Data Streams is available on any network listed on the following pages, where you can find the necessary Verifier Proxy addresses:
- [Cryptocurrency Streams](/data-streams/crypto-streams)
- [Real World Asset (RWA) Streams](/data-streams/rwa-streams)
## Streams Trade implementation (Onchain Lookup)
Streams Trade, the alternative implementation, allows smart contracts to access Data Streams onchain using the [`StreamsLookup`](/data-streams/getting-started) capability integrated with [Chainlink Automation](/chainlink-automation).
Streams Trade is currently available on the following networks:
- Arbitrum
- Avalanche
- Base
- BNB Chain
- Ethereum
- Optimism
- Polygon
---
# Verify report data onchain (EVM)
Source: https://docs.chain.link/data-streams/tutorials/evm-onchain-report-verification
In this tutorial, you'll learn how to verify onchain the integrity of reports by confirming their authenticity as signed by the Decentralized Oracle Network (DON). You'll use a verifier contract to verify the data onchain and pay the verification fee in LINK tokens.
## Before you begin
Make sure you understand how to fetch reports via the [REST API](/data-streams/reference/data-streams-api/interface-api) or [WebSocket](/data-streams/reference/data-streams-api/interface-ws) connection. Refer to the following tutorials for more information:
- [Fetch and decode reports via a REST API](/data-streams/tutorials/go-sdk-fetch)
- [Stream and decode reports via WebSocket](/data-streams/tutorials/go-sdk-stream)
## Requirements
- This tutorial requires testnet ETH and LINK on *Arbitrum Sepolia*. Both are available at [faucets.chain.link](https://faucets.chain.link/arbitrum-sepolia).
- Learn how to [Fund your contract with LINK](/resources/fund-your-contract).
## Tutorial
### Deploy the verifier contract
Deploy a `ClientReportsVerifier` contract on *Arbitrum Sepolia*. This contract is enabled to verify reports and pay the verification fee in LINK tokens.
1. [Open the ClientReportsVerifier.sol](https://remix.ethereum.org/#url=https://docs.chain.link/samples/DataStreams/ClientReportsVerifier.sol) contract in Remix.
2. Select the `ClientReportsVerifier.sol` contract in the **Solidity Compiler** tab.
3. Compile the contract.
4. Open MetaMask and set the network to *Arbitrum Sepolia*. If you need to add Arbitrum Sepolia to your wallet, you can find the chain ID and the LINK token contract address on the [LINK Token Contracts](/resources/link-token-contracts#arbitrum-sepolia-testnet) page.
-
5. On the **Deploy & Run Transactions** tab in Remix, select *Injected Provider - MetaMask* in the **Environment** list. Remix will use the MetaMask wallet to communicate with *Arbitrum Sepolia*.
6. In the **Contract** section, select the `ClientReportsVerifier` contract and fill in the Arbitrum Sepolia **verifier proxy address**: 0x2ff010DEbC1297f19579B4246cad07bd24F2488A. You can find the verifier proxy addresses on the [Stream Addresses](/data-streams/crypto-streams) page.
7. Click the **Deploy** button to deploy the contract. MetaMask prompts you to confirm the transaction. Check the transaction details to ensure you deploy the contract to *Arbitrum Sepolia*.
8. After you confirm the transaction, the contract address appears under the **Deployed Contracts** list in Remix. Save this contract address for the next step.
### Fund the verifier contract
In this example, the client contract pays for onchain verification of reports in LINK tokens when fees are required. The contract automatically detects whether the target network requires fees:
- **Networks with `FeeManager` deployed**: Verification requires token payments. These networks include: Arbitrum, Avalanche, Base, Blast, Bob, Ink, Linea, OP, Scroll, Soneium, and ZKSync.
- **Networks without `FeeManager`**: No funding is needed since you can verify reports without fees. The contract skips the fee calculation and approval steps.
For this tutorial on *Arbitrum Sepolia*, fees are required, so you need to fund the contract with LINK tokens. Open MetaMask and send 1 testnet LINK on *Arbitrum Sepolia* to the verifier contract address you saved earlier.
### Verify a report onchain
1. In Remix, on the **Deploy & Run Transactions** tab, expand your verifier contract under the **Deployed Contracts** section.
2. Fill in the `verifyReport` function input parameter with the report payload you want to verify. You can use the following full report payload obtained in the [Fetch and decode report via a REST API](/data-streams/tutorials/go-sdk-fetch) tutorial as an example:
```
0x000660403d36be006d0c15d9b306f93c8660c5cfeab7db8e28c78ba316d395970000000000000000000000000000000000000000000000000000000032c3780a000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000e000000000000000000000000000000000000000000000000000000000000002200000000000000000000000000000000000000000000000000000000000000280010000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000001200003c8e550d2fc5304993010112de9b69798297e4cc11990ee6250e464daf760000000000000000000000000000000000000000000000000000000006706e595000000000000000000000000000000000000000000000000000000006706e595000000000000000000000000000000000000000000000000000025bd3eb74c080000000000000000000000000000000000000000000000000021c6a95c654c7400000000000000000000000000000000000000000000000000000000670837150000000000000000000000000000000000000000000000079a2ab4077fc8fc6000000000000000000000000000000000000000000000000799fcb42536dfd8300000000000000000000000000000000000000000000000079a59496c3f29a0000000000000000000000000000000000000000000000000000000000000000002bd4acd37ce3cd5799de05d156ab328a5effd94468ebbaf2ff18d13d9631259cbe66cca01af6a8bb36e79d2d731a44e16791ee31e46ce27ed6530f1590cd7734c0000000000000000000000000000000000000000000000000000000000000002391562f1f2e4986bdb978fbf5ee27f7012992a79301af42d3473761ef2ede6271a61fbf4b32ac5be68a598bcfa523e035b624dab3b3d9a46276834f824ee592a
```
3. Click the `verifyReport` button to call the function. MetaMask prompts you to accept the transaction.
4. Click the `lastDecodedPrice` getter function to view the decoded price from the verified report. The answer of `3257579704051546000000` indicates an ETH/USD price of 3,257.579704051546. Each stream uses a different number of decimal places for answers. See the [Stream Addresses](/data-streams/crypto-streams) page for more information.
## Examine the code
The example code you deployed has all the interfaces and functions required to verify Data Streams reports onchain.
### Initializing the contract
When deploying the contract, you define the verifier proxy address for the stream you want to read from. You can find this address on the [Stream Addresses](/data-streams/crypto-streams) page. The verifier proxy address provides functions that are required for this example:
- The `s_feeManager` function to estimate the verification fees.
- The `verify` function to verify the report onchain.
### Verifying a report
The `verifyReport` function is the core function that handles onchain report verification. Here's how it works:
- **Report data extraction**:
- The function decodes the `unverifiedReport` to extract the report data.
- It then extracts the report version by reading the first two bytes of the report data, which correspond to the schema version encoded in the stream ID:
- Schema version `0x0003` corresponds to the [report schema v3](/data-streams/reference/report-schema-v3) (Crypto assets).
- Schema version `0x0008` corresponds to the [report schema v8](/data-streams/reference/report-schema-v8) (Real World Assets).
- Schema version `0x0007` corresponds to the [report schema v7](/data-streams/reference/report-schema-v7) (Exchange Rates).
- Schema version `0x0009` corresponds to the [report schema v9](/data-streams/reference/report-schema-v9) (NAV data).
- Schema version `0x000A` corresponds to the [report schema v10](/data-streams/reference/report-schema-v10) (Backed xStock data).
- If the report version is unsupported, the function reverts with an InvalidReportVersion error.
For more information, see the [Report Schemas](/data-streams/reference/report-schema-overview) documentation.
- **Fee calculation and handling**:
- The function first checks if a `FeeManager` contract exists by querying `s_feeManager()` on the verifier proxy.
- **If a `FeeManager` exists** (non-zero address):
- It calculates the fees required for verification using the `getFeeAndReward` function.
- It approves the `RewardManager` contract to spend the calculated amount of LINK tokens from the contract's balance.
- It encodes the fee token address into the `parameterPayload` for the verification call.
- `FeeManager` contracts are currently deployed on: Arbitrum, Avalanche, Base, Blast, Bob, Ink, Linea, OP, Scroll, Soneium, and ZKSync.
- **If no `FeeManager` exists** (zero address):
- The function skips the fee calculation and approval steps entirely.
- It passes an empty `parameterPayload` to the verification call.
- This automatic detection makes the contract compatible with any network, regardless of whether fee management is deployed.
- **Report verification**:
- The `verify` function of the verifier proxy is called to perform the actual verification.
- It passes the `unverifiedReport` and the `parameterPayload` (which contains either the encoded fee token address or empty bytes) as parameters.
- **Data decoding**:
- Depending on the [report version](/data-streams/reference/report-schema-overview), the function decodes the verified report data into the appropriate struct (`ReportV3`, `ReportV8`,`ReportV9`, etc.).
- It emits a `DecodedPrice` event with the price extracted from the verified report.
- The `lastDecodedPrice` state variable is updated with the new price.
### Additional functionality
The contract also includes:
- **Owner-only token withdrawal**: The `withdrawToken` function allows the contract owner to withdraw any ERC-20 tokens (including LINK) from the contract.
- **Enhanced error handling**: The contract includes specific error types (`InvalidReportVersion`, `NotOwner`, `NothingToWithdraw`) for better debugging and user experience.
- **Cross-chain compatibility**: The automatic `FeeManager` detection makes the same contract code work on any supported network, whether fees are required or not.
---
# Fetch and decode Data Streams reports using the Go SDK
Source: https://docs.chain.link/data-streams/tutorials/go-sdk-fetch
In this tutorial, you'll learn how to use the [Data Streams SDK](/data-streams/reference/data-streams-api/go-sdk) for Go to fetch and decode [reports](/data-streams/reference/report-schema-overview) from the Data Streams Aggregation Network. You'll set up your Go project, retrieve reports, decode them, and log their attributes.
## Requirements
- **Git**: Make sure you have Git installed. You can check your current version by running git --version in your terminal and download the latest version from the official [Git website](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git) if necessary.
- **Go Version**: Make sure you have Go version 1.21 or higher. You can check your current version by running `go version` in your terminal and download the latest version from the official [Go website](https://go.dev/) if necessary.
- **API Credentials**: Access to Data Streams requires API credentials. If you haven't already, [contact us](https://chainlinkcommunity.typeform.com/datastreams?typeform-source=docs.chain.link#ref_id=docs) to request mainnet or testnet access.
## Tutorial
You'll start with the set up of your Go project, installing the SDK and pasting example code. This will let you decode reports for both single and multiple [streams](/data-streams/crypto-streams), logging their attributes to your terminal.
### Set up your Go project
1. Create a new directory for your project and navigate to it:
```bash
mkdir my-data-streams-project
cd my-data-streams-project
```
2. Initialize a new Go module:
```bash
go mod init my-data-streams-project
```
3. Install the [Data Streams SDK](https://github.com/smartcontractkit/data-streams-sdk/tree/main/go):
```bash
go get github.com/smartcontractkit/data-streams-sdk/go
```
### Fetch and decode a report with a single stream
1. Create a new Go file, `single-stream.go`, in your project directory:
```bash
touch single-stream.go
```
2. Insert the following code example and save your `single-stream.go` file:
```go
package main
import (
"context"
"fmt"
"os"
"time"
streams "github.com/smartcontractkit/data-streams-sdk/go"
feed "github.com/smartcontractkit/data-streams-sdk/go/feed"
report "github.com/smartcontractkit/data-streams-sdk/go/report"
// NOTE: Use the report version (v3, v8, etc.) that matches your stream
v3 "github.com/smartcontractkit/data-streams-sdk/go/report/v3"
)
func main() {
// Validate command-line arguments
if len(os.Args) < 2 {
fmt.Printf("Usage: go run main.go [FeedID]\nExample: go run main.go 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782\n")
os.Exit(1)
}
feedIDInput := os.Args[1]
// Get API credentials from environment variables
apiKey := os.Getenv("API_KEY")
apiSecret := os.Getenv("API_SECRET")
if apiKey == "" || apiSecret == "" {
fmt.Printf("API_KEY and API_SECRET environment variables must be set\n")
os.Exit(1)
}
// Define the configuration for the SDK client
cfg := streams.Config{
ApiKey: apiKey,
ApiSecret: apiSecret,
RestURL: "https://api.testnet-dataengine.chain.link",
Logger: streams.LogPrintf,
}
// Initialize the SDK client
client, err := streams.New(cfg)
if err != nil {
cfg.Logger("Failed to create client: %v\n", err)
os.Exit(1)
}
// Create context with timeout
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()
// Parse the feed ID
var feedID feed.ID
if err := feedID.FromString(feedIDInput); err != nil {
cfg.Logger("Invalid feed ID format '%s': %v\n", feedIDInput, err)
os.Exit(1)
}
// Fetch the latest report
reportResponse, err := client.GetLatestReport(ctx, feedID)
if err != nil {
cfg.Logger("Failed to get latest report: %v\n", err)
os.Exit(1)
}
// Log the raw report data
cfg.Logger("Raw report data: %+v\n", reportResponse)
// Decode the report
// NOTE: Use the report version (v3, v8, etc.) that matches your stream
decodedReport, err := report.Decode[v3.Data](reportResponse.FullReport)
if err != nil {
cfg.Logger("Failed to decode report: %v\n", err)
os.Exit(1)
}
// Format and display the decoded report
// NOTE: Adjust for your report and desired output
fmt.Printf("\nDecoded Report for Stream ID %s:\n"+
"------------------------------------------\n"+
"Observations Timestamp: %d\n"+
"Benchmark Price : %s\n"+
"Bid : %s\n"+
"Ask : %s\n"+
"Valid From Timestamp : %d\n"+
"Expires At : %d\n"+
"Link Fee : %s\n"+
"Native Fee : %s\n"+
"------------------------------------------\n",
feedIDInput,
decodedReport.Data.ObservationsTimestamp,
decodedReport.Data.BenchmarkPrice.String(),
decodedReport.Data.Bid.String(),
decodedReport.Data.Ask.String(),
decodedReport.Data.ValidFromTimestamp,
decodedReport.Data.ExpiresAt,
decodedReport.Data.LinkFee.String(),
decodedReport.Data.NativeFee.String(),
)
}
```
3. Download the required dependencies and update the `go.mod` and `go.sum` files:
```bash
go mod tidy
```
4. Set up your API credentials as environment variables:
```bash
export API_KEY=""
export API_SECRET=""
```
Replace `` and `` with your actual credentials.
The Go script uses `os.Getenv("API_KEY")` and `os.Getenv("API_SECRET")` in its `streams.Config` to read these values. From the code sample above:
```go
cfg := streams.Config{
ApiKey: os.Getenv("API_KEY"),
ApiSecret: os.Getenv("API_SECRET"),
RestURL: "https://api.testnet-dataengine.chain.link",
Logger: streams.LogPrintf,
}
```
This configuration also specifies the `RestURL`, which the base URL for the API. In this case, it is already set to the testnet URL for Data Streams.
See the [SDK Reference](/data-streams/reference/data-streams-api/go-sdk#config-struct) page for more configuration options.
5. Read from a [testnet crypto stream](/data-streams/crypto-streams?page=1\&testnetPage=1#testnet-crypto-streams). The below example executes the application, reading from the `ETH/USD` crypto stream:
```bash
go run single-stream.go 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
```
Expect output similar to the following in your terminal:
```bash
Raw report data: {"fullReport":"0x0006f9b553e393ced311551efd30d1decedb63d76ad41737462e2cdbbdff1578000000000000000000000000000000000000000000000000000000004f8e8a11000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000e00000000000000000000000000000000000000000000000000000000000000220000000000000000000000000000000000000000000000000000000000000028001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000120000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba78200000000000000000000000000000000000000000000000000000000675e0a5b00000000000000000000000000000000000000000000000000000000675e0a5b00000000000000000000000000000000000000000000000000001787ff5c6fb8000000000000000000000000000000000000000000000000000c01807477ecd000000000000000000000000000000000000000000000000000000000675f5bdb0000000000000000000000000000000000000000000000d1865f8f627c4113300000000000000000000000000000000000000000000000d18572c6cdc3b915000000000000000000000000000000000000000000000000d1879ab8e98f743ad00000000000000000000000000000000000000000000000000000000000000002f3316e5c964d118f6683eecda454985fcc696e4ba34d65edb4a71a8d0cfe970676f465618c7d01196e433cc35b6994e7ad7b8189b0462b51458e663d601fdfaa0000000000000000000000000000000000000000000000000000000000000002219a4493fdf311421d664e0c8d69efa74b776461f8e252d191eda7edb980ab9a5cce69ec0ad35ba210cf60a201ceff6771b35b44860fda859f4aaba242c476bf","feedID":"0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782","validFromTimestamp":1734216283,"observationsTimestamp":1734216283}
Decoded Report for Stream ID 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782:
------------------------------------------
Observations Timestamp: 1734216283
Benchmark Price : 3865052126782320350000
Bid : 3864985478146740000000
Ask : 3865140837060103650000
Valid From Timestamp : 1734216283
Expires At : 1734302683
Link Fee : 3379350941986000
Native Fee : 25872872271800
------------------------------------------
```
Your application has successfully decoded the report data.
[Learn more about the decoded report details](#decoded-report-details).
### Fetch and decode reports for multiple streams
1. Create a new Go file, `multiple-streams.go`, in your project directory:
```bash
touch multiple-streams.go
```
2. Insert the following code example in your `multiple-streams.go` file:
```go
package main
import (
"context"
"fmt"
"os"
"time"
streams "github.com/smartcontractkit/data-streams-sdk/go"
feed "github.com/smartcontractkit/data-streams-sdk/go/feed"
report "github.com/smartcontractkit/data-streams-sdk/go/report"
// NOTE: Use the report version (v3, v8, etc.) that matches your stream
v3 "github.com/smartcontractkit/data-streams-sdk/go/report/v3"
)
func main() {
// Validate command-line arguments
if len(os.Args) < 3 {
fmt.Printf("Usage: go run multiple-streams.go [StreamID1] [StreamID2] ...\n"+
"Example: go run multiple-streams.go "+
"0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782 "+
"0x00036fe43f87884450b4c7e093cd5ed99cac6640d8c2000e6afc02c8838d0265\n")
os.Exit(1)
}
// Get API credentials from environment variables
apiKey := os.Getenv("API_KEY")
apiSecret := os.Getenv("API_SECRET")
if apiKey == "" || apiSecret == "" {
fmt.Printf("API_KEY and API_SECRET environment variables must be set\n")
os.Exit(1)
}
// Define the configuration for the SDK client
cfg := streams.Config{
ApiKey: apiKey,
ApiSecret: apiSecret,
RestURL: "https://api.testnet-dataengine.chain.link",
Logger: streams.LogPrintf,
}
// Initialize the SDK client
client, err := streams.New(cfg)
if err != nil {
cfg.Logger("Failed to create client: %v\n", err)
os.Exit(1)
}
// Create context with timeout
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()
// Parse Feed IDs
var ids []feed.ID
for _, arg := range os.Args[1:] {
var fid feed.ID
if err := fid.FromString(arg); err != nil {
cfg.Logger("Invalid feed ID format '%s': %v\n", arg, err)
continue
}
ids = append(ids, fid)
}
if len(ids) == 0 {
cfg.Logger("No valid feed IDs provided\n")
os.Exit(1)
}
// Fetch reports for all streams
timestamp := uint64(time.Now().Unix())
reportResponses, err := client.GetReports(ctx, ids, timestamp)
if err != nil {
cfg.Logger("Failed to get reports: %v\n", err)
os.Exit(1)
}
// Process reports
for _, reportResponse := range reportResponses {
// Log the raw report data
cfg.Logger("Raw report data for Stream ID %s: %+v\n",
reportResponse.FeedID.String(), reportResponse)
// Decode the report
decodedReport, err := report.Decode[v3.Data](reportResponse.FullReport)
if err != nil {
cfg.Logger("Failed to decode report for Stream ID %s: %v\n",
reportResponse.FeedID.String(), err)
continue // Skip to next report if decoding fails
}
// Format and display the decoded report
// NOTE: Adjust for your report and desired output
fmt.Printf("\nDecoded Report for Stream ID %s:\n"+
"------------------------------------------\n"+
"Observations Timestamp: %d\n"+
"Benchmark Price : %s\n"+
"Bid : %s\n"+
"Ask : %s\n"+
"Valid From Timestamp : %d\n"+
"Expires At : %d\n"+
"Link Fee : %s\n"+
"Native Fee : %s\n"+
"------------------------------------------\n",
reportResponse.FeedID.String(),
decodedReport.Data.ObservationsTimestamp,
decodedReport.Data.BenchmarkPrice.String(),
decodedReport.Data.Bid.String(),
decodedReport.Data.Ask.String(),
decodedReport.Data.ValidFromTimestamp,
decodedReport.Data.ExpiresAt,
decodedReport.Data.LinkFee.String(),
decodedReport.Data.NativeFee.String(),
)
}
}
```
3. Before running the example, verify that your API credentials are still set in your current terminal session:
```bash
echo $API_KEY
echo $API_SECRET
```
If the commands above don't show your credentials, set them again:
```bash
export API_KEY=""
export API_SECRET=""
```
4. Read from two [testnet crypto streams](/data-streams/crypto-streams?page=1\&testnetPage=1#testnet-crypto-streams). For this example, you will read from the ETH/USD and LINK/USD Data Streams crypto streams. Run your application:
```bash
go run multiple-streams.go 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782 0x00036fe43f87884450b4c7e093cd5ed99cac6640d8c2000e6afc02c8838d0265
```
Expect to see the output below in your terminal:
```bash
2024-12-14T17:49:06-05:00 Raw report data for Stream ID 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782: {"fullReport":"0x0006f9b553e393ced311551efd30d1decedb63d76ad41737462e2cdbbdff1578000000000000000000000000000000000000000000000000000000004f8eb301000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000e00000000000000000000000000000000000000000000000000000000000000220000000000000000000000000000000000000000000000000000000000000028001010000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000120000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba78200000000000000000000000000000000000000000000000000000000675e0b6100000000000000000000000000000000000000000000000000000000675e0b610000000000000000000000000000000000000000000000000000178bcfba6d60000000000000000000000000000000000000000000000000000c1536af09b42c00000000000000000000000000000000000000000000000000000000675f5ce10000000000000000000000000000000000000000000000d1646f5fd0b1e6e6a00000000000000000000000000000000000000000000000d1635f8df6b0fce2600000000000000000000000000000000000000000000000d1677c13a6c9d89620000000000000000000000000000000000000000000000000000000000000000222412e1bd137097dc97def8914c72ae7305179eedc5c15e344bc119a06f1db76ef20f3e6493a97c2be2ab831199bfc00dbbf02551f2a27a70cfd55653270acac0000000000000000000000000000000000000000000000000000000000000002343957d73014b446eb0b0436072e16261a643102e502529d3b97412027c3468977bf0806e8971a37e6487b855243957a57749cde2ac92ebc2bda412d94981251","feedID":"0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782","validFromTimestamp":1734216545,"observationsTimestamp":1734216545}
Decoded Report for Stream ID 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782:
------------------------------------------
Observations Timestamp: 1734216545
Benchmark Price : 3862606619881446500000
Bid : 3862530109428509500000
Ask : 3862826368095386900000
Valid From Timestamp : 1734216545
Expires At : 1734302945
Link Fee : 3401024329593900
Native Fee : 25889252994400
------------------------------------------
2024-12-14T17:49:06-05:00 Raw report data for Stream ID 0x00036fe43f87884450b4c7e093cd5ed99cac6640d8c2000e6afc02c8838d0265: {"fullReport":"0x00060a2676459d14176b64106fcf3246631d3a03734171737eb082fe79c956e0000000000000000000000000000000000000000000000000000000005437220a000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000e0000000000000000000000000000000000000000000000000000000000000022000000000000000000000000000000000000000000000000000000000000002800001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000012000036fe43f87884450b4c7e093cd5ed99cac6640d8c2000e6afc02c8838d026500000000000000000000000000000000000000000000000000000000675e0b6100000000000000000000000000000000000000000000000000000000675e0b610000000000000000000000000000000000000000000000000000178bc5ba4c04000000000000000000000000000000000000000000000000000c153d429e703c00000000000000000000000000000000000000000000000000000000675f5ce1000000000000000000000000000000000000000000000001980b3d7d3ec2df6000000000000000000000000000000000000000000000000197f69e569cfba588000000000000000000000000000000000000000000000001981e643ba148a6040000000000000000000000000000000000000000000000000000000000000002ae4d6d1a241622f6b9c5cc53c5ac6abc4e46759c7bda4942936e02f2fc9ba7374869e71ce1572ae581049e6e9463537056031e30b0c11f84ff44e45363e3fa9300000000000000000000000000000000000000000000000000000000000000021ab422f2202e0f59016a29b37c2795b47d78c19ba2358808794c6f0cd3b04bde7adb2f30669b051ddde1059f106c161c65c401909b78d76bd13ad593e31ab13e","feedID":"0x00036fe43f87884450b4c7e093cd5ed99cac6640d8c2000e6afc02c8838d0265","validFromTimestamp":1734216545,"observationsTimestamp":1734216545}
Decoded Report for Stream ID 0x00036fe43f87884450b4c7e093cd5ed99cac6640d8c2000e6afc02c8838d0265:
------------------------------------------
Observations Timestamp: 1734216545
Benchmark Price : 29402662200351580000
Bid : 29396857712545605000
Ask : 29408052824047658500
Valid From Timestamp : 1734216545
Expires At : 1734302945
Link Fee : 3401052575395900
Native Fee : 25889085213700
------------------------------------------
```
Your application has successfully fetched and decoded data for both streams.
[Learn more about the decoded report details](#decoded-report-details).
### Decoded report details
The decoded [crypto v3 report](/data-streams/reference/report-schema-v3) details include:
| Attribute | Value | Description |
| ---------------------- | -------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| Stream ID | `0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782` | The unique identifier for the stream. In this example, the stream is for ETH/USD. |
| Observations Timestamp | `1734216283` | The timestamp indicating when the data was captured. |
| Benchmark Price | `3865052126782320350000` | The observed price in the report, with 18 decimals. For readability: `3,865.0521267823204` USD per ETH. |
| Bid | `3864985478146740000000` | The highest price a buyer is willing to pay for an asset, with 18 decimals. For readability: `3,864.9854781467400` USD per ETH. Learn more about the [Bid price](/data-streams/concepts/liquidity-weighted-prices). (For [DEX State Price streams](/data-streams/concepts/dex-state-price-streams), this value equals `Benchmark Price`.) |
| Ask | `3865140837060103650000` | The lowest price a seller is willing to accept for an asset, with 18 decimals. For readability: `3,865.1408370601037` USD per ETH. Learn more about the [Ask price](/data-streams/concepts/liquidity-weighted-prices). (For [DEX State Price streams](/data-streams/concepts/dex-state-price-streams), this value equals `Benchmark Price`.) |
| Valid From Timestamp | `1734216283` | The start validity timestamp for the report, indicating when the data becomes relevant. |
| Expires At | `1734302683` | The expiration timestamp of the report, indicating the point at which the data becomes outdated. |
| Link Fee | `3379350941986000` | The fee to pay in LINK tokens for the onchain verification of the report data. With 18 decimals. For readability: `0.03379350941986` LINK. **Note:** This example fee is not indicative of actual fees. |
| Native Fee | `25872872271800` | The fee to pay in the native blockchain token (e.g., ETH on Ethereum) for the onchain verification of the report data. With 18 decimals. **Note:** This example fee is not indicative of actual fees. |
For descriptions and data types of other report schemas, see the [Report Schema Overview](/data-streams/reference/report-schema-overview).
### Payload for onchain verification
In this tutorial, you logged and decoded the `full_report` payloads to extract the report data. However, in a production environment, you should verify the data to ensure its integrity and authenticity.
Refer to the [Verify report data onchain](/data-streams/tutorials/evm-onchain-report-verification) tutorial to learn more.
## Adapting code for different report schema versions
When working with different versions of [Data Stream reports](/data-streams/reference/report-schema-overview), you'll need to adapt your code to handle the specific report schema version they use:
1. Import the correct schema version. Examples:
- For v3 schema (as used in this example):
```go
v3 "github.com/smartcontractkit/data-streams-sdk/go/report/v3"
```
- For v8 schema:
```go
v8 "github.com/smartcontractkit/data-streams-sdk/go/report/v8"
```
2. Update the decode function to use the correct schema version. Examples:
- For v3 schema (as used in this example):
```go
decodedReport, err := report.Decode[v3.Data](reportResponse.FullReport)
```
- For v8 schema:
```go
decodedReport, err := report.Decode[v8.Data](reportResponse.FullReport)
```
3. Access fields according to the schema version structure. Refer to the [Report Schemas](/data-streams/reference/report-schema-overview) documentation for complete field references for each version.
## Explanation
### Initializing the client and configuration
The Data Streams client is initialized in two steps:
1. Configure the client with [`streams.Config`](https://github.com/smartcontractkit/data-streams-sdk/blob/main/go/config.go#L10):
```go
cfg := streams.Config{
ApiKey: os.Getenv("API_KEY"),
ApiSecret: os.Getenv("API_SECRET"),
RestURL: "https://api.testnet-dataengine.chain.link",
Logger: streams.LogPrintf,
}
```
The configuration requires:
- `ApiKey` and `ApiSecret` for authentication (required)
- `RestURL` for the API endpoint (required)
- `Logger` for debugging and error tracking (optional, defaults to `streams.LogPrintf`)
See the [SDK Reference](/data-streams/reference/data-streams-api/go-sdk#config-struct) page for more configuration options.
2. Create the client with [`streams.New`](https://github.com/smartcontractkit/data-streams-sdk/blob/main/go/client.go#L56):
```go
client, err := streams.New(cfg)
```
The client handles:
- Authentication with HMAC signatures
- Connection management and timeouts
- Error handling and retries
### Fetching reports
The SDK provides two main methods to fetch reports:
1. Latest report for a single stream with [`GetLatestReport`](https://github.com/smartcontractkit/data-streams-sdk/blob/main/go/client.go#L130):
```go
reportResponse, err := client.GetLatestReport(ctx, feedID)
```
- Takes a context and feed ID
- Returns a single `ReportResponse` with the most recent data
- No timestamp parameter needed
- Useful for real-time price monitoring
2. Latest reports for multiple streams with [`GetReports`](https://github.com/smartcontractkit/data-streams-sdk/blob/main/go/client.go#L208):
```go
reportResponses, err := client.GetReports(ctx, ids, timestamp)
```
- Takes context, feed IDs array, and Unix timestamp
- Returns array of `ReportResponse`, one per feed ID
- Timestamp determines the point in time for the reports
- Efficient for monitoring multiple assets simultaneously
Each API request automatically:
- Handles authentication with API credentials
- Manages request timeouts via context
- Processes responses into structured types
### Decoding reports
Reports are decoded in two steps:
1. Report decoding with [`report.Decode`](https://github.com/smartcontractkit/data-streams-sdk/blob/main/go/report/report.go#L30):
```go
decodedReport, err := report.Decode[v3.Data](reportResponse.FullReport)
```
This step:
1. Takes the raw `FullReport` bytes from the response
2. Uses `v3.Data` schema for Crypto streams (use the appropriate schema for your stream)
3. Validates the format and decodes using Go generics
4. Returns a structured report with typed data
2. Data access:
```go
data := decodedReport.Data
price := data.BenchmarkPrice.String() // Convert big number to string
bid := data.Bid.String()
ask := data.Ask.String()
validFrom := data.ValidFromTimestamp // Unix timestamp
expiresAt := data.ExpiresAt // Unix timestamp
// ...additional or different fields depending on the report schema
```
Provides access to:
- Benchmark price, bid, and ask prices (as big numbers)
- **Note:** For [DEX State Price streams](/data-streams/concepts/dex-state-price-streams), which also use the V3 schema, the `bid` and `ask` fields contain the same value as `BenchmarkPrice`.
- Fee information (LINK and native token fees)
- Timestamp data (validity period)
- All numeric values require `.String()` for display
### Error handling
The SDK uses Go's standard error handling patterns with some enhancements:
1. Context management:
```go
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()
```
- Sets request timeouts for API calls
- `defer cancel()` ensures cleanup of resources
- Same context pattern for both single and multiple reports
2. Error checking:
```go
if err != nil {
cfg.Logger("Failed to decode report: %v\n", err)
os.Exit(1) // Fatal errors: exit the program
// or
continue // Non-fatal errors: skip this report
}
```
- Fatal errors (client creation, no valid feeds) use `os.Exit(1)`
- Non-fatal errors (single report decode) use `continue`
- All errors are logged before handling
3. SDK logging:
```go
cfg.Logger("Raw report data: %+v\n", reportResponse)
```
- Uses configured logger for SDK operations
- `fmt.Printf` for user-facing output
- Debug information includes raw report data
- Structured error messages with context
The decoded data can be used for further processing or display in your application. For production environments, you must verify the data onchain using the provided `fullReport` payload.
---
# Stream and decode Data Streams reports via WebSocket using the Go SDK
Source: https://docs.chain.link/data-streams/tutorials/go-sdk-stream
In this tutorial, you'll learn how to use the [Data Streams SDK](/data-streams/reference/data-streams-api/go-sdk) for Go to subscribe to real-time [reports](/data-streams/reference/report-schema-overview) via a [WebSocket connection](/data-streams/reference/data-streams-api/interface-ws). You'll set up your Go project, listen for real-time reports from the Data Streams Aggregation Network, decode the report data, and log their attributes to your terminal.
## Requirements
- **Git**: Make sure you have Git installed. You can check your current version by running git --version in your terminal and download the latest version from the official [Git website](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git) if necessary.
- **Go Version**: Make sure you have Go version 1.21 or higher. You can check your current version by running `go version` in your terminal and download the latest version from the official [Go website](https://go.dev/) if necessary.
- **API Credentials**: Access to Data Streams requires API credentials. If you haven't already, [contact us](https://chainlinkcommunity.typeform.com/datastreams?typeform-source=docs.chain.link#ref_id=docs) to request mainnet or testnet access.
## Tutorial
First, you'll set up a basic Go project, installing the SDK and pasting example code. This will let you stream reports for [streams](/data-streams/crypto-streams), logging their attributes to your terminal.
### Set up your Go project
1. Create a new directory for your project and navigate to it:
```bash
mkdir my-data-streams-project
cd my-data-streams-project
```
2. Initialize a new Go module:
```bash
go mod init my-data-streams-project
```
3. Install the Data Streams SDK:
```bash
go get github.com/smartcontractkit/data-streams-sdk/go
```
### Establish a WebSocket connection and listen for real-time reports
1. Create a new Go file, `stream.go`, in your project directory:
```bash
touch stream.go
```
2. Insert the following code example and save your `stream.go` file:
```go
package main
import (
"context"
"fmt"
"os"
"time"
streams "github.com/smartcontractkit/data-streams-sdk/go"
feed "github.com/smartcontractkit/data-streams-sdk/go/feed"
report "github.com/smartcontractkit/data-streams-sdk/go/report"
// NOTE: Use the report version (v3, v8, etc.) that matches your stream
v3 "github.com/smartcontractkit/data-streams-sdk/go/report/v3"
)
func main() {
if len(os.Args) < 2 {
fmt.Println("Usage: go run stream.go [StreamID1] [StreamID2] ...")
os.Exit(1)
}
// Set up the SDK client configuration
cfg := streams.Config{
ApiKey: os.Getenv("API_KEY"),
ApiSecret: os.Getenv("API_SECRET"),
WsURL: "wss://ws.testnet-dataengine.chain.link",
Logger: streams.LogPrintf,
}
// Create a new client
client, err := streams.New(cfg)
if err != nil {
cfg.Logger("Failed to create client: %v\n", err)
os.Exit(1)
}
// Parse the feed IDs from the command line arguments
var ids []feed.ID
for _, arg := range os.Args[1:] {
var fid feed.ID
if err := fid.FromString(arg); err != nil {
cfg.Logger("Invalid stream ID %s: %v\n", arg, err)
os.Exit(1)
}
ids = append(ids, fid)
}
ctx, cancel := context.WithTimeout(context.Background(), 30*time.Second)
defer cancel()
// Subscribe to the feeds
stream, err := client.Stream(ctx, ids)
if err != nil {
cfg.Logger("Failed to subscribe: %v\n", err)
os.Exit(1)
}
defer stream.Close()
for {
reportResponse, err := stream.Read(context.Background())
if err != nil {
cfg.Logger("Error reading from stream: %v\n", err)
continue
}
// Log the contents of the report before decoding
cfg.Logger("Raw report data: %+v\n", reportResponse)
// Decode each report as it comes in
// NOTE: Use the report version (v3, v8, etc.) that matches your stream
decodedReport, decodeErr := report.Decode[v3.Data](reportResponse.FullReport)
if decodeErr != nil {
cfg.Logger("Failed to decode report: %v\n", decodeErr)
continue
}
// Log the decoded report
// NOTE: Adjust for your report and desired output
cfg.Logger("\n--- Report Stream ID: %s ---\n" +
"------------------------------------------\n" +
"Observations Timestamp : %d\n" +
"Benchmark Price : %s\n" +
"Bid : %s\n" +
"Ask : %s\n" +
"Valid From Timestamp : %d\n" +
"Expires At : %d\n" +
"Link Fee : %s\n" +
"Native Fee : %s\n" +
"------------------------------------------\n",
reportResponse.FeedID.String(),
decodedReport.Data.ObservationsTimestamp,
decodedReport.Data.BenchmarkPrice.String(),
decodedReport.Data.Bid.String(),
decodedReport.Data.Ask.String(),
decodedReport.Data.ValidFromTimestamp,
decodedReport.Data.ExpiresAt,
decodedReport.Data.LinkFee.String(),
decodedReport.Data.NativeFee.String(),
)
// Also, log the stream stats
cfg.Logger("\n--- Stream Stats ---\n" +
stream.Stats().String() + "\n" +
"--------------------------------------------------------------------------------------------------------------------------------------------\n",
)
}
}
```
3. Download the required dependencies and update the `go.mod` and `go.sum` files:
```bash
go mod tidy
```
4. Set up your API credentials as environment variables:
```bash
export API_KEY=""
export API_SECRET=""
```
Replace `` and `` with your actual credentials.
The Go script uses `os.Getenv("API_KEY")` and `os.Getenv("API_SECRET")` in its `streams.Config` to read these values. From the code sample above:
```go
cfg := streams.Config{
ApiKey: os.Getenv("API_KEY"),
ApiSecret: os.Getenv("API_SECRET"),
WsURL: "wss://ws.testnet-dataengine.chain.link",
Logger: streams.LogPrintf,
}
```
This configuration also specifies the `WsURL`, which the [Websocket URL for the API](/data-streams/reference/data-streams-api/interface-ws#domains). In this case, it is already set to the testnet URL for Data Streams.
See the [SDK Reference](/data-streams/reference/data-streams-api/go-sdk#config-struct) page for more configuration options.
5. Subscribe to a [testnet crypto stream](/data-streams/crypto-streams?page=1\&testnetPage=1#testnet-crypto-streams). The below example executes the application, subscribing to the `ETH/USD` crypto stream:
Execute your application:
```bash
go run stream.go 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
```
Expect output similar to the following in your terminal:
```bash
2024-07-31T15:34:27-05:00 Raw report data: {"fullReport":"0x0006f9b553e393ced311551efd30d1decedb63d76ad41737462e2cdbbdff15780000000000000000000000000000000000000000000000000000000035252f11000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000e00000000000000000000000000000000000000000000000000000000000000220000000000000000000000000000000000000000000000000000000000000028000010000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000120000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba7820000000000000000000000000000000000000000000000000000000066aa9fd30000000000000000000000000000000000000000000000000000000066aa9fd300000000000000000000000000000000000000000000000000001c23cdce76d0000000000000000000000000000000000000000000000000001ba0a27c8b79d40000000000000000000000000000000000000000000000000000000066abf1530000000000000000000000000000000000000000000000af35b91cbc421fe2800000000000000000000000000000000000000000000000af354910dbd1830c200000000000000000000000000000000000000000000000af3629289cb2be3f800000000000000000000000000000000000000000000000000000000000000002e03c8b14707a80c59922eeb6b89c79dd8ac6b4b925203b3fe2f0903ba6765934aaf6c4170522c0e54abecb90e7ba7b26e27a83b12740e6a6fd5835c5ece9034c000000000000000000000000000000000000000000000000000000000000000252088e89df570d7022fd2bfc71eb53bfe72423ccba1834a785d80c278b334fab65d4acced307504358554844c2007ab0322b7ab2b7bfa2bc39563bf823654a36","feedID":"0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782","validFromTimestamp":1722458067,"observationsTimestamp":1722458067}
2024-07-31T15:34:27-05:00
--- Report Stream ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782 ---
------------------------------------------
Observations Timestamp : 1722458067
Benchmark Price : 3232051369848762000000
Bid : 3232019831592780500000
Ask : 3232082908104743600000
Valid From Timestamp : 1722458067
Expires At : 1722544467
Link Fee : 7776444105849300
Native Fee : 30940102293200
------------------------------------------
2024-07-31T15:34:27-05:00
--- Stream Stats ---
accepted: 1, deduplicated: 0, total_received 1, partial_reconnects: 0, full_reconnects: 0, configured_connections: 1, active_connections 1
--------------------------------------------------------------------------------------------------------------------------------------------
2024-07-31T15:34:28-05:00 Raw report data: {"fullReport":"0x0006f9b553e393ced311551efd30d1decedb63d76ad41737462e2cdbbdff15780000000000000000000000000000000000000000000000000000000035252f14000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000e00000000000000000000000000000000000000000000000000000000000000220000000000000000000000000000000000000000000000000000000000000028000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000120000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba7820000000000000000000000000000000000000000000000000000000066aa9fd40000000000000000000000000000000000000000000000000000000066aa9fd400000000000000000000000000000000000000000000000000001c23c416de34000000000000000000000000000000000000000000000000001ba0909c32d3c00000000000000000000000000000000000000000000000000000000066abf1540000000000000000000000000000000000000000000000af35f59d91552300000000000000000000000000000000000000000000000000af34696c66686640800000000000000000000000000000000000000000000000af3c6a5680c2a6000000000000000000000000000000000000000000000000000000000000000000020d3c5953a51793330c4bb5082d0e82eca98281d340d56088b5707dbc77e5c106311585b943ced71c62a3e6b100dc9316c3580354aee92626280228dd9b6a2c3900000000000000000000000000000000000000000000000000000000000000026398ed0026b877ba17280888f1c7f93f42ca4c3148cf33761412af03b19c08880e4ee75f222eb928db5429fc4339aa1e275bf5a5ffeb6345aa0acef594024abc","feedID":"0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782","validFromTimestamp":1722458068,"observationsTimestamp":1722458068}
2024-07-31T15:34:28-05:00
--- Report Stream ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782 ---
------------------------------------------
Observations Timestamp : 1722458068
Benchmark Price : 3232068400000000000000
Bid : 3231956881848792400000
Ask : 3232533600000000000000
Valid From Timestamp : 1722458068
Expires At : 1722544468
Link Fee : 7776367327499200
Native Fee : 30939939266100
------------------------------------------
2024-07-31T15:34:28-05:00
--- Stream Stats ---
accepted: 2, deduplicated: 0, total_received 2, partial_reconnects: 0, full_reconnects: 0, configured_connections: 1, active_connections 1
--------------------------------------------------------------------------------------------------------------------------------------------
2024-07-31T15:34:29-05:00 Raw report data: {"fullReport":"0x0006f9b553e393ced311551efd30d1decedb63d76ad41737462e2cdbbdff15780000000000000000000000000000000000000000000000000000000035252f19000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000e00000000000000000000000000000000000000000000000000000000000000220000000000000000000000000000000000000000000000000000000000000028000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000120000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba7820000000000000000000000000000000000000000000000000000000066aa9fd50000000000000000000000000000000000000000000000000000000066aa9fd500000000000000000000000000000000000000000000000000001c2232164340000000000000000000000000000000000000000000000000001ba02d9e17e83c0000000000000000000000000000000000000000000000000000000066abf1550000000000000000000000000000000000000000000000af3fbd367bea5ac0000000000000000000000000000000000000000000000000af3f1f78eb5653c0000000000000000000000000000000000000000000000000af405a99196de7800000000000000000000000000000000000000000000000000000000000000000020a7b2c4de654a6fb2e0b9c3706521a94bb852c705fe03e682da43301986c229f99b40a47c34b2d23c51e6323274d68b5c6d0d36dbc02586233d50dfc3ef193700000000000000000000000000000000000000000000000000000000000000002364b1b5d922cfe20faa94011a22324ed452fe17a0c1d1475a468974a39419aae33a027865c4a2738fbd59f2ce3a1c72435054a72084b4802f205fe7a690d1ecc","feedID":"0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782","validFromTimestamp":1722458069,"observationsTimestamp":1722458069}
2024-07-31T15:34:29-05:00
--- Report Stream ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782 ---
------------------------------------------
Observations Timestamp : 1722458069
Benchmark Price : 3232773100000000000000
Bid : 3232728700000000000000
Ask : 3232817400000000000000
Valid From Timestamp : 1722458069
Expires At : 1722544469
Link Fee : 7775942157527100
Native Fee : 30933194785600
------------------------------------------
2024-07-31T15:34:29-05:00
--- Stream Stats ---
accepted: 3, deduplicated: 0, total_received 3, partial_reconnects: 0, full_reconnects: 0, configured_connections: 1, active_connections 1
--------------------------------------------------------------------------------------------------------------------------------------------
[...]
```
Your application has successfully subscribed to the report data.
[Learn more about the decoded report details](#decoded-report-details).
### Decoded report details
The decoded report details include:
| Attribute | Value | Description |
| ------------------------ | -------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `Stream ID` | `0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782` | The unique identifier for the stream. In this example, the stream is for ETH/USD. |
| `Observations Timestamp` | `1722458069` | The timestamp indicating when the data was captured. |
| `Benchmark Price` | `3232773100000000000000` | The observed price in the report. For readability: `3,232.7731000000000` USD per ETH. |
| `Bid` | `3232728700000000000000` | The highest price a buyer is willing to pay for an asset. For readability: `3,232.7287000000000` USD per ETH. Learn more about the [Bid price](/data-streams/concepts/liquidity-weighted-prices). |
| `Ask` | `3232817400000000000000` | The lowest price a seller is willing to accept for an asset. For readability: `3,232.8174000000000` USD per ETH. Learn more about the [Ask price](/data-streams/concepts/liquidity-weighted-prices). |
| `Valid From Timestamp` | `1722458069` | The start validity timestamp for the report, indicating when the data becomes relevant. |
| `Expires At` | `1722544469` | The expiration timestamp of the report, indicating the point at which the data becomes outdated. |
| `Link Fee` | `7775942157527100` | The fee to pay in LINK tokens for the onchain verification of the report data. For readability: `0.0077759421575271` LINK. |
| `Native Fee` | `30933194785600` | The fee to pay in the native blockchain token (e.g., ETH on Ethereum) for the onchain verification of the report data. **Note:** This example fee is not indicative of actual fees. |
For descriptions and data types of other report schemas, see the [Report Schema Overview](/data-streams/reference/report-schema-overview).
### Subscribing to multiple streams
You can subscribe to multiple streams by providing additional stream IDs as command-line arguments:
```bash
go run stream.go 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782 0x00036fe43f87884450b4c7e093cd5ed99cac6640d8c2000e6afc02c8838d0265
```
This will subscribe to both ETH/USD and BTC/USD streams.
### High Availability (HA) mode
The example above demonstrates streaming data from a single crypto stream. For production environments, especially when subscribing to multiple streams, it's recommended to enable [High Availability (HA) mode](/data-streams/reference/data-streams-api/go-sdk#high-availability-ha-mode).
High Availability (HA) mode creates multiple WebSocket connections to different origin endpoints for improved reliability. When HA mode is enabled, the Stream will maintain at least 2 concurrent connections to different instances to ensure high availability, fault tolerance and minimize the risk of report gaps.
#### Enabling HA mode
To enable HA mode in your streaming application, make these changes to the basic example:
```go
// ... existing code ...
// Enable HA mode with mainnet endpoint
cfg := streams.Config{
ApiKey: os.Getenv("API_KEY"),
ApiSecret: os.Getenv("API_SECRET"),
WsURL: "wss://ws.dataengine.chain.link", // Use mainnet endpoint for HA mode
WsHA: true, // Enable High Availability mode
Logger: streams.LogPrintf,
}
client, err := streams.New(cfg)
if err != nil {
cfg.Logger("Failed to create client: %v\n", err)
os.Exit(1)
}
// ... existing code ...
// Optional: Change streams subscription to use StreamWithStatusCallback for connection monitoring
stream, err := client.StreamWithStatusCallback(
ctx, ids,
func(isConnected bool, host string, origin string) {
status := "DISCONNECTED"
if isConnected {
status = "CONNECTED"
}
cfg.Logger("Host: %s, Origin: %s, Status: %s\n", host, origin, status)
},
)
// ... existing code ...
```
When `WsHA` is `true`, the SDK automatically discovers multiple origin endpoints behind the single URL and establishes separate connections to each origin. You also must use a mainnet endpoint, as HA mode is not currently supported on testnet.
The optional `StreamWithStatusCallback` can be used to monitor individual connection status. The SDK already provides comprehensive connection logs through `stream.Stats().String()`, so this callback is primarily useful for custom alerting or connection monitoring.
See more details about HA mode in the [SDK Reference](/data-streams/reference/data-streams-api/go-sdk#high-availability-ha-mode).
### Payload for onchain verification
In this tutorial, you logged and decoded the `full_report` payloads to extract the report data. However, in a production environment, you should verify the data to ensure its integrity and authenticity.
Refer to the [Verify report data onchain](/data-streams/tutorials/evm-onchain-report-verification) tutorial to learn more.
## Adapting code for different report schema versions
When working with different versions of [Data Stream reports](/data-streams/reference/report-schema-overview), you'll need to adapt your code to handle the specific report schema version they use:
1. Import the correct schema version. Examples:
- For v3 schema (as used in this example):
```go
v3 "github.com/smartcontractkit/data-streams-sdk/go/report/v3"
```
- For v8 schema:
```go
v8 "github.com/smartcontractkit/data-streams-sdk/go/report/v8"
```
2. Update the decode function to use the correct schema version. Examples:
- For v3 schema (as used in this example):
```go
decodedReport, err := report.Decode[v3.Data](reportResponse.FullReport)
```
- For v8 schema:
```go
decodedReport, err := report.Decode[v8.Data](reportResponse.FullReport)
```
3. Access fields according to the schema version structure. Refer to the [Report Schemas](/data-streams/reference/report-schema-overview) documentation for complete field references for each version.
## Explanation
### Establishing a WebSocket connection and listening for reports
Your application uses the [Stream](https://github.com/smartcontractkit/data-streams-sdk/blob/main/go/client.go#L103) function in the [Data Streams SDK](/data-streams/reference/data-streams-api/go-sdk)'s client package to establish a real-time WebSocket connection with the Data Streams Aggregation Network.
Once the WebSocket connection is established, your application subscribes to one or more streams by passing an array of `feed.IDs` to the `Stream` function. This subscription lets the client receive real-time updates whenever new report data is available for the specified streams.
### Decoding a report
As data reports arrive via the established WebSocket connection, they are processed in real-time:
- Reading streams: The [`Read`](https://github.com/smartcontractkit/data-streams-sdk/blob/main/go/stream.go#L292) method on the returned Stream object is continuously called within a loop. This method blocks until new data is available, ensuring that all incoming reports are captured as soon as they are broadcasted.
- Decoding reports: For each received report, the SDK's [`Decode`](https://github.com/smartcontractkit/data-streams-sdk/blob/main/go/report/report.go#L36) function parses and transforms the raw data into a structured format (in this tutorial, `v3.Data` for [crypto streams](/data-streams/crypto-streams)). This decoded data includes data such as the benchmark price and [bid and ask](/data-streams/concepts/liquidity-weighted-prices) prices.
### Handling the decoded data
In this example, the application logs the structured report data to the terminal. However, this data can be used for further processing, analysis, or display in your own application.
---
# Data Streams Tutorials
Source: https://docs.chain.link/data-streams/tutorials/overview
Explore several tutorials to learn how to use the Data Streams.
## Fetch, Stream, and Decode Reports
- [Fetch and decode reports](/data-streams/tutorials/go-sdk-fetch): Learn how to fetch and decode reports from the Data Streams Aggregation Network, using the [Go](/data-streams/reference/data-streams-api/go-sdk), [Rust](/data-streams/reference/data-streams-api/rust-sdk), or [TypeScript](/data-streams/reference/data-streams-api/ts-sdk) SDKs.
- [Stream and decode reports (WebSocket)](/data-streams/tutorials/go-sdk-stream): Learn how to listen for real-time reports from the Data Streams Aggregation Network, decode the report data, and log their attributes, using the [Go](/data-streams/reference/data-streams-api/go-sdk), [Rust](/data-streams/reference/data-streams-api/rust-sdk), or [TypeScript](/data-streams/reference/data-streams-api/ts-sdk) SDKs.
## Verify Reports
- [EVM onchain](/data-streams/tutorials/evm-onchain-report-verification): Learn how to verify onchain the integrity of reports by confirming their authenticity as signed by the Decentralized Oracle Network (DON).
- [Solana onchain integration](/data-streams/tutorials/solana-onchain-report-verification): Learn how to verify Chainlink Data Streams reports directly within your Solana program using Cross-Program Invocation (CPI) to ensure data integrity.
- [Solana offchain integration](/data-streams/tutorials/solana-offchain-report-verification): Learn how to verify Chainlink Data Streams reports client-side using the Rust SDK on Solana to ensure data authenticity before using it in your application.
---
# Fetch and decode Data Streams reports using the Rust SDK
Source: https://docs.chain.link/data-streams/tutorials/rust-sdk-fetch
In this tutorial, you'll learn how to use the [Data Streams SDK](/data-streams/reference/data-streams-api/rust-sdk) for Rust to fetch and decode [reports](/data-streams/reference/report-schema-overview) from the Data Streams Aggregation Network. You'll set up your Rust project, retrieve reports, decode them, and log their attributes.
## Requirements
- **Rust**: Make sure you have Rust installed. You can install Rust by following the instructions on the official [Rust website](https://www.rust-lang.org/tools/install).
- **API Credentials**: Access to Data Streams requires API credentials. If you haven't already, [contact us](https://chainlinkcommunity.typeform.com/datastreams?typeform-source=docs.chain.link#ref_id=docs) to request mainnet or testnet access.
## Tutorial
You'll start with the set up of your Rust project, installing the SDK and pasting example code. Next, you'll fetch and decode reports for crypto streams and log their attributes to your terminal.
### Set up your Rust project
1. Create a new directory for your project and navigate to it:
```bash
mkdir my-data-streams-project && cd my-data-streams-project
```
2. Initialize a new Rust project:
```bash
cargo init
```
3. Add the following dependencies to your `Cargo.toml` file:
```toml
[dependencies]
chainlink-data-streams-sdk = "1.0.3"
chainlink-data-streams-report = "1.0.3"
tokio = { version = "1.4", features = ["full"] }
hex = "0.4"
```
### Fetch and decode a report with a single stream
1. Replace the contents of `src/main.rs` with the following code:
```rust
use chainlink_data_streams_report::feed_id::ID;
// NOTE: Use the report version (v3, v8, etc.) that matches your stream
use chainlink_data_streams_report::report::{ decode_full_report, v3::ReportDataV3 };
use chainlink_data_streams_sdk::client::Client;
use chainlink_data_streams_sdk::config::Config;
use std::env;
use std::error::Error;
#[tokio::main]
async fn main() -> Result<(), Box> {
// Get feed ID from command line arguments
let args: Vec = env::args().collect();
if args.len() < 2 {
eprintln!("Usage: cargo run [FeedID]");
std::process::exit(1);
}
let feed_id_input = &args[1];
// Get API credentials from environment variables
let api_key = env::var("API_KEY").expect("API_KEY must be set");
let api_secret = env::var("API_SECRET").expect("API_SECRET must be set");
// Initialize the configuration
let config = Config::new(
api_key,
api_secret,
"https://api.testnet-dataengine.chain.link".to_string(),
"wss://api.testnet-dataengine.chain.link/ws".to_string()
).build()?;
// Initialize the client
let client = Client::new(config)?;
// Parse the feed ID
let feed_id = ID::from_hex_str(feed_id_input)?;
// Fetch the latest report
let response = client.get_latest_report(feed_id).await?;
println!("\nRaw report data: {:?}\n", response.report);
// Decode the report
let full_report = hex::decode(&response.report.full_report[2..])?;
let (_report_context, report_blob) = decode_full_report(&full_report)?;
// NOTE: Use the report version (v3, v8, etc.) that matches your stream
let report_data = ReportDataV3::decode(&report_blob)?;
// Print decoded report details
// NOTE: Adjust for your report and desired output
println!("\nDecoded Report for Stream ID {}:", feed_id_input);
println!("------------------------------------------");
println!("Observations Timestamp: {}", response.report.observations_timestamp);
println!("Benchmark Price : {}", report_data.benchmark_price);
println!("Bid : {}", report_data.bid);
println!("Ask : {}", report_data.ask);
println!("Valid From Timestamp : {}", response.report.valid_from_timestamp);
println!("Expires At : {}", report_data.expires_at);
println!("Link Fee : {}", report_data.link_fee);
println!("Native Fee : {}", report_data.native_fee);
println!("------------------------------------------");
Ok(())
}
```
2. Set up your API credentials as environment variables:
```bash
export API_KEY=""
export API_SECRET=""
```
Replace `` and `` with your API credentials.
The Rust code sample reads these environment variables using `std::env::var("API_KEY")` and `std::env::var("API_SECRET")` when building the client configuration:
```rust
let api_key = std::env::var("API_KEY").expect("API_KEY must be set");
let api_secret = std::env::var("API_SECRET").expect("API_SECRET must be set");
// Initialize the configuration
let config = Config::new(
api_key,
api_secret,
"https://api.testnet-dataengine.chain.link".to_string(),
"wss://api.testnet-dataengine.chain.link/ws".to_string()
).build()?;
```
This configuration also specifies the `rest_url`, which is the base URL for the API, along with the WebSocket endpoint [for subscribing to a streamed data report](rust-sdk-stream). In this example, both are set to the testnet URLs for Data Streams.
See the [Rust SDK Reference](/data-streams/reference/data-streams-api/rust-sdk#configuration-reference) page for more configuration options.
3. Read from a [testnet crypto stream](/data-streams/crypto-streams?page=1\&testnetPage=1#testnet-crypto-streams). The below example executes the application, reading from the `ETH/USD` crypto stream:
```bash
cargo run -- 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
```
Expect output similar to the following in your terminal:
```bash
Raw report data: Report { feed_id: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782, valid_from_timestamp: 1734124400, observations_timestamp: 1734124400, full_report: "0x0006f9b553e393ced311551efd30d1decedb63d76ad41737462e2cdbbdff1578000000000000000000000000000000000000000000000000000000004f56930f000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000e00000000000000000000000000000000000000000000000000000000000000220000000000000000000000000000000000000000000000000000000000000028001010000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000120000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba78200000000000000000000000000000000000000000000000000000000675ca37000000000000000000000000000000000000000000000000000000000675ca3700000000000000000000000000000000000000000000000000000174be1bd8758000000000000000000000000000000000000000000000000000cb326ce8c3ea800000000000000000000000000000000000000000000000000000000675df4f00000000000000000000000000000000000000000000000d3a30bcc15e207c0000000000000000000000000000000000000000000000000d3a1557b5e634060200000000000000000000000000000000000000000000000d3ab99a974ff10f400000000000000000000000000000000000000000000000000000000000000000292bdd75612560e46ed9b0c2437898f81eb0e18b6b902a161b9708e9177175cf3b8ef2b279f230f766fb29306250ee90856516ee349ca42b2d7fb141deb006745000000000000000000000000000000000000000000000000000000000000000221c156e80276827e1bfeb6542ab064dfa958f5be955f516fb62b1c93437472c31cc65fcaba68c9d661701190bc32025a0690af0eefe027ac218fd15c588dd4d5" }
Decoded Report for Stream ID 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782:
------------------------------------------
Observations Timestamp: 1734124400
Benchmark Price : 3904011708000000000000
Bid : 3903888333211164500000
Ask : 3904628100124598400000
Valid From Timestamp : 1734124400
Expires At : 1734210800
Link Fee : 3574678975954600
Native Fee : 25614677280600
------------------------------------------
```
Your application has successfully decoded the report data.
[Learn more about the decoded report details](#decoded-report-details).
#### Decoded report details
The decoded report details include:
| Attribute | Value | Description |
| ------------------------ | -------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `Stream ID` | `0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782` | The unique identifier for the stream. In this example, the stream is for ETH/USD. |
| `Observations Timestamp` | `1734124400` | The timestamp indicating when the data was captured. |
| `Benchmark Price` | `3904011708000000000000` | The observed price in the report, with 18 decimals. For readability: `3,904.0117080000000` USD per ETH. |
| `Bid` | `3903888333211164500000` | The highest price a buyer is willing to pay for an asset, with 18 decimals. For readability: `3,903.8883332111645` USD per ETH. Learn more about the [Bid price](/data-streams/concepts/liquidity-weighted-prices). (For [DEX State Price streams](/data-streams/concepts/dex-state-price-streams), this value equals `Benchmark Price`.) |
| `Ask` | `3904628100124598400000` | The lowest price a seller is willing to accept for an asset, with 18 decimals. For readability: `3,904.6281001245984` USD per ETH. Learn more about the [Ask price](/data-streams/concepts/liquidity-weighted-prices). (For [DEX State Price streams](/data-streams/concepts/dex-state-price-streams), this value equals `Benchmark Price`.) |
| `Valid From Timestamp` | `1734124400` | The start validity timestamp for the report, indicating when the data becomes relevant. |
| `Expires At` | `1734210800` | The expiration timestamp of the report, indicating the point at which the data becomes outdated. |
| `Link Fee` | `3574678975954600` | The fee to pay in LINK tokens for the onchain verification of the report data. With 18 decimals. For readability: `0.003574678975954600` LINK. **Note:** This example fee is not indicative of actual fees. |
| `Native Fee` | `25614677280600` | The fee to pay in the native blockchain token (e.g., ETH on Ethereum) for the onchain verification of the report data. With 18 decimals. For readability: `0.0000256146772806000` ETH. **Note:** This example fee is not indicative of actual fees. |
For descriptions and data types of other report schemas, see the [Report Schema Overview](/data-streams/reference/report-schema-overview).
#### Payload for onchain verification
In this tutorial, you logged and decoded the `full_report` payloads to extract the report data. However, in a production environment, you should verify the data to ensure its integrity and authenticity.
Refer to the [Verify report data onchain](/data-streams/tutorials/evm-onchain-report-verification) tutorial to learn more.
## Adapting code for different report schema versions
When working with different versions of [Data Stream reports](/data-streams/reference/report-schema-overview), you'll need to adapt your code to handle the specific report schema version they use:
1. Import the correct schema version module. Examples:
- For v3 schema (as used in this example):
```rust
use chainlink_data_streams_report::report::{ decode_full_report, v3::ReportDataV3 };
```
- For v8 schema:
```rust
use chainlink_data_streams_report::report::{ decode_full_report, v8::ReportDataV8 };
```
2. Update the decode function to use the correct schema version. Examples:
- For v3 schema (as used in this example):
```rust
let report_data = ReportDataV3::decode(&report_blob)?;
```
- For v8 schema:
```rust
let report_data = ReportDataV8::decode(&report_blob)?;
```
3. Access fields according to the schema version structure. Refer to the [Report Schemas](/data-streams/reference/report-schema-overview) documentation for complete field references for each version.
## Explanation
### Initializing the API client and configuration
The API client is initialized in two steps:
1. [`Config::new`](https://github.com/smartcontractkit/data-streams-sdk/blob/main/rust/crates/sdk/src/config.rs#L131) creates a configuration with your API credentials and endpoints. This function:
- Validates your API key and secret
- Sets up the REST API endpoint for data retrieval
- Configures optional settings like TLS verification
2. [`Client::new`](https://github.com/smartcontractkit/data-streams-sdk/blob/main/rust/crates/sdk/src/client.rs#L131) creates the HTTP client with your configuration. This client:
- Handles authentication automatically
- Manages HTTP connections
- Implements retry logic for failed requests
### Fetching reports
The SDK provides several methods to fetch reports through the REST API:
1. Latest report: [`get_latest_report`](https://github.com/smartcontractkit/data-streams-sdk/blob/main/rust/crates/sdk/src/client.rs#L165) retrieves the most recent report for a feed:
- Takes a feed ID as input
- Returns a single report with the latest timestamp
- Useful for applications that need the most current data
2. Historical report: [`get_report`](https://github.com/smartcontractkit/data-streams-sdk/blob/main/rust/crates/sdk/src/client.rs#L242) fetches a report at a specific timestamp:
- Takes both feed ID and timestamp
- Returns the report closest to the requested timestamp
- Helpful for historical analysis or verification
Each API request automatically:
- Generates HMAC authentication headers
- Includes proper timestamps
- Handles HTTP response status codes
- Deserializes the JSON response into Rust structures
### Decoding reports
Reports are decoded in three stages:
1. Hex decoding: The `full_report` field comes as a hex string prefixed with "0x":
```rust
let full_report = hex::decode(&response.report.full_report[2..])?;
```
2. Report separation: [`decode_full_report`](https://github.com/smartcontractkit/data-streams-sdk/blob/main/rust/crates/report/src/report.rs#L83) splits the binary data:
- Extracts the report context (metadata)
- Isolates the report blob (actual data)
- Validates the report format
3. Data extraction: [`ReportDataV3::decode`](https://github.com/smartcontractkit/data-streams-sdk/blob/main/rust/crates/report/src/report/v3.rs#L80) parses the report blob into structured data:
- Benchmark price
- Bid and ask prices for [liquidity-weighted pricing](/data-streams/concepts/liquidity-weighted-prices)
- **Note:** For [DEX State Price streams](/data-streams/concepts/dex-state-price-streams), which also use the V3 schema, the `bid` and `ask` fields contain the same value as `benchmark_price`.
- Fee information for onchain verification
- Timestamp information
### Error handling
The example demonstrates Rust's robust error handling:
1. Type-safe errors:
- Uses custom error types for different failure scenarios
- Implements the `Error` trait for proper error propagation
- Provides detailed error messages for debugging
2. Error propagation:
- Uses the `?` operator for clean error handling
- Converts errors between types when needed
- Bubbles up errors to the main function
3. Input validation:
- Checks command-line arguments
- Validates environment variables
- Verifies feed ID format
The decoded data can be used for further processing, analysis, or display in your application. For production environments, you must verify the data onchain using the provided `full_report` payload.
---
# Stream and decode Data Streams reports via WebSocket using the Rust SDK
Source: https://docs.chain.link/data-streams/tutorials/rust-sdk-stream
In this tutorial, you'll learn how to use the [Data Streams SDK](/data-streams/reference/data-streams-api/rust-sdk) for Rust to subscribe to real-time [reports](/data-streams/reference/report-schema-overview) via a [WebSocket connection](/data-streams/reference/data-streams-api/interface-ws). You'll set up your Rust project, listen for real-time reports from the Data Streams Aggregation Network, decode the report data, and log their attributes to your terminal.
## Requirements
- **Rust**: Make sure you have Rust installed. You can install Rust by following the instructions on the official [Rust website](https://www.rust-lang.org/tools/install).
- **API Credentials**: Access to Data Streams requires API credentials. If you haven't already, [contact us](https://chainlinkcommunity.typeform.com/datastreams?typeform-source=docs.chain.link#ref_id=docs) to request mainnet or testnet access.
## Tutorial
### Set up your Rust project
1. Create a new directory for your project and navigate to it:
```bash
mkdir my-data-streams-project && cd my-data-streams-project
```
2. Initialize a new Rust project:
```bash
cargo init
```
3. Add the following dependencies to your `Cargo.toml` file:
```toml
[dependencies]
chainlink-data-streams-sdk = "1.0.3"
chainlink-data-streams-report = "1.0.3"
tokio = { version = "1.4", features = ["full"] }
hex = "0.4"
tracing = "0.1"
tracing-subscriber = { version = "0.3", features = ["time"] }
```
Note: The `tracing` feature is required for logging functionality.
### Establish a WebSocket connection and listen for real-time reports
1. Replace the contents of `src/main.rs` with the following code:
```rust
use chainlink_data_streams_report::feed_id::ID;
// NOTE: Use the report version (v3, v8, etc.) that matches your stream
use chainlink_data_streams_report::report::{ decode_full_report, v3::ReportDataV3 };
use chainlink_data_streams_sdk::config::Config;
use chainlink_data_streams_sdk::stream::Stream;
use std::env;
use std::error::Error;
use tracing::{ info, warn };
use tracing_subscriber::fmt::time::UtcTime;
#[tokio::main]
async fn main() -> Result<(), Box> {
// Initialize logging with UTC timestamps
tracing_subscriber
::fmt()
.with_timer(UtcTime::rfc_3339())
.with_max_level(tracing::Level::INFO)
.init();
// Get feed IDs from command line arguments
let args: Vec = env::args().collect();
if args.len() < 2 {
eprintln!("Usage: cargo run [StreamID1] [StreamID2] ...");
std::process::exit(1);
}
// Get API credentials from environment variables
let api_key = env::var("API_KEY").expect("API_KEY must be set");
let api_secret = env::var("API_SECRET").expect("API_SECRET must be set");
// Parse feed IDs from command line arguments
let mut feed_ids = Vec::new();
for arg in args.iter().skip(1) {
let feed_id = ID::from_hex_str(arg)?;
feed_ids.push(feed_id);
}
// Initialize the configuration
let config = Config::new(
api_key,
api_secret,
"https://api.testnet-dataengine.chain.link".to_string(),
"wss://ws.testnet-dataengine.chain.link".to_string()
).build()?;
// Create and initialize the stream
let mut stream = Stream::new(&config, feed_ids).await?;
stream.listen().await?;
info!("WebSocket connection established. Listening for reports...");
// Process incoming reports
loop {
match stream.read().await {
Ok(response) => {
info!("\nRaw report data: {:?}\n", response.report);
// Decode the report
let full_report = hex::decode(&response.report.full_report[2..])?;
let (_report_context, report_blob) = decode_full_report(&full_report)?;
// NOTE: Use the report version (v3, v8, etc.) that matches your stream
let report_data = ReportDataV3::decode(&report_blob)?;
// Print decoded report details
// NOTE: Adjust for your report and desired output
info!(
"\n--- Report Stream ID: {} ---\n\
------------------------------------------\n\
Observations Timestamp : {}\n\
Price : {}\n\
Bid : {}\n\
Ask : {}\n\
Valid From Timestamp : {}\n\
Expires At : {}\n\
Link Fee : {}\n\
Native Fee : {}\n\
------------------------------------------",
response.report.feed_id.to_hex_string(),
response.report.observations_timestamp,
report_data.benchmark_price,
report_data.bid,
report_data.ask,
response.report.valid_from_timestamp,
report_data.expires_at,
report_data.link_fee,
report_data.native_fee
);
// Print stream stats
info!(
"\n--- Stream Stats ---\n{:#?}\n\
--------------------------------------------------------------------------------------------------------------------------------------------",
stream.get_stats()
);
}
Err(e) => {
warn!("Error reading from stream: {:?}", e);
}
}
}
// Note: In a production environment, you should implement proper cleanup
// by calling stream.close() when the application is terminated.
// For example:
//
// tokio::select! {
// _ = tokio::signal::ctrl_c() => {
// info!("Received shutdown signal");
// stream.close().await?;
// }
// result = stream.read() => {
// // Process result
// }
// }
}
```
2. Set up your API credentials as environment variables:
```bash
export API_KEY=""
export API_SECRET=""
```
Replace `` and `` with your API credentials.
The Rust code sample reads these environment variables using `std::env::var("API_KEY")` and `std::env::var("API_SECRET")` when building the client configuration:
```rust
let api_key = std::env::var("API_KEY").expect("API_KEY must be set");
let api_secret = std::env::var("API_SECRET").expect("API_SECRET must be set");
// Initialize the configuration
let config = Config::new(
api_key,
api_secret,
"https://api.testnet-dataengine.chain.link".to_string(),
"wss://api.testnet-dataengine.chain.link/ws".to_string()
).build()?;
```
This configuration also specifies the `rest_url`, which is the base URL for the API, along with the WebSocket endpoint [for subscribing to a streamed data report](rust-sdk-stream). In this example, both are set to the testnet URLs for Data Streams.
See the [Rust SDK Reference](/data-streams/reference/data-streams-api/rust-sdk#configuration-reference) page for more configuration options.
3. Subscribe to a [testnet crypto stream](/data-streams/crypto-streams?page=1\&testnetPage=1#testnet-crypto-streams). The below example executes the application, subscribing to the `ETH/USD` crypto stream:
```bash
cargo run -- 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
```
Expect output similar to the following in your terminal:
```bash
2024-12-13T23:07:56.463719Z INFO my_data_streams_project: WebSocket connection established. Listening for reports...
2024-12-13T23:07:56.463824Z INFO data_streams_sdk::stream::monitor_connection: Received ping: [49]
2024-12-13T23:07:56.463868Z INFO data_streams_sdk::stream::monitor_connection: Responding with pong: [49]
2024-12-13T23:07:57.060504Z INFO data_streams_sdk::stream::monitor_connection: Received new report from Data Streams Endpoint.
2024-12-13T23:07:57.061078Z INFO my_data_streams_project:
Raw report data: Report { feed_id: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782, valid_from_timestamp: 1734131277, observations_timestamp: 1734131277, full_report: "0x0006f9b553e393ced311551efd30d1decedb63d76ad41737462e2cdbbdff1578000000000000000000000000000000000000000000000000000000004f5ac90d000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000e00000000000000000000000000000000000000000000000000000000000000220000000000000000000000000000000000000000000000000000000000000028001010000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000120000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba78200000000000000000000000000000000000000000000000000000000675cbe4d00000000000000000000000000000000000000000000000000000000675cbe4d000000000000000000000000000000000000000000000000000017489b06e8fc000000000000000000000000000000000000000000000000000c7badb615bd1400000000000000000000000000000000000000000000000000000000675e0fcd0000000000000000000000000000000000000000000000d3c0d34ca0d14d85600000000000000000000000000000000000000000000000d3bda64c97c9f3a3a00000000000000000000000000000000000000000000000d3c1a08e0cffd77690000000000000000000000000000000000000000000000000000000000000000238102110cad488ecf151a17276fcfad6ef1f05593edfe80f6823b729416f826972ba32d085525b1d7ab79e6ae8188928c86051a4fc75f500bffabda2acd1d1f900000000000000000000000000000000000000000000000000000000000000024dddbc660abf75c30cb3c2aa375c87d228b2ee8735e339f59c5214897c0b89af39a7602df754364cce029f6eb7699ee02ffded96d0c46b5919e81ee4f650d1cb" }
2024-12-13T23:07:57.062344Z INFO my_data_streams_project:
--- Report Stream ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782 ---
------------------------------------------
Observations Timestamp : 1734131277
Price : 3906157533081673500000
Bid : 3905928693886829700000
Ask : 3906215307384792250000
Valid From Timestamp : 1734131277
Expires At : 1734217677
Link Fee : 3513685734964500
Native Fee : 25600606005500
------------------------------------------
2024-12-13T23:07:57.062489Z INFO my_data_streams_project:
--- Stream Stats ---
StatsSnapshot {
accepted: 1,
deduplicated: 0,
total_received: 1,
partial_reconnects: 0,
full_reconnects: 0,
configured_connections: 1,
active_connections: 1,
}
--------------------------------------------------------------------------------------------------------------------------------------------
2024-12-13T23:07:58.065686Z INFO data_streams_sdk::stream::monitor_connection: Received new report from Data Streams Endpoint.
2024-12-13T23:07:58.066315Z INFO my_data_streams_project:
Raw report data: Report { feed_id: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782, valid_from_timestamp: 1734131278, observations_timestamp: 1734131278, full_report: "0x0006f9b553e393ced311551efd30d1decedb63d76ad41737462e2cdbbdff1578000000000000000000000000000000000000000000000000000000004f5ac911000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000e00000000000000000000000000000000000000000000000000000000000000220000000000000000000000000000000000000000000000000000000000000028000010000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000120000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba78200000000000000000000000000000000000000000000000000000000675cbe4e00000000000000000000000000000000000000000000000000000000675cbe4e00000000000000000000000000000000000000000000000000001748ee300af4000000000000000000000000000000000000000000000000000c7b8b51304fbc00000000000000000000000000000000000000000000000000000000675e0fce0000000000000000000000000000000000000000000000d3bddf08d0b10a28e00000000000000000000000000000000000000000000000d3bb84af9f92f963c00000000000000000000000000000000000000000000000d3bf1d6bf14e501fc000000000000000000000000000000000000000000000000000000000000000021402b6b82c20826315384d74b3235b95f136ac65bba8c9e97c24d786e499894f298b51ae4aeba55cce0f85f2463e49e0a5e001b9a66f5b7b91e8be37d81d6cc5000000000000000000000000000000000000000000000000000000000000000217895cb599abc88d7b695edafed5ca5a5fc970f079b48bc2218888eec1fcccb0430c1ba2aa13b0d10f6c6b19a43cdb770029f4fb5804b0e2ef5ba3e73ca710f8" }
2024-12-13T23:07:58.067395Z INFO my_data_streams_project:
--- Report Stream ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782 ---
------------------------------------------
Observations Timestamp : 1734131278
Price : 3905944663438106700000
Bid : 3905775117434634200000
Ask : 3906034281472429400000
Valid From Timestamp : 1734131278
Expires At : 1734217678
Link Fee : 3513538013319100
Native Fee : 25602001210100
------------------------------------------
2024-12-13T23:07:58.067633Z INFO my_data_streams_project:
--- Stream Stats ---
StatsSnapshot {
accepted: 2,
deduplicated: 0,
total_received: 2,
partial_reconnects: 0,
full_reconnects: 0,
configured_connections: 1,
active_connections: 1,
}
[...]
```
Your application has successfully subscribed to the report data.
[Learn more about the decoded report details](#decoded-report-details).
### Subscribing to multiple streams
You can subscribe to multiple streams by providing additional stream IDs as command-line arguments:
```bash
cargo run -- \
0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782 \
0x00037da06d56d083fe599397a4769a042d63aa73dc4ef57709d31e9971a5b439
```
This will subscribe to both ETH/USD and BTC/USD streams.
### High Availability (HA) mode
The example above demonstrates streaming data from a single crypto stream. For production environments, especially when subscribing to multiple streams, it's recommended to enable [High Availability (HA) mode](/data-streams/reference/data-streams-api/rust-sdk#high-availability-mode).
High Availability (HA) mode creates multiple WebSocket connections to different origin endpoints for improved reliability. When HA mode is enabled, the Stream will maintain at least 2 concurrent connections to different instances to ensure high availability, fault tolerance and minimize the risk of report gaps.
#### Enabling HA mode
To enable HA mode in your streaming application, make these changes to the basic example. You also must use a mainnet endpoint, as HA mode is not currently supported on testnet.
```rust
// ... existing code ...
use chainlink_data_streams_sdk::config::{Config, WebSocketHighAvailability}; // Import WebSocketHighAvailability
// ... existing code ...
// Initialize the configuration with HA mode
let config = Config::new(
api_key,
api_secret,
"https://api.dataengine.chain.link".to_string(), // Mainnet endpoint
"wss://ws.dataengine.chain.link,wss://ws.dataengine.chain.link".to_string(), // Multiple WebSocket endpoints
)
.with_ws_ha(WebSocketHighAvailability::Enabled) // Enable WebSocket High Availability Mode
.build()?;
// ... existing code ...
```
See more details about HA mode in the [SDK Reference](/data-streams/reference/data-streams-api/rust-sdk#high-availability-mode).
### Decoded report details
The decoded report details include:
| Attribute | Value | Description |
| ------------------------ | -------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| `Stream ID` | `0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782` | The unique identifier for the stream. In this example, the stream is for ETH/USD. |
| `Observations Timestamp` | `1734131277` | The timestamp indicating when the data was captured. |
| `Price` | `3906157533081673500000` | The observed price in the report. For readability: `3,906.1575330816735` USD per ETH. |
| `Bid` | `3905928693886829700000` | The highest price a buyer is willing to pay for an asset. For readability: `3,905.9286938868297` USD per ETH. Learn more about the [Bid price](/data-streams/concepts/liquidity-weighted-prices). |
| `Ask` | `3906215307384792250000` | The lowest price a seller is willing to accept for an asset. For readability: `3,906.2153073847923` USD per ETH. Learn more about the [Ask price](/data-streams/concepts/liquidity-weighted-prices). |
| `Valid From Timestamp` | `1734131277` | The start validity timestamp for the report, indicating when the data becomes relevant. |
| `Expires At` | `1734217677` | The expiration timestamp of the report, indicating the point at which the data becomes outdated. |
| `Link Fee` | `3513685734964500` | The fee to pay in LINK tokens for the onchain verification of the report data. For readability: `0.0035136857349645` LINK. **Note:** This example fee is not indicative of actual fees. |
| `Native Fee` | `25600606005500` | The fee to pay in the native blockchain token (e.g., ETH on Ethereum) for the onchain verification of the report data. For readability: `0.0000256006060055` ETH. **Note:** This example fee is not indicative of actual fees. |
For descriptions and data types of other report schemas, see the [Report Schema Overview](/data-streams/reference/report-schema-overview).
### Payload for onchain verification
In this tutorial, you logged and decoded the `full_report` payloads to extract the report data. However, in a production environment, you should verify the data to ensure its integrity and authenticity.
Refer to the [Verify report data onchain](/data-streams/tutorials/evm-onchain-report-verification) tutorial to learn more.
## Adapting code for different report schema versions
When working with different versions of [Data Stream reports](/data-streams/reference/report-schema-overview), you'll need to adapt your code to handle the specific report schema version they use:
1. Import the correct schema version module. Examples:
- For v3 schema (as used in this example):
```rust
use chainlink_data_streams_report::report::{ decode_full_report, v3::ReportDataV3 };
```
- For v8 schema:
```rust
use chainlink_data_streams_report::report::{ decode_full_report, v8::ReportDataV8 };
```
2. Update the decode function to use the correct schema version. Examples:
- For v3 schema (as used in this example):
```rust
let report_data = ReportDataV3::decode(&report_blob)?;
```
- For v8 schema:
```rust
let report_data = ReportDataV8::decode(&report_blob)?;
```
3. Access fields according to the schema version structure. Refer to the [Report Schemas](/data-streams/reference/report-schema-overview) documentation for complete field references for each version.
## Explanation
### Establishing a WebSocket connection and listening for reports
The WebSocket connection is established in two steps:
1. [`Stream::new`](https://github.com/smartcontractkit/data-streams-sdk/blob/main/rust/crates/sdk/src/stream.rs#L131) initializes a new stream instance with your configuration and feed IDs. This function prepares the connection parameters but doesn't establish the connection yet.
2. [`stream.listen()`](https://github.com/smartcontractkit/data-streams-sdk/blob/main/rust/crates/sdk/src/stream.rs#L162) establishes the actual WebSocket connection and starts the background tasks that maintain the connection. These tasks handle:
- Automatic reconnection if the connection is lost
- Ping/pong messages to keep the connection alive
- Message queueing and delivery
### Decoding a report
As data reports arrive via the WebSocket connection, they are processed in real-time through several steps:
1. Reading streams: The [`read`](https://github.com/smartcontractkit/data-streams-sdk/blob/main/rust/crates/sdk/src/stream.rs#L218) method on the Stream object is called within a loop. This asynchronous method:
- Awaits the next report from the WebSocket connection
- Handles backpressure automatically
- Returns a [`WebSocketReport`](https://github.com/smartcontractkit/data-streams-sdk/blob/main/rust/crates/sdk/src/stream.rs#L51) containing the report data
2. Decoding reports: Each report is decoded in two stages:
- [`decode_full_report`](https://github.com/smartcontractkit/data-streams-sdk/blob/main/rust/crates/report/src/report.rs#L83) parses the raw hexadecimal data, separating the report context (containing metadata) from the report blob
- [`ReportDataV3::decode`](https://github.com/smartcontractkit/data-streams-sdk/blob/main/rust/crates/report/src/report/v3.rs#L80) transforms the report blob into a structured format containing:
- The benchmark price
- Bid and ask prices for [liquidity-weighted pricing](/data-streams/concepts/liquidity-weighted-prices)
- Fee information for onchain verification
- Timestamp information
### Handling the decoded data
The example demonstrates several best practices for handling the decoded data:
1. Logging:
- Uses the [`tracing`](https://github.com/tokio-rs/tracing) crate for structured logging
- Configures UTC timestamps for consistent time representation
- Includes both raw report data and decoded fields for debugging
2. Error handling:
- Uses Rust's `Result` type for robust error handling
- Implements the `?` operator for clean error propagation
- Logs errors with appropriate context using `warn!` macro
3. Stream monitoring:
- Tracks stream statistics through [`get_stats()`](https://github.com/smartcontractkit/data-streams-sdk/blob/main/rust/crates/sdk/src/stream.rs#L253)
- Monitors connection status and reconnection attempts
- Reports message acceptance and deduplication counts
The decoded data can be used for further processing, analysis, or display in your application. For production environments, it's recommended to verify the data onchain using the provided `full_report` payload.
---
# Verify report data - Offchain integration (Solana)
Source: https://docs.chain.link/data-streams/tutorials/solana-offchain-report-verification
To verify a Data Streams report, you must confirm the report integrity signed by the Decentralized Oracle Network (DON).
You have two options to verify Data Streams reports on Solana:
1. **Onchain integration**: Verify reports directly within your Solana program using [Cross-Program Invocation (CPI)](https://solana.com/docs/core/cpi) to the verifier program. Learn more about this method in the [onchain verification tutorial](/data-streams/tutorials/solana-onchain-report-verification).
2. **Offchain integration**: Verify reports client-side using an SDK. You'll learn how to implement this method in this tutorial.
Both methods use the same underlying verification logic and security guarantees, differing only in where the verification occurs.
## Offchain integration
The offchain integration allows you to verify the authenticity of Data Streams reports from your client-side application. While this method requires sending a transaction to the verifier program, the verification logic and processing of results happens in your client application rather than in a Solana program.
In this tutorial, you'll learn how to:
- Set up an Anchor project
- Configure the necessary dependencies
- Create a command-line tool to verify reports
- Process and display the verified report data
### Prerequisites
Before you begin, you should have:
- Familiarity with [Rust](https://www.rust-lang.org/learn) programming
- Understanding of [Solana](https://solana.com/docs) concepts:
- [RPC Clients](https://solana.com/docs/rpc) and network interaction
- [Transactions](https://solana.com/docs/core/transactions) and signing
- [Keypairs](https://solana.com/docs/intro/wallets#keypair) and account management
- An allowlisted account in the Data Streams Access Controller ([contact us](https://chainlinkcommunity.typeform.com/datastreams?typeform-source=docs.chain.link#ref_id=docs) to get started)
### Requirements
To complete this tutorial, you'll need:
- **Rust and Cargo**: Install the latest version using [rustup](https://rustup.rs/). Run rustc --version to verify your installation.
- **Solana CLI tools**: Install the latest version following the [official guide](https://docs.solana.com/cli/install-solana-cli-tools). Run solana --version to verify your installation.
- **Anchor Framework**: Follow the [official installation guide](https://www.anchor-lang.com/docs/installation). Run anchor --version to verify your installation.
- **Devnet SOL**: You'll need devnet SOL for transaction fees. Use the [Solana CLI](https://docs.solana.com/cli/transfer-tokens#airdrop-some-tokens-to-get-started) or the [Solana Faucet](https://faucet.solana.com/) to get devnet SOL. Check your balance with solana balance.
- **Allowlisted Account**: Your account must be allowlisted in the Data Streams Access Controller.
> **Note**: While this tutorial uses the Anchor framework for project structure, you can integrate the verification using any Rust-based Solana project setup. The verifier SDK and client libraries are written in Rust, but you can integrate them into your preferred Rust project structure.
### Implementation tutorial
#### 1. Create a new Anchor project
1. Create a new Anchor project:
```bash
anchor init example_verify
cd example_verify
```
2. Create a binary target for the verification tool:
```bash
mkdir -p programs/example_verify/src/bin
touch programs/example_verify/src/bin/main.rs
```
#### 2. Configure your project's dependencies
Update your program's manifest file (`programs/example_verify/Cargo.toml`):
```toml
[package]
name = "example_verify"
version = "0.1.0"
description = "Created with Anchor"
edition = "2021"
[lib]
crate-type = ["cdylib", "lib"]
[[bin]]
name = "example_verify"
path = "src/bin/main.rs"
[features]
no-entrypoint = []
no-idl = []
no-log-ix-name = []
cpi = ["no-entrypoint"]
default = []
[dependencies]
data-streams-report = { git = "https://github.com/smartcontractkit/data-streams-sdk.git" }
sdk-off-chain = { git = "https://github.com/smartcontractkit/smart-contract-examples.git", branch = "data-streams-solana-integration", package = "sdk-off-chain"}
solana-program = "1.18.26"
solana-sdk = "1.18.26"
solana-client = "1.18.26"
hex = "0.4.3"
borsh = "0.10.3"
```
#### 3. Implement the verification library
Create `programs/example_verify/src/lib.rs` with the verification function:
```rust
// NOTE: Adjust for your desired report
use data_streams_report::report::v3::ReportDataV3;
use sdk_off_chain::VerificationClient;
use solana_client::rpc_client::RpcClient;
use solana_sdk::{
commitment_config::CommitmentConfig,
pubkey::Pubkey,
signature::read_keypair_file,
signer::Signer,
};
use std::{ path::PathBuf, str::FromStr };
pub fn default_keypair_path() -> String {
let mut path = PathBuf::from(std::env::var("HOME").unwrap_or_else(|_| ".".to_string()));
path.push(".config/solana/id.json");
path.to_str().unwrap().to_string()
}
pub fn verify_report(
signed_report: &[u8],
program_id: &str,
access_controller: &str
// NOTE: Adjust for your desired report
) -> Result> {
// Initialize RPC client with confirmed commitment level
let rpc_client = RpcClient::new_with_commitment(
"https://api.devnet.solana.com",
CommitmentConfig::confirmed()
);
// Load the keypair that will pay for and sign verification transactions
let payer = read_keypair_file(default_keypair_path())?;
println!("Using keypair: {}", payer.pubkey());
// Convert to Pubkey
let program_pubkey = Pubkey::from_str(program_id)?;
let access_controller_pubkey = Pubkey::from_str(access_controller)?;
println!("Program ID: {}", program_pubkey);
println!("Access Controller: {}", access_controller_pubkey);
// Create a verification client instance
let client = VerificationClient::new(
program_pubkey,
access_controller_pubkey,
rpc_client,
payer
);
// Verify the report
println!("Verifying report of {} bytes...", signed_report.len());
let result = client.verify(signed_report.to_vec()).map_err(|e| {
println!("Verification error: {:?}", e);
e
})?;
// Decode the returned data into a ReportDataV3 struct
let return_data = result.return_data.ok_or("No return data")?;
// NOTE: Adjust for your desired report
let report = ReportDataV3::decode(&return_data)?;
Ok(report)
}
```
> This example uses the [V3 schema](/data-streams/reference/report-schema-v3) for [crypto streams](/data-streams/crypto-streams) to decode the report. If you verify reports for [RWA streams](/data-streams/rwa-streams), import and use the [V8 schema](/data-streams/reference/report-schema-v8) from the [report crate](https://github.com/smartcontractkit/data-streams-sdk/tree/main/rust/crates/report) instead.
#### 4. Create the command-line interface
Create `programs/example_verify/src/bin/main.rs`:
```rust
use example_verify::verify_report;
use std::env;
use std::str::FromStr;
use hex;
use solana_sdk::pubkey::Pubkey;
fn main() {
let args: Vec = env::args().collect();
if args.len() != 4 {
eprintln!(
"Usage: {} ",
args[0]
);
std::process::exit(1);
}
let program_id_str = &args[1];
let access_controller_str = &args[2];
let hex_report = &args[3];
// Validate program_id and access_controller
if Pubkey::from_str(program_id_str).is_err() {
eprintln!("Invalid program ID provided");
std::process::exit(1);
}
if Pubkey::from_str(access_controller_str).is_err() {
eprintln!("Invalid access controller address provided");
std::process::exit(1);
}
// Decode the hex string for the signed report
let signed_report = match hex::decode(hex_report) {
Ok(bytes) => bytes,
Err(e) => {
eprintln!("Failed to decode hex string: {}", e);
std::process::exit(1);
}
};
// Perform verification off-chain
match verify_report(&signed_report, program_id_str, access_controller_str) {
Ok(report) => {
println!("\nVerified Report Data:");
println!("Feed ID: {}", report.feed_id);
println!("Valid from timestamp: {}", report.valid_from_timestamp);
println!("Observations timestamp: {}", report.observations_timestamp);
println!("Native fee: {}", report.native_fee);
println!("Link fee: {}", report.link_fee);
println!("Expires at: {}", report.expires_at);
println!("Benchmark price: {}", report.benchmark_price);
println!("Bid: {}", report.bid);
println!("Ask: {}", report.ask);
}
Err(e) => {
eprintln!("Verification failed: {}", e);
std::process::exit(1);
}
}
}
```
#### 5. Build and run the verifier
1. Build the project:
```bash
cargo build
```
2. Make sure you are connected to Devnet with solana config set --url https\://api.devnet.solana.com.
3. Run the verifier with your report:
```bash
cargo run --
```
Replace the placeholders with:
- ``: The Verifier Program ID (find it on the [Stream Addresses](/data-streams/crypto-streams) page)
- ``: The Access Controller Account (find it on the [Stream Addresses](/data-streams/crypto-streams) page)
- ``: Your hex-encoded signed report (without the '0x' prefix)
Example:
```bash
cargo run -- Gt9S41PtjR58CbG9JhJ3J6vxesqrNAswbWYbLNTMZA3c 2k3DsgwBoqrnvXKVvd7jX7aptNxdcRBdcd5HkYsGgbrb 0006f9b553e393ced311551efd30d1decedb63d76ad41737462e2cdbbdff1578000000000000000000000000000000000000000000000000000000004f56930f000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000e00000000000000000000000000000000000000000000000000000000000000220000000000000000000000000000000000000000000000000000000000000028001010000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000120000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba78200000000000000000000000000000000000000000000000000000000675ca37000000000000000000000000000000000000000000000000000000000675ca3700000000000000000000000000000000000000000000000000000174be1bd8758000000000000000000000000000000000000000000000000000cb326ce8c3ea800000000000000000000000000000000000000000000000000000000675df4f00000000000000000000000000000000000000000000000d3a30bcc15e207c0000000000000000000000000000000000000000000000000d3a1557b5e634060200000000000000000000000000000000000000000000000d3ab99a974ff10f400000000000000000000000000000000000000000000000000000000000000000292bdd75612560e46ed9b0c2437898f81eb0e18b6b902a161b9708e9177175cf3b8ef2b279f230f766fb29306250ee90856516ee349ca42b2d7fb141deb006745000000000000000000000000000000000000000000000000000000000000000221c156e80276827e1bfeb6542ab064dfa958f5be955f516fb62b1c93437472c31cc65fcaba68c9d661701190bc32025a0690af0eefe027ac218fd15c588dd4d5
```
Expect the output to be similar to the following:
```bash
Using keypair:
Program ID: Gt9S41PtjR58CbG9JhJ3J6vxesqrNAswbWYbLNTMZA3c
Access Controller: 2k3DsgwBoqrnvXKVvd7jX7aptNxdcRBdcd5HkYsGgbrb
Verifying report of 736 bytes...
FeedId: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Valid from timestamp: 1734124400
Observations timestamp: 1734124400
Native fee: 25614677280600
Link fee: 3574678975954600
Expires at: 1734210800
Price: 3904011708000000000000
Bid: 3903888333211164500000
Ask: 3904628100124598400000
```
### Best practices
When implementing verification in production:
1. **Error Handling**:
- Implement robust error handling for network issues
- Add proper logging and monitoring
- Handle report expiration gracefully
2. **Security**:
- Securely manage keypairs and never expose them
- Validate all input parameters
- Implement rate limiting for verification requests
3. **Performance**:
- Cache verified reports when appropriate
- Implement retry mechanisms with backoff
- Use connection pooling for RPC clients
## Adapting code for different report schema versions
When working with different versions of [Data Stream reports](/data-streams/reference/report-schema-overview), you'll need to adapt your code to handle the specific report schema version they use:
1. **Import the correct schema version module.** Examples:
- For v3 schema (as used in this example):
```rust
use data_streams_report::report::v3::ReportDataV3;
```
- For v8 schema:
```rust
use data_streams_report::report::v8::ReportDataV8;
```
2. **Update the function return type and decode function to use the correct schema version.** Examples:
- For v3 schema (as used in this example):
```rust
// Function signature
fn verify_report(...) -> Result> { ... }
// Decoding
let report = ReportDataV3::decode(&return_data)?;
```
- For v8 schema:
```rust
// Function signature
fn verify_report(...) -> Result> { ... }
// Decoding
let report = ReportDataV8::decode(&return_data)?;
```
3. **Adjust report field access and logic for the schema version.**
- The fields you access and log (e.g., `feed_id`, `benchmark_price`, `bid`, `ask`, etc.) must match the schema version's structure and available fields.
Refer to the [Report Schemas](/data-streams/reference/report-schema-overview) documentation for details on each version's fields and usage.
---
# Verify report data - Onchain integration (Solana)
Source: https://docs.chain.link/data-streams/tutorials/solana-onchain-report-verification
To verify a Data Streams report, you must confirm the report integrity signed by the Decentralized Oracle Network (DON).
You have two options to verify Data Streams reports on Solana:
1. **Onchain integration**: Verify reports directly within your Solana program using [Cross-Program Invocation (CPI)](https://solana.com/docs/core/cpi) to the Verifier program. You'll learn how to implement this method in this tutorial.
2. **Offchain integration**: Verify reports client-side using an SDK. Learn more about this method in the [offchain verification tutorial](/data-streams/tutorials/solana-offchain-report-verification).
Both methods use the same underlying verification logic and security guarantees, differing only in where the verification occurs.
## Onchain integration
You can verify Data Streams reports directly within your Solana program using this integration. This method ensures atomic verification and processing of report data.
In this tutorial, you'll learn how to:
- Integrate with the Chainlink Data Streams Verifier program
- Create and invoke the verification instruction
- Retrieve the verified report data
### Prerequisites
Before you begin, you should have:
- Familiarity with [Rust](https://www.rust-lang.org/learn) programming
- Understanding of [Solana](https://solana.com/docs) concepts:
- [Accounts](https://solana.com/docs/core/accounts)
- [Instructions](https://solana.com/docs/core/transactions#instruction)
- [Program Derived Addresses (PDAs)](https://solana.com/docs/core/pda)
- Knowledge of the [Anchor](https://www.anchor-lang.com/) framework
- An allowlisted account in the Data Streams Access Controller. ([Contact us](https://chainlinkcommunity.typeform.com/datastreams?typeform-source=docs.chain.link#ref_id=docs) to get started).
### Requirements
To complete this tutorial, you'll need:
- **Rust and Cargo**: Install the latest version using [rustup](https://rustup.rs/). Run rustc --version to verify your installation.
- **Solana CLI tools**: Install the latest version following the [official guide](https://docs.solana.com/cli/install-solana-cli-tools). Run solana --version to verify your installation.
- **Anchor Framework**: Follow the [official installation guide](https://www.anchor-lang.com/docs/installation). Run anchor --version to verify your installation.
- **Node.js and npm**: [Install Node.js 20 or later](https://nodejs.org/). Verify your installation with node --version.
- **ts-node**: Install globally using npm: npm install -g ts-node. Verify your installation with ts-node --version.
- **Devnet SOL**: You'll need devnet SOL for deployment and testing. Use the [Solana CLI](https://docs.solana.com/cli/transfer-tokens#airdrop-some-tokens-to-get-started) or the [Solana Faucet](https://faucet.solana.com/) to get devnet SOL.
> **Note**: While this tutorial uses the Anchor framework for project structure, you can integrate the verification using any Rust-based Solana program framework. The verifier SDK is written in Rust, but you can integrate it into your preferred Rust program structure.
### Implementation tutorial
#### 1. Create a new Anchor project
1. Open your terminal and run the following command to create a new Anchor project:
```bash
anchor init example_verify
```
This command creates a new directory named `example_verify` with the basic structure of an Anchor project.
2. Navigate to the project directory:
```bash
cd example_verify
```
#### 2. Configure your project for devnet
Open your `Anchor.toml` file at the root of your project and update it to use devnet:
```toml
[features]
seeds = false
skip-lint = false
[programs.devnet]
# Replace with your program ID
example_verify = ""
[registry]
url = "https://api.apr.dev"
[provider]
cluster = "devnet"
wallet = "~/.config/solana/id.json"
[scripts]
test = "yarn run ts-mocha -p ./tsconfig.json -t 1000000 tests/**/*.ts"
```
Replace `` with your program ID. You can run solana-keygen pubkey target/deploy/example_verify-keypair.json to get your program ID.
#### 3. Set up your program's dependencies
In your program's manifest file (`programs/example_verify/Cargo.toml`), add the Chainlink Data Streams client and the report crate as dependencies:
```toml
[dependencies]
chainlink_solana_data_streams = { git = "https://github.com/smartcontractkit/chainlink-solana", branch = "develop", subdir = "contracts/crates/chainlink-solana-data-streams" }
data-streams-report = { git = "https://github.com/smartcontractkit/data-streams-sdk.git", subdir = "rust/crates/report" }
# Additional required dependencies
anchor-lang = "0.29.0"
```
#### 4. Write the program
Navigate to your program main file (`programs/example_verify/src/lib.rs`). This is where you'll write your program logic. Replace the contents of `lib.rs` with the following example code:
```rust
// Import required dependencies for Anchor, Solana, and Data Streams
use anchor_lang::prelude::*;
use anchor_lang::solana_program::{
program::{get_return_data, invoke},
pubkey::Pubkey,
instruction::Instruction,
};
// NOTE: Adjust for your report version
use data_streams_report::report::v3::ReportDataV3;
use chainlink_solana_data_streams::VerifierInstructions;
declare_id!("");
#[program]
pub mod example_verify {
use super::*;
/// Verifies a Data Streams report using Cross-Program Invocation to the Verifier program
/// Returns the decoded report data if verification succeeds
pub fn verify(ctx: Context, signed_report: Vec) -> Result<()> {
let program_id = ctx.accounts.verifier_program_id.key();
let verifier_account = ctx.accounts.verifier_account.key();
let access_controller = ctx.accounts.access_controller.key();
let user = ctx.accounts.user.key();
let config_account = ctx.accounts.config_account.key();
// Create verification instruction
let chainlink_ix: Instruction = VerifierInstructions::verify(
&program_id,
&verifier_account,
&access_controller,
&user,
&config_account,
signed_report,
);
// Invoke the Verifier program
invoke(
&chainlink_ix,
&[
ctx.accounts.verifier_account.to_account_info(),
ctx.accounts.access_controller.to_account_info(),
ctx.accounts.user.to_account_info(),
ctx.accounts.config_account.to_account_info(),
],
)?;
// Decode and log the verified report data
if let Some((_program_id, return_data)) = get_return_data() {
msg!("Report data found");
// NOTE: Adjust for your report version (V3, V4, V8, etc.)
let report = ReportDataV3::decode(&return_data)
.map_err(|_| error!(CustomError::InvalidReportData))?;
// Log report fields
// NOTE: Adjust for your report and desired output
msg!("FeedId: {}", report.feed_id);
msg!("Valid from timestamp: {}", report.valid_from_timestamp);
msg!("Observations Timestamp: {}", report.observations_timestamp);
msg!("Native Fee: {}", report.native_fee);
msg!("Link Fee: {}", report.link_fee);
msg!("Expires At: {}", report.expires_at);
msg!("Benchmark Price: {}", report.benchmark_price);
msg!("Bid: {}", report.bid);
msg!("Ask: {}", report.ask);
} else {
msg!("No report data found");
return Err(error!(CustomError::NoReportData));
}
Ok(())
}
}
#[error_code]
pub enum CustomError {
#[msg("No valid report data found")]
NoReportData,
#[msg("Invalid report data format")]
InvalidReportData,
}
#[derive(Accounts)]
pub struct ExampleProgramContext<'info> {
/// The Verifier Account stores the DON's public keys and other verification parameters.
/// This account must match the PDA derived from the verifier program.
/// CHECK: The account is validated by the verifier program.
pub verifier_account: AccountInfo<'info>,
/// The Access Controller Account
/// CHECK: The account structure is validated by the verifier program.
pub access_controller: AccountInfo<'info>,
/// The account that signs the transaction.
pub user: Signer<'info>,
/// The Config Account is a PDA derived from a signed report
/// CHECK: The account is validated by the verifier program.
pub config_account: UncheckedAccount<'info>,
/// The Verifier Program ID specifies the target Chainlink Data Streams Verifier Program.
/// CHECK: The program ID is validated by the verifier program.
pub verifier_program_id: AccountInfo<'info>,
}
```
Replace `` with your program ID in the `declare_id!` macro. You can run solana-keygen pubkey target/deploy/example_verify-keypair.json to get your program ID.
Note how the `VerifierInstructions::verify` helper method automatically handles the PDA computations internally. Refer to the [Program Derived Addresses (PDAs)](#program-derived-addresses-pdas) section for more information.
> This example uses the [V3 schema](/data-streams/reference/report-schema-v3) for [crypto streams](/data-streams/crypto-streams) to decode the report. If you verify reports for [RWA streams](/data-streams/rwa-streams), import and use the [V8 schema](/data-streams/reference/report-schema-v8) from the [report crate](https://github.com/smartcontractkit/data-streams-sdk/tree/main/rust/crates/report) instead.
#### 5. Deploy your program
1. Run the following command to build your program:
```bash
anchor build
```
**Note**: If you run into this error, set the `version` field at the top of your `cargo.lock` file to `3`.
```bash
warning: virtual workspace defaulting to `resolver = "1"` despite one or more workspace members being on edition 2021 which implies `resolver = "2"`
note: to keep the current resolver, specify `workspace.resolver = "1"` in the workspace root's manifest
note: to use the edition 2021 resolver, specify `workspace.resolver = "2"` in the workspace root's manifest
note: for more details see https://doc.rust-lang.org/cargo/reference/resolver.html#resolver-versions
warning: .../example_verify/programs/example_verify/Cargo.toml: unused manifest key: dependencies.data-streams-report.subdir
error: failed to parse lock file at: .../example_verify/Cargo.lock
Caused by:
lock file version 4 requires `-Znext-lockfile-bump`
```
2. Deploy your program to a Solana cluster (devnet in this example) using:
```bash
anchor deploy
```
Expect an output similar to the following:
```bash
Deploying cluster: https://api.devnet.solana.com
Upgrade authority: ~/.config/solana/id.json
Deploying program "example_verify"...
Program path: ~/example_verify/target/deploy/example_verify.so...
Program Id: 8XcUbDgY2UaUYNHkirKsWqXJtzPXezBSyj5Yh87dXums
Signature: 3ky6VkpebDGq7x1n8JB32daybmjvbRBsD4yR2uCCussSWhokaEESTXuSa5s8NMvKTz2NZjoq9aoQ9pvuw9bYoibt
Deploy success
```
#### 6. Interact with the Verifier Program
In this section, you'll write a client script to interact with your deployed program, which will use [Cross-Program Invocation (CPI)](https://solana.com/docs/core/cpi) to verify reports through the Chainlink Data Streams Verifier Program.
1. In the `tests` directory, create a new file verify_test.ts to interact with your deployed program.
2. Populate your `verify_tests.ts` file with the example client script below.
- Replace `` with your program ID.
- This example provides a report payload. To use your own report payload, update the `hexString` variable.
```typescript
import * as anchor from "@coral-xyz/anchor"
import { Program } from "@coral-xyz/anchor"
import { PublicKey } from "@solana/web3.js"
import { ExampleVerify } from "../target/types/example_verify"
import * as snappy from "snappy"
// Data Streams Verifier Program ID on Devnet
const VERIFIER_PROGRAM_ID = new PublicKey("Gt9S41PtjR58CbG9JhJ3J6vxesqrNAswbWYbLNTMZA3c")
async function main() {
// Setup connection and provider
const provider = anchor.AnchorProvider.env()
anchor.setProvider(provider)
// Initialize your program using the IDL and your program ID
const program = new Program(
require("../target/idl/example_verify.json"),
"",
provider
)
// Convert the hex string to a Uint8Array
// This is an example report payload for a crypto stream
const hexString =
"0x00064f2cd1be62b7496ad4897b984db99243e0921906f66ded15149d993ef42c000000000000000000000000000000000000000000000000000000000103c90c000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000e000000000000000000000000000000000000000000000000000000000000002200000000000000000000000000000000000000000000000000000000000000280000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000001200003684ea93c43ed7bd00ab3bb189bb62f880436589f1ca58b599cd97d6007fb0000000000000000000000000000000000000000000000000000000067570fa40000000000000000000000000000000000000000000000000000000067570fa400000000000000000000000000000000000000000000000000004c6ac85bf854000000000000000000000000000000000000000000000000002e1bf13b772a9c0000000000000000000000000000000000000000000000000000000067586124000000000000000000000000000000000000000000000000002bb4cf7662949c000000000000000000000000000000000000000000000000002bae04e2661000000000000000000000000000000000000000000000000000002bb6a26c3fbeb80000000000000000000000000000000000000000000000000000000000000002af5e1b45dd8c84b12b4b58651ff4173ad7ca3f5d7f5374f077f71cce020fca787124749ce727634833d6ca67724fd912535c5da0f42fa525f46942492458f2c2000000000000000000000000000000000000000000000000000000000000000204e0bfa6e82373ae7dff01a305b72f1debe0b1f942a3af01bad18e0dc78a599f10bc40c2474b4059d43a591b75bdfdd80aafeffddfd66d0395cca2fdeba1673d"
// Remove the '0x' prefix if present
const cleanHexString = hexString.startsWith("0x") ? hexString.slice(2) : hexString
// Validate hex string format
if (!/^[0-9a-fA-F]+$/.test(cleanHexString)) {
throw new Error("Invalid hex string format")
}
// Convert hex to Uint8Array
const signedReport = new Uint8Array(cleanHexString.match(/.{1,2}/g).map((byte) => parseInt(byte, 16)))
// Compress the report using Snappy
const compressedReport = await snappy.compress(Buffer.from(signedReport))
// Derive necessary PDAs using the SDK's helper functions
const verifierAccount = await PublicKey.findProgramAddressSync([Buffer.from("verifier")], VERIFIER_PROGRAM_ID)
const configAccount = await PublicKey.findProgramAddressSync([signedReport.slice(0, 32)], VERIFIER_PROGRAM_ID)
// The Data Streams access controller on devnet
const accessController = new PublicKey("2k3DsgwBoqrnvXKVvd7jX7aptNxdcRBdcd5HkYsGgbrb")
try {
console.log("\n📝 Transaction Details")
console.log("━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━")
console.log("🔑 Signer:", provider.wallet.publicKey.toString())
const tx = await program.methods
.verify(compressedReport)
.accounts({
verifierAccount: verifierAccount[0],
accessController: accessController,
user: provider.wallet.publicKey,
configAccount: configAccount[0],
verifierProgramId: VERIFIER_PROGRAM_ID,
})
.rpc({ commitment: "confirmed" })
console.log("✅ Transaction successful!")
console.log("🔗 Signature:", tx)
// Fetch and display logs
const txDetails = await provider.connection.getTransaction(tx, {
commitment: "confirmed",
maxSupportedTransactionVersion: 0,
})
console.log("\n📋 Program Logs")
console.log("━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━")
let indentLevel = 0
let currentProgramId = ""
txDetails.meta.logMessages.forEach((log) => {
// Handle indentation for inner program calls
if (log.includes("Program invoke")) {
const programIdMatch = log.match(/Program (.*?) invoke/)
if (programIdMatch) {
currentProgramId = programIdMatch[1]
// Remove "Unknown Program" prefix if present
currentProgramId = currentProgramId.replace("Unknown Program ", "")
// Remove parentheses if present
currentProgramId = currentProgramId.replace(/[()]/g, "")
}
console.log(" ".repeat(indentLevel) + "🔄", log.trim())
indentLevel++
return
}
if (log.includes("Program return") || log.includes("Program consumed")) {
indentLevel = Math.max(0, indentLevel - 1)
}
// Add indentation to all logs
const indent = " ".repeat(indentLevel)
if (log.includes("Program log:")) {
const logMessage = log.replace("Program log:", "").trim()
if (log.includes("Program log:")) {
console.log(indent + "📍", logMessage)
} else if (log.includes("Program data:")) {
console.log(indent + "📊", log.replace("Program data:", "").trim())
}
}
})
console.log("━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\n")
} catch (error) {
console.log("\n❌ Transaction Failed")
console.log("━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━")
console.error("Error:", error)
console.log("━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\n")
}
}
main()
```
**Note**: The Program IDs and Access Controller Accounts are available on the [Stream Addresses](/data-streams/crypto-streams) page.
3. Add the `snappy` dependency to your project:
```bash
yarn add snappy
```
`snappy` is a compression library that is used to compress the report data.
4. Execute the test script to interact with your program:
```bash
ANCHOR_PROVIDER_URL="https://api.devnet.solana.com" ANCHOR_WALLET="~/.config/solana/id.json" ts-node tests/verify_test.ts
```
Replace `~/.config/solana/id.json` with the path to your Solana wallet (e.g., `/Users/username/.config/solana/id.json`).
5. Verify the output logs to ensure the report data is processed correctly. Expect to see the decoded report fields logged to the console:
```bash
📝 Transaction Details
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
🔑 Signer: 1BZZU8cJsrMSBaQQGUxTE4LQYX2SU2jjs97pkrz7rHD
✅ Transaction successful!
🔗 Signature: 2CTZ7kgAxTogvMgb7QFDJUAq9xFBUVTEvyjf7UuhoVrHDhYKtHpQmd8hEy9XvLhfgWMdVTpCRvdf18r1ixgtncUc
📋 Program Logs
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
📍 Instruction: Verify
📍 Instruction: Verify
📍 Report data found
📍 FeedId: 0x0003684ea93c43ed7bd00ab3bb189bb62f880436589f1ca58b599cd97d6007fb
📍 valid from timestamp: 1733758884
📍 Observations Timestamp: 1733758884
📍 Native Fee: 84021511714900
📍 Link Fee: 12978571827423900
📍 Expires At: 1733845284
📍 Benchmark Price: 12302227135960220
📍 Bid: 12294760000000000
📍 Ask: 12304232715632312
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
```
### Learn more
#### Program Derived Addresses (PDAs)
The verification process relies on two important PDAs that are handled automatically by the [Chainlink Data Streams Solana SDK](https://github.com/smartcontractkit/chainlink-solana/tree/develop/contracts/crates/chainlink-solana-data-streams):
- **Verifier config account PDA**:
- Derived using the verifier program ID as a seed
- Stores verification-specific configuration
- Used to ensure consistent verification parameters across all reports
- **Report config account PDA**:
- Derived using the feed ID (first 32 bytes of the uncompressed signed report) as a seed
- Contains feed-specific configuration and constraints
- Ensures that each feed's verification follows its designated rules
The SDK's `VerifierInstructions::verify` helper method performs these steps:
1. Extracts the necessary seeds
2. Computes the PDAs using `Pubkey::find_program_derived_address`
3. Includes these derived addresses in the instruction data
#### Best practices
This tutorial provides a basic example on how to verify reports. When you implement reports verification, consider the following best practices:
- Implement robust error handling:
- Handle verification failures and invalid reports comprehensively
- Implement proper error reporting and logging for debugging
- Add custom error types for different failure scenarios
- Add appropriate validations:
- Price threshold checks to prevent processing extreme values
- Timestamp validations to ensure data freshness
- Custom feed-specific validations based on your use case
## Adapting code for different report schema versions
When working with different versions of [Data Stream reports](/data-streams/reference/report-schema-overview), you'll need to adapt your code to handle the specific report schema version they use:
1. **Import the correct schema version module.** Examples:
- For v3 schema (as used in this example):
```rust
use data_streams_report::report::v3::ReportDataV3;
```
- For v8 schema:
```rust
use data_streams_report::report::v8::ReportDataV8;
```
2. **Update the decode function to use the correct schema version.** Examples:
- For v3 schema (as used in this example):
```rust
let report = ReportDataV3::decode(&return_data)?;
```
- For v8 schema:
```rust
let report = ReportDataV8::decode(&return_data)?;
```
3. **Adjust report field access and logic for the schema version.**
- The fields you access and log (e.g., `feed_id`, `benchmark_price`, `bid`, `ask`, etc.) must match the schema version's structure and available fields.
Refer to the [Report Schemas](/data-streams/reference/report-schema-overview) documentation for details on each version's fields and usage.
---
# Getting Started with Chainlink Data Streams (Hardhat CLI)
Source: https://docs.chain.link/data-streams/tutorials/streams-trade/getting-started-hardhat
---
# Getting Started with Chainlink Data Streams (Remix)
Source: https://docs.chain.link/data-streams/tutorials/streams-trade/getting-started
---
# Streams Trade guides
Source: https://docs.chain.link/data-streams/tutorials/streams-trade
Explore several guides to learn how to use the [Streams Trade](/data-streams/streams-trade) implementation of Data Streams.
- [Getting Started](/data-streams/tutorials/streams-trade/getting-started): Learn how to read data from a Data Streams stream, verify the answer onchain, and store it.
- [Handle StreamsLookup errors](/data-streams/tutorials/streams-trade/streams-trade-lookup-error-handler): Learn how to handle potential errors or edge cases in StreamsLookup upkeeps.
**Note**: Before implementing Streams Trade, ensure that Chainlink Automation is available on your desired network by checking the [Automation Supported Networks page](/chainlink-automation/overview/supported-networks).
---
# Using the StreamsLookup error handler
Source: https://docs.chain.link/data-streams/tutorials/streams-trade/streams-trade-lookup-error-handler
---
# Fetch and decode Data Streams reports using the TypeScript SDK
Source: https://docs.chain.link/data-streams/tutorials/ts-sdk-fetch
In this tutorial, you'll learn how to use the [Data Streams SDK](/data-streams/reference/data-streams-api/ts-sdk) for TypeScript to fetch and decode [reports](/data-streams/reference/report-schema-overview) from the Data Streams Aggregation Network. You'll set up your TypeScript project, retrieve reports, decode them, and log their attributes.
## Requirements
- **Git**: Make sure you have Git installed. You can check your current version by running git --version in your terminal and download the latest version from the official [Git website](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git) if necessary.
- **Node.js**: Make sure you have Node.js 20.0 or higher. You can check your current version by running node --version in your terminal and download the latest version from the official [Node.js website](https://nodejs.org/) if necessary.
- **TypeScript**: Make sure you have TypeScript 5.3 or higher. You can check your current version by running npx tsc --version in your terminal and install or update TypeScript by running npm install -g typescript if necessary.
- **API Credentials**: Access to Data Streams requires API credentials. If you haven't already, [contact us](https://chainlinkcommunity.typeform.com/datastreams?typeform-source=docs.chain.link#ref_id=docs) to request mainnet or testnet access.
## Tutorial
You'll start with the set up of your TypeScript project, installing the SDK and pasting example code. This will let you decode reports for both single and multiple [streams](/data-streams/crypto-streams), logging their attributes to your terminal.
### Set up your TypeScript project
1. Create a new directory for your project and navigate to it:
```bash
mkdir my-data-streams-project
cd my-data-streams-project
```
2. Initialize a new Node.js project:
```bash
npm init -y
```
3. Install the TypeScript SDK and other required packages:
```bash
npm install @chainlink/data-streams-sdk dotenv
npm install -D tsx
```
4. Set your API credentials:
Option 1 - Environment variables:
```bash
export API_KEY="your_api_key_here"
export USER_SECRET="your_user_secret_here"
```
Option 2 - `.env` file:
```bash
# Create .env file
touch .env
# Add your credentials
API_KEY="your_api_key_here"
USER_SECRET="your_user_secret_here"
```
### Fetch and decode a report with a single stream
1. Create a new new TypeScript file, `singleStream.ts`, in your project directory:
```bash
touch singleStream.ts
```
2. Insert the following code example and save your `singleStream.ts` file:
```typescript
import { createClient, decodeReport, LogLevel, getReportVersion, formatReport } from "@chainlink/data-streams-sdk"
import "dotenv/config"
async function main() {
if (process.argv.length < 3) {
console.error("Please provide a feed ID as an argument")
console.error(
"Example: npx tsx singleStream.ts 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782"
)
process.exit(1)
}
const feedId = process.argv[2]
const version = getReportVersion(feedId)
try {
const config = {
apiKey: process.env.API_KEY || "YOUR_API_KEY",
userSecret: process.env.USER_SECRET || "YOUR_USER_SECRET",
endpoint: "https://api.testnet-dataengine.chain.link",
wsEndpoint: "wss://ws.testnet-dataengine.chain.link",
// Comment to disable SDK logging:
logging: {
logger: console,
logLevel: LogLevel.INFO,
},
}
const client = createClient(config)
console.log(`\nFetching latest report for feed ${feedId} (${version})...\n`)
// Get raw report data
const report = await client.getLatestReport(feedId)
console.log(`Raw Report Blob: ${report.fullReport}`)
// Decode the report
const decodedData = decodeReport(report.fullReport, report.feedID)
// Combine decoded data with report metadata
const decodedReport = {
...decodedData,
feedID: report.feedID,
validFromTimestamp: report.validFromTimestamp,
observationsTimestamp: report.observationsTimestamp,
}
console.log(formatReport(decodedReport, version))
} catch (error) {
if (error instanceof Error) {
console.error("Error:", error.message)
} else {
console.error("Unknown error:", error)
}
process.exit(1)
}
}
main()
```
3. Read from a [testnet crypto stream](/data-streams/crypto-streams?page=1\&testnetPage=1#testnet-crypto-streams). The below example executes the application, reading from the `ETH/USD` crypto stream:
```bash
npx tsx singleStream.ts 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
```
Expect output similar to the following in your terminal:
```bash
Fetching latest report for feed 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782 (V3)...
[2025-09-23T00:09:49.042Z] [DataStreams] Request successful: GET https://api.testnet-dataengine.chain.link/api/v1/reports/latest?feedID=0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782 - 200
Raw Report Blob: 0x00090d9e8d96765a0c49e03a6ae05c82e8f8de70cf179baa632f18313e54bd690000000000000000000000000000000000000000000000000000000001f6f486000000000000000000000000000000000000000000000000000000030000000100000000000000000000000000000000000000000000000000000000000000e00000000000000000000000000000000000000000000000000000000000000220000000000000000000000000000000000000000000000000000000000000028000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000120000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba7820000000000000000000000000000000000000000000000000000000068d1e54c0000000000000000000000000000000000000000000000000000000068d1e54c00000000000000000000000000000000000000000000000000004539f757bc7c0000000000000000000000000000000000000000000000000034754304206ea30000000000000000000000000000000000000000000000000000000068f9724c0000000000000000000000000000000000000000000000e3e84d950bcd8d80000000000000000000000000000000000000000000000000e3e48f23626b5660000000000000000000000000000000000000000000000000e3eb7a12d8af1b00000000000000000000000000000000000000000000000000000000000000000002e7c71643e93efb8e759b1d1a8826579853d3aa2c96f59a2813e833a374c786f8f13a497a753af14c6b7329f704f148779b20aae62ed450167c61b9b5d8fcb0e100000000000000000000000000000000000000000000000000000000000000024905f1b4a6313246988a33d2aa923e3023065bbc8c874956aa5df25ec8b7b3e918a25b251713158aae6be85a188c1292e6b5288f2f96e75d632c4e6b659dfa53
Report Metadata:
Feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Valid From: 1758586188
Observations: 1758586188
Decoded Data:
Native Fee: 76115265174652
LINK Fee: 14765629481447075
Expires At: 1761178188
Price: 4204150104000000000000
Bid Price: 4203880326000000000000
Ask Price: 4204378800000000000000
--------------------------------------------------
```
Your application has successfully fetched and decoded data for both streams.
[Learn more about the decoded report details](#decoded-report-details).
### Fetch and decode reports for multiple streams
1. Create a new TypeScript file, `multipleStreams.ts`, in your project directory:
```bash
touch multipleStreams.ts
```
2. Insert the following code example in your `multipleStreams.ts` file:
```typescript
import { createClient, decodeReport, LogLevel, getReportVersion, formatReport } from "@chainlink/data-streams-sdk"
import "dotenv/config"
async function main() {
if (process.argv.length < 3) {
console.error("Please provide feed IDs as arguments")
console.error("Get latest reports for multiple feeds:")
console.error(" npx tsx multipleStreams.ts [feedID3...]")
console.error("\nExample:")
console.error(
" npx tsx multipleStreams.ts 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782 0x00036fe43f87884450b4c7e093cd5ed99cac6640d8c2000e6afc02c8838d0265"
)
process.exit(1)
}
const feedIds = process.argv.slice(2)
try {
const config = {
apiKey: process.env.API_KEY || "YOUR_API_KEY",
userSecret: process.env.USER_SECRET || "YOUR_USER_SECRET",
endpoint: "https://api.testnet-dataengine.chain.link",
wsEndpoint: "wss://ws.testnet-dataengine.chain.link",
// Comment to disable SDK logging:
logging: {
logger: console,
logLevel: LogLevel.INFO,
},
}
const client = createClient(config)
console.log(`\nFetching latest reports for ${feedIds.length} feed(s):`)
feedIds.forEach((feedId) => {
const version = getReportVersion(feedId)
console.log(`- ${feedId} (${version})`)
})
console.log()
// Get latest reports for each feed ID
const reports = []
for (const feedId of feedIds) {
try {
const report = await client.getLatestReport(feedId)
reports.push(report)
} catch (error) {
console.error(`Failed to get report for ${feedId}:`, error)
continue
}
}
console.log(`Found ${reports.length} reports:\n`)
// Process reports
reports.forEach((report, index) => {
const version = getReportVersion(report.feedID)
console.log(`Raw Report Blob #${index + 1}: ${report.fullReport}`)
try {
// Decode the report
const decodedData = decodeReport(report.fullReport, report.feedID)
// Combine decoded data with report metadata
const decodedReport = {
...decodedData,
feedID: report.feedID,
validFromTimestamp: report.validFromTimestamp,
observationsTimestamp: report.observationsTimestamp,
}
console.log(formatReport(decodedReport, version))
} catch (error) {
console.error(`Failed to decode report for ${report.feedID}:`, error)
}
})
} catch (error) {
if (error instanceof Error) {
console.error("Error:", error.message)
} else {
console.error("Unknown error:", error)
}
process.exit(1)
}
}
main()
```
3. Before running the example, verify that your API credentials are still set in your current terminal session:
```bash
echo $API_KEY
echo $API_SECRET
```
If the commands above don't show your credentials, set them again:
```bash
export API_KEY=""
export USER_SECRET=""
```
4. Read from two testnet crypto streams (ETH/USD and LINK/USD) by running:
```bash
npx tsx multipleStreams.ts 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782 0x00036fe43f87884450b4c7e093cd5ed99cac6640d8c2000e6afc02c8838d0265
```
Expect to see the output below in your terminal:
```bash
[2025-09-24T01:50:28.313Z] [DataStreams] Data Streams client initialized
Fetching latest report for feed 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782 (V3)...
[2025-09-24T01:50:28.607Z] [DataStreams] Request successful: GET https://api.testnet-dataengine.chain.link/api/v1/reports/latest?feedID=0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782 - 200
Raw Report Blob: 0x00090d9e8d96765a0c49e03a6ae05c82e8f8de70cf179baa632f18313e54bd690000000000000000000000000000000000000000000000000000000001fb09db000000000000000000000000000000000000000000000000000000030000000100000000000000000000000000000000000000000000000000000000000000e00000000000000000000000000000000000000000000000000000000000000220000000000000000000000000000000000000000000000000000000000000028000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000120000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba7820000000000000000000000000000000000000000000000000000000068d34e640000000000000000000000000000000000000000000000000000000068d34e64000000000000000000000000000000000000000000000000000045686a2caed300000000000000000000000000000000000000000000000000347613062ce6c40000000000000000000000000000000000000000000000000000000068fadb640000000000000000000000000000000000000000000000e34fc8e3afa8f400000000000000000000000000000000000000000000000000e34cf02d97047e60000000000000000000000000000000000000000000000000e3533bbd1e9ba3400000000000000000000000000000000000000000000000000000000000000000021ae965e613bfb4580ea819f8c12736562222e515b412054c49ec77ded163d9ee4493fb7dfb713181ea1a0d4e1a7fc4b7a3618484f9c989c4262c201e749df2cd000000000000000000000000000000000000000000000000000000000000000217c165a50d34910db8667c1229128ea4110fba082eb0308b75282fffdb594f68467797ce8a8b515f8e3ab08e4723a1fe1a9cfa5079968cd4bb7a9ec1a7a943cc
Report Metadata:
Feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Valid From: 1758678628
Observations: 1758678628
Decoded Data:
Native Fee: 76314760228563
LINK Fee: 14766522869016260
Expires At: 1761270628
Price: 4193160000000000000000
Bid Price: 4192954886000000000000
Ask Price: 4193408500000000000000
--------------------------------------------------
```
### Decoded report details
The decoded [crypto v3 report](/data-streams/reference/report-schema-v3) details include:
| Attribute | Value | Description |
| ---------------------- | -------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| Stream ID | `0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782` | The unique identifier for the stream. In this example, the stream is for ETH/USD. |
| Observations Timestamp | `1734216283` | The timestamp indicating when the data was captured. |
| Benchmark Price | `3865052126782320350000` | The observed price in the report, with 18 decimals. For readability: `3,865.0521267823204` USD per ETH. |
| Bid | `3864985478146740000000` | The highest price a buyer is willing to pay for an asset, with 18 decimals. For readability: `3,864.9854781467400` USD per ETH. Learn more about the [Bid price](/data-streams/concepts/liquidity-weighted-prices). (For [DEX State Price streams](/data-streams/concepts/dex-state-price-streams), this value equals `Benchmark Price`.) |
| Ask | `3865140837060103650000` | The lowest price a seller is willing to accept for an asset, with 18 decimals. For readability: `3,865.1408370601037` USD per ETH. Learn more about the [Ask price](/data-streams/concepts/liquidity-weighted-prices). (For [DEX State Price streams](/data-streams/concepts/dex-state-price-streams), this value equals `Benchmark Price`.) |
| Valid From Timestamp | `1734216283` | The start validity timestamp for the report, indicating when the data becomes relevant. |
| Expires At | `1734302683` | The expiration timestamp of the report, indicating the point at which the data becomes outdated. |
| Link Fee | `3379350941986000` | The fee to pay in LINK tokens for the onchain verification of the report data. With 18 decimals. For readability: `0.03379350941986` LINK. **Note:** This example fee is not indicative of actual fees. |
| Native Fee | `25872872271800` | The fee to pay in the native blockchain token (e.g., ETH on Ethereum) for the onchain verification of the report data. With 18 decimals. **Note:** This example fee is not indicative of actual fees. |
### Payload for onchain verification
In this tutorial, you logged and decoded the `full_report` payloads to extract the report data. However, in a production environment, you should verify the data to ensure its integrity and authenticity.
Refer to the [Verify report data onchain](/data-streams/tutorials/evm-onchain-report-verification) tutorial to learn more.
## Explanation
### Initializing the client and configuration
The Data Streams TypeScript client is initialized in two steps:
1. Configure the client with a config object:
```typescript
const config = {
apiKey: process.env.API_KEY || "YOUR_API_KEY",
userSecret: process.env.USER_SECRET || "YOUR_USER_SECRET",
endpoint: "https://api.testnet-dataengine.chain.link",
wsEndpoint: "wss://ws.testnet-dataengine.chain.link",
// Optional logging:
logging: {
logger: console,
logLevel: LogLevel.INFO,
},
}
```
The configuration requires:
- `apiKey` and `userSecret` for authentication (required)
- `endpoint` for the API endpoint (required)
- `logging` for debugging and error tracking (optional)
See the [SDK Reference](/data-streams/reference/data-streams-api/ts-sdk) page for more configuration options.
2. Create the client with `createClient`:
```typescript
const client = createClient(config)
```
The client handles:
- Authentication with HMAC signatures
- Connection management and timeouts
- Error handling and retries
### Fetching reports
The TypeScript SDK provides two main methods to fetch reports:
1. Latest report for a single stream with `getLatestReport`:
```typescript
const report = await client.getLatestReport(feedId)
```
- Takes a feed ID string
- Returns a single `ReportResponse` with the most recent data
- No timestamp parameter needed
- Useful for real-time price monitoring
2. Latest reports for multiple streams by calling `getLatestReport` in a loop:
```typescript
for (const feedId of feedIds) {
const report = await client.getLatestReport(feedId)
reports.push(report)
}
```
- Takes an array of feed ID strings
- Calls `getLatestReport` for each feed ID individually
- Returns the most recent data for each stream
- Useful for monitoring multiple assets simultaneously
- Each request gets the latest available data without timestamp constraints
Each API request automatically:
- Handles authentication with API credentials
- Manages request timeouts via the Node/SDK configuration
- Processes responses into structured types
### Decoding reports
Reports are decoded in two steps using the TypeScript SDK helper functions:
1. Report decoding with `decodeReport` (auto-detects report version by feed ID):
```typescript
const decodedData = decodeReport(report.fullReport, report.feedID)
```
This step:
1. Takes the raw `fullReport` bytes/string from the response
2. Uses the report schema that matches the feed (SDK detects version)
3. Validates the format and decodes into a structured object
4. Returns decoded data that can be combined with report metadata and formatted using `formatReport()`
For more details, see the [Report Format section of the SDK Reference](/data-streams/reference/data-streams-api/ts-sdk#report-format).
### Error handling
The TypeScript examples use standard try/catch patterns and optional timeouts:
1. Request timeouts / cancellation
Use your application's timeout/cancellation mechanism (for example, AbortController) when making requests to the SDK or wrap calls in a manual timeout.
2. Error checking
```typescript
try {
const report = await client.getLatestReport(feedId)
} catch (err) {
console.error("Failed to fetch report:", err)
process.exit(1) // fatal error
}
```
- Fatal errors (client creation, missing credentials) typically exit the process
- Non-fatal errors (single report decode) can be logged and skipped when processing multiple feeds
- All errors should be logged with context for easier debugging
Learn more about SDK error handling in the [SDK Reference](/data-streams/reference/data-streams-api/ts-sdk#error-handling).
3. SDK logging
The SDK can log requests and responses when `logging` is enabled in the config. In the example above we pass `console` as the logger and set `LogLevel.INFO`.
The decoded data can be used for further processing or display in your application. For production environments, you must verify the data onchain using the provided `fullReport` payload.
For more information about SDK logging configuration and monitoring options, see the [SDK Reference](/data-streams/reference/data-streams-api/ts-sdk#observability-logs--metrics).
> > > > > > > c98d3598 (draft ts-sdk-fetch)
---
# Stream and decode Data Streams reports via WebSocket using the TypeScript SDK
Source: https://docs.chain.link/data-streams/tutorials/ts-sdk-stream
In this tutorial, you'll learn how to use the [Data Streams SDK](/data-streams/reference/data-streams-api/ts-sdk) for TypeScript to subscribe to real-time [reports](/data-streams/reference/report-schema-overview) via a [WebSocket connection](/data-streams/reference/data-streams-api/interface-ws). You'll set up your TypeScript project, listen for real-time reports from the Data Streams Aggregation Network, decode the report data, and log their attributes to your terminal.
## Requirements
- **Git**: Make sure you have Git installed. You can check your current version by running git --version in your terminal and download the latest version from the official [Git website](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git) if necessary.
- **Node.js**: Make sure you have Node.js 20.0 or higher. You can check your current version by running node --version in your terminal and download the latest version from the official [Node.js website](https://nodejs.org/) if necessary.
- **TypeScript**: Make sure you have TypeScript 5.3 or higher. You can check your current version by running npx tsc --version in your terminal and install or update TypeScript by running npm install -g typescript if necessary.
- **API Credentials**: Access to Data Streams requires API credentials. If you haven't already, [contact us](https://chainlinkcommunity.typeform.com/datastreams?typeform-source=docs.chain.link#ref_id=docs) to request mainnet or testnet access.
## Tutorial
First, you'll set up a basic TypeScript project, installing the SDK and pasting example code. This will let you stream reports for [streams](/data-streams/crypto-streams), logging their attributes to your terminal.
### Set up your TypeScript project
1. Create a new directory for your project and navigate to it:
```bash
mkdir my-data-streams-project
cd my-data-streams-project
```
2. Initialize a new Node.js project:
```bash
npm init -y
```
3. Install the TypeScript SDK and other required packages:
```bash
npm install @chainlink/data-streams-sdk dotenv
npm install -D tsx
```
4. Set your API credentials:
Option 1 - Environment variables:
```bash
export API_KEY="your_api_key_here"
export USER_SECRET="your_user_secret_here"
```
Option 2 - `.env` file:
```bash
# Create .env file
touch .env
# Add your credentials
API_KEY="your_api_key_here"
USER_SECRET="your_user_secret_here"
```
### Establish a WebSocket connection and listen for real-time reports
1. Create a new TypeScript file, `stream.ts`, in your project directory:
```bash
touch stream.ts
```
2. Insert the following code example and save your `stream.ts` file:
```typescript
import { createClient, LogLevel, decodeReport, getReportVersion, formatReport } from "@chainlink/data-streams-sdk"
import "dotenv/config"
async function main() {
if (process.argv.length < 3) {
console.error("Please provide one or more feed IDs as arguments")
console.error("\nExamples:")
console.error(" Single feed:")
console.error(" npx tsx stream.ts 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782")
console.error(" Multiple feeds:")
console.error(
" npx tsx stream.ts 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782,0x00036fe43f87884450b4c7e093cd5ed99cac6640d8c2000e6afc02c8838d0265"
)
process.exit(1)
}
const feedIds = process.argv[2].split(",")
console.log("Chainlink Data Streams - Report Streaming")
console.log("=".repeat(60))
console.log(`📊 Feeds: ${feedIds.length} feed(s)`)
console.log("=".repeat(60))
try {
const client = createClient({
apiKey: process.env.API_KEY || "YOUR_API_KEY",
userSecret: process.env.USER_SECRET || "YOUR_USER_SECRET",
endpoint: "https://api.testnet-dataengine.chain.link",
wsEndpoint: "wss://ws.testnet-dataengine.chain.link",
// Comment to disable SDK logging:
logging: {
logger: console,
logLevel: LogLevel.INFO,
enableConnectionDebug: false, // Enable WebSocket ping/pong and connection state logs (logLevel should be DEBUG)
},
})
let reportCount = 0
// Create stream with custom options
const stream = client.createStream(feedIds, {
maxReconnectAttempts: 10,
reconnectInterval: 3000,
})
// Event: Process incoming reports
stream.on("report", (report) => {
reportCount++
try {
console.log(`\n📈 Report #${reportCount} - ${new Date().toISOString()}`)
// Show raw report blob
console.log(`\nRaw Report Blob: ${report.fullReport}`)
// Decode the report
const decodedData = decodeReport(report.fullReport, report.feedID)
const version = getReportVersion(report.feedID)
// Combine decoded data with report metadata
const decodedReport = {
...decodedData,
feedID: report.feedID,
validFromTimestamp: report.validFromTimestamp,
observationsTimestamp: report.observationsTimestamp,
}
console.log(formatReport(decodedReport, version))
} catch (error) {
console.error(`❌ Error processing report: ${error instanceof Error ? error.message : error}`)
}
// Display stats every 5 reports
if (reportCount % 5 === 0) {
const stats = stream.getMetrics()
console.log(
`\n📊 Stats: ${stats.accepted} reports | ${stats.activeConnections}/${stats.configuredConnections} connections`
)
}
})
// Event: Handle errors
stream.on("error", (error) => {
console.error(`\n❌ Error: ${error.message}`)
if (error.message.includes("authentication")) {
console.error("💡 Check your API_KEY and USER_SECRET environment variables")
}
})
// Event: Handle disconnections
stream.on("disconnected", () => {
console.log("\n🔴 Stream disconnected - reconnecting...")
})
// Event: Monitor reconnections
stream.on("reconnecting", (info: { attempt: number; delayMs: number; origin?: string; host?: string }) => {
console.log(
`🔄 Reconnecting... attempt ${info.attempt} in ~${info.delayMs}ms${info.host ? ` (${info.host})` : ""}`
)
})
console.log("⏳ Connecting...\n")
await stream.connect()
console.log("✅ Connected! Listening for reports...\n")
// Graceful shutdown
const shutdown = async () => {
console.log("\n🛑 Shutting down...")
await stream.close()
console.log("✅ Shutdown complete")
process.exit(0)
}
process.on("SIGINT", shutdown)
process.on("SIGTERM", shutdown)
} catch (error) {
console.error("❌ Failed to start stream:", error instanceof Error ? error.message : error)
process.exit(1)
}
}
main()
```
3. Subscribe to a [testnet crypto stream](/data-streams/crypto-streams?page=1\&testnetPage=1#testnet-crypto-streams). The below example executes the application, subscribing to the `ETH/USD` crypto stream:
```bash
npx tsx stream.ts 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
```
Expect output similar to the following in your terminal:
```bash
Chainlink Data Streams - Report Streaming
============================================================
📊 Feeds: 1 feed(s)
============================================================
[2025-09-24T01:52:36.464Z] [DataStreams] Data Streams client initialized
[2025-09-24T01:52:36.465Z] [DataStreams] Initializing stream in single mode
[2025-09-24T01:52:36.465Z] [DataStreams] Stream created successfully for 1 feed(s)
⏳ Connecting...
[2025-09-24T01:52:36.465Z] [DataStreams] Connecting stream in single mode
[2025-09-24T01:52:36.465Z] [DataStreams] Initializing in single connection mode { origin: 'ws.testnet-dataengine.chain.link' }
[2025-09-24T01:52:36.919Z] [DataStreams] Connection conn-0 established to ws.testnet-dataengine.chain.link {
connectionId: 'conn-0',
host: 'ws.testnet-dataengine.chain.link',
oldState: 'connecting',
newState: 'connected',
reason: 'WebSocket connection established'
}
[2025-09-24T01:52:36.921Z] [DataStreams] Stream connected successfully with 1 origins: wss://ws.testnet-dataengine.chain.link
✅ Connected! Listening for reports...
📈 Report #1 - 2025-09-24T01:52:37.639Z
Raw Report Blob: 0x00090d9e8d96765a0c49e03a6ae05c82e8f8de70cf179baa632f18313e54bd690000000000000000000000000000000000000000000000000000000001fb0b3d000000000000000000000000000000000000000000000000000000030000000100000000000000000000000000000000000000000000000000000000000000e00000000000000000000000000000000000000000000000000000000000000220000000000000000000000000000000000000000000000000000000000000028001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000120000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba7820000000000000000000000000000000000000000000000000000000068d34ee50000000000000000000000000000000000000000000000000000000068d34ee500000000000000000000000000000000000000000000000000004566c7d15bf000000000000000000000000000000000000000000000000000346f5bdd1b7f7f0000000000000000000000000000000000000000000000000000000068fadbe50000000000000000000000000000000000000000000000e3535f4d688742fe400000000000000000000000000000000000000000000000e35216a3ce81ab00000000000000000000000000000000000000000000000000e355718e650ceb8c00000000000000000000000000000000000000000000000000000000000000000264deb1e4d7485843f79f802f8ffd29fa395c45a8f4c10d7771d91fb5c6c7f55bdf3b496949277bdba413cf9a07adf62d4ffac8d04572455c8707a578eec7988f000000000000000000000000000000000000000000000000000000000000000221d03f390a3a8bf14406a5026bca8a569908abd0e3bfc1152e0ee94cf8087b29648cffd6e505f392678ad3c2488862dc6016755aec3076e64fa68d13705a5980
Report Metadata:
Feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Valid From: 1758678757
Observations: 1758678757
Decoded Data:
Native Fee: 76307741367280
LINK Fee: 14759139131228031
Expires At: 1761270757
Price: 4193418510271345000000
Bid Price: 4193326000000000000000
Ask Price: 4193567763462320000000
--------------------------------------------------
📈 Report #2 - 2025-09-24T01:52:38.352Z
Raw Report Blob: 0x00090d9e8d96765a0c49e03a6ae05c82e8f8de70cf179baa632f18313e54bd690000000000000000000000000000000000000000000000000000000001fb0b3f000000000000000000000000000000000000000000000000000000030000000100000000000000000000000000000000000000000000000000000000000000e00000000000000000000000000000000000000000000000000000000000000220000000000000000000000000000000000000000000000000000000000000028001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000120000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba7820000000000000000000000000000000000000000000000000000000068d34ee60000000000000000000000000000000000000000000000000000000068d34ee600000000000000000000000000000000000000000000000000004567035aadb500000000000000000000000000000000000000000000000000346f5006b572370000000000000000000000000000000000000000000000000000000068fadbe60000000000000000000000000000000000000000000000e354601f8dd7842d400000000000000000000000000000000000000000000000e35343330ef5aa80000000000000000000000000000000000000000000000000e3576b7b7e0d95bc0000000000000000000000000000000000000000000000000000000000000000029bbebb9390dfd84074ef0078b3b354a8c3f53915d24b664095b4360b20e495cfc1ee156fb32ffae29390a891b905b96fec0e949dbca19b7eded60a2178a0b7570000000000000000000000000000000000000000000000000000000000000002119306af79fb631fe3d281475b32e9e2c293fd6b544d4f5d44d74d9827d8897729c2bc6d5df3033d6b90aeb22df31f484f5ed7da9602c790213d3e12144544e1
Report Metadata:
Feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Valid From: 1758678758
Observations: 1758678758
Decoded Data:
Native Fee: 76308740222389
LINK Fee: 14759088289575479
Expires At: 1761270758
Price: 4193490798923085000000
Bid Price: 4193410600000000000000
Ask Price: 4193710169017200000000
--------------------------------------------------
📈 Report #3 - 2025-09-24T01:52:39.274Z
Raw Report Blob: 0x00090d9e8d96765a0c49e03a6ae05c82e8f8de70cf179baa632f18313e54bd690000000000000000000000000000000000000000000000000000000001fb0b42000000000000000000000000000000000000000000000000000000030000000100000000000000000000000000000000000000000000000000000000000000e00000000000000000000000000000000000000000000000000000000000000220000000000000000000000000000000000000000000000000000000000000028000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000120000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba7820000000000000000000000000000000000000000000000000000000068d34ee70000000000000000000000000000000000000000000000000000000068d34ee70000000000000000000000000000000000000000000000000000456689e0e86300000000000000000000000000000000000000000000000000346d7b2d006b3a0000000000000000000000000000000000000000000000000000000068fadbe70000000000000000000000000000000000000000000000e355ee07f0ab7f00000000000000000000000000000000000000000000000000e353c9352421f6e2000000000000000000000000000000000000000000000000e3565e1f7be6e49700000000000000000000000000000000000000000000000000000000000000000245ddf834b20cdd13c559fdd131c27f3c52d38ba6a456d6064a64269b703bc2012d0ffbb4ab3df623c0bdfba83004ef4db50ea351314b648153d2f6002f78cb4e00000000000000000000000000000000000000000000000000000000000000026a6bbed7e57969d56e6485682afc481f8f8127f292e4bd300f6bb0a57b77b21a2ca0e3338ba16b0afbbae567fc4a728c4b95c2092169f412ae7b8fc53c81342a
Report Metadata:
Feed ID: 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782
Valid From: 1758678759
Observations: 1758678759
Decoded Data:
Native Fee: 76306702198883
LINK Fee: 14757074592361274
Expires At: 1761270759
Price: 4193602800000000000000
Bid Price: 4193448319936840000000
Ask Price: 4193634351084156000000
--------------------------------------------------
[...additional reports...]
```
Your application has successfully subscribed to the report data.
[Learn more about the decoded report details](#decoded-report-details).
### Decoded report details
The decoded report details include:
| Attribute | Value | Description |
| ------------------------ | -------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| `Stream ID` | `0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782` | The unique identifier for the stream. In this example, the stream is for ETH/USD. |
| `Observations Timestamp` | `1758588884` | The timestamp indicating when the data was captured. |
| `Benchmark Price` | `4195298740000000000000` | The observed price in the report, with 18 decimals. For readability: `4,195.29874` USD per ETH. |
| `Bid` | `4194910430000000000000` | The highest price a buyer is willing to pay for an asset, with 18 decimals. For readability: `4,194.91043` USD per ETH. Learn more about the [Bid price](/data-streams/concepts/liquidity-weighted-prices). (For [DEX State Price streams](/data-streams/concepts/dex-state-price-streams), this value equals `Benchmark Price`.) |
| `Ask` | `4195448050000000000000` | The lowest price a seller is willing to accept for an asset, with 18 decimals. For readability: `4,195.44805` USD per ETH. Learn more about the [Ask price](/data-streams/concepts/liquidity-weighted-prices). (For [DEX State Price streams](/data-streams/concepts/dex-state-price-streams), this value equals `Benchmark Price`.) |
| `Valid From Timestamp` | `1758588884` | The start validity timestamp for the report, indicating when the data becomes relevant. |
| `Expires At` | `1761180884` | The expiration timestamp of the report, indicating the point at which the data becomes outdated. |
| `Link Fee` | `14805490063735767` | The fee to pay in LINK tokens for the onchain verification of the report data, with 18 decimals. For readability: `0.014805490063735767` LINK. **Note:** This example fee is not indicative of actual fees. |
| `Native Fee` | `76275680556912` | The fee to pay in the native blockchain token (e.g., ETH on Ethereum) for the onchain verification of the report data, with 18 decimals. **Note:** This example fee is not indicative of actual fees. |
For descriptions and data types of other report schemas, see the [Report Schema Overview](/data-streams/reference/report-schema-overview).
### Subscribing to multiple streams
You can subscribe to multiple streams by providing additional stream IDs as command-line arguments:
```bash
npx tsx stream.ts 0x000359843a543ee2fe414dc14c7e7920ef10f4372990b79d6361cdc0dd1ba782,0x00036fe43f87884450b4c7e093cd5ed99cac6640d8c2000e6afc02c8838d0265
```
This will subscribe to both ETH/USD and BTC/USD streams.
### High Availability (HA) mode
The example above demonstrates streaming data from a single crypto stream. For production environments, especially when subscribing to multiple streams, it's recommended to enable [High Availability (HA) mode](/data-streams/reference/data-streams-api/ts-sdk#high-availability-mode).
High Availability (HA) mode creates multiple WebSocket connections to different origin endpoints for improved reliability. When HA mode is enabled, the Stream will maintain at least 2 concurrent connections to different instances to ensure high availability, fault tolerance and minimize the risk of report gaps.
#### Enabling HA mode
To enable HA mode in your streaming application, make these changes to the basic example:
```typescript
// ... existing code ...
const client = createClient({
apiKey: process.env.API_KEY || "YOUR_API_KEY",
userSecret: process.env.USER_SECRET || "YOUR_USER_SECRET",
endpoint: "https://api.dataengine.chain.link", // Mainnet endpoint
wsEndpoint: "wss://ws.dataengine.chain.link", // Single endpoint (mainnet only)
haMode: true, // Enable High Availability mode
// Optional: Advanced connection monitoring with origin tracking
connectionStatusCallback: (isConnected, host, origin) => {
const timestamp = new Date().toISOString().substring(11, 19)
const status = isConnected ? "🟢 UP" : "🔴 DOWN"
console.log(`[${timestamp}] ${status} ${host}${origin || ""}`)
// Example: Send alerts for specific origins
if (!isConnected && origin) {
console.warn(`⚠️ Alert: Origin ${origin} on ${host} went offline`)
}
},
logging: {
logger: console,
logLevel: LogLevel.INFO,
},
})
// ... existing code ...
```
When `haMode` is `true`, the SDK automatically discovers multiple origin endpoints behind the single URL and establishes separate connections to each origin. You also must use a mainnet endpoint, as HA mode is not currently supported on testnet.
The optional `connectionStatusCallback` can be used to integrate with external monitoring systems. The SDK already provides comprehensive connection logs, so this callback is primarily useful for custom alerting or metrics collection.
See more details about HA mode in the [SDK Reference](/data-streams/reference/data-streams-api/ts-sdk#high-availability-mode).
### Payload for onchain verification
In this tutorial, you logged and decoded the `full_report` payloads to extract the report data. However, in a production environment, you should verify the data to ensure its integrity and authenticity.
Refer to the [Verify report data onchain](/data-streams/tutorials/evm-onchain-report-verification) tutorial to learn more.
## Explanation
### Establishing a WebSocket connection and listening for reports
Your application uses the `createClient` function from the [Data Streams SDK](/data-streams/reference/data-streams-api/ts-sdk) to create a client, then uses `client.createStream()` to establish a real-time WebSocket connection with the Data Streams Aggregation Network.
Once the WebSocket connection is established, your application subscribes to one or more streams by passing an array of feed IDs to the `createStream` function. This subscription lets the client receive real-time updates whenever new report data is available for the specified streams.
Fore further reference, see the [WebSocket Interface](/data-streams/reference/data-streams-api/interface-ws) section of the SDK Reference.
### Event-driven streaming
The TypeScript SDK uses an event-driven approach for handling streaming data:
- **Connection events**: The [stream emits `connected`, `disconnected`, and `reconnecting` events](/data-streams/reference/data-streams-api/ts-sdk#streaming) to track connection status.
- **Report events**: When new reports arrive, the `report` [event is triggered](/data-streams/reference/data-streams-api/ts-sdk#streaming) with the decoded report data.
- **Error handling**: The stream [emits `error` events](/data-streams/reference/data-streams-api/ts-sdk#error-handling) for any issues that occur during streaming.
- **Metrics**: [The `getMetrics()` method provides](/data-streams/reference/data-streams-api/ts-sdk#metrics-streamgetmetrics) real-time statistics about the stream performance.
### Decoding and processing reports
As data reports arrive via the established WebSocket connection, they are processed in real-time:
- **Automatic decoding**: The SDK's `decodeReport` function automatically [detects the report version](/data-streams/reference/data-streams-api/ts-sdk#report-format) and parses the raw data into a structured format.
- **Real-time processing**: Each received report triggers the `report` event handler, where you can process, log, or store the decoded data.
- **Error resilience**: Individual report processing errors don't interrupt the stream, allowing continuous operation.
### Handling the decoded data
In this example, the application logs the structured report data to the terminal. However, this data can be used for further processing, analysis, or display in your own application. The decoded data includes essential information such as benchmark prices, bid/ask spreads, and fee data for onchain verification.
For more information about SDK streaming configuration and advanced options, see the [SDK Reference](/data-streams/reference/data-streams-api/ts-sdk).