v0.3.0 is live

Query live Stellar data
with one command.

Unix pipes for blockchain indexing. No complex infrastructure required.

user@obsrvr:~
nebu fetch 60200000 --follow | token-transfer | nats-sink
# Streaming live events to NATS...
{ "type": "transfer", "asset": "USDC", "amount": 100.00 }
{ "type": "swap", "pair": "XLM/USDC", "price": 0.114 }

Blazing Fast

Written in Go. Decouples fetching from processing. Backfill 5 years of history in hours, not months, by parallelizing fetch workers.

The Unix Way

No heavy databases required. nebu respects stdin and stdout. Pipe directly into DuckDB for instant SQL analytics, jq for filtering or any tool that you want.

SaaS Ready (NATS)

Bridge the gap between CLI and Cloud. Use nats-sink to turn your local pipeline into a distributed, real-time firehose for your API.

The Pipeline Architecture

SOURCE
nebu fetch
Pulls XDR from RPC/GCS
PROCESSOR
token-transfer
Parses XDR -> JSON
SINK
nats-sink
Publishes to Subject
| Pipe anything. | Replace any component. | Build what you need. |

Installation

Get nebu installed on your system in seconds

$ go install github.com/withObsrvr/nebu/cmd/nebu@latest
Copied!

Requires Go 1.23 or later. After installation, the nebu command will be available in your $GOPATH/bin.

Install processors on-demand using nebu install <processor>

Getting Started

Common commands to get you up and running with nebu

Fetch raw ledgers

Separate fetch from processing

$ nebu fetch 60200000 60200100 > ledgers.xdr
$ cat ledgers.xdr | token-transfer | jq

Historical archives

GCS/S3 for data lakehouse

$ nebu fetch --mode archive \
--bucket-path "my-bucket/stellar/ledgers" \
60200000 60300000 | gzip > historical.xdr.gz

Premium RPC

Use authenticated endpoints

$ export NEBU_RPC_AUTH="Api-Key YOUR_API_KEY"
$ token-transfer --start-ledger 60200000 --end-ledger 60200100 \
--rpc-url https://rpc-pubnet.nodeswithobsrvr.co

Complete pipeline

Extract → filter → dedupe → store

$ token-transfer --start-ledger 60200000 --follow | \
jq -c 'select(.transfer.asset.issuedAsset.assetCode == "USDC")' | \
dedup --key meta.txHash | \
json-file-sink --out usdc-transfers.jsonl

Manage processors

List and install components

$ nebu list
$ nebu install token-transfer
$ nebu install json-file-sink

Extract token transfers

Stream events from Stellar ledgers

$ token-transfer --start-ledger 60200000 --end-ledger 60200100

Filter with jq

USDC transfers only

$ token-transfer --start-ledger 60200000 --end-ledger 60200100 | \
jq 'select(.transfer.asset.issuedAsset.assetCode == "USDC")'

Stream continuously

Like tail -f

$ token-transfer --start-ledger 60200000 --follow

Multiple destinations

Fan-out with tee

$ token-transfer --start-ledger 60200000 --end-ledger 60200100 | \
tee >(nats-sink --subject "stellar.transfers" --jetstream) | \
tee >(json-file-sink --out transfers.jsonl) | \
jq -r '"Ledger \(.meta.ledgerSequence): \(.transfer.amount)"'

SQL analytics

Aggregate with DuckDB

$ token-transfer --start-ledger 60200000 --end-ledger 60200100 | \
duckdb -c "
SELECT
json_extract_string(transfer, '$.asset.issuedAsset.assetCode') as asset,
COUNT(*) as count,
SUM(CAST(json_extract_string(transfer, '$.amount') AS DOUBLE)) as volume
FROM read_json('/dev/stdin')
WHERE transfer IS NOT NULL
GROUP BY asset
ORDER BY volume DESC
"