Build Your First Custom Indexer
Refer to Custom Indexing Framework and Indexer Pipeline Architecture for a conceptual overview of the indexer framework.
Refer to Access Sui Data for an overview of options to access Sui network data.
To build a complete custom indexer, you use the sui-indexer-alt-framework. The steps that follow demonstrate how to create a sequential pipeline that extracts transaction digests from Sui checkpoints and stores them in a local PostgreSQL. You can find the source code for the framework in the Sui repo on GitHub.
While this example uses PostgreSQL with Diesel (a popular Rust ORM and query builder) for minimalism and out-of-the-box support, the sui-indexer-alt-framework is designed for flexible storage. You can use different databases (such as MongoDB, CouchDB, or similar) or utilize other database clients if you prefer not to use Diesel. To achieve this, implement the framework's Store and Connection traits and define your database write logic directly within your Handler::commit() method.
- Prerequisites
Check installation
If you're unsure whether your system has the necessary software properly installed, you can verify installation with the following commands.
$ psql --version
$ diesel --version
- Read the Custom Indexing Framework and Indexer Pipeline Architecture topics first to understand the overall architecture and concurrent pipeline concepts.
What you build
The following steps show how to create an indexer that:
- Connects to Sui Testnet: Uses the remote checkpoint store at https://checkpoints.testnet.sui.io.
- Processes checkpoints: Streams checkpoint data continuously.
- Extracts transaction data: Pulls transaction digests from each checkpoint.
- Stores in local PostgreSQL: Commits data to a local PostgreSQL database.
- Implements sequential pipeline: Uses in-order processing with batching for optimal consistency and performance.
In the end, you have a working indexer that demonstrates all core framework concepts and can serve as a foundation for more complex custom indexers.
Sui provides checkpoint stores for both Mainnet and Testnet.
- Testnet:
https://checkpoints.testnet.sui.io - Mainnet:
https://checkpoints.mainnet.sui.io
Step 1: Project setup
First, open your console to the directory you want to store your indexer project. Use the cargo new command to create a new Rust project and then navigate to its directory.
$ cargo new simple-sui-indexer
$ cd simple-sui-indexer
Step 2: Configure dependencies
Replace your Cargo.toml code with the following configuration and save.
[package]
name = "basic-sui-indexer"
version = "0.1.0"
edition = "2021"
[dependencies]
# Core framework dependencies
sui-indexer-alt-framework = { git = "https://github.com/MystenLabs/sui.git", branch = "testnet" }
# Async runtime
tokio = { version = "1.0", features = ["full"] }
# Error handling
anyhow = "1.0"
# Diesel PostgreSQL
diesel = { version = "2.0", features = ["postgres", "r2d2"] }
diesel-async = { version = "0.5", features = ["bb8", "postgres", "async-connection-wrapper"] }
diesel_migrations = "2.0"
# Async traits
async-trait = "0.1"
# URL parsing
url = "2.0"
# Use .env file
dotenvy = "0.15"
# Command line parsing
clap = { version = "4.0", features = ["derive"] }
The manifest now includes the following dependencies:
sui-indexer-alt-framework: Core framework providing pipeline infrastructure.diesel/diesel-async: Type-safe database ORM with asynchronous support.tokio: Async runtime required by the framework.clap: Command-line argument parsing for configuration.anyhow: Error handling and async-trait for trait implementations.dotenvy: Ingest.envfile that stores your PostgreSQL URL.
Step 3: Create database
Before configuring migrations, create and verify your local PostgreSQL database:
$ createdb sui_indexer
Get your connection details:
$ psql sui_indexer -c "\conninfo"
If successful, your console should display a message similar to the following:
You are connected to database "sui_indexer" as user "username" via socket in "/tmp" at port "5432".
If you receive a createdb error similar to
createdb: error: connection to server on socket "/tmp/.s.PGSQL.5432" failed: FATAL: role "username" does not exist
This means you need to create the user (replace username with the name provided in your error message).
$ sudo -u postgres createuser --superuser username
Enter the password for your pgAdmin account when prompted, then try the createdb command again.
You can now set a variable to your database URL as it's used in following commands. Make sure to change username to your actual username.
$ PSQL_URL=postgres://username@localhost:5432/sui_indexer
You can now test your connection with the following command:
$ psql $PSQL_URL -c "SELECT 'Connected';"
If successful, your console or terminal should respond with a message similar to the following:
?column?
-----------
Connected
(1 row)
Step 4: Database setup
Before you start coding, make sure you set up a local PostgreSQL database from the previous step. This is required for the indexer to store the extracted transaction data.
The following database setup steps have you:
- Create a database table to store the data.
- Use Diesel to manage the process.
- Generate Rust code that maps to the database table.
Step 4.1: Configure Diesel
First, create a diesel.toml file (within the same folder as cargo.toml) to configure database migrations.
$ touch diesel.toml
Update and save the file with the following code:
[print_schema]
file = "src/schema.rs"
[migrations_directory]
dir = "migrations"
Step 4.2: Create database table using Diesel migrations
Diesel migrations are a way of creating and managing database tables using SQL files. Each migration has two files:
up.sql: Creates and changes the table.down.sql: Removes and undoes the changes.
Use the diesel setup command to create the necessary directory structure, passing your database URL with the --database-url argument.
$ diesel setup --database-url $PSQL_URL
Use the diesel migration command at the root of your project to then generate the migration files.
$ diesel migration generate transaction_digests
You should now have a migrations folder in your project. There should be a subdirectory in this folder with the name format YYYY-MM-DD-HHMMSS_transaction_digests. This folder should contain the up.sql and down.sql files.
Open up.sql and replace its contents with the following code (using the actual folder name):
CREATE TABLE IF NOT EXISTS transaction_digests (
tx_digest TEXT PRIMARY KEY,
checkpoint_sequence_number BIGINT NOT NULL
);
This example uses the TEXT data type for tx_digest, but best practice for a production indexer is to use the BYTEA data type.
The TEXT type is used to make the transaction digest easily readable and directly usable with external tools. Digests are Base58 encoded, and because PostgreSQL cannot natively display BYTEA data in this format, storing it as TEXT allows you to copy the digest from a query and paste it into an explorer like SuiScan.
For a production environment, however, BYTEA is strongly recommended. It offers superior storage and query efficiency by storing the raw byte representation, which is more compact and significantly faster for comparisons than a string. Refer to Binary data performance in PostgreSQL on the CYBERTEC website for more information.
Save up.sql, then open down.sql to edit. Replace the contents of the file with the following code and save it:
DROP TABLE IF EXISTS transaction_digests;
Step 4.3: Apply migration and generate Rust schema
From the root of your project, use the diesel migration command to create tables.
$ diesel migration run --database-url $PSQL_URL
Then use the diesel print-schema command to generate the schema.rs file from the actual database.
$ diesel print-schema --database-url $PSQL_URL > src/schema.rs
Your src/schema.rs file should now look like the following:
// @generated automatically by Diesel CLI.
diesel::table! {
transaction_digests (tx_digest) {
tx_digest -> Text,
checkpoint_sequence_number -> Int8,
}
}
After running the previous commands, your project is set up for the next steps:
- PostgreSQL now has a
transaction_digeststable with the defined columns. src/schema.rscontains automatically generated Rust code that represents this table structure.- You can now write type-safe Rust code that talks to this specific table.
The Diesel's migration system evolves the database schema over time in a structured and version-controlled way. For a complete walkthrough, see the official Diesel Getting Started guide.