REST API service that provides endpoints for querying indexed blockchain data. Built with Fiber framework and includes comprehensive Swagger documentation.
Features:
- Account data queries
- Block information retrieval
- Module transaction tracking
- NFT data and transaction history
- Governance proposal information
- Transaction details and history
- Validator information and metrics
- Health check endpoints
- CORS support and request logging
Specialized indexer for blockchain events with comprehensive event tracking capabilities. Processes transaction events, block events, and Move smart contract events.
Features:
- Transaction event processing and indexing
- Block finalization event capture
- Move smart contract event tracking
- Database migration management
- Data pruning with cloud storage backup
- Command-line interface with multiple modes
Commands:
indexer
- Main indexing processmigrate
- Database migration operationsprunner
- Data pruning and archival
General-purpose blockchain data indexer with both continuous and scheduled processing capabilities. Handles comprehensive blockchain state tracking and account management.
Features:
- Continuous block processing
- Cron-based batch operations
- Account data indexing
- Block result processing
- Validator state tracking
- Batch insertion optimization
- Multi-mode operation support
Commands:
indexer
- Continuous indexing modeindexercron
- Scheduled batch processing mode
Comprehensive blockchain data processor with specialized module processors for different blockchain components. Features advanced state tracking and caching mechanisms.
Features:
- Modular processor architecture (auth, bank, IBC, move, OPinit, gov, staking)
- State tracking and management
- Data caching for performance optimization
- Genesis block processing
- Event processing utilities
- Validator uptime tracking
- Batch state updates
Commands:
indexer
- Main processing enginemigrate
- Database schema management
High-performance data collection service that polls RPC endpoints for new blockchain data and distributes it via message queues.
Features:
- RPC endpoint polling
- Block data retrieval
- Message queue publishing
- Database migration on startup
- Error handling and retry logic
- Configurable polling intervals
Message queue consumer that processes transaction response data and uploads it to cloud storage systems with support for large message handling.
Features:
- Kafka message consumption
- Cloud storage integration (GCS)
- Claim check pattern support for large messages
- Dead letter queue error handling
- Retry mechanisms with exponential backoff
- Transaction response archival
- Sentry integration for monitoring
The indexer ecosystem consists of multiple specialized services that work together to provide comprehensive blockchain data indexing and querying capabilities.
Core Data Flow:
- Sweeper retrieves new block data from RPC endpoints and publishes it to message queues
- Indexers (Event, Generic, Informative) consume messages and process blockchain data into the database
- API serves the indexed data through REST endpoints
- TX Response Uploader handles transaction response storage in cloud services
- Pruners manage data lifecycle and storage optimization
Sweeper
- On startup, applies any pending database migrations.
- Retrieves the latest indexed block from the database.
- Fetches data for the next block using the RPC methods
/block
and/block_results
. - Publishes the block data as a message to the message queue.
Indexers
- Subscribe to and read messages from the queue.
- Process each message using specialized processors for different blockchain modules.
- Insert processed data into the database with batch operations for efficiency.
- Handle state tracking and caching for optimized performance.
Prunner
- Triggers at predefined intervals.
- Checks whether the database requires pruning.
- If pruning is needed:
- Fetches prunable rows from the database.
- Uploads the data to a cloud storage service.
- Deletes the fetched rows from the database.
To run the Informative Indexer with Docker locally, follow this guide.