Skip to main content

API Comparison

To choose the right Bitquery API offering, it helps to understand their differences. The table below summarizes key features and capabilities of GraphQL Query vs GraphQL Subscription vs Kafka Streams:

Feature / MethodGraphQL QueryGraphQL Subscription (WebSocket)Kafka Streams (Protobuf)
Use caseOn‑demand historical + real‑time via pollingReal-time pushes for new on-chain eventsHigh‑throughput, event-driven pipelines
Data modelFlexible GraphQL with filtering, aggregationSame GraphQL query syntax, live updatesTopic-based structured streams with JSON or Protobuf
Latency~1 sec~1 sec latencySub‑second; streamed within 500 ms, supports HFT
Delivery modelPull via REST/GraphQLPush over WebSocket (GraphQL subscription)Consumer‑driven pull from Kafka broker
Schema granularityCustomizable projectionsCustomizable projectionKafka topics per type (e.g., dextrades, transactions)
Ordering & duplicationStrong consistency in query responseNo guaranteed order; WebSocket deliveryConsumer logic needed
AuthenticationOAuth/API‑KeyOAuth token over WebSocketSASL username, password
Ideal Implementation LanguageAny languageAny language except curlGo, Java, Python, Rust
Best forInteractive apps, ad-hoc queryingMonitoring, alerts, dashboards, mempool eventsStreaming pipelines, HFT, data lakes, analytics workloads

When to Use Each API

GraphQL Queries

Best for on-demand, ad-hoc, and historical data needs:

  • When you need to fetch past blockchain activity (trades, transfers, balances)
  • For dashboards, reports, and analytics that rely on filtering, sorting, and pagination
  • When requests are driven by user actions or scheduled jobs
  • Ideal for combining historical + near-real-time data via polling

GraphQL Subscriptions (WebSocket)

Ideal for lightweight real-time updates and UI integration:

  • When you want push-based delivery of live on-chain events (e.g., swaps, transfers, new blocks)
  • For wallet trackers, price alerts, or monitoring dashboards
  • When you need low-latency updates (~500 ms) but don’t require ultra-high throughput
  • Perfect for embedding live feeds into web or mobile interfaces

Kafka Streams (JSON / Protobuf)

Perfect for high-throughput, fault-tolerant streaming pipelines:

  • When you need sub-second end-to-end latency for trades, mempool events, or order books
  • For building scalable data lakes, HFT bots, or real-time analytics systems
  • When you require integration with Kafka-based ecosystems

Pick Queries for flexibility and history, Subscriptions for easy real-time UI feeds, and Kafka Streams when you need industrial-scale, ultra-low-latency pipelines.