X Reply Agent Logo

X Reply Agent

Our project is an AI agent that uses n8n to catch mentions on X and automatically replies with real-time Hyperliquid insights. By integrating CoinGecko MCP and Hyperliquid’s GoldRush MCP, it delivers price, funding rate, and open interest data directly into conversations. This solves the problem of fragmented, delayed information by bringing HL-native analytics to the platforms where the community already engages such as X. It strengthens Hyperliquid’s visibility and culture while giving traders instant, reliable answers.
Learn more on TAIKAI

Team

  • Cem Denizsel (Owner)
  • Dogukan Ali Gundogan (Member)

Categories

  • 02. 🚀 Hyperliquid Frontier Track
  • 17. Best use of GoldRush

Why

  • Faster support: Answer crypto data questions at mention time.
  • Tool-orchestration: Use multiple MCP servers (CoinGecko, GoldRush) via an agent.
  • Hands-free posting: Reply under the original tweet through an X MCP.

Hyperliquid extension goal 🧭

  • Ask-on-X: Community members can mention your account and ask about Hyperliquid topics (e.g., HLP price, token movements, transaction details), receiving answers in-reply, in seconds.
  • Right tool for the job: The agent routes price/market questions to CoinGecko MCP and on‑chain activity questions (balances, transfers, gas, transaction lookups) to the GoldRush MCP.
  • Frictionless onboarding: No dashboards or query builders—natural‑language questions on X become live, contextual responses, increasing Hyperliquid awareness and engagement.
  • Extensible: New Hyperliquid‑specific tools can be added as MCP endpoints, and the agent will automatically consider them.

Examples the agent can answer for Hyperliquid users:

  • “What’s the price of HLP right now?” → CoinGecko MCP.
  • “Show the last 3 ERC20 transfers of the HLP contract on ethereum; brief summary.” → GoldRush MCP.
  • “Give me the native balance and recent activity for 0x… on base; 1 line.” → GoldRush MCP.

Components 🧩

  • cmd/mcp-servers/general/coingecko/xmcp (X MCP)
    • Tool: "twitter.post_reply" (posts under a tweet). Requires X auth (Bearer or OAuth1).
  • cmd/mcp-servers/general/coingecko/cgproxy (CoinGecko Proxy MCP)
    • Bridges HTTP MCP to an upstream stdio CoinGecko MCP (local via "npx @coingecko/coingecko-mcp" or remote via "mcp-remote").
    • Discovers and forwards all upstream tools.
  • cmd/mcp-servers/general/goldrush (GoldRush MCP)
    • Adds tools for on-chain data (Covalent GoldRush): balances, transactions, gas, NFTs, token holders, etc.
    • Requires "GOLDRUSH_AUTH_TOKEN" and often an allow-list (IP) on Covalent.
  • cmd/agent (LangChainGo ReAct agent)
    • Discovers tools from HTTP MCP servers: CoinGecko + GoldRush + X poster.
    • Runs with OpenAI (or compatible) model. Sanitizes final answer to avoid tool chatter.
    • If `-reply-to` is provided, posts the answer via X MCP and exits non-zero on failures.
  • cmd/bot (HTTP server)
    • Endpoint: `GET /healthz`, `POST /mentions`.
    • Always runs in agent mode: set `AGENT_CMD`; the bot spawns the agent per mention.

n8n integration 🧰

Payloads accepted by "POST /mentions"

  • Single object ("MentionsPayload") or array of objects.

Example (single):

{
  "count": 1,
  "mentions": [
    {
      "tweet_id": "1958197635065463114",
      "text": "what’s the price of $HLP on Hyperliquid right now?",
      "author_id": "123",
      "author_username": "alice",
      "conversation_id": "1956374656836907309",
      "created_at": "2025-08-15T10:00:00.000Z"
    }
  ],
  "meta": {}
}

Quick Setup 🚀

To get started with X Reply Agent:

  1. Clone the repository:
    git clone https://github.com/DogukanGun/XReplyAgent.git
  2. Set up environment variables:
    export AGENT_CMD="$(pwd)/agent" export AGENT_CG_MCP_HTTP="http://localhost:8082/mcp" export AGENT_X_MCP_HTTP="http://localhost:8081/mcp" export AGENT_GOLDRUSH_MCP_HTTP="http://localhost:8083/mcp" export OPENAI_API_KEY="your_openai_key"
  3. Install dependencies and build the project
  4. Start the bot server and required MCP servers

For detailed setup instructions and configuration options, visit our GitHub repository.