Skip to main content
You are viewing the documentation for Interana version 2. For documentation on the most recent version of Interana, go to


Scuba Docs

Segment integration cookbook

This applies tov2.24

Segment allows you to manage integrations between your data sources and your analytics platform (Interana). You can use Segment's API and libraries to track events and actors, then send that data to Interana.

See Ingest data from Segment into Interana for information about integrating Interana version 3.x with Segment.

In Interana version 2.x, you can use webhooks to connect your Segment data source to Interana for data ingestion as follows:

  1. First, install and configure your Interana cluster, including the Listener node. You must assign a public IP address to your listener node (or nodes, if you have multiple).
  2. Run the following at the command line to create your dataset and set up the data ingest pipeline: import_server/tools/ -t <table_name> -k <topic_name> --shard_column userId --time_column timestamp --shard_type <int or string> --time_format "%Y-%m-%dT%H:%M:%S.%fZ"
    • Where <table_name> is the name of the dataset that you want to create
    • <topic_name> is any name that is meaningful to you. This is used by the Interana Listener node, which stores data in categories called topics.
    • userId and timestamp are the default columns that Segment creates
    • shard_type is either int or string, depending on the datatype of the data
  3. Log in to your Segment account.
  4. Set up a Segment Source, if you don't already have one.
  5. Next, select Add Integration and click the Webhooks tile in the Integrations Catalog.
  6. Go to the Settings tab.
  7. Edit the Webhooks connection settings to set the Webhook URL: http://<listener_node_public_ip>/add_events
  8. Add a Header. Use the format topic : <topic_name>
    • Where <topic_name> is the same value that you used in step 2.
  9. Click Save.

Now follow Segment's documentation to send data to your Source.

Backing up your data

Because this is not a high-availability data integration, we recommend that you use another method as your main event data storage solution. We recommend using an Amazon S3 or Microsoft Azure archive to back up your data. 

  • Was this article helpful?