TimescaleDB Integration

This tutorial shows how to connect ALPON X4 to TimescaleDB for storing real-time sensor data.


TimescaleDB is a powerful time-series database built on PostgreSQL, designed for efficiently storing and querying sensor and event data. With ALPON X4, it can be deployed in a fully containerized environment via ALPON Cloud, enabling you to continuously ingest and manage time-series data from your device fleet.

This guide walks through creating a simple TimescaleDB data ingestion container and deploying it on ALPON X4 using ALPON Cloud.

For more details on TimescaleDB, visit the Timescale documentation.

Prerequisites

Before getting started, ensure the following:

  • ALPON X4 Device: Powered on, connected to ALPON Cloud and operational.
  • ALPON Cloud Account: Access to ALPON Cloud for container management.
  • Docker Installed: On your development machine for building the container image.
  • TimescaleDB Service: A running TimescaleDB instance (hosted on Timescale or another PostgreSQL-compatible service).

Step 1: Prepare the Dockerfile

On your development machine, create a file named Dockerfile that defines a Python-based ingestion environment:

FROM python:3.11-slim

RUN apt-get update && apt-get install -y \
    gcc \
    libpq-dev \
    && rm -rf /var/lib/apt/lists/*

WORKDIR /app

RUN pip install --no-cache-dir psycopg2-binary==2.9.9

COPY main.py .

CMD ["python", "main.py"]

This container will run a Python script to send data into your TimescaleDB instance.

Step 2: Write the Data Ingestion Script

Create a file named main.py in the same directory as your Dockerfile.
This script will:

  • Connect to your TimescaleDB instance.
  • Create a sensor_data table if it doesn’t exist.
  • Insert simulated temperature and humidity readings every 10 seconds.
import psycopg2
import os
import random
import time
from datetime import datetime

class PostgresClient:
    def __init__(self):
        self.conn = psycopg2.connect(
            f"postgres://{os.getenv('DB_USER')}:{os.getenv('DB_PASSWORD')}@"
            f"{os.getenv('DB_HOST')}:{os.getenv('DB_PORT')}/{os.getenv('DB_NAME')}?sslmode=require"
        )
        print("Database connected!")
    
    def setup_table(self):
        with self.conn.cursor() as cur:
            cur.execute("""
                CREATE TABLE IF NOT EXISTS sensor_data (
                    id SERIAL PRIMARY KEY,
                    sensor_name VARCHAR(50),
                    temperature DECIMAL(5,2),
                    humidity DECIMAL(5,2),
                    timestamp TIMESTAMP DEFAULT CURRENT_TIMESTAMP
                )
            """)
        self.conn.commit()
        print("Table created!")
    
    def insert_data(self):
        sensors = ['TEMP_001', 'TEMP_002', 'HUMID_001']
        
        with self.conn.cursor() as cur:
            cur.execute("""
                INSERT INTO sensor_data (sensor_name, temperature, humidity)
                VALUES (%s, %s, %s)
            """, (
                random.choice(sensors),
                round(random.uniform(15.0, 35.0), 2),
                round(random.uniform(30.0, 80.0), 2)
            ))
        self.conn.commit()
        print(f"{datetime.now().strftime('%H:%M:%S')} - Data inserted")
    
    def show_data(self):
        with self.conn.cursor() as cur:
            cur.execute("SELECT * FROM sensor_data ORDER BY timestamp DESC LIMIT 5")
            for row in cur.fetchall():
                print(f"ID: {row[0]}, Sensor: {row[1]}, Temp: {row[2]}°C, Humidity: {row[3]}%")
    
    def run_continuous(self):
        print("Starting continuous data generation... (Ctrl+C to stop)")
        try:
            while True:
                self.insert_data()
                time.sleep(10)
        except KeyboardInterrupt:
            print("\nStopped by user")
        finally:
            self.conn.close()

def main():
    db = PostgresClient()
    db.setup_table()
    db.show_data()
    db.run_continuous()

if __name__ == "__main__":
    main()

Step 3: Build the Docker Image

From the directory containing your Dockerfile and main.py, run:

docker buildx build --platform linux/arm64 --load -t timescale-ingestor .

This creates a container image compatible with ALPON X4’s ARM64 architecture.


Step 4: Push the Image to ALPON Cloud

  • Log in to the Sixfab Connect platform, navigate to the Sixfab Registry page
  • Click on + Add Container and follow the prompts to push container to Sixfab registry.

    📘

    Manage and Deploy Applications

    Visit the Manage & Deploy Applications page for all the necessary details on pushing your container image to the Sixfab Registry.



Step 5: Deployment Configuration

  1. Go to the Application section of your asset on Sixfab Connect.

  2. Click the + Deploy button to configure and deploy the container.

  3. In the Deploy Container window, use the following settings:

    • Container Name: timescale

    • Image: Select the timescale-ingestor image and tag pushed to the Sixfab Registry.

    • Environment: Click "+ Add More" in the environment section and add the following values:

      KeyValue
      DB_USERyour_timescale_username
      DB_PASSWORDyour_timescale_password
      DB_HOSTyour_timescale_host
      DB_PORT5432
      DB_NAMEyour_database_name

    • Click the "+ Deploy" button.

Final Step: Verify Data in TimescaleDB

Once the container is running, data will be continuously sent to your TimescaleDB instance.
To check:

SELECT * FROM sensor_data ORDER BY timestamp DESC LIMIT 5;

You should see the latest simulated readings from your ALPON X4.

With TimescaleDB now integrated, your ALPON X4 can store and query massive volumes of time-series data for analytics, monitoring, and visualization.