Skip to content

DevOldSchool/powermeter_hub_server

 
 

Repository files navigation

Efergy Hub Local Data Logger

This project provides a local emulation server for legacy Efergy Engage hubs (v1, v2, and v3). It allows you to intercept and log your home's energy data to a local SQLite database, completely bypassing the decommissioned Efergy cloud servers.

This is designed for anyone who wants to keep their devices from becoming e-waste.

How it Works

The Efergy Hub is hard-coded to send its data to sensornet.info over HTTPS. Unfortunately, these old devices use the deprecated SSLv3 protocol, which modern web servers and Python libraries will not accept.

This project solves the problem with a two-service system managed by Docker Compose:

  1. legacy-nginx Service: A custom-built Nginx server acts as a reverse proxy. It uses an old version of OpenSSL (1.0.2u) specifically compiled to accept the hub's SSLv3 connection. It then terminates the SSL and forwards the decrypted HTTP traffic to the Python application.
  2. hub-server Service: A lightweight Python 3 server (hub_server.py) that listens for the forwarded requests. It receives the plain HTTP request from the Nginx proxy, emulates the Efergy API, and logs the data to a SQLite database (readings.db) using the db.py script.

Setup with Docker Compose

1. Generate SSL Certificates

The legacy-nginx service requires SSL certificates to run. A helper script is provided to generate self-signed certificates.

  1. Make the script executable:
chmod +x ./generate-certs.sh
  1. Run the script from the project's root directory:
./generate-certs.sh

This will create server.key and server.crt inside the legacy-nginx directory, where the docker-compose.yml file expects to find them.

2. Run the Services

With the certificates in place, you can start both services using Docker Compose.

# Build and start the containers in detached mode
docker-compose up --build -d

This will:

  • Build the hub-server image from its Dockerfile.
  • Build the legacy-nginx image from its Dockerfile.
  • Start both containers. The legacy-nginx service is exposed on port 443.
  • Automatically create the SQLite database on first run and mount the data directory for persistence.

3. Redirect the Efergy Hub

Finally, you must trick your Efergy Hub into sending data to your new server instead of sensornet.info. The easiest way to do this is with DNS spoofing on your router (e.g., using dnsmasq, Pi-hole, or similar):

Create DNS entries that map your hub's specific domain(s) to the local IP address of the machine running your Docker container (e.g., 10.0.0.213).

Domain patterns by hub version:

  • V1 hubs: [MAC].sensornet.info and [MAC].keys.sensornet.info (e.g., aa.bb.cc.dddddd.sensornet.info)
  • V2/V3 hubs: [MAC].[h2/h3].sensornet.info (e.g., 41.0a.04.001ec0.h2.sensornet.info)

Once the hub is rebooted, it will contact sensornet.info, be directed to your legacy-nginx proxy, and your hub-server should start logging data to readings.db.

Pi-hole example

Navigate to Settings -> Local DNS Records and add entries for your hub:

For v2/v3 hubs:

Domain IP
[device mac].[h2/h3].sensornet.info [server ip]
41.0a.04.001ec0.h2.sensornet.info 10.0.0.213

For v1 hubs:

Domain IP
[device mac].sensornet.info [server ip]
[device mac].keys.sensornet.info [server ip]
aa.bb.cc.dddddd.sensornet.info 10.0.0.213
aa.bb.cc.dddddd.keys.sensornet.info 10.0.0.213

pfSense example

Navigate to Services -> DNS Resolver -> Custom Options and add entries for your hub:

For all hubs (wildcard approach):

server:
    local-zone: "sensornet.info" redirect
    local-data: "sensornet.info 86400 IN A 10.0.0.213"

Or for specific domains:

server:
    # V2/V3 hubs
    local-data: "41.0a.04.001ec0.h2.sensornet.info 86400 IN A 10.0.0.213"
    # V1 hubs
    local-data: "aa.bb.cc.dddddd.sensornet.info 86400 IN A 10.0.0.213"
    local-data: "aa.bb.cc.dddddd.keys.sensornet.info 86400 IN A 10.0.0.213"

Data Formats and Units

MQTT payloads:

  • Power: Raw sensor values (h1/h2: milliamps; h3: deciwatts).
    • See below for conversion to Watts.
  • Energy: Kilowatt-hours (kWh)

For Home Assistant users: When MQTT discovery is enabled, Home Assistant automatically converts Power readings to watts using the value_template included in discovery messages.

For other MQTT consumers: You must apply the conversion yourself. Formulas are in config.py.

Home Assistant Integration

Home Assistant Operating System hosted

If your Efergy Hub server is running on HA OS, you can integrate the readings into Home Assistant via MQTT.

  1. Configure Environment Variables for MQTT

Update your environment variables in the docker-compose.yml file:

# Optional: logging level (DEBUG, INFO, WARN, ERROR)
LOG_LEVEL=INFO

# Enable MQTT (true/false)
MQTT_ENABLED=true

# MQTT broker details
MQTT_BROKER=homeassistant.local
MQTT_PORT=1883
MQTT_USER=mqtt-broker-username-here
MQTT_PASS=your-password-here

# Home Assistant MQTT Discovery
HA_DISCOVERY=true
  1. Home Assistant Auto-Discovery

With HA_DISCOVERY=true, the hub-server will automatically publish Home Assistant MQTT discovery payloads. This creates two sensors per Efergy device:

Sensor Topic Unit Device Class State Class
sensor.efergy_hub_power_<sid> home/efergy/efergy_h<X>_<SID>/power W power measurement
sensor.efergy_hub_energy_consumption home/efergy/energy_consumption/energy kWh energy total_increasing

Note: MQTT payloads for power are raw values, but Home Assistant applies the necessary conversion to Watts automatically. See the Data Formats and Units section for details.

Home Assistant will pick up these sensors automatically, making them available for dashboards, automations, and the Energy Dashboard.

  1. Add Sensors to Energy Dashboard

Once discovered, the sensor.efergy_hub_energy_consumption sensor can be added to Home Assistant’s Energy Dashboard under Grid Consumption, allowing you to track daily, weekly, and monthly usage.

Container hosted

You can integrate your local energy data into Home Assistant using the SQL Sensor integration. This allows Home Assistant to directly query the readings.db file.

The provided sensors.yaml file is a configuration snippet you can add to your Home Assistant setup.

  1. Ensure Home Assistant can access the database. Make sure your readings.db file is located somewhere Home Assistant can read it (e.g., in your /config directory).
  2. Add the SQL integration to your configuration.yaml if you haven't already.
  3. Add the sensor configuration. You can copy the contents of sensors.yaml into your Home Assistant's configuration.yaml (under a sql: key) or, if you have a split configuration, !include it.

configuration.yaml example:

# Loads default set of integrations. Do not remove.
default_config:

# Load frontend themes from the themes folder
frontend:
  themes: !include_dir_merge_named themes

automation: !include automations.yaml
script: !include scripts.yaml
scene: !include scenes.yaml
sql: !include sensors.yaml
  1. Update the db_url in sensors.yaml.
  2. Restart Home Assistant.

You will now have two sensors:

  • sensor.efergy_hub_power_<sid>: The instantaneous power reading in W.
  • sensor.efergy_hub_energy_consumption: A running total of energy consumed in kWh, which can be added directly to your Home Assistant Energy Dashboard.

Other

Efergy Data Format

Documentation about the known hub payload formats and data structures:

About

Keep your legacy Efergy Engage Hub alive! This project emulates the Efergy API to log power data locally.

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Languages

  • Python 93.0%
  • Dockerfile 5.8%
  • Shell 1.2%