As a solar installer, I think it’s great that solar inverters are related accessories are, in the last few years, starting to get connected to the Internet. Being able to see a graph of—for example—production vs consumption, is an enormous help in reducing a power bill and environmental footprint.
The average user is fascinated with their new solar graph the first day they see it. They excitedly check it every hour on the first day. Then every day for the first week. Then once a week for the next month. Then it’s forgotten. But I’m not the average user. I’ve had solar at home for 3 years now and (perhaps sadly) I still find myself checking what it’s doing several times each day. If you’re like me, perhaps you can relate to my frustration with the limited nature of these bundled monitoring products.
The Enphase Envoy S is capable of measuring (or calculating from measured values) real power, reactive power, apparent power, voltage, current, power factor and frequency. And it does this every second, for up to 3 phases, and on up to two meters. Impressive. Yet, the proprietary Enphase Enlighten web portal only stores what is deemed to be relevent to most users—power only, and at a 15 minute resolution.
This is fine for marrying up electrical loads with solar production, but what about monitoring for voltage dips? Intermittent loads? This is where local access to the data the Envoy collects comes in.
An Old System
I have used for the past 3 years a Bash script, incorporating curl and jq, to achieve this. My script was crude, inefficient, and for some reason I couldn’t get it to work with a systemd service. Instead, I had to run it in a screen session and manually restart it if either the Envoy or server were restarted.
Python is built for the job, hence I decided a long time ago that one day, I would rewrite it in Python.
What The Data Is Used For
The individual measurements collected by the Python script are forwarded on to MQTT topics via a self hosted Mosquitto service.
Some of these topics are subscribed to by a Node-RED service, which may process then, re-publish calculated values based on them, and/or save them to an Influx database.
The Influx database allows for graphing the data to any desired resolution.
I use MQTT Dash to monitor this data live, and from anywhere.
Perhaps more importantly, I also have the OpenEVSE electric vehicle charging station, which out of the box is capable of adjusting the charging power according to the excess solar energy available that would otherwise be exported to the grid. This uses an MQTT feed, hence the need to feed the data from the Envoy into an MQTT topic.
I have found that the 1 second resolution is enough to work out exactly what appliances have been running, and when. It even shows a spike in current when the compressor in the fridge starts. However, while the data is send every second, readings appear to be averages from over the previous 5 seconds.
Requirements
- Enphase Envoy S Metered. The Envoy I have has both production and grid metering enabled, and measures 3 phase power on both the production and grid meters. If your Envoy is configured for single phase or only one of production or grid metering, your output may very from what I get and the script may need to be adjusted accordingly.
- Installer password from Envoy S. This can be found using the Envoy serial number, and either:
- the Android app developed by thecomputerperson at this link
- the Python script by Marcus Fritze at this link.
- A server on the same trusted network as the Envoy. I use an LXC container, however a Raspberri Pi, Linux server or virtual machine would do the job just as well. The server requires:
ssl
(for the certificate authority certificates)- python-pip (to install paho-mqtt)
- paho-mqtt (installed with:
pip install paho-mqtt
)
- Mosquitto service. This can be hosted anywhere on the Internet, however I use another LXC container for this.
How it works
Data is extracted from the Envoy at http://envoy/stream/meter as a continuous stream of data that is—almost—in JSON format.
The stream is broken up into lines. A new line arrives in the stream every second. Each line is trimmed for some superfluous characters to make it legitimate JSON data, then converted into a Python data set.
Finally, using paho-mqtt, each value in the set is published on an MQTT topic.
Implementation
Where envoy.local
is the DNS name or IP address of your Envoy.
# Capture data from the Enphase Envoy S, and publish to MQTT.
# Constants
mqttBroker = 'mqtt.example.com'
mqttPort = 8883
mqttClientId = 'python-mqtt'
mqttUsername = 'username'
mqttPassword = 'mqttP@$$w0rd'
envoyPassword = 'envoyP@$$w0rd'
caCertificates = "/etc/ssl/certs/ca-certificates.crt"
envoyUrl = "http://envoy.local/stream/meter"
# Iterate
def streaming():
import requests
from requests.auth import HTTPDigestAuth
session = requests.Session()
req = requests.Request("GET", envoyUrl, auth = HTTPDigestAuth('installer', envoyPassword))
reqPrep = req.prepare()
resp = session.send(reqPrep, stream=True)
mqttClient = connect_mqtt()
for line in resp.iter_lines():
if line:
lineToMqtt(line, mqttClient)
mqttClient.disconnect()
# Tidy up a line and publish to MQTT
def lineToMqtt(line, mqttClient):
# Remove the first 6 characters from the line, and convert to JSON
import json
j = json.loads(line[6:])
# Publish all values in the string to MQTT
c = mqttClient
publish(c, "stuart/0/power-real/watt/msb/production/a", j["production"]["ph-a"]["p"])
publish(c, "stuart/0/power-real/watt/msb/production/b", j["production"]["ph-b"]["p"])
publish(c, "stuart/0/power-real/watt/msb/production/c", j["production"]["ph-c"]["p"])
publish(c, "stuart/0/power-real/watt/msb/import/a", j["net-consumption"]["ph-a"]["p"])
publish(c, "stuart/0/power-real/watt/msb/import/b", j["net-consumption"]["ph-b"]["p"])
publish(c, "stuart/0/power-real/watt/msb/import/c", j["net-consumption"]["ph-c"]["p"])
publish(c, "stuart/0/power-apparent/volt-amp/msb/production/a", j["production"]["ph-a"]["s"])
publish(c, "stuart/0/power-apparent/volt-amp/msb/production/b", j["production"]["ph-b"]["s"])
publish(c, "stuart/0/power-apparent/volt-amp/msb/production/c", j["production"]["ph-c"]["s"])
publish(c, "stuart/0/power-apparent/volt-amp/msb/import/a", j["net-consumption"]["ph-a"]["s"])
publish(c, "stuart/0/power-apparent/volt-amp/msb/import/b", j["net-consumption"]["ph-b"]["s"])
publish(c, "stuart/0/power-apparent/volt-amp/msb/import/c", j["net-consumption"]["ph-c"]["s"])
publish(c, "stuart/0/power-reactive/volt-amp-reactive/msb/production/a",j["production"]["ph-a"]["q"])
publish(c, "stuart/0/power-reactive/volt-amp-reactive/msb/production/b",j["production"]["ph-b"]["q"])
publish(c, "stuart/0/power-reactive/volt-amp-reactive/msb/production/c",j["production"]["ph-c"]["q"])
publish(c, "stuart/0/power-reactive/volt-amp-reactive/msb/import/a", j["net-consumption"]["ph-a"]["q"])
publish(c, "stuart/0/power-reactive/volt-amp-reactive/msb/import/b", j["net-consumption"]["ph-b"]["q"])
publish(c, "stuart/0/power-reactive/volt-amp-reactive/msb/import/c", j["net-consumption"]["ph-c"]["q"])
publish(c, "stuart/0/potential/volt/msb/a", j["production"]["ph-a"]["v"])
publish(c, "stuart/0/potential/volt/msb/b", j["production"]["ph-b"]["v"])
publish(c, "stuart/0/potential/volt/msb/c", j["production"]["ph-c"]["v"])
publish(c, "stuart/0/frequency/hertz/msb", j["production"]["ph-a"]["f"])
# Create an MQTT client object
def connect_mqtt():
# Callback function
def on_connect(client, userdata, flags, rc):
if rc == 0:
print("Connected to MQTT Broker!")
else:
print("Failed to connect, return code %d\n", rc)
from paho.mqtt import client as mqtt_client
# Set client attributes
client = mqtt_client.Client(mqttClientId)
client.tls_set(caCertificates)
client.username_pw_set(mqttUsername, mqttPassword)
client.on_connect = on_connect
# Connect and return the client
client.connect(mqttBroker, mqttPort)
return client
# Publish an individual topic to MQTT
def publish(client, topic, msg):
result = client.publish(topic, msg)
# Top Function
streaming()
Conclusion
There’s something so satisfying about having access to this information, without having to purchase any extra equipment, plans, or deal with the restrictions of closed ecosystems. Hopefully the above story and script can help you likewise turn your Envoy into a much more powerful energy monitoring device.