Passive House Project: Introducing Muon
As mentioned in the introductory post, I’ll be using a toolkit called Muon to connect my services and handle all of the communication between them. I chose Muon because its use of event-sourcing and CSP fits in with how my brain tends to work, not to mention the whole concepot of events is a javascript staple.
I’ll be writing the ReactJS/Redux UX in es6, which means I’ll need a little bit of transpiling at that point. All the usual stuff, like webpack, babel and so on. All the other services are all in whatever works at the time. This is a proof of concept more than anything, so I will be writing naively. There will be no tests. If this scares and/or enrages you, look away now.
The Basics
Because this is a proof of concept I won’t be working with live data of any sort, but instead simulating it. I will have a bunch of sensors recording temperature, humidity, sound and whatever else I can think of. In the real world they’d be passing their data through a gateway, which would emit events that would be recorded in a data store and consumed by the control panel or other services listening to their streams.
For the moment I’ll need two services:
- Sensor Activity
- Sensor Registration
Sensor Activity
will emit a bunch of sensor data for each sensor. If I was being lazy this is all I would need, as I could interpret the incoming data in the control panel and populate my display of sensors as data for them arrives. This isn’t necessarily an ideal state, however, as it can lead to unexpected UX changes.
Therefore, I introduce a Sensor Registration
service. If this were a real-world application, Sensor Registration
might analyse the output of Sensor Activity
and then generate a list of sensor activations and de-activations based on that, which the control panel would then use instead of doing that work by itself.
For this example I will instead use the Sensor Registration
service to populate from a static list of sensor IDs. The Sensor Activity
service will consume that list and use it to emit simulated sensor data.
Already we’re taking advantage of one of the inherently great things about event-sourcing, by creating a relationship between services without having any requirement for the services to know about each other. All they care about are events. When we get more advanced, we can look into service discover and introspection to modify service behaviour, but for now this will do just fine.
Sensor Service
So, first things first, the Sensor
service.
Event-sourcing and data streams can be a little odd to work with. Events are emitted onto a stream that is persisted into an event store (Photon). This stream can then be subscribed by other services.
In the case of our Sensor
service, when the service starts up, we’re going to emit a bunch of sensor-connected
events to populate our sensors.
Right away there is a problem here: what if the service drops and restarts? Isn’t it going to emit the same events onto the stream?
It is, and that can be an issue if you don’t plan out how to interact with your streams.
So, before we emit anything, the first thing we’re going to do is subscribe to the stream we’re working with and see what’s already there. If there are any existing sensors, we’ll make a record of them for later use.
Then, with that out of the way, we’re going to emit our sensors - assuming we need to - and then… nothing. But that’s ok, it’s all just proof of concept.
Data Structure
Before we can build our services, we need to understand what data will be traversing them. For the moment we need two things.
A ‘Sensor’:
… and ‘Sensor Activity’:
I’m using a random value between 1 and 100 for the value, though I could vary that range based on the sensor type if I wanted. I might do that in the future.
The Code
Our service begins thus:
The keen-eyed amongst you will have noticed amqp and rabbitmq lurking in there. Muon communicates over ampq and recommends rabbitmq as its amqp broker. Because of this, our service’s default assumption is that it’s operating in a docker environment with a container called rabbitmq to provide this service. I’m also giving the opportunity to override that behaviour with a MUON_URL
environment variable, which makes test-runs a lot easier.
While I’m operating outside of a docker container, I have the MUON_URL set to amqp://muon:microservices@localhost:5672
, with rabbitmq listening on that port on localhost.
The first thing we need to do is subscribe to the stream we’ll be using. As outlined before, this is so we can check for any existing sensors on the stream and avoid duplicating information. Subscribing to your stream isn’t always necessary, but it’s useful more often than not.
Three temp sensors, three motion sensors, and three humidity sensors. It’ll do for now.
The nomenclature is easy enough to understand. You subscribe
to an Event Store or other stream uri of the form stream://<servicename>/<stream>
, though only Photon exists as an event store service at the moment. Along with this you pass parameters such as the stream-name
, an event callback, an error callback and a complete callback.
Another way to subscribe to streams is to use replay
, which abstracts most of the effort away by making assumptions about where your event store lives:
In the event callback we’re looking for the sensor-connected
event-type. When it arrives we process it with run
:
If the stream already contains sensors that we want to populate, we remove them from our list. Here, poptimer
lets us delay the population of the sensor data until we’re sure we’ve received all of the existing sensor events on the stream.
Now we get to the fun part: Emitting an Event.
We’ve wrapped this in a function in order to ensure that we can delay populating the sensors until we’ve checked that none exist.
Again, the parameters are fairly self-explanatory. service-id
identifies the service across Muon, allowing other services to introspect it. You can also add tags to give mor context for your service and make it easier for other services to search and identify it.
We iterate over the array and emit our events one by one. Back at the subscribe
block, we could then react to these event in some way if we wanted. For now all our subscription does is record it into inSensors
. You could provide an rpc end-point to request the entire list of recorded sensors, but that’s outside of the scope of this article.
One of the most obvious benefits of this pattern is that multiple instances of this service can automatically maintain eventual consistency with one another through the stream subscription.
The Result
Assuming you’ve got the Muon environment set up, when you run this service you will end up with a set of events persisted into a stream in the Event Store. Now you can subscribe to that stream in another service, replay the events and perform whatever actions you want with them.
Muon offers ways to query and time-window the stream, along with creating snapshots called Projections
, though at the time of writing that feature isn’t available in Muon Node. This opens up all sorts of potentially interesting ways to manipulate and analyse data persisted to the Event Store, which can be useful for everything from simple visualisations to deep audits, including event correlation for trend analysis.
But all of that is for the future.
In the next post I’ll take the persisted sensors and use it to generate our simulated sensor data.