The Raspberry Pi Foundation brought out a new product the other week, Raspberry Pi zero w which seemed pretty cool so I bought three. After deciding I should *actually* make something with them as opposed to, say, making a LED flash then exile them for all eternity to the electronics box (Guilty glance at my original Rpi and Arduino). So I decided to try and make a Facebook messenger bot I could use to interact with my LIFX smart bulb.
As it turns out both Facebook and LIFX have pretty well documented HTTP APIs so getting started wasn’t particularly difficult. Pretty quickly, just from following the documentation on the LIFX site I was able to get a simple python interface up and running which could turn my bulb on/off, adjust the brightness, change the colours and query state information.
Creating a Facebook bot was also pretty simple there are a few tutorials online but essentially it just boils down to forwarding messages (done through the Facebook developers portal) to your server then dealing with the messages with your HTTP request system of choice. Again this was just another python script. After putting those two things together it was pretty simple to make an integrated Facebook messenger processor come light controller.
Demo! Slightly sped up to make it a bit less dull, from sending a message to receiving the response is about a second but the changes take effect in slightly under a second, just before the response is sent back to the user.
Ok so the probability bit which I mentioned in the title. Basically I’m using a fairly simple machine learning algorithm called Naive Bayes. This is based on Bayes rules which is:
I’ll try and keep this bit brief, just remember that basically A is some label and B is a word in the Facebook message.
The above formula means: the probability of event A given that condition B is met is equal to the probability of B given that condition A is met multiplied by the prior probability of A (I.E our initial guess, irrespective of B). All of this is then divided by the prior probability of B, again this is irrespective of A. I actually skip the last step as it’s irrespective of A. So what I’m actually calculating is the likelihood.
Basically, I have a set of these classifiers trained to work out different things from the sentence. The basic walkthrough is: is this operation a query or trying to change the state, which object does this refer too, and what operation does it refer too? Each point is a pre-trained Naive Bayes classifier which represents a node in a big decision tree.
This means to add another operation I just have to add additional labels to the training set. Since these are just text files I could theoretically actually add a command in the Facebook messenger interface to retrain the classifiers dynamically to make it easier to add more labels to each node (for instance additional colours).
So for example if I said ‘dim the lights please’ it will go through the first classifier which will distinguish sentences that are querying the state and changing the state. It knows that the word ‘dim’ appears in the change state training examples so it will end up with a higher likelihood and the others appear in either both or neither. It will then see what object it is referring to, again, ‘dim’ appears in the brightness set but not the others. Finally, once more to check what is the intention of the statement.
As it turns out this all pretty fast to compute, all the HTTP requests are a pretty big bottleneck in comparison. The examples I gave could be much more efficiently implemented by just searching for certain strings in the messages but my hope is to build the dataset to process more complex sets like “if the light is on, turn it off” and sets with more ambiguity.
And all this is running on this tiny computer.