Less wizardry, more automation

“I want to book an appointment with to renew my license in Arima at your nearest available date in the morning”

I wonder if I can make a chatbot that does this?

It’s a reworking of this,



The TweetWhisperer

Today is GDG DevFest 2019 in Trinidad. The organizers put out a call for sessions, and I was happy to share one of the ideas that had been rolling around in my head for a while.

I Facebook in pirate, don’t @ me.

So, here’s the TL;DR: my idea was to take my likes on @Twitter and funnel them into Google Keep. Along the way, I’ll automatically categorize the tweets and then confirm that categorization via a chatbot. Simple stuff.

So simple, I didn’t even use Visio to diagram it.

What I actually did:

Twitter Likes

I made an Azure Function that would periodically poll my twitter account and get the last tweets that I liked. To do this, I had to create a developer account in twitter to get the appropriate creds. The function was pretty simple:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Net.Http;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
namespace TweetWhisperer.Function
public static class TwitterFavoritesReader
static readonly HttpClient client = new HttpClient();
public static async Task Run([TimerTrigger("0 */1 * * * *")]TimerInfo myTimer, ILogger log,
ExecutionContext context)
var config = GetConfig(context);
var token = config["TwitterAccessToken"] ?? await GetToken(config["TwitterKey"], config["TwitterSecret"]);
var screenName = config["ScreenName"];
//TODO: ONLY GET last minute of faves
var uri = $"{screenName}";
var authHeader = System.Convert.ToBase64String(
new UTF8Encoding().GetBytes(token));
client.DefaultRequestHeaders.Add("Authorization", $"Bearer {token}");
catch (Exception err)
log.LogError(err, "it happened");
var favoritesResponse = await client.GetAsync(uri);
var favesJson = await favoritesResponse.Content.ReadAsStringAsync();
var faves = JsonConvert.DeserializeObject<List<TwitterFave>>(favesJson);
await CategorizeThem(config, faves, log);
log.LogInformation(faves.Count()+" favorites found");
private static async Task CategorizeThem(IConfigurationRoot config, List<TwitterFave> faves, ILogger log)
var uri = config["CategorizeThemUrl"];
var keepItResponse = await client.PostAsJsonAsync(uri, faves);
var resp = await keepItResponse.Content.ReadAsStringAsync();
log.LogTrace("Was it kept??? " + resp);
private static IConfigurationRoot GetConfig(ExecutionContext context)
var config = new ConfigurationBuilder()
.AddJsonFile("local.settings.json", optional: true, reloadOnChange: true)
return config;
private static async Task<string> GetToken(string key, string secret)
var uri = "";
var bearerTokenCredentials = $"{key}:{secret}";
var authHeader = "Basic "+System.Convert.ToBase64String(
new UTF8Encoding().GetBytes(bearerTokenCredentials));
client.DefaultRequestHeaders.Add("Authorization", authHeader);
var nameValuePair = new KeyValuePair<string, string>("grant_type", "client_credentials");
var encodedContent = new FormUrlEncodedContent(
new List<KeyValuePair<string, string>>{nameValuePair});
var result = await client.PostAsync(uri, encodedContent);
var resultContent = await result.Content.ReadAsStringAsync();
var e = JsonConvert.DeserializeObject<AccessTokenHolder>(resultContent);
return e.access_token;
public class AccessTokenHolder
public string token_type { get; set; }
public string access_token { get; set; }

Categorizing Likes

In the DotNetConf keynote a few weeks ago, I saw an ML.NET demo and I got the idea to use it here, too.

ML.Net to build models (easy peasy)

All my notes

I pulled all my notes in keep to train an ML model. It was very easy, particularly because I used gkeepapi, an unsupported library for interacting with keep.

Doing this made me glad that I could think in terms of a bunch of cooperating functions, because the function to extract the notes from keep was written in python, while most everything else is in C#.

import logging
import gkeepapi
import jsonpickle
import os
import azure.functions as func
class KeepNote(object):
label = ""
text = ""
category = ""
# The class "constructor" – It's actually an initializer
def __init__(self, label, text, category):
self.label = label
self.text = text
self.category = category
def __json__(self):
return {
'label': self.label,
'text': self.text,
'category': self.category,
'__python__': 'mymodule.submodule:KeepNote.from_json',
for_json = __json__ # supported by simplejson
def from_json(cls, json):
obj = cls()
obj.label = json['label']
obj.text = json['text']
obj.category = json['category']
return obj
def main(req: func.HttpRequest) -> func.HttpResponse:'Python HTTP trigger function processed a request.')
keep = gkeepapi.Keep()
clientId = os.environ["gclientid"]
secret = os.environ["gkey"]
# return func.HttpResponse(f"{username}")
success = keep.login(clientId, secret)
body = ""
title = ""
category = ""
req_body = req.get_json()
body = req_body["body"]
title = req_body["title"]
category = req_body["category"]
except ValueError:
body = req_body.get('body')
title = req_body.get('title')
category = req_body.get('category')
#return func.HttpResponse(title)
labels = keep.labels()
currentLabel = None
for label in labels:
if == category:
currentLabel = label
if currentLabel is None:
currentLabel = keep.createLabel(category)
#return func.HttpResponse(
if len(category) > 0:
note = keep.createNote(title, body)
note.color = gkeepapi.node.ColorValue.Pink
return func.HttpResponse(
gnotes = keep.all()
#for label in labels:
keepNotes = []
for note in gnotes:
woi = "—>"
data = note.title
for label in note.labels.all():
keepNote = KeepNote(note.title, note.text,
#print data
#woi = woi + "," +
#print woi
#print note.text
#print "\r\n"
#json = simplejson.dumps(keepNotes, for_json=True)
json = jsonpickle.encode(keepNotes)
# name = req.params.get('name')
# if not name:
# try:
# req_body = req.get_json()
# except ValueError:
# pass
# else:
# name = req_body.get('name')
# if name:
# return func.HttpResponse(f"{json.tostring()}")
return func.HttpResponse(json)
# else:
# return func.HttpResponse(
# "Please pass a name on the query string or in the request body",
# status_code=400
# )
view raw hosted with ❤ by GitHub
KeepIt: A function to get my notes from Google Keep

The funny thing is, I didn’t really need the model. Most of the things I stored in keep, were in one or two categories – not very good for modelling. I guess I hadn’t really realized that. To be honest, I was probably storing things in keep that I had a very high priority on, which turned out to be mostly cloud things. Go figure.

How the bot will help change things

So, I’m grabbing my tweets, categorizing them based on history and preference and I’m ready to store them, except, as I’ve shown above, my categorization is whack. Thus, I also made a chatbot, that will take my liked tweets and ask me to adjust the category I’ve labelled it as.

TweetWhisperer: Helping me categorize my tweets better.

So, with these three components, a likes-harvester, a categorizer and a chatbot, maybe, I’ll get better at returning to these topics that I have an interest in.


Lightning in a Hansard bottle

Some of the technology team that brings the Hansard online in Trinidad & Tobago

When we built the Hansard Speaks chatbot in 2017, I was super excited and told all my friends about it. One of them now works in IT at the Parliament and he invited me to talk with the team about it.

At the brief talk, I spoke about the motivations for building the chatbot, how we thought it was a great way to win arguments about who said what in Parliament, and that we liked how easy it was to bring an automated conversational experience into that world.

I think the team at the Parliament does a great job. I’ve always liked that they were among the early movers to bringing Internet technology into governance. They’ve been online for a long time, they make copies of the Hansard available on their site and they stream proceedings. They’re also on Twitter and are pretty active.

We spoke about how much Hansard Speaks leverages cloud technology and the fact that though the government is progressing, the policy on public cloud means that they have to find ways to use on-prem tech to accomplish similar functionality. HS uses the Microsoft Bot Framework, Azure Media Services and Azure App Services. If they wanted to do that off the cloud, they could but it would just be a bit harder.

I’d love if they shared more about what they do, in terms of all the teams that go in to making the Hansard come alive. There’s a real production team around ensuring those documents get generated and that the parliament team can handle requests that come down from MPs about who said what, when.

Since it’s been two years after we first did the chatbot, I described to them one key change we might make if we were doing it again. We would use a tool like Video Indexer on the videos they create.

Video Indexer’s insights on a Parliament video from Trinidad and Tobago.

It would let us do more than simply get a transcript, as we would then be able to get information on who said what, and how much they contributed to a session. We would be able to get key topics that were discussed.

So, it was great to speak with some of the guys behind the Hansard and share with them ideas on services they can leverage to make their offering even more compelling, insightful and relevant.