Agents of TEAMS

Hey!

Last week I presented at DevFest Caribbean. It was a Google Developer Group event, where the GDG of multiple regions came together, it involved the groups from Trinidad and Tobago, Guyana, Jamaica and others. There were some really good presentations, and you can check them out here.

My presentation focused on virtual agents I created in Microsoft Teams. I demonstrated a messaging extension from Microsoft, that I extended to work with BambooHR’s platform.

Exercise is a big deal at Teleios. So, when Geon demonstrated his Power App, it inspired me to make a virtual agent to help with updating. In my presentation, I showed a bot that uploads data for access to the Power App … plus, it pushes updates to Strava!

The Ministry of Planning has a website for checking air quality in T&T. I wrote an API to talk to that site and then a bot that works in Teams directly. The bigger challenge in this agent was getting something, anything really, up and running on my google home mini. And I did! So, I was very glad.

Finally, I’ve started experimenting with virtual agents that can interact with in-progress meetings on Microsoft Teams. I heavily relied on the Microsoft Graph samples library related to accessing meetings. I got a zany bot to work. It can inject audio into live meetings, without anyone having to share media from their devices. It’s great for sound effects, like a play off sound for people who are taking up too much time in a meeting.

All told, presenting at DevFest was fun, yet again. It was my third time presenting, and third year in a row talking about conversational user interfaces. You can catch the whole talk here:

The TweetWhisperer

Today is GDG DevFest 2019 in Trinidad. The organizers put out a call for sessions, and I was happy to share one of the ideas that had been rolling around in my head for a while.

I Facebook in pirate, don’t @ me.

So, here’s the TL;DR: my idea was to take my likes on @Twitter and funnel them into Google Keep. Along the way, I’ll automatically categorize the tweets and then confirm that categorization via a chatbot. Simple stuff.

So simple, I didn’t even use Visio to diagram it.

What I actually did:

Twitter Likes

I made an Azure Function that would periodically poll my twitter account and get the last tweets that I liked. To do this, I had to create a developer account in twitter to get the appropriate creds. The function was pretty simple:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Net.Http;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
namespace TweetWhisperer.Function
{
public static class TwitterFavoritesReader
{
static readonly HttpClient client = new HttpClient();
[FunctionName("TwitterFavoritesReader")]
public static async Task Run([TimerTrigger("0 */1 * * * *")]TimerInfo myTimer, ILogger log,
ExecutionContext context)
{
var config = GetConfig(context);
var token = config["TwitterAccessToken"] ?? await GetToken(config["TwitterKey"], config["TwitterSecret"]);
var screenName = config["ScreenName"];
//TODO: ONLY GET last minute of faves
var uri = $"https://api.twitter.com/1.1/favorites/list.json?count=5&screen_name={screenName}";
var authHeader = System.Convert.ToBase64String(
new UTF8Encoding().GetBytes(token));
try
{
client.DefaultRequestHeaders.Add("Authorization", $"Bearer {token}");
}
catch (Exception err)
{
log.LogError(err, "it happened");
}
var favoritesResponse = await client.GetAsync(uri);
var favesJson = await favoritesResponse.Content.ReadAsStringAsync();
var faves = JsonConvert.DeserializeObject<List<TwitterFave>>(favesJson);
await CategorizeThem(config, faves, log);
log.LogInformation(faves.Count()+" favorites found");
}
private static async Task CategorizeThem(IConfigurationRoot config, List<TwitterFave> faves, ILogger log)
{
var uri = config["CategorizeThemUrl"];
var keepItResponse = await client.PostAsJsonAsync(uri, faves);
var resp = await keepItResponse.Content.ReadAsStringAsync();
log.LogTrace("Was it kept??? " + resp);
}
private static IConfigurationRoot GetConfig(ExecutionContext context)
{
var config = new ConfigurationBuilder()
.SetBasePath(context.FunctionAppDirectory)
.AddJsonFile("local.settings.json", optional: true, reloadOnChange: true)
.AddEnvironmentVariables()
.Build();
return config;
}
private static async Task<string> GetToken(string key, string secret)
{
var uri = "https://api.twitter.com/oauth2/token";
var bearerTokenCredentials = $"{key}:{secret}";
var authHeader = "Basic "+System.Convert.ToBase64String(
new UTF8Encoding().GetBytes(bearerTokenCredentials));
client.DefaultRequestHeaders.Add("Authorization", authHeader);
var nameValuePair = new KeyValuePair<string, string>("grant_type", "client_credentials");
var encodedContent = new FormUrlEncodedContent(
new List<KeyValuePair<string, string>>{nameValuePair});
var result = await client.PostAsync(uri, encodedContent);
var resultContent = await result.Content.ReadAsStringAsync();
var e = JsonConvert.DeserializeObject<AccessTokenHolder>(resultContent);
return e.access_token;
}
}
public class AccessTokenHolder
{
public string token_type { get; set; }
public string access_token { get; set; }
}
}

Categorizing Likes

In the DotNetConf keynote a few weeks ago, I saw an ML.NET demo and I got the idea to use it here, too.

ML.Net to build models (easy peasy)

All my notes

I pulled all my notes in keep to train an ML model. It was very easy, particularly because I used gkeepapi, an unsupported library for interacting with keep.

Doing this made me glad that I could think in terms of a bunch of cooperating functions, because the function to extract the notes from keep was written in python, while most everything else is in C#.

import logging
import gkeepapi
import jsonpickle
import os
import azure.functions as func
class KeepNote(object):
label = ""
text = ""
category = ""
# The class "constructor" – It's actually an initializer
def __init__(self, label, text, category):
self.label = label
self.text = text
self.category = category
def __json__(self):
return {
'label': self.label,
'text': self.text,
'category': self.category,
'__python__': 'mymodule.submodule:KeepNote.from_json',
}
for_json = __json__ # supported by simplejson
@classmethod
def from_json(cls, json):
obj = cls()
obj.label = json['label']
obj.text = json['text']
obj.category = json['category']
return obj
def main(req: func.HttpRequest) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
keep = gkeepapi.Keep()
clientId = os.environ["gclientid"]
secret = os.environ["gkey"]
# return func.HttpResponse(f"{username}")
success = keep.login(clientId, secret)
body = ""
title = ""
category = ""
try:
req_body = req.get_json()
body = req_body["body"]
title = req_body["title"]
category = req_body["category"]
except ValueError:
pass
else:
body = req_body.get('body')
title = req_body.get('title')
category = req_body.get('category')
print(body)
print(title)
print(category)
#return func.HttpResponse(title)
labels = keep.labels()
currentLabel = None
for label in labels:
if label.name == category:
currentLabel = label
break
if currentLabel is None:
currentLabel = keep.createLabel(category)
keep.sync()
#return func.HttpResponse(currentLabel.id)
if len(category) > 0:
note = keep.createNote(title, body)
note.color = gkeepapi.node.ColorValue.Pink
note.labels.add(currentLabel)
keep.sync()
return func.HttpResponse(note.id)
gnotes = keep.all()
#for label in labels:
#print label.id
#help(label)
keepNotes = []
for note in gnotes:
try:
woi = "—>"
data = note.title
for label in note.labels.all():
try:
keepNote = KeepNote(note.title, note.text, label.name)
keepNotes.append(keepNote)
except:
pass
#print data
#woi = woi + "," + label.name
#print woi
#print note.text
#print "\r\n"
except:
pass
#json = simplejson.dumps(keepNotes, for_json=True)
json = jsonpickle.encode(keepNotes)
logging.info(json)
# name = req.params.get('name')
# if not name:
# try:
# req_body = req.get_json()
# except ValueError:
# pass
# else:
# name = req_body.get('name')
# if name:
# return func.HttpResponse(f"{json.tostring()}")
#http://www.convertcsv.com/json-to-csv.htm
return func.HttpResponse(json)
# else:
# return func.HttpResponse(
# "Please pass a name on the query string or in the request body",
# status_code=400
# )
view raw KeepIt.py hosted with ❤ by GitHub
KeepIt: A function to get my notes from Google Keep

The funny thing is, I didn’t really need the model. Most of the things I stored in keep, were in one or two categories – not very good for modelling. I guess I hadn’t really realized that. To be honest, I was probably storing things in keep that I had a very high priority on, which turned out to be mostly cloud things. Go figure.

How the bot will help change things

So, I’m grabbing my tweets, categorizing them based on history and preference and I’m ready to store them, except, as I’ve shown above, my categorization is whack. Thus, I also made a chatbot, that will take my liked tweets and ask me to adjust the category I’ve labelled it as.

TweetWhisperer: Helping me categorize my tweets better.

So, with these three components, a likes-harvester, a categorizer and a chatbot, maybe, I’ll get better at returning to these topics that I have an interest in.

Bots State

Ok, in 2016/2017 these were the bots I made:

Nurse Carter

Hansard Speaks

Time for Water

For some reason, I feel like there were more. Most likely, that’s because of perhaps just iterating on those above. I did make a few PoCs for work, like collaborating on the Teleios Code Jam one with our intern at the time, Joshua.

I also made a few ones we used for demos with clients, those put together things like QuikWorx, our low code solution creator at Teleios with SharePoint and Cortana.

This year, there are a few I’m going to go after in addition to iterating on the ones above. A friend of mine asked me to make a hybrid QnA CUI application. This tweet by Gary Pretty about a new way to sync QnAs might bring that back up.

My next new bot will be one that uses the Consumer Affairs Division data in some way. I hope to finish that over this long weekend in Trinidad.

One of the changes I’ve not been on top of have been to the Microsoft Bot Framework.  They’ve gone to General Availability and bots on the bot framework developer portal need to be moved over to the azure portal by March 31.  I’ll both move and update dependencies with the move to keep current with how to do things on the framework.

So, that’s it. I hope for more collaborations with the updates this year and perhaps more frequent updates.