Project VILMA

Sometimes, I get home and just wish I could say to my virtual machine in the cloud, “magico presto, turn on!” and it’s on and ready for me to remote into and do things. I wanted to do things to make that happen, but time and procrastination happens to many. Thankfully, there was an upcoming developer gathering that I used as the catalyst to actually build a system that would work, almost like magic.

So, last Sunday, the Port of Spain chapter of GDG (Google Developer Groups) held a developer event, #GDGPOS #DevFest. They reached out to the local developer community for interesting projects and I responded with a proposal to build something that would work in the way I described.

gdg-presenters
GDGPOS Presenters

My proposal got accepted and I spent a few weeks building out the idea. My whole solution involved using my Google mini to turn my virtual machine on or off.

To do that, I created a Google Action on the Google Actions console. I had played around with Actions before, but this would be different. I have been making most of my conversational agents using Microsoft’s Bot FrameworkBot Framework, so a lot of the concepts were familiar to me, from things like Intents, to Utterances and even the use of webhooks. For this action, I largely had to focus on just one Intent – the one that would hear a command for a VM state change and execute. Overall, the system would look like this:

VILMA-diagram

  • Creating the action

So, I created this custom Intent that took me to Dialogflow, Google’s interactive tool for building conversational interfaces. There, I created a custom intent, ChangeVMState.

ChangeVMState would receive messages and figure out if to turn a VM on or off. The messages could be in a range of formats like:

  • turn on/off
  • power on/off
  • shutdown/start up the vm

They would resolve to the ChangeVMState intent. All messages sent to ChangeVMState was then forwarded to my webhook. I deployed the webhook as a function in Azure.

The code to execute the functions is pretty straightforward. One function receives the request and queues it on an Azure Storage Queue.  Azure functions provides a really simple infrastructure for doing just that. 

I mean, this is the whole method: 

The item being put on the queue – the desired VM state – is just a variable being set. 

Another function in Azure will then take up the values in the queue and will start or stop the VM based on state. Again, a pretty simple bit of code. 

I’m using the Azure Fluent Management SDK to start/stop a VM

So, finally, after the VM is put into the desired state, an email is sent either saying the VM is off or that it’s on and an RDP file is included. Ideally, I wanted to have the Google Assistant I was using notify me when the VM got up and running, but I just couldn’t get push notifications working – which is why I ended up with email. 

Thus,  I ended up with a Google Action, that I eventually called VILMA Agent (at first, I was calling it Shelly-Ann).  I could say to my Google Mini, “OK, Google, tell VILMA Agent, turn on” and I’d get an email with an RDP file.

The code for the functions part of VILMA is up here on GitHub

Advertisements

Provisioning some test storage accounts for class

I wanted to create a few storage accounts for students in my class to complete an assignment featuring Event Sourcing and Material Views.

So, here’s what I did.

Download/install the latest azure command line interface (cli).
(While doing this, I realized I could have just used the cloud shell. I soldiered on with the dl)

Create a resource group to contain the accounts we’d need.

Create the accounts and output the storage account keys
The command to make a single storage account is pretty straightforward:

But I wanted to also output the keys and display them on a single line. The command to get the keys after the account is created is this:

So, I used the jq program in bash to parse the json result and display both keys on a line. Thus, I created a script that would create the accounts and then output their storage account keys.
This is the script that produced the accounts and keys:

Overall, the longest part of the exercise was dealing with the way the files were being saved in windows vs how they were being saved and read by bash. But the accounts were created and class can get on with assignment 2.

Teaching Cloud Technologies (2016/2017)

We just concluded another run of the Cloud Technologies course at the University of the West Indies. This course is part of the MSc Computer Science program.

As lecturer, I had to come up with the course outline as well as the content. In so doing, I get the opportunity to refresh what we talk about as well as how we approach assignments.

This course comprised of ten modules:

  1. Intro to Cloud Technologies
  2. Cloud computing infrastructure
  3. Virtualization
  4. Big data
  5. Cloud development patterns
  6. Cloud resource management
  7. IaaS Automation
  8. Microservices
  9. IoT
  10. Cloud for Research

Though our primary cloud platform is Microsoft Azure, students are free during assignments and project submission to use other cloud providers.

One of the assignments involves virtual machine scale sets and containers. The draught goes like this:

Z. Zanko Systems provide sales processing systems for large commercial banks.

They receive more than 5 million JSON requests per hour (revised to 300,000).
Each request must be stored in permanent storage. The format of the request is:
“{“TransactionID”:”1″,”UserId”:”A1″,”SellerID”:”S1″,”Product Name”:”Financial Trap”,”Sale Price”:1000000,”Transaction Date”:” “}”

You have been hired as a System Developer by Zanko. You have access to VMs whose capacity is equal to that of A1 VMs in Azure IaaS or Containers of similar capacity.

Develop a mechanism to generate the requests your system faces.

Design and implement a solution using a container-based approach or a
virtual machine-based one to process 5 million requests in an hour.
For your receivers introduce a failure rate.
Store the occurrence of failures.
Justify how you chose to store and monitor failures.

Though most students can build this out using azure, one enterprising student chose to use AWS and reading his submission was a nice view of getting this done using Amazon’s resources vs Azure.

This year, Microsoft put a halt to the Azure for Education Academic grant, but did have a number of other ways for students to get into cloud, including DreamSpark and other offers.

The project component this year changed a bit, too. In the two years prior, we asked students to build working cloud services themselves. This year, we asked them to propose a cloud service that featured understanding of:

  • Cloud service definition
  • Cloud service models
  • Cloud delivery models
  • Cloud for research
  • Cloud development with a regional focus

We saw some excellent solutions that we hope to hear more about in the future.