Categories
Cloud TrinidadAndTobago

If a tree falls in the forest…

At about 5:00 am, the fans stopped spinning. And we knew there was a power outage. We rolled back to sleep in the embers of the night and thought, “oh well, they’ll sort themselves out”.

We were jolted out of sleep two hours later, by the loud noise of a crash down the road.

A massive tree had fallen. It made the electricity company seem far more prescient than I had ever given it credit for.

The tree that collapsed pulled down wires from two poles, caused one of them to fold over into an acute angle and pushed cords into the nearby river.

Early morning, early January disaster.

By the time I walked down to check out what was going on, with only my phone in hand, the community response was well underway.

The community seemed battle-hardened by these events. My wide-eyed, city-boy confusion melted away. A man in a van turned up with not one, not two but three chainsaws. Others turned up with rope and van man, sent for gasoline.

The army was on the scene relatively quickly too. Closed the road and essentially kept people who weren’t helpful at a useful distance. Me, the kept me away.

The men of the neighbourhood started cutting and when the fire services arrived, with their coordination and support the tree was eventually moved aside.

Cars could pass once again, though of course, slowly. By the time the electricity company arrived, the road was clear enough to let them begin the repair process.

The situation reminded me about the need for status updates from utilities. There’s clearly a chain of events needed here. The community response was an amazing, welcome first step. But it seemed like a proactive neighborhood. The baton was passed to the fire services, which made the way for the team from the electricity company.

Who would tell the other service providers? I didn’t see any communication utilities on the scene. Were they aware? Would they spring into action like the men with the chainsaws? This is doubtful.

Also, my family and I temporarily decamped to a place to get power, Internet and some chill. When should we go back? Again, it would be great to either check something like “status.utility.co.tt” to find out.

For now, I’d actually settle for an SMS or WhatsApp from the providers. To date, we’ve gotten none. It seems like the best response will remain that of individuals and neighbours, who proactively set up their own systems, limited as they are, until better can be done.

Categories
Cloud Tracks

Save (your data from) Endomondo Month!

Heh.

I hereby dub December, 2020, “Save your data from Endomondo” month. Why?

Endomondo’s retiring from the game.

So, given this state of affairs, it would be wise to ensure your data on the Endomondo platform is exported to somewhere. I made a request via their site to get all 789 of my workouts from there and a few days later, I got an archive that included this folder structure:

I wanted to do some analysis on my workout data, so I created a really simple ingestion tool that takes the data from the json documents in Workouts/ and inserts them into a SQL Server database.

The tool can be found in this repo.

The key thing about this tool is that I had to fiddle with Endomondo’s JSON output to get it to play nice with my approach to serialization:

https://github.com/irwinwilliams/endomondo-json-to-sql/blob/master/WorkoutExtractor.cs

I’m not super-proud of it, because it could be very finicky, but it got the job done for my purposes. I deliberately rejected pulling in the available lat-lon data from the runs, because I wasn’t interested in it for the moment, but a slight modification to the approach I’ve taken will accommodate that.

So, I’m glad the data is ingestible now, and I hope to do some cool stuff with it soon.

Categories
Advocacy

“Never code for free”

A post on the Caribbean Developers group

I can get where this advice is coming from.

For newer devs, there can be a lot of “opportunities” to write code that benefit other people than the developer producing code for some solution or other. I mean, it might be someone’s “killer app idea” or a code-for-equity something, or one of those “hackathons” intent on engaging innovative people to help some firm or cause figure things out.

But a lot of code you write, especially when starting out is going to be “free” code. Finished a tutorial and want to explore some aspects of the language? That’s free code. Spent some time considering some technology and want to see how it work if you put something together, quickly? That’s free code, too.

You might have even seen an implementation of a solution and thought, “perhaps I can reason about that differently”. And you spend some time hacking together that approach. That’s free code.

As it turns out what some people call free code is just a part of how we developers learn, build and grow. Not always in that order. Ultimately, a more nuanced perspective is that one should learn to ask, why is this code I’m going to write valuable to me?

The answer to that should help determine if you want to press into an idea via code, or not.

Categories
Uncategorized

Less wizardry, more automation

“I want to book an appointment with to renew my license in Arima at your nearest available date in the morning”

I wonder if I can make a chatbot that does this?

It’s a reworking of this, https://licensingappointment.mowt.gov.tt/

#TODO

Categories
Uncategorized

Agents of TEAMS

Hey!

Last week I presented at DevFest Caribbean. It was a Google Developer Group event, where the GDG of multiple regions came together, it involved the groups from Trinidad and Tobago, Guyana, Jamaica and others. There were some really good presentations, and you can check them out here.

My presentation focused on virtual agents I created in Microsoft Teams. I demonstrated a messaging extension from Microsoft, that I extended to work with BambooHR’s platform.

Exercise is a big deal at Teleios. So, when Geon demonstrated his Power App, it inspired me to make a virtual agent to help with updating. In my presentation, I showed a bot that uploads data for access to the Power App … plus, it pushes updates to Strava!

The Ministry of Planning has a website for checking air quality in T&T. I wrote an API to talk to that site and then a bot that works in Teams directly. The bigger challenge in this agent was getting something, anything really, up and running on my google home mini. And I did! So, I was very glad.

Finally, I’ve started experimenting with virtual agents that can interact with in-progress meetings on Microsoft Teams. I heavily relied on the Microsoft Graph samples library related to accessing meetings. I got a zany bot to work. It can inject audio into live meetings, without anyone having to share media from their devices. It’s great for sound effects, like a play off sound for people who are taking up too much time in a meeting.

All told, presenting at DevFest was fun, yet again. It was my third time presenting, and third year in a row talking about conversational user interfaces. You can catch the whole talk here:

Categories
Uncategorized

A Software Engineering Course

A good conversation started because of that post on Caribbean Developers. (Also, my Facebook is in Pirate, don’t hate, lol).

When I saw this question, my first response was similar to quite a few others on the thread. “Teach a combination of these things: CI/CD; Source Control; DevOps; Observability and Telemetry; UX…”.

But I didn’t post it, because I had an inkling that something more was needed. That a three-month course to even masters-level students might not be sufficient to hit the mark. Then this hit me:

Which true Engineering discipline takes three months to understand?

I mean, Mendez’ question asked about a Software Engineering course. I get where he’s coming from. I did my undergrad at the University of the West Indies. And there was a Software Engineering course back then. It had a textbook and everything. That’s where I learnt about Waterfall, XP and UML. Things I barely used over the last 15 years. I have drawn a lot of boxes on whiteboards, though.

I think back then it was thought that to a bunch of “programmers” (really, it was a computer science programme), all that matters is knowing that software is “engineered” via “a process”.

But. If that university wants to play a role in shaping future software engineers, I think a course of that duration isn’t enough. Just like with other engineering disciplines, there’s a certain base of understanding a student might need before the concerns of modern software engineering is discussed, but then the diet of consideration needs to be relevant and balanced.

At Teleios, our software engineering involves using Scrum, Kanban, various forms of test automation, front-end, back-end and distributed system development. We are continuing to deepen our investment in DevOps, telemetry & observability and have CI/CD as firm goals. We have dedicated design teams who communicate in a variety of ways customer intents. We believe in continuously developing skill as engineers by training and experimentation. We use design patterns and we’ve made up our own.

All that to say, if your software engineering program doesn’t contemplate issues like that and above, what are you doing?

To go back to the original question, enough time should be given to properly understand the difference between git rebase and git merge. And why git’s so popular even. Students of software engineering should be given a sight of the whole and what part code plays in it all (hint: not everything). In terms of a software development process, whichever one is selected, you don’t really learn by simply reading the relevant manifesto. You really learn by doing.

You do Scrum. Or Kanban. Or whatever process. And it takes time for that doing to get meaningful. So, I think Mendez should return to the drawing board with fresh eyes on the goal and perhaps come up with a more valuable way to help his students understand the world of software engineering.

Going a bit further, when I first conceptualized this response, it took two paths. Either offer 2-4 different courses focused on elements of software engineering, so the whole MSc has a Software Engineering sub-focus, or, they embed SE concerns throughout the programme. With a deliberate goal of having students graduate with understanding what is involved in modern software engineering because almost every course had a component or a focus on some capability needed to effectively participate in software engineering.

Categories
Uncategorized

Back to the Sky: Processing Satellite Data Using Cloud Computing

From time to time, I work with researchers on projects outside of the day to day, forms-over-databases stuff. Mind you, I like a nice form over a cool data repository as much as the next guy, but it’s definitely cool to stretch your arms and do something more.

So, when Dr. Ogundipe approached me about cloudifying her satellite data processing, I had to do a lot of research. She had a processing pipeline that featured ingesting satellite images, doing some data cleanup, then analyzing those results using machine learning. Everything ran on local machines in her lab, and she knew she would run into scaling problems.

Early on, I decided to use a containerized approach to some of the scripting that was performed. The python scripts were meant to run in Windows, but I had an easier go at the time getting Linux containers up and running, so I went with that. Once the container was in good order, I stored the image in the Azure Container Registry and then fired it up using an Azure Container Instance.

Like a good story, I had started in the middle – with the data processing. I didn’t know how I would actually get non-test data into the container. Eventually, I settled on using Azure Files. Dr. Ogundipe would upload the satellite images via a network drive mapped to storage in the cloud. Since I got to have some fun with the fluent SDK in Azure a while back, I used it to build an orchestrator of sorts.

Once the orchestrator had run, it would have fed the satellite images into the container. Output from the container was used to run models stored in Azure ML. Instead of detailing all the steps, this helpful diagram explains the process well:

Super simple.

No, not that diagram.

The various cloud resources used to process satellite data in Azure.

So, I shared some of this process at a seminar Dr. Ogundipe held to talk about the work she does, and how her company, Global Geo-Intelligence Solutions Ltd uses a pipeline like this to detect locust movement in Kenya or the impact of natural disasters and a host of other applications of the data available from satellite images.

Categories
Uncategorized

status.[utility].co.tt

Recently, there was thunder and lightning and heavy rainfall in Trinidad. Trees fell, and there was a range of property damage.

In cases like this, the aftermath can involve service outages across a number of service providers. For me, it was power. There was a bright spark, a loud bang and then silence.

Right after the first jumble of thoughts, I remembered you can call the power company to find out about outages. In this case, it was 800-BULB. The operator was prompt, he helpfully indicated that it was “a big problem that will take a long time to fix”.

When we awoke hours later, there was still no power, but added to that, I could not reach the power company on the phone. Either a lot of ringing or busy signals. And it made me think, calling to find out about the status of service is an approach we should retire.

From a cloud perspective, we can readily find out if a given provider is up or down by checking one of their status pages. Both Azure and GitHub comes to mind.

If we had access to the numbers, I expect the power company’s lines didn’t stop ringing until the day after the event. How did they handle the increased load and calls? That could have been lessened if there was an automated way their customers could get answers. Both at the phone level, and on the web.

I expect that some of the more advanced utility companies here in Trinidad already have internal systems that can readily provide this information, it’s therefore a matter of going the last mile and making it accessible to the public.

Of course, if utilities use a standard mechanism for reporting, then what can then happen is the status of a number of public utilities can be known to the nation, without having to make any calls.

What a day that would be.

Categories
reference counting

Build 2020: Top 5

So, Microsoft’s Build 2020 Conference was held a few weeks ago. And at Teleios, we normally try to check it out because you get a good sense of where they’re going with some of the technologies and platforms we use to build solutions.

This year’s effort didn’t fail to deliver, even though the format had to be drastically change because of the strict rules of social distancing most of the planet was under due to COVID-19.

On a run a few days ago, I heard Jeff Sandquist, CVP dev-rel at MSFT talking about what it took to put on the event, likening it to a production by a movie studio. It was a great interview on a podcast called Screaming in the Cloud, in which Sandquist shared his top 5 Build Announcements (starts at 41:40):

  1. Microsoft Teams: See it as more than a communication channel; it’s a development platform.
    “Single largest development opportunity for developers across the planet”
  2. Developer Productivity: A number of changes and new resources have been announced to make it easier, faster and more productive when developing solutions.
  3. Power Apps: Continuing commitment to enabling citizen developers to build no-code solutions that frees up system and other software developers to focus on other concerns while the organization gets things done. There was a lot to see about Power Apps and Power Platform and build.
  4. “Making the best damn developer box” – Scott Hanselman’s keynote highlighted many new improvements to windows itself and the development experience there that underscore Microsoft’s goals around improving the tools that support the development process. From the new Windows Terminal, to making 2020 (finally), the year of Linux on the Desktop.
  5. Microsoft Learn. Learning how to get into Azure and other Microsoft development technologies can be a challenge, to put it lightly. But Microsoft has recognized this and deliberately made learning about it’s products a strategic asset. From easy on-ramps via videos on the new Microsoft learn site, to very low barrier-to-entry access to the cloud through Azure, Learn its where it’s at. More seasoned developers will be happy to hear too, that documentation is seen as a key part of the developer journey and this is reflected in the way they approach it.

It was helpful to hear Sandquist talk about Microsoft’s Founding Moment. That they started as a Developer First company and they’re staying true to their roots.

Categories
Uncategorized

Caribbean Developers’ Salary Survey – Redux

I saw on the Caribbean Developers’ Facebook group someone looking to run a salary survey:

I dig Matthew’s approach, because instead of doing the survey and trying to see if there’s interest, he’s looking to see if there’s any interest first and then leaning into the work.

From the looks of it, interest is low, maybe it will grow, but I wouldn’t be surprised if it doesn’t.

In 2018, I had a similar idea, for the same Caribbean Devs group. I may have been inspired by the StackOverflow Developer Survey and my own curiosity as it relates to the job market.

Here’s my survey results from back then:

Just about 40 submissions were made, which was instructive. I appreciated that if you do a salary survey in a small pool, it can be too revealing if you have access to each submission, even if the submissions are anonymized. If I were to do this again, I would bear that in mind, I might leave out a Company field, for example, in favor of Industry. I’d also not include a Job Title field and instead leave it at Category and perhaps Years in the industry. I might also include questions that get to the heart of project diversity, too.

So, this is mostly a cautionary tale about surveys like this. There are good points of guidance to get from economists and people in the social sciences to understand how to design things like this, beyond simply capturing the data.