I hereby dub December, 2020, “Save your data from Endomondo” month. Why?
So, given this state of affairs, it would be wise to ensure your data on the Endomondo platform is exported to somewhere. I made a request via their site to get all 789 of my workouts from there and a few days later, I got an archive that included this folder structure:
I wanted to do some analysis on my workout data, so I created a really simple ingestion tool that takes the data from the json documents in Workouts/ and inserts them into a SQL Server database.
I’m not super-proud of it, because it could be very finicky, but it got the job done for my purposes. I deliberately rejected pulling in the available lat-lon data from the runs, because I wasn’t interested in it for the moment, but a slight modification to the approach I’ve taken will accommodate that.
So, I’m glad the data is ingestible now, and I hope to do some cool stuff with it soon.
For newer devs, there can be a lot of “opportunities” to write code that benefit other people than the developer producing code for some solution or other. I mean, it might be someone’s “killer app idea” or a code-for-equity something, or one of those “hackathons” intent on engaging innovative people to help some firm or cause figure things out.
But a lot of code you write, especially when starting out is going to be “free” code. Finished a tutorial and want to explore some aspects of the language? That’s free code. Spent some time considering some technology and want to see how it work if you put something together, quickly? That’s free code, too.
You might have even seen an implementation of a solution and thought, “perhaps I can reason about that differently”. And you spend some time hacking together that approach. That’s free code.
As it turns out what some people call free code is just a part of how we developers learn, build and grow. Not always in that order. Ultimately, a more nuanced perspective is that one should learn to ask, why is this code I’m going to write valuable to me?
The answer to that should help determine if you want to press into an idea via code, or not.
Last week I presented at DevFest Caribbean. It was a Google Developer Group event, where the GDG of multiple regions came together, it involved the groups from Trinidad and Tobago, Guyana, Jamaica and others. There were some really good presentations, and you can check them out here.
My presentation focused on virtual agents I created in Microsoft Teams. I demonstrated a messaging extension from Microsoft, that I extended to work with BambooHR’s platform.
Exercise is a big deal at Teleios. So, when Geon demonstrated his Power App, it inspired me to make a virtual agent to help with updating. In my presentation, I showed a bot that uploads data for access to the Power App … plus, it pushes updates to Strava!
The Ministry of Planning has a website for checking air quality in T&T. I wrote an API to talk to that site and then a bot that works in Teams directly. The bigger challenge in this agent was getting something, anything really, up and running on my google home mini. And I did! So, I was very glad.
Finally, I’ve started experimenting with virtual agents that can interact with in-progress meetings on Microsoft Teams. I heavily relied on the Microsoft Graph samples library related to accessing meetings. I got a zany bot to work. It can inject audio into live meetings, without anyone having to share media from their devices. It’s great for sound effects, like a play off sound for people who are taking up too much time in a meeting.
All told, presenting at DevFest was fun, yet again. It was my third time presenting, and third year in a row talking about conversational user interfaces. You can catch the whole talk here:
When I saw this question, my first response was similar to quite a few others on the thread. “Teach a combination of these things: CI/CD; Source Control; DevOps; Observability and Telemetry; UX…”.
But I didn’t post it, because I had an inkling that something more was needed. That a three-month course to even masters-level students might not be sufficient to hit the mark. Then this hit me:
Which true Engineering discipline takes three months to understand?
I mean, Mendez’ question asked about a Software Engineering course. I get where he’s coming from. I did my undergrad at the University of the West Indies. And there was a Software Engineering course back then. It had a textbook and everything. That’s where I learnt about Waterfall, XP and UML. Things I barely used over the last 15 years. I have drawn a lot of boxes on whiteboards, though.
I think back then it was thought that to a bunch of “programmers” (really, it was a computer science programme), all that matters is knowing that software is “engineered” via “a process”.
But. If that university wants to play a role in shaping future software engineers, I think a course of that duration isn’t enough. Just like with other engineering disciplines, there’s a certain base of understanding a student might need before the concerns of modern software engineering is discussed, but then the diet of consideration needs to be relevant and balanced.
At Teleios, our software engineering involves using Scrum, Kanban, various forms of test automation, front-end, back-end and distributed system development. We are continuing to deepen our investment in DevOps, telemetry & observability and have CI/CD as firm goals. We have dedicated design teams who communicate in a variety of ways customer intents. We believe in continuously developing skill as engineers by training and experimentation. We use design patterns and we’ve made up our own.
All that to say, if your software engineering program doesn’t contemplate issues like that and above, what are you doing?
To go back to the original question, enough time should be given to properly understand the difference between git rebase and git merge. And why git’s so popular even. Students of software engineering should be given a sight of the whole and what part code plays in it all (hint: not everything). In terms of a software development process, whichever one is selected, you don’t really learn by simply reading the relevant manifesto. You really learn by doing.
You do Scrum. Or Kanban. Or whatever process. And it takes time for that doing to get meaningful. So, I think Mendez should return to the drawing board with fresh eyes on the goal and perhaps come up with a more valuable way to help his students understand the world of software engineering.
Going a bit further, when I first conceptualized this response, it took two paths. Either offer 2-4 different courses focused on elements of software engineering, so the whole MSc has a Software Engineering sub-focus, or, they embed SE concerns throughout the programme. With a deliberate goal of having students graduate with understanding what is involved in modern software engineering because almost every course had a component or a focus on some capability needed to effectively participate in software engineering.
From time to time, I work with researchers on projects outside of the day to day, forms-over-databases stuff. Mind you, I like a nice form over a cool data repository as much as the next guy, but it’s definitely cool to stretch your arms and do something more.
So, when Dr. Ogundipe approached me about cloudifying her satellite data processing, I had to do a lot of research. She had a processing pipeline that featured ingesting satellite images, doing some data cleanup, then analyzing those results using machine learning. Everything ran on local machines in her lab, and she knew she would run into scaling problems.
Early on, I decided to use a containerized approach to some of the scripting that was performed. The python scripts were meant to run in Windows, but I had an easier go at the time getting Linux containers up and running, so I went with that. Once the container was in good order, I stored the image in the Azure Container Registry and then fired it up using an Azure Container Instance.
Like a good story, I had started in the middle – with the data processing. I didn’t know how I would actually get non-test data into the container. Eventually, I settled on using Azure Files. Dr. Ogundipe would upload the satellite images via a network drive mapped to storage in the cloud. Since I got to have some fun with the fluent SDK in Azure a while back, I used it to build an orchestrator of sorts.
Once the orchestrator had run, it would have fed the satellite images into the container. Output from the container was used to run models stored in Azure ML. Instead of detailing all the steps, this helpful diagram explains the process well:
No, not that diagram.
So, I shared some of this process at a seminar Dr. Ogundipe held to talk about the work she does, and how her company, Global Geo-Intelligence Solutions Ltd uses a pipeline like this to detect locust movement in Kenya or the impact of natural disasters and a host of other applications of the data available from satellite images.
Recently, there was thunder and lightning and heavy rainfall in Trinidad. Trees fell, and there was a range of property damage.
In cases like this, the aftermath can involve service outages across a number of service providers. For me, it was power. There was a bright spark, a loud bang and then silence.
Right after the first jumble of thoughts, I remembered you can call the power company to find out about outages. In this case, it was 800-BULB. The operator was prompt, he helpfully indicated that it was “a big problem that will take a long time to fix”.
When we awoke hours later, there was still no power, but added to that, I could not reach the power company on the phone. Either a lot of ringing or busy signals. And it made me think, calling to find out about the status of service is an approach we should retire.
From a cloud perspective, we can readily find out if a given provider is up or down by checking one of their status pages. Both Azure and GitHub comes to mind.
If we had access to the numbers, I expect the power company’s lines didn’t stop ringing until the day after the event. How did they handle the increased load and calls? That could have been lessened if there was an automated way their customers could get answers. Both at the phone level, and on the web.
I expect that some of the more advanced utility companies here in Trinidad already have internal systems that can readily provide this information, it’s therefore a matter of going the last mile and making it accessible to the public.
Of course, if utilities use a standard mechanism for reporting, then what can then happen is the status of a number of public utilities can be known to the nation, without having to make any calls.
So, Microsoft’s Build 2020 Conference was held a few weeks ago. And at Teleios, we normally try to check it out because you get a good sense of where they’re going with some of the technologies and platforms we use to build solutions.
This year’s effort didn’t fail to deliver, even though the format had to be drastically change because of the strict rules of social distancing most of the planet was under due to COVID-19.
On a run a few days ago, I heard Jeff Sandquist, CVP dev-rel at MSFT talking about what it took to put on the event, likening it to a production by a movie studio. It was a great interview on a podcast called Screaming in the Cloud, in which Sandquist shared his top 5 Build Announcements (starts at 41:40):
Microsoft Teams: See it as more than a communication channel; it’s a development platform. “Single largest development opportunity for developers across the planet”
Developer Productivity: A number of changes and new resources have been announced to make it easier, faster and more productive when developing solutions.
Power Apps: Continuing commitment to enabling citizen developers to build no-code solutions that frees up system and other software developers to focus on other concerns while the organization gets things done. There was a lot to see about Power Apps and Power Platform and build.
“Making the best damn developer box” – Scott Hanselman’s keynote highlighted many new improvements to windows itself and the development experience there that underscore Microsoft’s goals around improving the tools that support the development process. From the new Windows Terminal, to making 2020 (finally), the year of Linux on the Desktop.
Microsoft Learn. Learning how to get into Azure and other Microsoft development technologies can be a challenge, to put it lightly. But Microsoft has recognized this and deliberately made learning about it’s products a strategic asset. From easy on-ramps via videos on the new Microsoft learn site, to very low barrier-to-entry access to the cloud through Azure, Learn its where it’s at. More seasoned developers will be happy to hear too, that documentation is seen as a key part of the developer journey and this is reflected in the way they approach it.
It was helpful to hear Sandquist talk about Microsoft’s Founding Moment. That they started as a Developer First company and they’re staying true to their roots.
Just about 40 submissions were made, which was instructive. I appreciated that if you do a salary survey in a small pool, it can be too revealing if you have access to each submission, even if the submissions are anonymized. If I were to do this again, I would bear that in mind, I might leave out a Company field, for example, in favor of Industry. I’d also not include a Job Title field and instead leave it at Category and perhaps Years in the industry. I might also include questions that get to the heart of project diversity, too.
So, this is mostly a cautionary tale about surveys like this. There are good points of guidance to get from economists and people in the social sciences to understand how to design things like this, beyond simply capturing the data.
So, when I built Nurse Carter years ago, I got the data from the Ministry of Health’s website in Trinidad and Tobago. I hadn’t updated the data since. Also, I built it with version 3 of the Microsoft Bot Framework.
Bot Framework updated a lot since then, and so did the health facility schedule. I took the opportunity to hit two birds with one code. I updated to the latest bot framework bits, changed from Node to dotnet and updated the health facility data.
To make it COVID-19 relevant, I included information about that as a menu option in Nurse Carter.
When I first did Nurse Carter, I feel like I had a lot more time – lol, I did because I was on “paternity leave” (it’s regular vacation I’m calling that). Now, in snatches of time, I found myself thinking “this is rell work!”
Nevertheless, I got to the submission line and Nurse Carter to a devpost hackathon and it was one of the top projects out of 1500 that were submitted! COVID-19 has brought a lot of heartache but I’m hoping that innovative approaches like Nurse Carter get more and more opportunities to deliver meaningful value in people’s lives.