Sometimes I think it makes sense to keep very simple tasks outside core application code. For instance, suppose a particular individual stakeholder wants the results of an SQL query emailed to them daily. In the past, many such reports have been implemented as scheduled tasks within heavyweight monolithic Spring applications. The main problem with this approach was that the simplicity of the task was not a good match for the turnaround time. Especially in the days before continuous delivery, adding to and tweaking these reports was an unnecessarily slow and unwieldy process.
Changes to core application code need to be carefully managed with careful requirements gathering, automated tests, user acceptance testing etc. on a number of different environments before finally being deployed to production. It gets worse when you have a tight release schedule with lots of stakeholders seeking to have new features released at the same time.
As such, even before moving to our new software stack, we had started moving away from this approach. Where possible, we would handle these reports as standalone scripts which would be handed off to the DBA team for automated execution and distribution outside the scope of the main application. Now that we have moved to Azure, we can handle all of this quite easily from within an Azure Logic App.
INTRODUCING AZURE LOGIC APPS
In this post I don’t aim to give a comprehensive overview of what Azure Logic Apps are about and what they can do. That’s easy enough for you to learn for yourself. I also can’t claim to be an expert or to have researched this very deeply. What I’m hoping to do instead is to make you aware of the existence of this tool and to explain how I used it to solve a problem, so that you’ll have it ready in your back pocket next time you face a similar problem.
Azure Logic Apps allow you to quickly build workflows where inputs and outputs from heterogeneous sources and destinations are wired together with a variety of Connectors and standard control flow structures (ifs, loops etc). Connectors include interfaces to tools, software and protocols as various as
- Proprietary file formats such as Excel
- Open file formats such as JSON, XML, CSV etc.
- Data sources such as SQL Server, MySQL, Azure Cosmos DB (possibly the only NoSQL option — no MongoDB or Cassandra)
- Social media platforms such as Twitter, Facebook, Instagram
- File hosting such as FTP, SFTP, Dropbox, Google Drive, Azure File Storage
- Collaboration tools such as SharePoint, Slack, GitHub, JIRA
- Email tools such as Outlook, Gmail, Sendgrid, MailcChimp
- Web search tools such as Bing Search (no Google Search, it seems)
There’s hundreds of the things (233 at the time of writing).
Wiring it together
It starts with a trigger to kick off the job. Again there are lots of these, with many Connectors providing several options. To give you a flavour, possible triggers include
- a recurring schedule
- a tweet addressed to a certain Twitter account
- a certain HTTP Request
- a file being added to an SFTP server
From there, you can add further steps. Each step can be a control flow structure, variable manipulation, or be provided by one of the aforementioned connectors. Many steps can read to and write from what are effectively global variables such that the output from one connector can provide the input to another. It’s all done with a drag-and-drop user interface. This means that there is very little barrier to entry. It’s easy to learn and easy to get started, which to be honest is one of the major selling points in my view.
AN EXAMPLE — DAILY COMPLIANCE REPORTS
A stakeholder in a particular business wanted to get daily reports about what customers had been conducting transactions lately. This was a new report, and the requirements were not settled. The customer in this case hadn’t yet figured out what exactly it was that she wanted, however she did need to see results soon. Azure Logic Apps were a good fit because it would allow us to collaboratively and iteratively build the report she needed with very rapid turnaround times and no constraints from a busy core application release schedule.
These screenshots of the user interface show some features of the final Azure Logic App.
Performing a query and accessing the results
Sending the results
WHEN TO USE, AND WHEN TO AVOID
Opinions on this will vary. I’m sure Microsoft would be keen to promote this as a way to build all sorts of complex software. Maybe that could work, but that’s not what I’m recommending it for. We already have a sophisticated computing platform for producing quality software and delivering it robustly and safely. That’s not what this is about, at least for me. If you’re waiting for me to get into unit testing and static analysis, you’re going to be disappointed.
What this is about is having a tool that can deliver new non-critical functionality from scratch within hours when the customer values rapidity over robustness, and when bugs or downtime are no big deal. And when I say “customer”, I might well mean a project manager or member of the development team. This is for trivial quick’n’dirty low-stakes throwaway stuff where the official procedures are overkill — the kind of stuff you might otherwise deal with using something like a scheduled shell script rather than as a core application feature. It might also be useful for prototyping framework.
Pros & Cons
However the advantages this has over a shell script are many.
- In the cloud: rather than running a task on a developer’s machine, it’s on the cloud. Any developer can tinker with it as needed. It’s also dirt cheap (at least for the kind of thing I was using it for).
- Serverless deployment: you could run a shell script in the cloud, but you would need to run it in a container or VM. You don’t have to worry about that here. You just define the task and Azure handles running it.
- Integration options: you only have to define very simple behaviours where data flows through steps A, B, C, D. 99% of the work of what each step does will be built-in by one of the many connectors.
- Built-in version tracking
- Built-in execution history (time, success or failure).
- Low barrier to entry: it’s all built with drag-and-drop and by filling in forms. The time taken to figure out how to implement simple tasks will be much faster unless you’re a serious bash wizard (which I’m not).
There are also disadvantages of course. As mentioned, it is not yet appropriate for any serious development of quality software by our team. Perhaps you could build quality software with Azure Logic Apps, but at the moment this is not a part of our approved technology stack, and for good reason. Without any formal testing or delivery process, it’s best not to have critical code relying on anything implemented in Logic Apps.
I would also note that using Azure Logic Apps ties us ever deeply into Microsoft’s technology ecosystem. If we had a lot of complex stuff implemented with this, it would make it that much harder for us to move to AWS or IBM Cloud or even back to in-house hosting. As such, I feel this is a good tool for simple stuff only. Don’t use Logic Apps if it would be a big problem to reimplement with another tool.