Issue 068 – October 16, 2021

ParallelCluster & API opportunities

Hey there,

It’s Robin from CFD Engine & I wanted to share a recent update to AWS ParallelCluster – their tool for creating, managing & using clusters of instances. Not immediately exciting, but hear me out.

It’s not a core part of my CFD toolkit, but I pay attention to this kind of stuff & this release has one particular development that could have wide-reaching implications for cloud CFD. I thought you might want to know about it.

This is my hot take on AWS ParallelCluster v3 & how it might signpost where cloud CFD is heading.

Quick Recap

If you wanted to build a cluster of instances on AWS & have it behave like the compute clusters you’re used to, then ParallelCluster is the tool you’d use.

It falls under the Infrastructure As Code umbrella – a family of tools that give us the ability to define a computing environment as code & then stand it up (or knock it down) with ease, often with just a single shell command.

In this case the compute environment is an HPC-ready, cluster of instances, including a head/management node, a job scheduler & (optionally) high-performance storage, fast interconnects & even a remote desktop.

That’s all straight off the bat, you can go on to customise it to include your favourite flavour of OpenFOAM & all the other tools you need.

When I take a moment to think about it, this is bonkers. Standing up a new compute cluster in five minutes, with one shell command 🤯 Every piece of this puzzle used to be a multi-person, multi-day job – bananas.

What’s New?

The release notes for the new version of ParallelCluster list a whole host of changes. Some are bigger than others, but most of them only touch existing users.

There’s a new config file format, some new options, some deprecated options (including the Torque & SGE schedulers). There are new ways to build custom images, more granular security controls and lots, lots more.

But there’s one change that might be of interest even if you don’t currently use AWS & it might indicate a new direction for cloud CFD.

The main event

You can now create web endpoints for your ParallelClusters (thanks to some behind-the-scenes AWS magic) 🎉

That didn’t float your boat? OK, let me try harder…

Your HPC cluster can now respond to events that originate outside AWS, away from the command-line or the web console.

How about some examples? As I understand it, with the new endpoints…

  • You could create a new cluster (& schedule jobs) when new models are uploaded to Dropbox or checked into GitHub;
  • You could interact with your cluster from outside AWS – “Alexa, how are my jobs doing?” or via a Slack/Teams/Whatsapp bot;
  • You could start/stop clusters on a given schedule using timed triggers, for overnight code testing perhaps;

The possibilities, while not endless, are certainly more numerous than they were before this release.

And I think this more closely resembles where cloud CFD is heading – API-driven, programmatic & scriptable. For example, SimScale have had an API into their world for a while, with clients already building their own custom cloud CFD workflows.

This offers similar opportunities, but outside the SimScale platform. It’s not for everyone, but I think it opens up some new opportunities that make it easier to build interesting new cloud CFD offerings. I can’t wait to see what people build on the back of this new ParallelCluster API.

What do you reckon?

It’s interesting that AWS are actively developing this tool. They’re paying attention to running HPC in the public cloud (as opposed to leaving us to cobble things together for ourselves) and including features that make it “easy” to build on top of it.

I like this direction of travel & I’m keen to take the revamped ParallelCluster for a spin. The web endpoint looks a bit beyond me at the moment, but never say never.

What do you reckon? Does this pique your interest? Could this be key piece of that new CFD service you’re itching to build? I’ll be keeping an eye out for your Alexa skills & Slack bots too 🤖

Until next week, stay safe

Signed Robin K