When you run temporary workloads you may not want to leave old machine instances lingering behind. Preemptible VM instances let you run at a significantly discounted price but they are shut down automatically after 24 hours, or earlier if Google needs those resources. The instances will still remain there in shut down state and they can still be turned back on later. What if you wanted to delete the machines automatically when you're done with them?
If you've read my Gentle Introduction to GitHub Actions you should already have a good understanding of the GitHub Actions basics. Creating a workflow is usually quite simple as long as you can find suitable action implementations for your steps. In some cases you may need to write your own actions, but you can actually get quite far with workflow commands.
When you need to return complex data from a function you typically think of two options:
- put the values in a dictionary
- create a new object/class
The first option is simple to implement but you need to access the individual values by their keys. The second option allows you to access data via attributes and do custom calculations behind the scenes, but then you need to implement yet another class.
Is there something in Python that could give us easy attribute access without having to bother with custom classes?
With the ECMAScript modules definition we can use the
export keywords to load and publish library code. However, if you try to use them in your Node.js project you might encounter the following error:
There are lots and lots of events that can be used to trigger GitHub Actions. But if you want to control your actions programmatically from the outside you will need to use repository dispatch. In this post I will go through the essential things you need to know to trigger actions programmatically.
Would you like to be able to schedule the blog posts for your static site into the future but don't want to use up all of your free Netlify build quota? In this post we'll create a GitHub Actions workflow that triggers a site rebuild when a post has been scheduled to be published.
GitHub Actions gives you lots of freedom to define custom workflows by combining different actions and running command line programs. Sometimes you might want to run small snippets of code, and that is already possible by running scripts from the command line with the
run keyword. What if you could write your Python script inside the workflow YAML file instead?
When you define an object in Python you usually give it some attributes that hold the necessary pieces of information in a place that makes sense. However, Python does not limit the use of attributes to the set that were described at object creation time.
Can you chain comparison operations in Python? Yes you can, each comparison is evaluated pairwise so you can chain together as many of them you want.
It can be easy to forget to use basic features like this if you come from a language that doesn't support chained comparisons or if you've never seen them used in the wild. I've been using Python professionally for years but I have to admit that I still didn't really know about this feature until recently.
When you work with other people, you may have unwritten rules that define the ways you work as a team, how things are done in your team. When you write the rules down you get what they call team agreements. They help define how the everyday work is done in your team, the core hours when people should be available, how everyone should have a say in decisions, and so on.
This made me ask: Could I adapt the concept to my blog? Today I’m talking about an idea I came up with, called blog agreements.