Search This Blog

Tuesday, August 20, 2024

Pain with Names (of computing resources)

 

The good naming convention of computing resources (API, functions, ...) is an important aspect of code usability. Ideally, the names are short, but descriptive, follow some kind of conventions and allows people, who read or use code/product, intuitive usage, and easy documentation search. The Terraform (or OpenTofu) GCP (Google Cloud Platform) Provider is an example on how a small 'paper cuts' might make the experience painful.

Resources in GCP, at least some of them, can be global or regional. Each 'type' has the own API call (e.g. there is 'healthChecks', and 'regionHealthChecks'). Terraform follows the same convention. That's the first design decision to discuss. I, as the end user, would prefer GCP, or at least Terraform as a higher level language, to put them together in one resource type (e.g. healthCheck) and have an option inside in the resource to switch between them. However, Terraform covers a lot of providers, so I can guess (haven't searched the answer) there is a policy to translate underling API in the most direct way.

The real pain, is that there are no way to easy, general way to distinguish between regional and global resources. As mentioned above, default health check is global and regional has the 'region' prefix, but for the addresses the endpoint 'address' is regional and the global one has the prefix (globalAddress). Terraform follows. I don't know why API introduces the chaos, but I would much appreciate, if Terraform introduced an order. For example, everything what is regional, has a prefix region. If authors afraid of introducing problems by cross naming API endpoints and Terraform resources (e.g. a global_address become address and address: address_region), the solution could be to introduce a prefix for both types. Then we would have global_address and region_address.

Another problem is usage of prefixes rather than postfixes. As a user, I first look to create a load balancer and then decides if I want to make it global or regional. API and Terraform resources should make it easier for me, by exposing the more important resource feature first, on the left, e.g. address_global and address_region. Maybe global_address sounds better, but this is not a poetry and users don't read the code aloud. Using prefix makes searching and reading documentation harder, too. Methods and objects are usually listed in the alphabetic order, and with postfixes similar resource are next to each other, healthCheck is next to helthCheckRegion etc. It case of GCP would also make API usage a bit easier. The GCP API follows the Camel function name convention, and we have 'address', but 'globalAddress' ('a' vs 'A' in 'address'). With the postfix, it would be 'address' and 'addressGlobal'. The 'a' in 'address' would be always lowercase.

So what did go wrong for me with all of that? Recently, I've been working on a scenario to deploy a load balancer pointing onto a test VM. That took me too much time. The GCP documentation is rich. Quite a lot of APIs have Terraform resources examples, but not all of them. So I was patching them, with extra steps translating Console steps into Terraform using the GCP provider documentation. Some example were for regional resources, others for global one. The provider does not link between them. It took me too much time to figured out the right resource names. It was like this: "The error reads that the regional resource cannot point onto the global one. But why? I don't have any global resources. Oh, in this case, you have to add the word regional." I fixed them one by one. The last error was the address. On the Terraform provided docs page, I put "compute_address" in the search bar. And I read the help page over and over, looking on how to enforce the resource to be global. Finally, not sure how, maybe looking on one of the examples, I noticed that the address endpoint by default is regional, and I needed to add the 'global' prefix. Of course, in the search bar, I could type only 'address'. Unfortunately, for many resources keywords a search returns so many results, that they are not helpful, especially when you start the adventure with a new product.



Monday, April 22, 2024

Open geth (and other) binaries on MacOS

Recently, I had a problem opening the geth binaries (one of the Ethereum execution clients) on macOS.


 

After a short internet search, I found it was caused by extended attributes check. The problem like that occurs when the unsigned software is downloaded from the internet, but only using a web browser. When the curl or wget commands are used, there are no extended attributes assigned to the file.

 To check if the file has extra attributes, you can run a simple `ls -l` command and look for the `@` character.

> ls -l
-rwxr-xr-x@ 1 wawrzek  staff  45986920 17 Apr 07:06 geth
The list of attributes can be obtained with the `xattr` command:
> xattr geth
com.apple.quarantine
The same command can be used to remove the attribute
> xattr -d com.apple.quarantine geth
what enables an application to run.

Tuesday, April 09, 2024

Logitech MX Keys S on macOS and Linux (Crux)

 Before I forget again.

- On macOS, the UK keyboard is recognised as the ISO (International) one. That maybe causes the problem with the location of the ['`','~'] key. It's mixed up with the ['§','±'] one. I wrote maybe, because today I manually set it up to the ANSI option. That fixed the problem. But then, for a test, switched back to ISO, and it still works fine. Maybe my problem is because I connected the keyboard to a USB switch, rather than directly.

- Officially, Logitech supports Logi Option + software on Windows and Mac, but on Linux we have Solaar. It works, and if you are a Crux user, I created a port for it.

Sunday, February 04, 2024

Open file from command line (in Linux and Macos)

One of the nice feature of MacOS is the open command. It allows opening files directly from the command line without knowing the application linked to the file type. For example:

 open interesting.pdf 

opens the interesting.pdf file using whatever program is assigned to open PDF files. (If you want more example about the open command, you can check this link.)

For some time I wonder about a Linux equivalent. Recently decided to look for it more actively and check if AI might help. It did, and pointed at the gio command from the Gnome Input/Output library. After adding a function or an alias following block of code to the .zshrc, I have a Linux equivalent.

  •  alias

open="gio open"

  • function:

open () {

    gio open $1

}   

And finally, xdg-open is an alternative for the "gio open".

Saturday, December 30, 2023

Glow the Grip of MD files (from GitHub)

If you will every need to locally render an MD file, e.g. reading some documentation, you can use the glow [1] program. It renders a MD file in the terminal.

In the case of GitHub repository, an alternative is to use the grid [2] project. It sets a local webserver using the GitHub markdown API. It produces a local view of MD files as they would be in the GitHub website.

 

Links

  1. https://github.com/charmbracelet/glow
  2. https://github.com/joeyespo/grip

Sunday, November 12, 2023

Summary of a Terraform plan output

One of the most annoying thing when working with terraform is the size of output of the terraform plan command. For more complex environments, it easily can get to many thousand lines, even for what seems to be a small change.  It makes very hard to confirm that a code change does not have a side effects.

It would be nice to have the summary option, showing only resources and modules changed. I guess one day such feature will be added. In the meantime, I thought to use the grep command on the terraform plan output. It wasn't easy, because the output contain a few control character. After quite a few attempts, I found that following regex is a substitute.

terraform plan | grep -E "^[[:cntrl:]][[:print:]]+[[:space:]]+#\ "

Wednesday, May 17, 2023

How to find s3 bucket in multiple accounts (with awk and multiple field separator)

Imagine you have quite a few AWS accounts. In one of them, you don't know which one, there is an S3 bucket. The AWS CLI with awk and zsh can help to find it.

In the first step, let's prepare a list of all accounts, or rather profiles from the AWS CLI config (the ~/.aws/config file).

accounts=($(awk -F "( |])" '/profile sso/ {print $2}'  ~/.aws/config))

In the example, we limit the list only to profiles with the prefix "sso". The command uses awk to find any line with the string "profile sso" and print the second field from it. However, it does no use the standard field separator. There are 3 characters working as a separator: space, "|" and "]". Please also note the awk command is two pairs of "()".

 The list is saved into the accounts variable and used in the second command. It lists all s3 buckets from each account, and grep for the selected string, which of course can be the whole bucket name.

p=sso-prod 
bucket=my-company-not-so-important-bucket
for accounts ($accounts) {echo $account; aws s3 ls --profile $p| grep $bucket}