Search This Blog

Saturday, February 21, 2026

IvyNet DevOps Projects

Before IvyNet shut down, we decided to open-source the code. There are a few DevOps bits, which might be useful for others.

I believe documentation is important, and the README is a good starting point to see what's there. Good documentation doesn't have to be long or very detailed, so you'll find links to the actual code (e.g. Ansible, GitHub Actions, pre-commit, OpenTofu/Terraform, or Packer) plus some extra explanations.

Other things I'd like to point out:

The repository with OpenTofu/Terraform modules includes pre-commit and GitHub Actions (GHA). Each module has tests, documentation, and proper versioning. Some of the test scenarios are quite complex because they require extra components to run. https://github.com/ivy-net/otofu-modules 

The OpenTofu infrastructure definition is separated from the modules. This separation makes it easy to upgrade environments independently. Each can be upgraded separately, so you can test things in staging or dev before touching production.  https://github.com/ivy-net/infra

  
The applications are written in Rust, and the GitHub Actions and pre-commit setup could be a good starting point for Rust project automation. 

Tuesday, January 06, 2026

Hash of the latest git commit

From time to time I have, not only know, but to paste the hash of the latest git commit somewhere else. I prepared a one-liner to get the hash in a format good to copy & paste:
git log --pretty=oneline |\
 awk '{print $1}'

Wednesday, November 26, 2025

Check status of your GitHub PR from CLI

This github CLI command (`gh`) will list each PR authored by you. The output contains the PR number, title and all reviews. They are printed as json, and piped to `jq`. From the output, jq filters out the id, title and state of each review and interpolate it into a string.

gh pr list \
 --author @me \
 --json "number,title,reviews"\
 | jq \
 '.[] | "PR \(number): \(.title) is \(.reviews[].state)"'

Links

Monday, November 03, 2025

IP address oneliner

Command

The command to get a public IP from UNIX `ip addr` command:

ip -4 -o addr| \
 awk -F "( |/)" '$0 !~ " (10|127|172|192)\\." {print $7}'

The `ip` command options makes it to print only IPv4 addresses (`-4`) and output them in one line each (`-o`).

The `awk` app uses space and `/` as Field separators (`-F "( |/)`) and looks for lines without a substring containing one of 10, 127, 172, 192 numbers followed by a dot and preceded by space (`$0 !~ " (10|127|172|192)\\."`. It prints a seventh filed (`{print $7}`).

Example of ip command output

To better understand the pattern used in the `awk` below is an example of the output of the `ip` command.

1: lo    inet 127.0.0.1/8 scope host lo\       valid_lft forever preferred_lft forever
2: enp2s0f0np0    inet 56.112.121.59/32 metric 100 scope global dynamic enp2s0f0np0\       valid_lft 64241sec preferred_lft 64241sec
4: docker0    inet 172.17.0.1/16 brd 172.17.255.255 scope global docker0\       valid_lft forever preferred_lft forever

Wednesday, September 24, 2025

Json in logs and how AWK can help

Checking systemd service logs in a JSON format might be a bit annoying. The output is far from readable, but AWK comes to help. The key is to use the "{" character as a separator, and reusing it in the printf command.

In my case, I had to check the mev-boost outputting in JSON format and used the following command to make the output more readable. 

 journalctl -u mev-boost --since "2025-09-22 00:00:00" | \
  awk -F{ \
  '$0 !~ /io.ReadCloser/ { printf ( "%s%s\n" ,FS ,$2) }' | \
  jq .
 

The first line limits the output of journal logs to unit `mev_boost` from 22nd September 2025.  Second and third line direct AWK to split each line on the "{" character, and processes only lines which do not contain the string "io.ReadCloser". The lines with the string are not proper JSON. Next, the second field/substring is printed, but prefixed with the field separate ("FS"). Otherwise, the output will not be a valid JSON, because the opening "{" is used a separator. Such formatted line is then sent to jq. In the example, jq just prints the output in a human friendly way, but it can be used to query the output.

Wednesday, September 17, 2025

Crux Port Automation

Crux ports 

To simplify keeping my Crux port repository up to date, I set GitHub Actions. There are 2 at the moment. The first one is the simple check of basic rules (trailing white spaces, lack of big files etc.) implemented with pre-commit.

The second one provides the crucial automation. The script takes the repository code, downloads the cruxpy library (see below for details). Then it gets Crux httpup-repgen script from the private s3 bucket. (Stored there to avoid 3rd party dependencies). It uses a very simple python script (portspage.py) to create a page for ports and then port-sync.sh to create the REPO file and upload the whole repo to the server. The SSH keys required to log in to the remote machine are stored in GitHub secret.

CruxPy

Initially, my automation based on standard Crux scripts. I was copying the repository from my desktop, even if the source code was stored in GitHub. Then I introduced GitHub Actions. The automation kept working from the remote machine fine. But there was a small issue. The last update date of ports in web UI was populated with the timestamp of the last file update on the GHA runner. They all were the same, because the code was downloaded every time. To fix it, I decide to prepare the CruxPy, a simple Python module.

Initially, I thought of preparing an equivalent of portspage.sh script, but with ports update date using git file metadata. Drafting the solution, I decided that the object-oriented approach suites the problem well; a port is an object, the repository website is as well. And in the future, the port class can be also used to prepare a repo one, and write CLI tools.

Having basic classes, I thought that it would be a great opportunity to prepare my first Crux package and add some automation. Which deserve a separate article.

Links

Thursday, August 28, 2025

Check the time of the current block in a Cosmos blockchain

echo BLOCK: $(curl -s 127.0.0.1:26657/status |\
 jq -r '.result.sync_info.latest_block_time');\
echo "NOW:   $(LC_TIME=C date -u +%Y-%m-%dT%X.%NZ)" 

Blockchain perspective 

If you deal with a Cosmos based blockchain, this command (actually there are 2 commands) helps to visualise the difference between the time of the latest block and current time. That's helpful if a node needs to catch up (e.g. after restarting from a snapshot). 

UNIX perspective 

The command has a few interesting "UNIX features". First, the curl from the localhost is silent (-s option) to avoid showing download stats. Its output is redirected to jq, which presents the raw (-r option) the time of latest block (.result.sync_info.lastest_block_time field). 

The second command returns the current time. The LC_TIME=Csets the locale to a standard, non-localized format, ensuring the output is always in a 24-hour clock and not a 12-hour AM/PM format, which can vary by location.  The option -u forces the output in the UTC, rather than the local time. Finally, +%Y-%m-%dT%X.%NZ formats date identical like the cosmos endpoint.