Search This Blog

Sunday, April 26, 2026

Raspberry Pi hardware-related problems and settings adjustment to address them

Using a Raspberry Pi 4 as a desktop computer, I faced a few hardware-related issues. I diagnosed and fixed them with the help of AI assistants, which suggested specific commands and configurations.

Low Monitor Resolution

The first was the low resolution of the monitor. I have BenQ PD2500Q with a native resolution of 2560x1440, but RPi recognised it only as Full HD (1920x1080).

After some research, I found a suggestion to check the output of edid file from the /sys subsystem:

sudo cat /sys/class/drm/card*/card*HDMI*/edid| edid-decode

The output revealed the issue: the monitor was advertising 2560×1440 as a DTD (Detailed Timing Descriptor) in Block 0, but the CTA-861 extension block (Block 1) only listed standard HD modes up to 1920×1080 as native. The Pi's Wayland compositor (the display server protocol used by Raspberry Pi OS) was prioritizing the CTA block and stopping at 1080p.

The DTD timing can be explicitly set in the /boot/firmware/config.txt:


[HDMI:0]
hdmi_group=2
hdmi_mode=87
hdmi_force_hotplug=1
hdmi_timings=2560 1 47 32 81
             1440 1 3 5 33 0 0 0
             60 0 241500000 6

However, after a restart, the resolution reverted to 1080p. Further investigation revealed that the window manager (labwc) also needed the custom mode to be set. I added the following line to ~/.config/labwc/autostart:

wlr-randr --output HDMI-A-1 --custom-mode 2560x1440@59.95 &

Very Slow Mouse

My mouse was lagging significantly. I first tried adjusting the mouse speed in the desktop environment settings, but it didn’t help.

After some research, I found that the issue could be resolved by adjusting the usbhid.mousepoll parameter in /boot/firmware/cmdline.txt. This parameter controls how often the USB mouse is polled for input (in milliseconds). Lower values mean more frequent polling and smoother cursor movement, but setting it too low can cause system instability.

I ended up with the following setting:

usbhid.mousepoll=4

Network Card Dropping Connection

The third problem was much easier to diagnose. I left RPi running, but when I returned, the WiFi connection had dropped. The logs revealed repeated failures during the 4-way handshake authentication process, with NetworkManager unable to retrieve the stored WiFi password.

The root cause was that the connection details were saved into a user session keyring (e.g., GNOME Keyring) rather than the system keyring. A user keyring unlocks when you log in and stays unlocked while your session is active; then it locks again, and NetworkManager cannot retrieve the password.

The solution was to set the password storage to system-level keyring using the nmcli tool:

sudo nmcli connection modify "YOUR_network" wifi-sec.psk "your-password"
sudo nmcli connection modify "YOUR_network" connection.permissions ""

The first command sets the WiFi password for the specified network connection, while the second removes any permission restrictions, allowing all users to access the connection.

Note: For security, avoid hardcoding passwords in commands. Instead, use nmcli interactively or a secure password manager.

Links

Saturday, April 18, 2026

Crux cnijfilter2 package (Canon cups drivers)

I don't longer have a Canon printer, so I remove the cnijfilter2 package from my repo. To help any potential Canon printer user the Pkgfile is below. There is also a patch required by the build process.

Pkgfile

# Description: Drivers for Canon printers
# URL: https://www.canon-europe.com/support/consumer_products/operating_system_information/#linux
# Maintainer: Wawrzek Niewodniczanski, main at wawrzek dot name
# Depends on: cups

name=cnijfilter2
version=6.80-1
release=1
source=(https://gdlp01.c-wss.com/gds/2/0100012302/02/$name-source-$version.tar.gz
	add-missing-import.patch
	)

dirs="cmdtocanonij2 cmdtocanonij3 cnijbe2 lgmon3 rastertocanonij tocanonij tocnpwg"
build() {
	patch -p0 -i add-missing-import.patch
	cd $name-source-$version
	sed -i '/po\/Makefile.in/d' lgmon3/configure.in
	sed -i /SUBDIRS/s/po// lgmon3/Makefile.am
	sed -i '/GET_PROTOCOL/s/^int /static int/' lgmon3/src/cnij{lgmon3,ifnet2}.c
	export LDFLAGS="-L../../com/libs_bin_x86_64"
	for dir in $dirs
	do
	        cd $dir
	        ./autogen.sh \
		                --prefix=/usr \
		                --enable-progpath=/usr/bin \
		                --datadir=/usr/share
	        make
	        make DESTDIR=$PKG install
	        cd ../
	done
	rm -rf $PKG/usr/share/locale
	cd com/libs_bin_x86_64/
	rm lib*.so
	install -c lib*.so* $PKG/usr/lib
	declare -a libs
	libs=$(ls -1 lib*.so.*)
	for baselib in $libs
	do
	        shortlib=$baselib
	        while extn=$(echo $shortlib | sed -n '/\.[0-9][0-9]*$/s/.*\(\.[0-9][0-9]*\)$/\1/p')
	        [ -n "$extn" ]
	        do
		                shortlib=$(basename $shortlib $extn)
		                ln -s $baselib $PKG/usr/lib/$shortlib
	        done
	done
	cd -
	mkdir -p $PKG/usr/lib/bjlib2
	install -c -m 644 com/ini/cnnet.ini $PKG/usr/lib/bjlib2
	mkdir -p $PKG/usr/share/ppd/$name
	install -c -m 644 ppd/*.ppd $PKG/usr/share/ppd/$name
}
Patch

--- cnijfilter2-source-6.80-1/lgmon3/src/keytext.c	2024-09-20 07:28:40.000000000 +0100
+++ cnijfilter2-source-6.80-1/lgmon3/src/keytext.c.fix	2025-06-11 23:21:28.361664234 +0100
@@ -37,6 +37,7 @@
 #include <unistd.h>
 #include <libxml/parser.h>  /* Ver.2.80 */
 #include <string.h>
+#include <stdlib.h>

 #include "keytext.h"

Monday, March 30, 2026

More jq from journalctl

I've found another use case for `jq` when parsing service logs stored with journald. This time, I want to extract all non-INFO level logs from the service called slinky. In the previous example, I used awk to print only the part of a line that is valid JSON. However, sed might be better suited for this task. The following rule removes (replaces with an empty string) the beginning of the line up to the colon followed by a space ": ", which separates the timestamp from the log entry (JSON):

's/^.\+\]:\ //'

Examples

There are two examples of the command pipelines below. 

The first one checks the logs from the last 2 hours:


journalctl --since "2 hours ago" -u slinky.service\
 | sed -e 's/^.\+\]:\ //'\
 | jq 'select(.level != "info") '

The second one continuously prints new entries:


journalctl -f -u slinky.service\
 | stdbuf -oL sed -e 's/^.\+\]:\ //'\
 | jq 'select(.level != "info") '

The journalctl command is nearly identical in both examples (--since vs. -f). The jq select statement and the sed string replacement are the same. The main difference is that the latter uses the stdbuf command. It allows running the following command with modified buffering. The -oL option means that the standard output of the sed command is flushed line by line, enabling each entry to be passed immediately to jq.

Monday, March 02, 2026

Download GitHub Actions logs

I've been using GitHub CLI more and more lately. Recently, I had to debug a failing of GitHub Action run. Browsing long logs in the WebUI is a bit clunky, so I started downloading the full logs via the CLI. 

It is straightforward `gh` command if you already know the run number --  and that information can be obtained from the `gh` command with different options. 

I end up with following two-part shell snippet: 

VIEW=$(\
 gh run list \
 | grep $(git branch --show-current) \
 | head -1\
 | awk '{print $(NF-2)}') \
&& \
gh run view ${VIEW} --log > ~/Downloads/${VIEW}.log
 

The first command assign the run number to the VIEW variable. It parses the output of the `gh run list` command by:

  • filtering run for the current git branch (`grep`)
  • taking the most recent one (`head`)
  • extracting the run number (third column from the end via `awk`).

The VIEW variable is then used to fetch the logs for the specific run and save them to the uniquely named file in the Downloads folder.

Assumptions:

  • the workflow run belongs to the current git branch
  • it's the latest run for the branch 
  • The Downloads folder doesn't already contain a file with the same number (it would be overwritten)

Saturday, February 21, 2026

IvyNet DevOps Projects

Before IvyNet shut down, we decided to open-source the code. There are a few DevOps bits, which might be useful for others.

I believe documentation is important, and the README is a good starting point to see what's there. Good documentation doesn't have to be long or very detailed, so you'll find links to the actual code (e.g. Ansible, GitHub Actions, pre-commit, OpenTofu/Terraform, or Packer) plus some extra explanations.

Other things I'd like to point out:

The repository with OpenTofu/Terraform modules includes pre-commit and GitHub Actions (GHA). Each module has tests, documentation, and proper versioning. Some of the test scenarios are quite complex because they require extra components to run. https://github.com/ivy-net/otofu-modules 

The OpenTofu infrastructure definition is separated from the modules. This separation makes it easy to upgrade environments independently. Each can be upgraded separately, so you can test things in staging or dev before touching production.  https://github.com/ivy-net/infra

  
The applications are written in Rust, and the GitHub Actions and pre-commit setup could be a good starting point for Rust project automation. 

Tuesday, January 06, 2026

Hash of the latest git commit

From time to time I have, not only know, but to paste the hash of the latest git commit somewhere else. I prepared a one-liner to get the hash in a format good to copy & paste:
git log --pretty=oneline |\
 awk '{print $1}'

Wednesday, November 26, 2025

Check status of your GitHub PR from CLI

This github CLI command (`gh`) will list each PR authored by you. The output contains the PR number, title and all reviews. They are printed as json, and piped to `jq`. From the output, jq filters out the id, title and state of each review and interpolate it into a string.

gh pr list \
 --author @me \
 --json "number,title,reviews"\
 | jq \
 '.[] | "PR \(.number): \(.title) is \(.reviews[].state)"'

Links