In Praise of Tinkering

If you're reading this, it's a good bet that you're a tinkerer like me. I love tweaking and tailoring just about everything I can, to the point that it's probably confusing that I'm not a Plasma user. To which I say, I believe that form follows function, not that function negates the need for form completely (You can send your complaints to $NAME_OF_OTHER_PODCAST_HOST). Despite being a perfectly defendable position epistemically speaking, I don't believe that everyone is a bot except me, and instead tend to believe that my fellow developers like to tinker as well, to the point that I'm always rather surprised and a little hurt when I hear others say that they'd rather stick to the beaten path; that they don't want to customize their operating systems more than they absolutely have to. Don't get me wrong, I'm not saying you're a bad person if you've never felt the desire to make your windows wobbly (@Plasma users), I'm with you there, but I'm not going to subject myself to using bash as my default shell because I have to SSH into servers on at least a weekly basis and am worried about being slightly uncomfortable in this context. It's a fraction of my time and I'm significantly more productive with the rest of the time using Fish. Not only that, but my comfort with Fish was what encouraged me to dive deeper into shell scripting; something I had previously tried to avoid to a fault, writing ridiculous python scripts that could have been implemented trivially with a handful of pipes in a one-liner.

This is a common manifestation of the most frequently cited reasons I see given against tinkering: the fear of losing the fruits of ones labor, be it because one has been forced into a situation of using a stock system, such as in the case of the aforementioned SSH session, or from data loss, such as after setting up a new OS. To the former I say, don't copy-paste configs if you don't know what they do. Any tinkering you do ought to be done reversibly, i.e. that you only do it if you at least generally understand what you've done and how your modified configuration differs from the original, and have some kind of version control in place (More on that below). Don't remap ":" to ";" if you're worried that you won't be able to remember how to use vanilla vi.

To the people worried about losing their configurations, this is a little more understandable as it requires some actual effort to rectify, but not much. There are a myriad of solutions available for not only backing up your configurations (also known as "dotfiles" because they're frequently stored in hidden files and directories), such as GNU Stow or (my personal favorite) Dotdrop. You can also write installation scripts for the parts that can't be handled by symlinking dotfiles from a git repo. I have a small collection for mine, and GitHub has an entire page dedicated to the documenting some of the most masterful dotfile setups. Even if you aren't a DevOps engineer, maintaining a git repo and drafting a few shell scripts isn't bad practice of the kinds of hard skills that every developer needs.

Finally, I'd like to address the idea of our machines being mere tools, disposable if anything goes wrong. I completely agree with the latter half of this sentiment. I don't know if I should be admitting this, but I tend to hose my machine at least once or twice a year. I do something so incredibly stupid that I either have to completely reinstall the OS, or worse, I've accidentally deleted it while trying to reformat another partition or something. It used to be a horrific event, with repercussions rippling outward days into the future as I slaved over restoring every bit back to its proper place, clicking checkboxes and re-downloading installers. Now, everything I need is backed up onto multiple media in multiple physical locations and my current configuration is a shell script away. I could throw my computer into a lake and be able to pretend like nothing ever happened within a couple of hours. My computer has been effectively disposable for the past year and a half now, and tinkering with configuration management is the reason why.

To the people who want to see their machines as mere tools, I say they already are. They always have been! They're functionally-equivalent to a hammer in the grand scheme of things. The only difference being that instead of providing mechanical leverage, their's is computational. As an implementation, they're actually worse than their physical counterparts. If you've ever handled old tools, you'll have noticed how they wear. Their handles erode in such a way that your hand fits them perfectly. They've been sculpted over years of use. Computers, namely operating systems, don't exhibit such a property. Without the effort, their sharp edges will remain as sharp as they day they were compiled until the bit rot finally breaks them. They're almost brittle in this regard. We can smooth those edges though, with a little effort and a bit of tinkering.

Since we're all stuck inside for an indeterminate length of time into the future, use some of it to sand some sharp edges. Try a new shell! Make some aliases for frequently-used commands! Write a program that nags you when you inevitably don't use them for the first few weeks! Ruminate about why you didn't do this sooner!

But above all, stay safe. We'll see you on the other side.

#andràtuttobene 🇮🇹

❋❋❋

New Year, New Password Manager

The walled garden of the Apple ecosystem is certainly a cozy one, but good luck if you ever want to venture out beyond into other realms, such as Linux. For some use cases, iCloud is not only perfectly fine, but even preferable. I don't foresee myself needing access to my Shortcuts on any non-iOS devices, so cordoning them off onto their own platform saves some hassle. My documents, on the other hand? That's another story. Likewise, what about photos and, probably most importantly, passwords? To put it succinctly, I want a solution that let's me have access to only what I need on each device/OS that I use, and nothing more. 2020 has brought some new themes for me; one of which is action. I've decided that the first step I'm going to take towards achieving ~~world domination~~ my goal is tackling the most important domain: passwords.

As aforementioned, things are pretty great within the confines of the walled garden; though by no means is it a panacea. The Keychain app has some pretty gaping holes with regard to its capabilities on each of the various platforms. On macOS, it's pretty great! There's a good concept of the different kinds of information that can be securely stored within it: login passwords, keys, notes, etc. Sadly, this notion does not translate over to iOS, and that's a problem. Reinstalling macOS and the encryption key of your backup is stored in a secure note? You better write that down ahead of time or have another Mac around. May the gods have pity on you if you need to access any passwords on a device that isn't logged into your iCloud account, or *gasp* isn't running an Apple OS.

There are myriad third-party password managers. I prefer open source solutions where possible, so I first tried Bitwarden. It's... okay. It does tick all the big boxes that Apple's first-party solutions do, with the added benefits of having a web app (!) and being much more cross-platform. I understand why all of the clients are Electron/React-based, but I much prefer native apps. Being open source, and seeing some existing attempts at native clients, I thought I'd try to revive one. In doing so, I see why it stagnated: there's little to no documentation, both outside of and within the source.

Enter: KeePass

KeePass is more of a protocol than a platform, and this is how it shines! There are native clients for just about every OS (at least that I use), and syncing it is as simple as passing a single file, the database, around through whatever means you please (in my case, Nextcloud). Clients can take care of merging or reverting changes as needed.

Hello ~~darkness~~ Applescript, my old friend...

Migrating between proprietary applications is never without hiccups, and Apple usually does a better job than most at enabling interoperablilty *cough* Microsoft *cough*. Keychain.app though, is the absolute antithesis of this. I understand that a password manager should probably have additional safeguards in place around data export, but crippling that functionality is not a safeguard. Login passwords, such as internet passwords, cannot be exported. The category of secure items that I most likely need available to me on other platforms is the hardest thing to extract. Having almost 250 login items in Keychain, I was not about to type my password in for each. Thankfully, there's an App[lescript] for That™. I had to modify it to work on 10.14.6+, and if you're reading this more than six months from date of publication, I'm sorry that you'll probably have to modify it further to work for yourself, but here, in all of its horrendous glory, is a script to copy a given number of items from Keychain.app to a csv file.

set x to 1
tell application "Keychain Access"
activate
repeat while (x ≤ NUMBER_OF_ITEMS_TO_EXPORT)
tell application "System Events"
click text field 1 of row 1 of outline 1 of scroll area 1 of splitter group 1 of window "Keychain Access" of application process "Keychain Access"
keystroke return
delay 0.2
keystroke tab
delay 0.2
keystroke tab
delay 0.2
keystroke tab
delay 0.2
keystroke "c" using command down
delay 0.2
do shell script "/usr/bin/pbpaste | xargs -I {} echo -n {}\",\" >> ~/exports.csv"
delay 0.2
keystroke tab using shift down
delay 0.2
keystroke "c" using command down
delay 0.2
do shell script "/usr/bin/pbpaste | xargs -I {} echo -n {}\",\" >> ~/exports.csv"
delay 0.2
click checkbox "Show password:" of tab group 1 of window 1 of application process "Keychain Access"
delay 0.2
keystroke KEYCHAIN_PASSWORD
delay 0.2
keystroke return
delay 0.2
keystroke tab using shift down
delay 0.2
keystroke tab using shift down
delay 0.2
keystroke tab using shift down
delay 0.2
keystroke "c" using command down
delay 0.2
do shell script "/usr/bin/pbpaste | xargs -I {} echo {}\",\" >> ~/exports.csv"
delay 0.2
click UI element 2 of window 1 of application process "Keychain Access"
delay 0.2
key code 125
delay 0.2
end tell
set x to (x + 1)
end repeat
end tell

You'll want to substitute the number of items, as well as your password, and beware that it obviously exports them as plaintext (in plain text), so don't save it to your Documents or Desktop folders if those are synced via iCloud. Additionally, you can sort items by "Kind" if you're only interested in exporting certain types, such as login items. Then sit back, relax, maybe prepare a beverage, because it might take a while (About a half of an hour, in my case).

Part the Second, Converting to KeePass

KeePass has a diverse ecosystem of scripts and utilities for importing a variety of data formats through a client. For our purposes, there's csv2keepassxml, an incredibly flexible Ruby utility that converts data from, you guessed it, csv into an importable xml file. If, unlike me, you're a Ruby expert or don't care about littering your system with packages you'll install once and probably never use again, feel free to skip the next two sections.

rbenv

Because I wasn't developing web apps in 2008, I'm not a big fan of Ruby, but I am a fan of using virtual environments, especially for languages with system installations, such as Python and, you guessed it, Ruby, on macOS (for now). If you don't care about such things, feel free to skip this section. For everyone else, there's rbenv. Much like its Python counterpart, pyenv, installation is simple and can be accomplished via brew.

$
brew install rbenv

Follow the instructions given by the init command to set up your shell.

$
rbenv init

Finally, source your config or restart your terminal to make the changes take effect.

Usage is equally simple. To install a far more up-to-date version of Ruby than what macOS may (or in the future may not) even have, simply use the install command.

$
rbenv install

rbenv-gemset

Virtual environments aren't terribly useful without dependency management. For rbenv, this is where rbenv-gemset comes in. A plugin for rbenv, it allows dependencies to be managed per-project and even reside within the project directory.

Installation is as easy as rbenv, as it's also available via brew.

$
brew install rbenv-gemset

In the root of the project, create a gemset using the current version of Ruby.

$
rbenv gemset init

You can now use gem as you would normally, with the gems being installed to a directory in the root of your project.

csv2keepassxml

If you're joining us from the beginning of this part, welcome back! csv2keepassxml is available not via brew, so let's clone the repository from Github.

$
git clone https://github.com/lifepillar/csv2keepassxml

Create a gemset, and install csv2keepassxml's dependencies into it.

$
gem install htmlentities

If you used the format of the aforementioned applescript, the following will create a KeePass2 database file from the exported csv file.

$
./csv2keepassxml -t 1 -U 1 -u 2 -p 3 ~/keychain-export.csv

Note: Indices are 1-based. Unfortunately.

Clients, Clients Everywhere!

There are a plethora of KeePass clients out there. Here are the ones that I use, and for the most part recommend. For an added bonus, they're all open source!

Conclusion

This is the first step in my journey towards building a private infrastructure for my personal data. Over the upcoming months, I'll be looking into solutions for file storage and syncing, and then eventually services such as email, contacts, and calendars.

❋❋❋

LightDM with an External Display and the Art of Pseudocode

Today I found the solution to an incredibly first-world problem that's been plaguing me for the majority of the months I've been using Arch (I use Arch, btw) full-time: Logging into a session when my laptop is connected to an external display and in *gasp!* clamshell mode. Signal boosting a post that solved this problem that I've been procrastinating on lo these many weeks for me probably shouldn't warrant a whole blog post but I wanted to praise the author for walking the reader through a step that we all probably do, or at least probably should be doing more deliberately: pseudocoding.

Much like in the process of writing my first paper in college where I started drafting an outline without even thinking about it, a credit to the years of English teachers in high school who made me do it even though the topics of many of the papers I wrote in those classes weren't so complex that I felt I could adequately hold it all in memory, there are times when we find ourselves with a problem sufficiently complex that we need to utilize swap-I mean scratch paper, and we start making (quite literally if you're me) back-of-the-napkin diagrams and rough pseudocode with horrendously-named functions that we swear we'll fix when it comes time to implement. If the code you ultimately write never leaves the confines of your hard drive, it's fine if the chicken scratch you've scribbled to yourself never makes it off the page, but if you share that solution with anyone, and especially if you share it in the form of a blog post, it's really to everyone's benefit (including your own), that you share the pseudocode as well.

For those consuming your code, it greatly improves the likelihood that they can conceptually understand and appreciate what you've made even if they can't understand the implementation very well (For example, if it's in an unfamiliar or obtuse language, like bash). It benefits the author as well, as it gives them an opportunity to think though the problem more generally and perhaps even explore multiple solutions without wasting keystrokes implementing each one. I particularly like this problem of telling LightDM which display to use as an example of the merits of explaining through pseudocode as the problem itself can be presented rather trivially; maybe so much so that one would just start hacking away at it inside of a text editor without giving it much thought. Upon closer inspection though, the problem is revealed to be a little more complex, with multiple sub-problems branching out like the length of a coastline, growing with each step in magnification.