Categories
linux

Cleaning $HOME: XDG Base Directory

Freedesktop.org: Standards: XDG Base Directory Specification is the main document at hand here.

Every now and again the folks that work on free software decide to rearrange a specified or de facto standard directory/file structure with an eye on improving the state of the platform. XDG-basedir is one such attempt, but there have been others like the advent of /run.

The XDG-basedir basically stipulates three locations:

  • $HOME/.config or $XDG_CONFIG_HOME
  • $HOME/.local/share or $XDG_DATA_HOME
  • $HOME/.cache or $XDG_CACHE_HOME

These are meant for your configuration files, data, and caches. These locations provide for some flexibility and improved organization over throwing everything in your $HOME. For example, your caches may be kept in a virtual path that points to RAM or on a SSD that is faster than your regular storage. Or your configurations may be on some networked disk.

Lots of applications support this specification in some way, though some more than others. To take advantage of this specification requires first looking to see what dot folders and dot files (ie, those named like .foo) exist in your $HOME. While you’re at it you may look in the above-mentioned folders to see what applications have already set up shop in their new locations.

Some applications will have moved their baggage themselves. Other applications implement the standard to whatever degree, but give precedent to the old location, the old dot files. They do this out of pragmatism. They don’t have to write a migration scheme, and they don’t confuse their long-term users who may not know or want to know about these new locations.

For these applications, moving the files yourself (or if the data/configuration/cache does not matter, simply deleting the existing file(s)) works fine.

For others, those that do not support the specification, you can often still move them if you define their environment to point to the proper location(s).

An example of the latter is Mercurial, the distributed source control system. It does not support XDG-basedir, but it does support the environment variable HGRCPATH. By properly setting it, export HGRCPATH=${XDG_CONFIG_HOME}/hg/hgrc (or the full path if you do not have XDG_CONFIG_HOME defined), you get the benefit of the specification without Mercurial actually supporting it.

In cleaning my own $HOME I found this sort of fix was useful for about six applications including vim and gnupg.

Still other applications do not support an environment variable to specify where their files live. Some of these will accept configuration file locations from the command line. For these, setting shell aliases or functions may help. I found this useful for only one or two applications.

Some applications have support for it in versions newer than I have installed. I let these wait, glad to know of the effort.

And quite a few applications have open (or closed) bugs for supporting XDG-basedir. In most cases this is less about the technical work for the specification and more about deciding if and how to support it. At least a handful of the applications I looked at were reticent to support it at all. Others said support would be welcomed given enough background to show what would/not break and with a patch available.

But several argued about how far they would support it. This mostly came down to applications willing to move their lot into $HOME/.config (or wherever $XDG_CONFIG_HOME), but not split out cache and/or data. It seemed this was more often argued against for portability reasons (ie, they support Microsoft Windows operating systems and want to allow their users to move their files between them or have them on one common drive without issue).

And in rare cases files are hard to nail down. They sort of configure, but they’re sort of data. Or they’re sort of data, but they’re sort of cache. (Hopefully never all three in one.)

On the whole I was able to reduce my number of dot files by about 15, and the number of dot directories by about 60.

At least a full half of the files and folders I lost were obsolete entries from applications that I no longer use. Another chunk were applications that wrote their configurations even though I used all default options. The rest were from applications that now support XDG-basedir or had other acceptable workarounds for moving their files.

I find this an interesting topic to look at given that the change to applications is not overly complex, but it affects a wide variety of applications. I also like having a cleaner $HOME as it makes finding things easier.

Maybe my next step will be to audit my XDG directories as I’m sure they contain some files from applications I no longer use.

Categories
linux

Introduction to Linux

One of the things that holds back Linux adoption is that the average user doesn’t understand how it works compared to Microsoft Windows or Apple Macintosh OS X.  In some ways Apple Macintosh has reused some of the best ideas of free operating systems (eg, package management).  Note that I’ve not actually touched a modern Microsoft Windows installation, so my current knowledge is based on the little I hear through the various grapevines of the Internet Vineyard.

On Microsoft Windows, most programs come with their own installer.  That installer is meant to look at your system and decide where everything goes and how to configure it.  It still relies upon certain services the OS provides, like the Registry for storing settings or for discovering existing configurations.  There’s an ability to use a more Linux-like installer, but the prevalence seems to remain with regular executables.

On Apple Macintosh OS X programs use a more Linux-like installation method.  The main difference is that they are typically acquired like Windows programs, with the only repository available to the package manager being the official Apple repository.

On most Linux distributions, such as Debian and Fedora, the vast majority of packages are installed through a package manager.  A package manager flips the control over installation a bit.  Each package has certain rules associated with its installation, but installing them begins with executing a package management component that’s responsible for invoking the rules execution.

An example will be useful. I’ll use Firefox since it’s widespread on all three platforms.

On Windows, you download the Firefox installer executable.  When you run it, it checks the system out and copies its files into the installation folder (usually somewhere like C:\Program Files).

On OS X, you download a disk image file.  This contains the .app structure for installation, along with some metadata.

On Linux, you might download a Debian archive.  This also contains the program’s files in an existing structure and some metadata.

The main difference is that Linux packages typically announce their dependencies, which means that other packages are required to use them.  On Windows, most dependencies are bundled or mentioned in documentation, with no formal facility to account for them.  On OS X, you get that too, but you also get developers tending to rely on the facilities provided by the OS itself rather than requiring anything separate.  On Linux, the expectation is that the computer should figure out what to install and in what order (though, ultimately humans still make those calls for now).

You can go download the archived build of Firefox for Linux from the official site.  Most people don’t do this, and Mozilla doesn’t directly provide packages.  They know that the distributions repackage it, in some cases splitting up the components into separate libraries so that other applications can use them.  The GNOME 3 desktop, GNOME Shell, uses the JavaScript engine that Firefox uses.  The Mozilla e-mail client Thunderbird typically shares certain dependencies with Firefox, meaning they are installed only once.  On Windows, the same code might exist for both programs.

Okay, back to the beaten path.  The takeaway for a Windows user is that most of the software you will install on a Linux system comes from the distribution and is managed by it.  If you want to edit some images, you might install GIMP, which means you ask the package manager to install it for you, rather than downloading it from the GIMP website.

If you want to modify GIMP, you can go the traditional Windows route of downloading the source or checking it out from the repository, but you can also tell the package manager you want the source.  This lets you take advantage of the work of the package managers to make the package buildable and packageable.  A few simple commands and you’ve got the source ready to build (or modify first, then build).

That’s an important fact about the Linux universe, the ability to easily rebuild the software is essential to its health.  It means that a bug can be fixed without too much hassle (other than the actual debugging) from a user.  When you have a platform of tens of thousands of pieces of software that need to work together, someone that’s only casually using one piece doesn’t want to learn the world to fix it.

I would be interested to know how much non-OS software the average person installs on their computer based on platform.  My guess is that it’s not that much.  And my guess is that most of the third-party software that goes on OS X and Windows has free alternatives that are in most Linux distribution repositories.

Improving the rough edges, making the concepts accessible, and giving people confidence in support channels, are the main challenges to adoption.