Personal Linux Setup with Git Repos and Stow
Created: 2022-02-03Updated: 2023-02-25 (Added scripts, more about living in pseudoterminals, link to slackware setup)
I had a dream
-
A low power, always-on computer I could SSH into from any other computer in the house.
-
All of my projects and data in Git repos available for cloning and updating from any computer in the house.
-
My personal Linux/UNIX configuration ("dotfiles") available to any computer in the house for instant and granular installation.
-
No dependencies on any computer outside my home network.
Time for a new setup!
Welcome to "setup2". Don’t bother looking for a "setup1" page. There isn’t one. But there was a
previous setup: a Syncthing-managed ~/sync/
dir that lived in my home
directory on all of the POSIX-compatible machines I use semi-regularly around
the house.
One machine to rule them all

The cornerstone of the new setup is a fanless mini PC running a low-power Celeron (N3160) processor with 4Gb RAM. It it always on and uses about 5W just sitting there (less than a household LED light bulb!), which I can live with. Main storage is the 60Gb NVME SSD card that came with the PC. (I also have an external USB SSD plugged in for bulk storage, but only use that ocassionally).
It’s headless and lives in the basement, though I can plug a keyboard and monitor into it if I have to for some reason.
Rather than keep synced "perfect" setups across multiple machines, the idea is that I now have just one perfect setup on this one machine. To go to my "true" home to edit my wiki, etc. I SSH into this machine. (And a web server can be an amazing thing to have on your local network if you are creative with it.)
It is called Phobos because it sounds cool and gives me happy DOOM-related memories.
Bare Git Repos
Okay, so I’ll SSH into Phobos as much as possible to keep a simple, secure "home" for myself. But I’m still going
to want a consistent environment (such as settings for Vim and Bash) and a common personal ~/bin/
directory for
my most used scripts.
I could continue to use Syncthing for this. But here’s my issue:
-
Syncthing is wonderful software, but it’s "magic", which has positive and negative connotations.
-
I want to keep my "dotfiles" and "bin" scripts in a distributed version control system.
I had my dotfiles and bin under version control. But since I only used it as a sort of backup mechanism, I only committed to the repos "whenever I felt like it", which was pretty rare. Also, conflicts between edits on different systems were rare, but when they did happen, Syncthing couldn’t help much to resolve them other than silently put a conflict file in the directory.
So this time, I resolved to go "all in" on sharing between computers via Git repos. This is done with "bare" repos, which I’d always avoided because I didn’t understand them and they seemed redundant on a local system like mine.
Turns out bare repos are nothing to fear and it’s quick and painless to set up the ability to serve a repo from a central "server".
On the "server" (in my case, Phobos) create a bare repo from existing local repo foo
with:
cd /home/dave/ git clone --bare foo repos/foo.git
Now you have:
/home/dave/ foo/ <your source files> repos/ foo.git
Set the upstream for the local repo on the "server" (Phobos):
cd foo git remote add origin /home/dave/repos/foo.git git push --set-upstream origin master
Now on the "server" (Phobos), you can push and pull from the foo repo locally. The server is done.
Then on any other another machine on the network, clone it from the "server" (automatically sets upstream):
git clone phobos:repos/foo.git
(In the above example, the /home/dave/repos
directory exists only on the Phobos computer. Note also that I’m user dave
on all of my machines. Otherwise, I might need dave@phobos…
.)
Now I can push/pull from the cloned repo on any machine (including the server Phobos) to the "bare" repo and keep everything in sync.
GNU Stow for "dotfile" management
If you’ve shared setups across multiple Linux/BSD/UNIX machines for a while, you know the "dotfiles" issue: how do you quickly and painlessly
"install" files such as .bashrc
, .vimrc
, and .ssh/config
across computers without losing your mind?
You can see a lot of solutions for this problem on github.com. I had my own dotfiles/setup.sh
to do this (it created symlinks).
But I really like the philosophy of using older, general purpose tools whenever it makes sense.
GNU Stow was created to solve a different problem in 1993: managing symlinks to software packages. These days, it’s found a new purpose: managing symlinks to people’s "dotfiles". Most distros seem to have a Stow package available, so it’s just an install away (in my case, Slackware has a slackbuild; Alpine and Debian both have "official" packages).
The key to understanding Stow are these principles:
-
Stow manages "packages" of files (so you can group your dotfiles into categories like
vim
,bash
, etc.) -
A package contains the exact tree of files you wish to symlink.
-
The target directory is the parent of the stow root directory.
-
Stow will attempt to symlink whole directories if it can. Otherwise it will symlink individual files.
This is best demonstrated with an example:
/home/dave/ - stow target directory (parent of stow root) .vimrc --> dotfiles/vim/.vimrc - stow-managed symlink to file .ssh --> dotfiles/ssh/.ssh - stow-managed symlink to dir dotfiles/ - stow root directory vim/ - the "vim" package .vimrc - the actual file ssh/ - the "ssh" package .ssh/ config
To install the above "vim" package:
~$ cd dotfiles ~/dotfiles$ stow vim
And then the "ssh" package:
~/dotfiles$ stow ssh
If the ~/.ssh
directory already exists, Stow will create a config
symlink in it. Otherwise, Stow will make a symlink for the entire ~/.ssh
directory!
Shared Stow "dotfile" configuration
Now what’s interesting with the above ~/.ssh
symlink is that every machine needs its own separate ~/.ssh/known_hosts
file (you could share them, but that would be horrendous to manage). At first, I solved this by mkdir .ssh && touch .ssh/known_hosts
and then running Stow. That works because Stow is smart enough to see the existing directory and only create symlinks for the individual files inside.
But then I realized that it would even easier to let every machine just go ahead and write to its local copy of the symlinked stow directory (~/dotfiles/ssh/.ssh/known_hosts
). How would that work when dotfiles
is a shared Git repo? Easy! Just add known_hosts
to the .gitignore
file. Now every machine can write to a separate known_hosts
in the stow repo without the repo being bothered by it at all.
My current .gitignore
in the dotfiles
repo:
*~ *.swp pass/.gnupg/random_seed ssh/.ssh/known_hosts
As you can see, .gnupg
also has a file, random_seed
which should not be shared between machines.
Per-computer customization
Of course, if you’re like me, your various computer setups are only 99% similar, not identical. I’ve taken to using the hostname of the computer in my scripts to do additional things. It works really well.
From my .xinitrc
:
host=$(hostname -s) # rotate the two portrait orientation screens on callisto if [[ 'callisto' == $host ]] then xrandr --output HDMI-0 --rotate left xrandr --output DVI-1 --rotate left fi
Setting up a new computer
First, I’ll be installing Slackware Linux, which I describe in detail in How I set up new computers with Slackware Linux.
Now I just need to install Stow (note that sbopkg
is Slackware
specific), get my dotfiles repo, select which items to have Stow
manage and I’ve already got most of my stuff available:
# sbopkg -i stow ~$ git clone phobos:repos/dotfiles.git ~$ cd dotfiles ~/dotfiles$ stow bash ~/dotfiles$ stow vim ~/dotfiles$ stow tmux ...
I let stow manage my .ssh info and I find that I need to manually correct the permissions for these files. I’ve got a script to do that. Then I test my 'p' ssh config alias for phobos:
~$ cd .ssh ~/.ssh$ ./fix_perms.sh ~$ ssh p Welcome to Phobos! dave@phobos~$
Now I can clone other repos (my home bin/
dir being
the most important, of course) and test out the stowed
Bash profile/rc:
~$ git clone phobos:repos/bin.git ~$ source .bash_profile Repos all good! ( bin dotfiles ) dave@cygnus~$
And Vim needs my plugins installed. I’ve got this at
the top of my .vimrc
:
" To install plugins, get Vundle: " " git clone https://github.com/VundleVim/Vundle.vim.git ~/.vim/bundle/Vundle.vim " :BundleInstall
That’s it. A new machine all set up in as little as 5 minutes after reaching a command prompt!
Checking for changes
The advantage of something like Syncthing is that it is always working on your behalf. I wondered if remembering to make my manual commits, pushes, and pulls was going to be a problem?
It was. I rarely remembered to push my changes and never remembered to pull them on until I realized I was missing something I’d done on another computer.
I think I’ve just solved that! It’s hardly perfect, but I’ve created a checker that I run every time I log in. I put it in my .bash_profile
script:
# run my sweet ~/bin/check-repos script on login check-repos
It tells me which local repo has uncommitted changes or needs to be pushed or pulled.
Output is as simple as
Repos all good! ( bin dotfiles test )
Or complicated as
Repo bin: Local changed files: check-repos Repo dotfiles: Local changed files: bash/.bash_profile bash/.bashrc xinit/.xinitrc 2 remote commits to merge. (please do a pull) Repo test: Local changed files: foo.txt new_stuff.log things/thing1 1 remote commits to merge. (please do a pull) 1 local commits unpushed. (please do a push)
I’ve put the entirety of the script at the bottom of this page. See check-repos
below!
Time will tell if this is enough or if I’ll want to further automate the git portions. Since I’m not collaborating with others, merge conflicts happen, but they’re not too frequent. I’m tempted to make a script that can do a complete commit (with message)/pull/push for the most common cases. But I’ll wait until I feel like I need it.
Update 1 month later: Yup, check-repos
has worked perfectly. I’m now keeping everything in sync much more regularly. The only thing that still annoyed me was having to cd <repo> ; git pull
. It’s a small thing, but with a tiny script, now I can just type pull dotfiles
after logging in:
#!/usr/bin/bash # exit on failure (can't cd to directory) set -e repo="$HOME/$1" cd $repo git pull
Living the pseudoterminal life!
(At first, I started typing "living the terminal life" before I realized how that sounded. I don’t use real hardware terminals anyway, so "pseudoterminal" has the advantage of being more correct as well as not making it sound like I have an incurable illness.)
Being able to live out of a virtual terminal means incredible flexibility because dang near any computer in my house can become a temporary workstation.
It’s also largely what allows me to update my local environment via "dotfiles".
I’ve split this whole topic off into a separate page here: Computer Terminals.
check-repos
As promised, here’s my entire check-repos
script. I’ll try to update this I make any important changes.
#!/usr/bin/bash # Repos shared between machines (hosted at phobos:/home/dave/repos) known_repos=(bin dotfiles dwm test) checked_repos="" all_good=yes # Check each repo for local or remote changes for repo in "${known_repos[@]}" do repo_dir="$HOME/$repo" if test ! -d $repo_dir then # directory doesn't exist, repo not on this machine continue fi checked_repos="$checked_repos $repo" cd $repo_dir # fetch changes from master git fetch --quiet # commits on the remote machine that we haven't pulled yet remote_commits=$(git rev-list HEAD..origin/master --count) # commits that we haven't pushed to remote yet local_commits=$(git rev-list origin/master..HEAD --count) # gets all untracked and staged/unstaged changes with full paths local_changed_files=$(git status --untracked-files=all --porcelain | cut -c 4-) count_local_changes=$(echo $local_changed_files | wc -w) counts="$remote_commits $local_commits $count_local_changes" if test "$counts" != "0 0 0" then # something is non-zero in this repo, we'll print output echo "Repo ${repo}:" # and don't print the all good message all_good=no fi if test "$count_local_changes" -gt 0 then echo " Local changed files:" for lcf in $local_changed_files do echo " $lcf" done fi if test "$remote_commits" -gt 0 then echo " $remote_commits remote commits to merge. (please do a pull)" fi if test "$local_commits" -gt 0 then echo " $local_commits local commits unpushed. (please do a push)" fi done # Done with repo checks, all good? if test $all_good == yes then echo "Repos all good! ($checked_repos )" fi
Enjoy! And let me know if you have any improvements on any part of my setup. :-)