AN XCELLENT IDEA
No I haven't made another spelling mistake in the title of a post.
Today I'm talking about X, or X11, or the X Window System, or is it
the X Windowing System, or X Windows System, no that sounds too
Microsoft... Well you know, Xorg, or X.Org, or XFree86, or Xming,
or...
OK, OK, just as long as the system itself isn't as confusing as
what name to call it... [peeks into the source code] JESUS CHRIST!
Well alright, so long as you don't look under the hood or, you
know, actually try to write software that deals with it directly,
it's pretty nice from a user perspective. In fact a lot of that
complexity in code is to allow for the flexibility of a
client-server architecture that actually allows software running on
one computer to display a window on another connected via TCP. Very
handy back in the 80s when it was written, back when giving lots of
people computers able to run graphical software was prohibitively
expensive compared to the cost of just buying one mainframe and a
fleet of dedicated X terminals.
Of course nowadays anyone sane probably just has their X clients
(the applications they're running) talking to an X server on the
same machine, little thought of until one day it fails to start at
boot-up. But what I've realised is that in this largely ignored
capability might be the solution to my long-term headaches
connected with software updates.
To explain these headaches, I'll start with news on a long running
saga that I've been complaining about since this phlog first went
online. I had a final go at getting OpenSSH to compile in the
configuration that I wanted, this time after studying forum threads
from people who had similar problems (clearly this is something
that the developers are in no rush to fix). Armed with a few more
additions to my mile-long ./configure line, I finally got it to
find the OpenSSL library and build the Makefile. But then gcc
couldn't find OpenSSL. I was out of time and well and truely out of
patience, so even though there was clearly a path to follow from
there, I refused to take it.
I'd ruled out alternative SSH clients such as Dropbear and Lsh
because just as important as shell access is SFTP for uploading to
my sites like Free Thoughts (though I have discovered that shell
access to aussies.space is actually more bareable over my slow
connection with SSH compression enabled (-C), so composing things
on the remote server isn't entirely out of the question anymore),
and the other projects didn't include an SFTP client. But then I
discovered that PuTTY, a popular open-source SSH and
anything-else-terminal client for Windows, which I already had
installed on my old Windows XP PC, is available for Linux too, and
it includes an SFTP client! I downloaded the source and, without
needing any separate encryption library, it compiled without a
hitch. Plink is actually the command-line version of the terminal
client program and uses many of the same commands as OpenSSH's
client, while psftp is the command-line SFTP client. The graphical
PuTTY can be built too, and uses GTK, but allows you the choice of
building against GTK 1, 2, or 3! Why does one never get such choice
when building popular software for Linux?! Sure, sure it might be a
pain to maintain, but geeze it would make my life easier if
everyone respected compatibility like that.
Anyway this has been a good example of the sort of troubles that
I'm having keeping my computers set up the way _I_ like, often
using outdated hardware because I like to stick with what works
(and I can't afford to buy the latest anyway), while maintaining
access to the internet. It's especially the case now that
everyone's going encryption-nuts and requiring things like HTTPS
everywhere even though I don't have a personal need for it most of
the time, and those who do can use things like the HTTPS Everwhere
browser add-on. With the web specifically, of course there's also
the nightmare of Javascript and the corresponding need to run ever
more bloated web browsers in order to access many parts of the web.
It recently became impossible to browse YouTube with Javascript
disabled, even just to find video links to download with
youtube-dl. Plenty of other sites had gone that way already of
course.
As I mentioned in a previous post, "The Need for New" (oh boy,
almost four months and I still haven't got the new laptop set up),
updating internet software (especially things like Firefox that
update regularly and aren't practical to compile myself) leads to
updating the OS, which leads to breaking compatibility with other
software that I want to use (unless diving in to more difficult
compiling and patching), and breaking compatibility with hardware
that is no longer supported, or now has bugs in the drivers, or is
just less efficient so needs more processing power, or RAM, or disk
space. I don't need all that if I just want to edit text files,
compile software, listen to tracker modules, draw diagrams, write
documents, view and edit photos, design electronics, play old
computer games, record myself humming, etc. For all that I've got
my computers working how I like them and that's as far as things
need to go. It's for one reason that I'm expected to throw all of
that neat arrangement into complete disorder and rebuild it
according to the ways of the moment. That reason is accessing the
internet.
So, back to X, a system that's been mercifully static for at least
a couple of decades now, and available for pretty much any Linux
distro new and old, as well as for Windows if you install it. The
answer to my troubles (maybe) is to install all of the internet
software that I use on one headless modern (but cheap) computer
connected to my LAN, and run that internet software with the
display on my old outdated systems, via telnet for terminal
software, and X for the more difficult things like Firefox. Then I
have only one system to keep up to date, only one set of internet
software to keep updating on it, and also only one configuration to
keep set up how I like (an important consideration given that
there's a lot that I like to change in Firefox's about:config).
It's also a solution for sharing bookmarks over multiple machines
(I know there are a lot of answers for that, but none that I
managed to stick with personally so far).
I'm not saying this is all that radical or clever, it's what X was
designed for after all, but it does involve a bit of planning and
the details have been bogging me down a lot more than I expected so
far.
Just a warning: from now on this will be a bit more of a log than a
readable writeup because I'm still working on this and undecided
about how, or whether, it will work. On the other hand that's the
only reason why I'm devoting the time to writing this, because I've
ended up with tons of vague dot-point notes scribbled down on paper
and I need to condense my thoughts somewhere or else I'm likely to
get lost. It might as well be more chow for the hungry Gopher.
First off, a list of internet software that I would want to run on
the "internet client" computer. Note that this is only including
stuff that's likely to become obsolete (not inc. Gopher or NNTP for
example), though _ideally_ I should install that too and block
direct internet access from other computers on my LAN for increased
security given that their OSs will have known security
vulnerabilities because they don't get updated much / at all.
* ---WEB BROWSERS--- - This is the main thing, and Firefox in
particular. Chromium too if I can't avoid it.
* POP,IMAP,SMTP - technically it's alright if the Mail User Agent
software runs on the other computers, but mail is received and sent
by the internet client.
* SSH
* SFTP and FTPS
* Gemini browsers
* Downloaders, eg. Wget and Curl (also youtube-dl). - Need to wrap
these in a script on the other computers so that eg. wget -o
/somewhere/file has wget on the internet client stream the
downloaded data over the terminal connection and write that to the
file on the local machine. Otherwise lots of scripts might break.
That last one touches on the issue of dealing with saving downloads
to fine in general. As it is I mostly already subscribe to the web
browser convention of saving to a "downloads" directory and sorting
them out from there. All I need is to set up NFS so that a
downloads (and uploads for SFTP) directory on the internet client
is accessible from all other computers on the LAN. But then there's
the problem of file formats. Standards like PDF, for example, keep
changing and I've already had trouble compiling newer versions of
PDF viewers. Such changes are completely needless in my opinion,
and if there was any logic in the world we'd be using Djvu format
anyway, but that's the internet for you. The same goes for video
and audio formats, and even old reliable image formats are
apparantly under threat with new alternatives for JPEG being pushed
by the likes of Apple and Google. Then there are Microsoft Office
documents, but I'm mostly in a position to tell anyone who'se
likely to send me one of them to either convert it to RTF, or PDF,
or HTML, or get stuffed.
So I really want software on the internet client that can convert
these horrible modern formats to something that I know and love,
ideally running from a script that finds and converts them
automatically.
* Image converters - Image Magick should be up to date enough for
this, I hope.
* Audio converters - Maybe also a player rigged up to some buttons
so that I can use the internet client as a little music box? Or is
this needless feature creep... Well playing streaming audio would
be complicated if it didn't have speakers.
* Video converters
* PDF converters
* Microsoft Office document converters
One other thing is that I need to have all of my printers on the
network, so that I can print web pages from the internet client.
Again nothing radical, but a pain to set up because so far I've
worked on the logic that USB (or LPT) connected printers are fine,
after all you need to be next to the printer to grab the paper it
prints anyway.
Yes and speaking of hardware hassles, it's time to look at what
computer I'd actually use for the internet client. Well jumping
straight to my (first) conclusion, The Raspberry Pi 4. Poor as I
claim I am, I can let $100 (Pis are more expensive in Australia -
pissed me off when I heard about the $5 Pi Zero and then paid a
little over $10, seems to be a bit more than just the exchange rate
difference going on) go towards one of these without losing much
sleep. Besides being cheap to buy, it should use a lot less power
than a full PC, which is important given that such power would be
in addition to that used by the PC I'm accessing it from.
But I've got a laptop, and on rare occasions I actually take it out
of the house. What if I want to access the internet from it while
I'm away from home where my LAN is? I could SSH in over the
internet and use X port forwarding (ssh/plink -X option) I
suppose...
Experiment time (well a few weeks/months ago actually, but I'm
trying to make this exciting)!
I don't think the idea of tildes is really about running X, but
nevertheless aussies.space does have the Xlib library installed
(though not the development headers) and I found one lonely X
program on its (I'm pretty sure virtual) hard disk:
$ xev &
Whoopie. A little box in a window and a whole lot of logged info in
the terminal. Xev is actually just a debugging program, but it
proved the point. Except it actually took ages after running the
command for it to appear, and it doesn't really do anything so it's
hard to gauge responsiveness. Turning SSH compression on helped get
it to start up much faster (at this point I realised that it made
everything faster, but it seems to be especially the case with X
port forwarding), but I needed a real program to test properly.
The easy way would be to compile an X program as a static binary,
then just copy it over to aussies.space. But all of the PCs that I
use are still 32bit and aussies.space is x86_64. So I copied
various Xlib library files over from my computer and wrestled them
into makefiles until gcc was finally satisfied. Whereafter it built
for me the timeless image viewer "xv", and the probably fairly
forgotten game Xinvaders 3D (which uses Xlib's line drawing
commands rather than bitmaps, so it wouldn't have as much
information to transmit as other games). I set the permissions so
that anyone on aussies.space should be able to run them too:
$ /home/freet/bin/xv &
$ /home/freet/bin/xinv3d &
As for the result, well xv takes a few seconds to refresh even a
small image after moving the window, and Xinvaders 3D is just about
playable but very laggy. There's no way that browsing websites in
Firefox would be practical this way. Of course that's my slow
internet connection, I haven't been to anyone with fast (wired)
internet for ages due to the virus (as I'm in the only Australian
state where it's really got away, but death and economic ruin isn't
important enough to warrant a phlog post on that compared to
configuring an unusual software environment), but the point is that
I'd be connecting to the internet client at home, so this is the
sort of performance I'd expect doing that.
I did configure xv so that it's now used for viewing images in both
Lynx and ELinks though. ELinks was a real bugger to configure for
this without any examples to copy, by the way (this post is already
too long, but ask and I can share the config. I ended up with).
So with that option ruled out, the next one is to take the internet
client with me instead of leaving it as home. But then I've got to
set that up as well as the laptop, and it won't be bettery powered
unless I build a battery supply for it, and that's to much trouble.
What I want is to be able to power it from the laptop, but a Pi4
would draw too much current. But a Raspberry Pi Zero W doesn't! It
peaks at a little over 300mA, which is fine for USB. Plus it
supports a USB "On The Go" mode where you can actually plug it into
a USB port and it emulates a USB network adapter connected to
itself. There are even boards on Ebay that allow you to solder on a
USB-A Male connector. Raspberry Pi OS (no longer called Raspbian)
is compatible with both the Pi Zero and the Pi 4. So when I leave
the house I just have to pull the SD card out of the Pi4, put it in
the Pi0w, then later stick the Pi0w into the laptop, a network
connection is set up between them, then I configure the Pi0w wifi
and use all the internet software on it like I do on the Pi4 at
home. Great!
Now sure the Pi0 with a 1GHz ARM processor and 512MB of RAM isn't
much of a performer these days. But my current laptop (the one that
I still haven't got around to upgrading) is a 1GHz Pentium III with
768MB RAM, and it runs Firefox at a mostly bareable pace with my
configuration that disables as much junk as possible from both the
browser and the sites that it browses. For the rare times that I
want to browse the web away from home, it should suffice.
So yesterday (yeah things move slowly on this) I tried that out on
the Pi0w that I had already. If all went well I'd set it up fully
then buy a Pi4 to boost performance into the modern era. Mostly it
did go well, but Firefox didn't.
First off, I did experiment earlier accessing Dillo running on the
Pi0w over SSH with X port forwarding on my LAN. It was pretty laggy
and the sshd process on the Pi was eating more than 50% of CPU time
while scrolling around a page. Useless encryption again - nobody is
snooping on my LAN (if they were close enough to do so, I'd see
them). There's a simple HOWTO at TLDP that covers the basics of
setting remote X windows up properly, and there's not much to it:
http://www.tldp.org/HOWTO/Remote-X-Apps.html
Having removed "-nolisten tcp" from my xserverrc and allowed the pi
with xhost, Dillo could connect directly without any SSH nonsense,
and it was quite responsive enough for my needs, while CPU usage on
the Pi0 now only peaked at about 3% while scrolling around a
webpage wildly. Good, now on to Firefox. I was running the current
Raspberry Pi OS "lite", and a mountain of dependencies needed to be
installed for Firefox, all up almost 900MB of space taken up!
Anyway, it installed, so I ran:
$ firefox-esr &
"Illegal instruction"
Hmm...
Well the Raspberry Pi OS package archive will install Firefox on a
Pi0, but not a binary that's compatible with the Pi0. The story is
roughly that the Pi0 CPU implements the ARMv6 architecture, with
certain additions for hardware floating point operations (and maybe
more besides). Later models of "big boy" Pis after the Pi2 are
based on ARMv7 CPUs which all have hardware floating point. Debian
supports ARMv6 (but assuming no hardware floating point support)
with the "armel" target, and ARMv7 with the "armhf" target.
Raspberry Pi OS is apparantly built for ARMv6 with hardware
floating point, which they then call "armhf" and should be
compatible with both ARMv6 and ARMv7 Pis while still using the
hardware floating point features for maximum performance (for some
things, web browsing not really being the first that comes to
mind). Therefore you'd expect that all Raspberry Pi OS packages
should run on the Pi0.
https://wiki.debian.org/RaspberryPi#Raspberry_Pi_issues
The truth of the matter is that they're probably copying the
Firefox-esr package from Debian armhf (the Debian armel Firefox-esr
version is way out of date (v. 60.9.0esr) for some (frustrating)
reason). As such it uses ARMv7 instructions, which don't work on an
ARMv6 Pi0. Bloody bastards!
If they can't be bothered keeping the Firefox package working on
the Pi0 with the official Linux distro (which I didn't want to use
anyway due to Systemd), there's not much hope of any other distros
doing so. Firefox being the most significant motivator for setting
up this internet client system, this basically rules out the Pi0w
as a hardware option.
But (sorry folks, this will just keep going on, my software
projects tend to go this way), there's more to Pi than Raspberry
(come on advertising agencies, there's talent going to waste
here!), and indeed there is a whole world of Raspberry Pi clones
and alternatives. Not quite so much for the Pi Zero, but there is
the Banana Pi Zero:
http://wiki.banana-pi.org/Banana_Pi_BPI-M2_ZERO
Faster RAM, quad-core, USB OTG, and ARMv7 (or at least armhf binary
compatible - that's all that I bothered to note down)! But enabling
USB OTG seems to require recompiling the kernel, and it's not clear
if the same kernel would work for a Pi4, and finding a distro for
it without Systemd will probably be hard, and nobody seems to have
measured the actual current draw so I can't tell if it can be USB
powered. Sigh.
Then while grumbling about this in my sleep last night I remembered
that the Atomic Pi, which is an x86_64 board that is apparantly
actually New Old Stock from a failed robot vacuum cleaner product,
bought by some mob who made an I/O board for it and slapped on a
"Pi" brand. With postage to Australia they're more expensive than a
Pi4, while offering similar, if not worse, performance from what I
can tell. But if I get my "new" laptop set up, it is x86_64. So
when leaving home I could take the SD card out of the Atomic Pi
internet client, then wrangle the laptop's BIOS, or failing that
Grub and a copy of the kernel on the HDD, to boot from the SD. It
would mean that I would have to choose between booting into the
internet client OS or the laptop OS with all of the non-internet
software, but again I only rarely want to access the internet away
from home so that inconvenience shouldn't be very great.
Though as it's basically a one-off, it's not clear whether an
equivalent cheap, low-power, x86_64 board will be available to
upgrade to later when Firefox and the web becomes even more bloated
(the thought of what that might look like hardly bares
contemplation). Also, by needing modern Firefox to run on the
laptop, that means that the laptop hardware will keep needing to be
upgraded over time as well, which is one of the things I was trying
to get away from by isolating the internet software.
Another option might be to use the Pi4, then emulate it in QEMU on
the laptop, ideally keeping the whole client-server thing working
(though the internet client's internet would be via the laptop's
OS). Unless QEMU's emulation performance is really really good
though, I strugle to imagine this running fast enough for Firefox
to be usable, at least on the old laptop that I'll be upgrading to.
Also even more so than the Atomic Pi option, this will cause
pressure to upgrade the laptop later on (if not immediately).
https://wiki.debian.org/RaspberryPi/qemu-user-static
Yep, as usual with these things there's no good option. I was
kind-of hoping that it would look better with everything spelt out
in text. But no, since Firefox on the Pi0 turned out to be a lie
I'm not sure if I like any of the other paths that are left to
take. Or maybe carrying around a battery-powered Pi4 along with a
laptop wouldn't be so bad after all...
Hmm, two posts in a row that don't reach a conclusion, I must be
dealing with too much personal stuff here lately.
- The Free Thinker.