Wednesday, October 1, 2014


Why I don't use tablets

The world is filled with tablets and smartphones. I've resisted the temptation to jump on the bandwagon (mostly, I did buy a used zaurus).

My main computer activities consist of running a server, programming and researching information on the internet. Now writing programs on a tablet just seems silly. There's no tactile feedback from an on-screen keyboard and even if it did have a real keyboard it would be no where near the full size. The grunt-and-point interface just doesn't cut it.

I've always wondered just how difficult it is to repair a tablet. Fortunately the engineers at have ranked the tablets by repairability:

Tablet Repairability

The Microsoft Surface Pro ranked the worst and the Dell XPS 10 took top ranking as the most repairable device on the list.

The advantages of using a desktop or tower computer over a tablet are numerous: no battery life to worry about, proper keyboard, repairable (replacement parts are easy to come by), large monitor, and it's just nicer to sit at a desk with a good chair. Last but not least: you're probably not going to drop your desktop computer.

Portable devices such as cameras and ereaders do have their uses, but to my mind tablets have far less utility that a standard desktop. One can not help but wonder what the sociologist Vance Packard (author of the Waste Makers) would make of tablets. It's hard to imagine he would be in favour of them.

Friday, September 5, 2014


Why the Computer Experience is Often Poor


The most common reason computers become damaged is due to heat. A lot of
computers are not designed well enough to deal with 100% CPU use over long
periods of time. This has been a problem that computers had since their
beginnings using vacuum tubes. The problem remains to this day as a lot
of modern laptops still overheat and there are many examples of video cards
in desktop machines becoming damaged due to excessive heat.

Intel has tried deal with this problem with their speedstep technology.
If the sensors say the computer is getting too hot then the CPU scales the clock frequency to a lower value. Other solutions for overly hot laptops include cleaning out the fan vents and buying a cooling pad, of which there are active and passive types.

Generally a desktop with good air flow should not overheat.

Note: I've never had a Panasonic Toughbook CF-48 overheat.


Unfortunately just about all computer components fail at some point. It's
a good idea to watch for signs of impending failure. One can make
component failure less likely to happen by using high quality parts such
as power supplies, motherboards and hard drives. Some external hard drives
are rugged and have anti shock capability.

The DC adapters used with a lot of routers and cable modems are cheaply made
and are a common point of failure. It's possible for a cable modem to fail
and not fail completely, instead there is increased packet loss. The
easiest solution for this problem is to completely swap out the cable modem
for another one. In the case of routers merely swapping out the DC adapter
(or wall wart) will usually fix the problem. Be sure to match the physical
connector, polarity, voltage and amperage to the original unit.


It's one of the oldest tricks in the book for hardware manufacturers of
printers and scanners: Make the protocol the device communicates in a
secret, thus making life difficult for software programmers (especially FOSS
ones). Often there is little or no technical documentation to support development of free and open-source device drivers for their products.


As computer motherboards become more and more integrated they become less and
less repairable. In the days of the IBM PC the motherboard had no disk controllers
or IO ports, just the memory chips, the CPU, and external ports for the keyboard and cassette deck. Thus it was a relatively simple matter to replace defective components. Not so with a modern motherboard that has sound capability, networking, hard drive controllers, USB, and who knows what else.

One wonders what repair level is possible with tablets and other tiny devices which
use surface mount technology. On the other side of the coin often it is quite easy to repair a modern LCD HDTV. The large case size ensures that there is a reasonable distance between components. One can at least do part replacement at the component level.

The road to unrepairable computers started with Very Large Scale Integration (VLSI) technology in the 1970s. Before the microcomputer appeared the CPU was a collection
of separate cards. In the case of the DEC PDP-11/20 it had integrated circuit flip-chip modules. Individual flip-chips could be repaired.


The final issue is software creep. By this I mean the continual replacement of older
software with newer software. It seems unavoidable but it's usually unnecessary.
How often does one need to update their word processor? I'm using abiword on
Linux and it seems adequate for my purposes. KDE versions after 3.5.10 do not intererest
me. In fact I've switched to the lighter IceWM and I'm quite happy with it.


In conclusion we can clearly see why the computer experience is often poor, and this
is mostly due to poor design and lack of knowledge of computer maintainance. It's not
the user's fault as the manufacturers want unknowledgeable users and unrepairable devices so they can keep selling you a new computer or television or $ELECTRONIC_DEVICE every 4 years or so. It's bad for your pocket book and it's bad for the environment. All that ewaste has to go somewhere and somewhere is usually a landfill site.

The solutions to these problems are somewhat unclear but I favour using older more
repairable computers as part of the answer. I've seen Panasonic Toughbooks last over ten
years and there's no reason why one couldn't keep it working for at least another decade.

Tuesday, August 12, 2014


unix version 5 demo

A short video of Unix version 5 from Bell Labs circa 1974 running on PDP-11/70.

Friday, July 25, 2014


We Have Strayed from the Original Ideas of Unix

When Ken Thompson and Dennis Ritchie created Unix and C Language they also created
a philosophy of Unix.

Some of the original ideas were:

    Small is beautiful.
    Make each program do one thing well.
    Build a prototype as soon as possible.
    Choose portability over efficiency.
    Store data in flat text files.
    Use software leverage to your advantage.
    Use shell scripts to increase leverage and portability.
    Avoid captive user interfaces.
    Make every program a filter

In some ways we have actually made improvements to the Unix Philosophy with Richard Stallman's GPL. We also have a mostly standardized graphical system with the X Window System. I can't find any overt references to sharing of source code from the early days of Bell Labs but it clearly did happen even if it was de facto
rather than de jure.

But the idea that "small is beautiful" has faltered rather a lot. Unix and Unix-like Distros have become rather bloated. And no one would think of programs like The Gimp or Photoshop as "doing one thing well". I'd be willing to grant we have chosen portability over efficiency. Levering software to our advantage, yes we have done that.

As for "Avoiding captive user interfaces", well there's lots of room for improvement on that score. Storing data in flat text files isn't always possible. No one would think of storing a video file as a text file. Ditto for making every program a filter. A lot of programs are graphical and interactive and I can't really see any way to make a filter out of those types of programs.

Looking at Unix version 5 we can truly see that "Small is Beautiful". The kernel of the original Unix v5 is 25,802 bytes in size. The entire operating system could be stored on a DEC RK05 magnetic disk drive which was 2.5 megs in size. That includes the C and Fortran compilers, the kernel, the assembler, the device drivers, the userland programs and the source code for all of the above. It was truly an amazing accomplishment. To be fair, You would have needed a second RK05 disk pack for man pages and a little extra breathing room.

Another big advantage of Unix version 5 was that it was quite possible for a determined programmer to actually read the source code of the entire system. That really isn't possible with a modern Linux distro. It would have been great if every programmer's introduction to programming included Unix version 5. It's possible now, thanks to simh and The Unix Historical Society you can do it.

In my previous blog entry I talked about using less memory. Unix version 5 could run quite well with 256 kilobytes of ram.

Saturday, June 7, 2014


Using Less Memory

In an effort to make older computers more viable one must consider ways of
using less memory. One of the most effective changes I've made on my
quasi-old P3 system running vector linux was to switch from KDE 3.5.10
to IceWM. For memory KDE 3.5.10 needs roughly 41 megs and the much lighter IceWM
only uses 4.5 megs of memory. That's a significant memory savings but there are
other advantages as well.

Web browsing and even games were found to be more responsive after the switch
to IceWM.

Kmail was replaced by using Gmail and the text-based email client mutt.
The scanning program Kooka has been replaced by xscanimage as Kooka doesn't
work with newer Linux systems using libusb.

The idea of using Trinity or XFCE as a way of using less memory is only
effective if switching from KDE 4 or Unity, both of which are memory pigs.
IceWM has a far smaller memory footprint and a better cost/benefits ratio.

I still use some parts of KDE 3.5.10, mainly klipper, kmix and konsole.

Unload kernel modules which are not needed, for example:
sudo /sbin/modprobe -r bluetooth

Repeated information requests can be automated using links or wget, and both
of these use far less memory than firefox.

links can be used in scripts for quick stock checks: (where $1 is the stock identifier)

links ""$1

Generally text based programs use far less memory compared to their graphical

In many ways firefox is one of the biggest memory pigs of them all and dillo is
my first choice among light web browsers that use X. The main problem with dillo is
it can not handle https and javascript. Assuming one does not need either of those things
then dillo is quite suitable as a web browser. Also when the computer is unattended
I always exit from firefox so that it doesn't consume any cpu time which can be
very high if the web page you are on as flash based ads!

Killing defunct processes and their parent processes will also help to free up
memory. It is worth it to periodically check for dead processes.

Saturday, March 8, 2014


Pollution Caused by Chip Fabrication

It is a sobering fact that the chip fabrication industry which is so vital to our modern society is also the cause of a lot of pollution. This unglamorous topic doesn't get much media attention. No one wants to be reminded that the hi-tech world of computers isn't possible without the use of a lot of caustic chemicals.

To summarize, some of the items used in the various processes of chip fabrication include: Acetone, Arsenic, Arsine, Benzene, Cadium, Hydrochloric acid, Lead, Methyl chloroform, Toluene and Trichloroethylene. More details on these items can be found here: Environmental Impact.

The cost of cleaning up contaminated soil was one of the reasons why Commodore Semiconductor Group went out of business. More information on this can be found at the EPA site but suffice to say that groundwater in Norristown Pennsylvania was contaminated with high levels of trichloroethylene. This also led to the eventual demise of Commodore Computers. 

This leads me to propose the idea of using older computers as long as possible instead of continually buying new computers. For the last 10 years I have done this, although initially it was more of a cost saving measure. My computer philosophy has shifted towards the idea of 'computer minimalism'. I've already remarked on the longevity of certain computer models, including ones using the Pentium 3 coppermine CPU.

To apply the idea of minimalism to computers we must select an operating system that is efficient and adjustable. My short-list of possible operating systems for minimal computing include: OpenBSD, Vector Linux Classic, and Fedora Core 1. One might be a bit surprised to see FC1 on that list due to its (relative) old age, but I must admit to a certain amount of inclusionism in my philosophy on computers. One could conceive of examples of far greater levels of minimalism but I wanted to be able to at least do mundane things like running a web browser and web server.

Thus we could fix or at least reduce the problem of the pollution caused by chip fabrication by buying or fixing up old computers that have already been manufactured. Naturally we would select old computers that are the most easily repairable and have a ready supply of replacement parts available. One example of a computer that fits this desciption is the IBM Personal Computer 300GL (slot 1, p3 running at 400 mhz).

Saturday, January 11, 2014


DNS Propagation and Disappearing Services

I had to change the nameservers on a few of my domains. Formerly I was using the free service of dnsever to supply nameservers for my domains. This stopped working yesterday. It's an unfortunate reality that free internet services disappear, or cease to be free.

Since I have more than one domain pointing to my server it's still possible to access via the domain. Eventually the new dns records should propagate within 24 to 48 hours and things should be back to normal in a day or two.

It's been a bit of a battle keeping web services running. The now infamous ice storm which began in Dec. 22nd, 2013 disrupted power and telephone lines through much of Ontario. One of my phone lines stopped working, but it wasn't too bad as it was restored in a couple of days. Also I experienced two power failures of about 5 hours and 2 hours respectively. My UPS's kept the computers going for half an hour each time but it's clear that I need a longer lasting UPS than that.

This page is powered by Blogger. Isn't yours?

Subscribe to Posts [Atom]