Saturday, July 25, 2015

 

Not Learning Unix is a Mistake


Back in 1986 my first exposure to Unix was Venix on an IBM XT which was made by VenturCom.

It wasn't a great experience but I learned about awk (we called it awkward at the time) and most of the basic Unix commands. Venix for the IBM XT was based on Unix version 7 with some programs from BSD such as vi. I still run Venix 2.0 on xhomer on Linux via the magic of emulation. In this case xhomer is emulating a DECPro 350. There was almost zero source code for Venix.

I realized that closed source software was very undesirable from a programmer's point of view after I discovered Unix version 7 (much much later) on the PDP-11/45 also via the magic of emulation. Unfortunately my initial exposure to Unix consisted of various closed source variants of Unix such as Venix and (in the SCO era) Xenix. Xenix was actually the second version of Unix I encountered and although I learned about Unix System V there was still mostly no source code available.

There was little enjoyment to be had with these closed source Unix versions, but learning elements of Unix System V ended up being important as Linux had largely modelled itself on SysV. For reasons of cost we didn't see Personal Computers with the ability to fully run Unix with an MMU until later. MicroVaxes and Sun workstations were well over $10K in price. It wasn't until the appearance of the Intel 80386 that we started to see affordable Unix machines. By the late 1980s most folks could afford such a machine.

It has occurred to me that not learning Unix is a grave mistake. My relatively early exposure to Unix was important. I may not have appreciated Linux as much or even at all if I hadn't had that ability to experiment at home with Xenix. Learning about Unix develops new mental muscles like playing a musical instrument or learning a new language. But learning these new processes becomes more difficult with age. To me the exact technical details are less important. It does not really matter if you are a Linux user or if you use one of the BSDs or even something more exotic like Plan 9. The important thing is you can learn new concepts from what I will broadly refer to as the Unix/Internet Community.

One way to metaphorically dip your big toe into the Unix Pool is to set up Linux (or some other FOSS operating system) on a second computer. The secondary computer can be an old Pentium 3 machine with 256 megs of ram, a new machine is not necessary. The major advantage to setting up a Linux computer is that you have access to tons of source code and tons of application software. The process of learning Unix is similar to physical exercise only in this case your brain is getting the exercise. Although this process will require considerable effort on your part it will be worth it.

My first exposure to Unix wasn't ideal, in fact I could even say that I disliked Unix initially. Part of that was due to the lack of source code. The big realization of the benefits of Unix came later: Unix is the thinking person's operating system and you will find many scientific utilities available. If your interests include physics, mathematics, chemistry, or astronomy then you will find something of interest.

After using Xenix on my computer at home in the late 1980s I made several attempts to use other operating systems. In 1994 I tried Darkstar Linux but I didn't make it my main operating system. Around the same time I tried Minix 1.5 but it seemed quite difficult to use. It wasn't until I had broadband internet access that I finally made the transition to Linux in 2003 by using Red Hat Linux version 8 as my main operating system.

Monday, March 9, 2015

 

Easy Way to Get Coreboot


   Replacing the proprietary BIOS firmware on most computers is a process that often can be frustrating. It's possible that your computer could be rendered unuseable in the process. Back in 2010 I managed to get coreboot working on the Gigabyte GA-6BCX motherboard and although the process went fairly smoothly it did consume a fair bit of time. Fortunately we now have an inexpensive way of obtaining a ready to go coreboot computer.

   There are sites that now offer Thinkpad X60, T60 and X200 series laptops with coreboot installed, although it seems that they are often out of stock. The simplest alternative to obtain coreboot is to buy a chromebook or a chromebox which are readily available from most computer stores.

   If you wish to replace ChromeOS with a Linux distribution you should read the info here.

   I had an occasion to use a Chromebook and I thought the keyboard was rather cheap so I recommend getting a Chromebox and supplying it with a good quality keyboard. The ASUS CHROMEBOX-M115U has an Intel Celeron 2955U Processor, 2GB DDR3 memory, 16G SSD for storage and Intel HD Graphics and sells for as low as $200 Canadian.

   There is also a Coreboot on Chromebooks community here.

Labels:


Saturday, February 21, 2015

 

AntiX Linux: A Brief Review


 Certain factors like systemd are polarizing the Linux community. It seems that either you like it or you hate it. Some of the Debian developers are getting nervous and so a fork of Debian called Devuan has been announced.

   I'm always looking at other distros that emphasize compactness and the ability to run on old hardware. I was also intrigued by the Debian controversy with systemd so when I saw AntiX 13.2 was based on Debian Wheezy I had to give it a try. AntiX comes on a single CD so installing it was easy enough.

   Now the hardware used to test AntiX was an old IBM R30 Thinkpad with 1 gigabyte of memory. It ran fairly well on this hardware, good enough to run fbreader and iceweasel. It had a lot of good attributes like fast boot speed, and it used icewm which I was already familiar with (fluxbox and JWM were also available). It also comes with libreoffice and some multimedia software. It has a fairly modern kernel: 3.7.10.

   As for the systemd issue, yes Debian Wheezy has systemd on it, so this version of AntiX has it as well. All you need to do to remove it is:

apt-get remove systemd

What could be simpler? Things get a bit murkier with future releases like Jessie which will have systemd on it by default.

   I don't want to get into empty histrionics about systemd. I don't like it because I feel it's too big a departure from Unix principles. People are worried that systemd is "taking over" but so far that's not really the case. If you don't like systemd try AntiX, Slackware or one of its derivatives, or if you like a more traditional Unix try one of the BSDs.

Tuesday, January 20, 2015

 

Mind Pollution


   The time has come to talk about a different kind of pollution. I'm not talking about the pollution that is created by manufacturing computers. This time I'm talking about pollution of the mind.

   Mind pollution has permeated the internet. Computer articles are mostly written from the point of view of a person who uses Microsoft products. The assumption is that everyone uses Microsoft Windows and Windows software. To be fair there are a few OSX and Linux articles. Heaven help you if you use something really unusual like Plan 9 or HaikuOS. I can only assume those folks have a lot of patience and a strong will.

   When I was in high school Microsoft Windows did not exist and we just used what was available at the time which was a Commodore PET. For the folks too young to remember a Commodore PET was an 8-bit computer with a 6502 CPU and 32 kilobytes of memory. This was in the early 1980's. So my perspective is different. Moving from one computer ecosystem to another is relatively easy since I've done it so often. I naturally think in more abstract terms such as "software presentation program" rather than thinking of "PowerPoint".

   Back then no one would have thought of using a specific software package for word-processing. You either used a dedicated machine that only did word processing or you went to the computer store to find some software that would run on the computer you owned. Since everything was so new no one had any preconceptions about which software package was the "best". One usually went along with what their schools used, e.g. if your school happened to use a Commodore computer you probably would have bought a Commodore computer.

   Even in the mid-90s when people were running to the mall to buy Windows 95 the computer articles on the net didn't have the overbearing assumption that you owned a Windows box and now you type "format c: /s" or "ipconfig" (it was ifconfig in the Unix world since 1982). Or my personal favourite "Get your friends off XP" which initially sounds good until you realize the article is about installing Windows 8.1 on your computer.

   That's not to say there aren't articles about Linux and BSD, there's lots of them. And even with articles about Firefox there's enough similarities between the Windows and Linux versions that one can often extract useful information from them. I suppose one could still complain about the assumption that everyone uses Firefox.

   It's really all about mind share. You might think "Microsoft doesn't control me" but what about the computing masses? What about the schools and the governments and the banks? We should have platform independence and people should be able to do their online activities with the platform of their choice. The media should NOT assume that all their viewers use Microsoft software because that is certainly not the case.

   Mostly I can use my Linux and BSD machines to do everything I want to do. There are a few edge cases where it takes quite a lot of effort to do certain things, but for the most part there are no serious problems. Once one walks into a computer store it reaffirms that you're in the Windows World. There are Microsoft stores now but I haven't bothered checking them out as there's just nothing there for me.

   I'd like to think that with the internet we have bypassed rigid and inflexible thinking as people are free to tap into any part of the web. Still, it behooves journalists and teachers to write about computer topics in a platform independent way to the greatest possible degree. If one must write about things which are platform dependent they should at least make it immediately clear that their article is platform dependent in the article's title. It is the general concepts and the broad interchange of information that is the most important thing.

Friday, January 2, 2015

 

BSD Community is Too Insular


First of all let me say I really like BSD. I enjoy studying it's history which extends back to 1978 when it was a mere add-on to Bell Labs Unix version 6. The longest uptime I've ever had on a computer was with OpenBSD. It's a fine piece of work.

On the other hand when I look at the BSD community I see a less than friendly environment. It is rather like a gated community where you need to be invited in. Often when one goes to BSD forums one gets some mysterious error message and no access. IRC channels related to BSD are also invite only.

When I re-entered the Linux community in 2003 there was a strong feeling I was less than welcome. As an old Xenix user I thought I would be treated well enough, instead I was barely tolerated. Eventually I felt I was accepted, at least to a certain degree (I also developed a tougher skin). Some of the rudeness seemed to be part and parcel to being online. Manners on the internet have never been very good, but lately things have really deteriorated.

In the case of the BSD community things really do need some improvement. While in the Linux world it's true that people are treated with faint rudeness and a certain amount of condescension, it's still preferable to being utterly locked out of the BSD community. One wonders what rites of initiation are necessary to be accepted into the BSD world and if it's worth the trouble.

I consider the various BSDs to be closer to the original Bell Labs Unix. In some ways I prefer the BSD way of doing things. It's good that we have a varied ecosystem of software. Nevertheless the BSD community really needs to try to become more welcoming to newcomers. This is also true of the Linux community to a lesser degree.

The original programmers of Bell Labs Unix were undoubtedly writing an operating system for other programmers. Programmers are not the friendliest group of people in the world and sometimes this can be very discouraging for young people who are getting used to Unix type operating systems for the first time. An extra effort needs to be made for the BSD and Linux communities to be more welcoming and less insular.

Sunday, December 7, 2014

 

Order Linux Discs


Avoid Systemd by trying AntiX, Slackware or OpenBSD.
For older computers try Puppy Linux.


Buy Linux and BSD Discs



Wednesday, November 19, 2014

 

Thoughts on Systemd


First a short description of systemd:

Systemd is a collection of system management daemons, libraries, and utilities designed for Linux.

One of the main reasons why some folks don't like systemd is it stores logs in a binary format. This goes against the Unix philosophy of storing data in flat text files. The old method was sysvinit which is described here:

http://en.wikipedia.org/wiki/Init#SysV-style

Of course we can go further back, all the way back to Research Unix v5 from 1974:

http://www.maxhost.org/other/text/original-unix-v5-init.c.txt

BSD init was, prior to 4.3BSD, the same as Research UNIX's init and after that things
diverged so there was a BSD and AT&T SysV way of running init. In the ancient days
of Unix v5 there were no runlevels. Unix v5 simply ran /etc/rc in the Thompson shell and
then launched getty.

Some folks say that systemd is the svchost.exe of Linux, saying it is essentially making
Linux more like Microsoft Windows. It is a monolithic entity that hides what's happening
behind the scenes. It stomps on the Unix Philosophy (again) of doing one thing and
doing it well. With systemd we have one large Swiss army knife of a tool that isn't
very good at anything in particular.

For people who want a modern distro that stays much closer to the original Unix Philosophy
we have the BSDs: NetBSD, FreeBSD and OpenBSD. Another solution would be to fork an
existing Linux distro into sysintv and systemd versions but that is hardly ideal.
I'm not sure which Linux distros will avoid systemd in the future as it seems many
of them are jumping on the systemd bandwagon. Slackware appears to be resisting the
systemd tide and "Patrick has stated he intends to stick to the BSD-stylized SysVInit design."

Another solution to this problem is to do what I do, i.e. to use an older Linux distro
that still uses sysvinit and upgrade it as necessary. This method isn't very popular but
there are many retro-computing specialists who use older versions of Linux and Research
Unix. Some distros we use include FC1, 2.11BSD and Unix v5,v6 and v7. Of course I do
expect there to be more forks of distros appearing in the future. There's just too many
different opinions on how things should be done in the Linux community.

Saturday, October 25, 2014

 

Using Older Software and Hardware


If we look at the history of Unix we can see that the kernel, libraries and userland programs have all increased in size as time passes. Here is a summary of the sizes of libc.a, the main library for C language:

Unix version 2 from 1972:  31 functions and 5,242 bytes
Unix version 5 from 1974:  85 functions and 21,542 bytes
Unix version 6 from 1975:  74 functions and 22,042 bytes
(note that with v6 the math functions were moved to liba.a)
Unix version 7 from 1979: 154 functions, and 77,690 bytes

2.11BSD from 1992: 347 functions, 233,788 bytes

Slackware Linux 4.0 from 1999: libc.so.5.44 has 580,816 bytes

Vector Linux Classic from 2007: 1,425 functions and 2,979,634 bytes

Similarly we can see that basic commands like ps for process status have also increased in size:

ps for Unix v5:      2,660 bytes
ps for modern linux: 63,948 bytes (a 24-fold increase!)

One could easily wonder "at what point does the feature set of a given part of an operating system reach completion"? It seems we never reach that point. On the other hand we don't necessarily have to continually upgrade our operating systems. We could reach back into the past and pick an older starting point and upgrade that distro as needed.

Over a decade ago I started running a web server based on Fedora Core 1 which still runs to this day. There's no doubt in my mind that FC1, while leaner than the most of the current distros from 2014, is still bloated with tons of functions I never use. In other words the ideal operating system, one in which does exactly what I need and no more, does not exist. I will give an honourable mention to OpenBSD though.

There has been some good work done to counter the bloat. Tiny Core Linux does a reasonably good job of being lean and mean. It isn't what I would call low memory use though, it still needs at least 46 MB of RAM (compare this to Unix v5 which can run easily in 256K of ram).

Others will argue that as the feature set of an operating system increases it is inevitable that its size will also increase. That is true but I can't help wonder exactly why libc.a has to be almost 3 megabytes in size. There has been work done to make a leaner libc with the MUSL project. MUSL libc.a is a mere 1.6 megabytes which is significantly smaller.

What else could one do to improve the ever increasing bloat of their operating systems? Well, you could roll your own software distribution or even write your own operating system. You could pick a distro based which uses the MUSL version of libc, e.g. Snowflake and Sabotage.

Unfortunately even the so called light distros seem bloated to me. The true champion of lightweight distros is Basic Linux 3.5 which is based on Slackware 3.5 from 1999. It can run within 16 megs of memory and it can use Slackware 4.0 packages.  There are still folks that use it on older hardware. Of course we can rule out Firefox or Chrome or most other modern web browsers while constrained with such a severe memory limit, although we can still use text based web browsers like lynx.

Using a truly ancient operating system like Unix v5 would seem too restrictive to all but the most hard-core PDP-11 aficionados. We can safely assume that everybody would want an operating system that supports ethernet and TCP/IP. Also a working PDP-11 is a rarity and they wouldn't be efficient in terms of power use.

It seems to me that bloated operating systems go hand in hand with corporations who want to sell everyone endless updates of software and hardware. Computer users need to make a judgement call on just how often they want to upgrade and how much money they want to spend. I've already reached the point of upgrade fatigue and I'm putting most of my efforts into maintaining and repairing my existing computers. The truth of the matter is chip fabrication pollutes the world with caustic chemicals like Trichloroethylene. The endless production of plastic isn't doing our environment any good either.

Wednesday, October 1, 2014

 

Why I don't use tablets


The world is filled with tablets and smartphones. I've resisted the temptation to jump on the bandwagon (mostly, I did buy a used zaurus).

My main computer activities consist of running a server, programming and researching information on the internet. Now writing programs on a tablet just seems silly. There's no tactile feedback from an on-screen keyboard and even if it did have a real keyboard it would be no where near the full size. The grunt-and-point interface just doesn't cut it.

I've always wondered just how difficult it is to repair a tablet. Fortunately the engineers at ifixit.com have ranked the tablets by repairability:

Tablet Repairability

The Microsoft Surface Pro ranked the worst and the Dell XPS 10 took top ranking as the most repairable device on the list.

The advantages of using a desktop or tower computer over a tablet are numerous: no battery life to worry about, proper keyboard, repairable (replacement parts are easy to come by), large monitor, and it's just nicer to sit at a desk with a good chair. Last but not least: you're probably not going to drop your desktop computer.

Portable devices such as cameras and ereaders do have their uses, but to my mind tablets have far less utility that a standard desktop. One can not help but wonder what the sociologist Vance Packard (author of the Waste Makers) would make of tablets. It's hard to imagine he would be in favour of them.


Friday, September 5, 2014

 

Why the Computer Experience is Often Poor


HEAT

The most common reason computers become damaged is due to heat. A lot of
computers are not designed well enough to deal with 100% CPU use over long
periods of time. This has been a problem that computers had since their
beginnings using vacuum tubes. The problem remains to this day as a lot
of modern laptops still overheat and there are many examples of video cards
in desktop machines becoming damaged due to excessive heat.

Intel has tried deal with this problem with their speedstep technology.
If the sensors say the computer is getting too hot then the CPU scales the clock frequency to a lower value. Other solutions for overly hot laptops include cleaning out the fan vents and buying a cooling pad, of which there are active and passive types.

Generally a desktop with good air flow should not overheat.

Note: I've never had a Panasonic Toughbook CF-48 overheat.

COMPONENT FAILURE

Unfortunately just about all computer components fail at some point. It's
a good idea to watch for signs of impending failure. One can make
component failure less likely to happen by using high quality parts such
as power supplies, motherboards and hard drives. Some external hard drives
are rugged and have anti shock capability.

The DC adapters used with a lot of routers and cable modems are cheaply made
and are a common point of failure. It's possible for a cable modem to fail
and not fail completely, instead there is increased packet loss. The
easiest solution for this problem is to completely swap out the cable modem
for another one. In the case of routers merely swapping out the DC adapter
(or wall wart) will usually fix the problem. Be sure to match the physical
connector, polarity, voltage and amperage to the original unit.

LACK OF KNOWLEDGE OF DEVICE PROTOCOLS

It's one of the oldest tricks in the book for hardware manufacturers of
printers and scanners: Make the protocol the device communicates in a
secret, thus making life difficult for software programmers (especially FOSS
ones). Often there is little or no technical documentation to support development of free and open-source device drivers for their products.

REPAIRABILITY

As computer motherboards become more and more integrated they become less and
less repairable. In the days of the IBM PC the motherboard had no disk controllers
or IO ports, just the memory chips, the CPU, and external ports for the keyboard and cassette deck. Thus it was a relatively simple matter to replace defective components. Not so with a modern motherboard that has sound capability, networking, hard drive controllers, USB, and who knows what else.

One wonders what repair level is possible with tablets and other tiny devices which
use surface mount technology. On the other side of the coin often it is quite easy to repair a modern LCD HDTV. The large case size ensures that there is a reasonable distance between components. One can at least do part replacement at the component level.

The road to unrepairable computers started with Very Large Scale Integration (VLSI) technology in the 1970s. Before the microcomputer appeared the CPU was a collection
of separate cards. In the case of the DEC PDP-11/20 it had integrated circuit flip-chip modules. Individual flip-chips could be repaired.

SOFTWARE CREEP

The final issue is software creep. By this I mean the continual replacement of older
software with newer software. It seems unavoidable but it's usually unnecessary.
How often does one need to update their word processor? I'm using abiword on
Linux and it seems adequate for my purposes. KDE versions after 3.5.10 do not intererest
me. In fact I've switched to the lighter IceWM and I'm quite happy with it.

CONCLUSIONS

In conclusion we can clearly see why the computer experience is often poor, and this
is mostly due to poor design and lack of knowledge of computer maintainance. It's not
the user's fault as the manufacturers want unknowledgeable users and unrepairable devices so they can keep selling you a new computer or television or $ELECTRONIC_DEVICE every 4 years or so. It's bad for your pocket book and it's bad for the environment. All that ewaste has to go somewhere and somewhere is usually a landfill site.

The solutions to these problems are somewhat unclear but I favour using older more
repairable computers as part of the answer. I've seen Panasonic Toughbooks last over ten
years and there's no reason why one couldn't keep it working for at least another decade.

This page is powered by Blogger. Isn't yours?

Subscribe to Posts [Atom]