Sunday, November 8, 2015
Fedora Core 1 Computer Reaches 1 Year Uptime
Well it finally happened, my old Fedora Core 1 server has reached 1 year of uptime and counting.
The server was built in 1998 and Fedora Core 1 was installed on May 12th 2004. I wish I could say that I always ran Linux or BSD on this box but the truth is it was originally a Windows 95 box and later on a Win2K box. One of the reasons why the uptimes weren't longer was due to utility power failures. Currently the server has a decent APC ES 725 UPS connected via USB cable, but this will be upgraded in the near future.
This computer has turned into somewhat of an experiment in longevity. The question is how long can I keep it going? Certainly 20 years does not seem out of reach. The ATX power supply has been replaced twice. I can't remember exactly how many hard drives there have been but it was replaced at least once. I have placed an insulating mat under the tower and I think this has helped improved the computer's reliability.
This particular install of FC1 has seen many an upgrade, including a KDE upgrade to version 3.4.2. It has been without a doubt the longest lasting work computer (17 years) and longest continuously used distro (11 years). Also I can say that there have been no problems at all using the old SysV Init.
The specifications of the computer are now quite old but still useful: it has a Tyan Tiger 100 S1832DL Motherboard with dual P3 550 Mhz CPUs. On the other hand every P4 computer I've owned has developed some problem, I suspect due to excess heat.
I'm sure Vance Packard and Ralph Nader would approve of this computer. I wrote previously about my old Dell reaching 1 year uptime running OpenBSD back in 2013.
Saturday, October 24, 2015
The Web is Gummed Up
This is a sad story to write, but it's been percolating in the back of my mind for months if not years: The World Wide Web is gummed up with crap. This realization came into sharp focus today when I visited some media sites like cbc.ca and my CPU utilization when up to 100% and stayed there. Exactly why firefox was using so much CPU was a bit of a mystery. I had autoplay in firefox turned off, and there didn't appear to be any reason why the CPU should be maxed out.
Looking at my processes I could see that firefox was using about 65% of the cpu, and X was using the other 35%. There was a banner ad at the top of the screen, and a few other ads were also present. All the images appeared static so there was no apparent reason why the CPU needed to be running at 100%. Once I exited firefox my CPU use went down to 1.5% to 3%, so there was no doubt that firefox was to blame.
It's not unusual for firefox to hit 100% CPU usage for brief periods of time even on web pages I've designed myself but the CPU use always goes back down to around 3% after a brief period of time. This seems normal to me. Using the CBC web site made the CPU hit 100% and stay there indefinitely, or at least for as long as I felt like waiting. As an experiment I tried the dillo web browser on CBC's site, and the CPU level was about 3% although things were not rendered as nicely (it is very doubtful there are many web designers who test their sites using dillo).
Media web sites seem to be the most problematic when it comes to firefox's use of one's CPU. In fact I've seen web sites that not only use 100% of the CPU, but also become completely unresponsive. Clearly there are design practises which are making the situation worse. When I view a web page's source sometimes it looks like an indecipherable snarl of code. To say that the code is overly elaborate would be a huge understatement. One might even say there's an amount of deliberate obfuscation going on.
On the other hand some sites don't use a lot of CPU, blogger sites and gmail are two examples. Fortunately Google has offered a "Basic HTML" mode for older computers. It would certainly be nice if media sites also had a Basic HTML mode to fall back on. No doubt SSL is partly to blame for the increasing slowness of the web, but what can one do about these other sites? Surely using 100% of the CPU on certain computers for long periods of time will make them over-heat or damage themselves in a worst case scenario.
Possible solutions include using Dillo or a text based browser. While this is not ideal it seems more palatable to me now that certain media sites are so slow. It doesn't seem to be a problem on my chromebook, so that is another possible solution. Tumblr and Facebook seem ridiculously slow, although I am probably aggravating things by using older P3 systems. I have a kill script at the ready to clobber any sites which paralyze firefox.
In any case my suggestion to web site designers is to have a Basic HTML mode for your sites. It's only fair to your users with older computers. For the rest of us we can at least turn off auto-play for videos and consider using alternative web browsers. The problem doesn't seem limited to older computers as I've seen firefox hit 100% CPU on faster systems as well. Videos should not be auto-played. One can imagine how frustrated an older computer user would be if an HD video automatically started up (paypal I am looking at you!).
Saturday, October 17, 2015
Internet Biase Against Older Computers
In the march towards greater security there is a downside that affects older computers and older software. Older web browsers that support older versions of SSL are often locked out of certain web sites. Naturally web browsers that don't support SSL at all won't work either.
Recently I tried to access forums.freebsd.org and osdisc.com and always got the message "The connection was interrupted" in firefox 16.0.2, the newest version which would run on an older version of Vector Linux. At first I tried to disable IPV6 within firefox but that made no difference. Then I wondered 'could the version of SSL supported in firefox be too old?' so I tried again using Q4OS with iceweasel 38.2.0 and it worked.
To my thinking the extra security for web sites is rather nullified by the result of locking out many systems. Even my online banking worked on firefox 16.0.2 and surely the freebsd forums are not more important than that. Web site developers need to be aware that by locking out older computers they are reducing the utility of their sites.
Older computer users face another more serious problem which can't be fixed by newer versions of software: As encryption becomes more elaborate it requires more and more computing power to make use of it. To put it another way an older computer like a MicroVAX or an Amiga running NetBSD would run the security layer so slowly as to make it unuseable.
Informational web sites or forums should not lock out older computers. I don't see the necessity for using new versions of SSL on such sites. With online banking obviously there is no argument, the strongest security should be implemented, but for sites like wikipedia or forums I see no need for https.
Wednesday, October 14, 2015
CBC Changes Radio Streaming Link Yet Again
I still like to listen to CBC radio on the internet on occasion and once again I see they have changed the link for CBC Toronto Radio.
Currently I am using this script for CBC Toronto Radio:
One wonders why CBC changes their links so often.
Friday, August 14, 2015
A Brief Review of Q4OS Linux
I'm ready to make some provisional comments about Q4OS and Trinity.
Q4OS Linux is based on Debian Wheezy so it is similar to AntiX in this regard. If you like KDE 3.5.10 then you'll like Trinity 18.104.22.168 which is what Q4OS uses.
Basically package names for q4os trinity are of the form:
kooka-trinity, ksnapshot-trinity, kolourpaint-trinity etc.
So in q4os you would basically do 'apt-get install kdestuff-trinity'.
Q4OS Linux Desktop
It fixes a lot of the old KDE 3.5.10 bugs. On the down side the GUI eats about 8% of the cpu when idling and 30% of the cpu while kdm_greet is running. This is due to the fact that Q4OS's version of kdm_greet runs an analog clock with a hand that shows the seconds. Q4OS has something called libsystemd-login0 which I assume is from debian wheezy which isn't removeable without also removing trinity.
Both AntiX and q4os use init and I don't see any processes that use systemd.
A basic install uses about 2 gigs of hard drive space. Q4OS doesn't load many programs from the initial install. The Q4OS install disk is one CD with about 300 megs worth of files.
Early conclusion: if you want to have an up-to-date system running a desktop similar to KDE 3.5.10 then Q4OS Linux with Trinity 22.214.171.124 isn't a bad choice.
Saturday, July 25, 2015
Not Learning Unix is a Mistake
Back in 1986 my first exposure to Unix was Venix on an IBM XT which was made by VenturCom.
It wasn't a great experience but I learned about awk (we called it awkward at the time) and most of the basic Unix commands. Venix for the IBM XT was based on Unix version 7 with some programs from BSD such as vi. I still run Venix 2.0 on xhomer on Linux via the magic of emulation. In this case xhomer is emulating a DECPro 350. There was almost zero source code for Venix.
I realized that closed source software was very undesirable from a programmer's point of view after I discovered Unix version 7 (much much later) on the PDP-11/45 also via the magic of emulation. Unfortunately my initial exposure to Unix consisted of various closed source variants of Unix such as Venix and (in the SCO era) Xenix. Xenix was actually the second version of Unix I encountered and although I learned about Unix System V there was still mostly no source code available.
There was little enjoyment to be had with these closed source Unix versions, but learning elements of Unix System V ended up being important as Linux had largely modelled itself on SysV. For reasons of cost we didn't see Personal Computers with the ability to fully run Unix with an MMU until later. MicroVaxes and Sun workstations were well over $10K in price. It wasn't until the appearance of the Intel 80386 that we started to see affordable Unix machines. By the late 1980s most folks could afford such a machine.
It has occurred to me that not learning Unix is a grave mistake. My relatively early exposure to Unix was important. I may not have appreciated Linux as much or even at all if I hadn't had that ability to experiment at home with Xenix. Learning about Unix develops new mental muscles like playing a musical instrument or learning a new language. But learning these new processes becomes more difficult with age. To me the exact technical details are less important. It does not really matter if you are a Linux user or if you use one of the BSDs or even something more exotic like Plan 9. The important thing is you can learn new concepts from what I will broadly refer to as the Unix/Internet Community.
One way to metaphorically dip your big toe into the Unix Pool is to set up Linux (or some other FOSS operating system) on a second computer. The secondary computer can be an old Pentium 3 machine with 256 megs of ram, a new machine is not necessary. The major advantage to setting up a Linux computer is that you have access to tons of source code and tons of application software. The process of learning Unix is similar to physical exercise only in this case your brain is getting the exercise. Although this process will require considerable effort on your part it will be worth it.
My first exposure to Unix wasn't ideal, in fact I could even say that I disliked Unix initially. Part of that was due to the lack of source code. The big realization of the benefits of Unix came later: Unix is the thinking person's operating system and you will find many scientific utilities available. If your interests include physics, mathematics, chemistry, or astronomy then you will find something of interest.
After using Xenix on my computer at home in the late 1980s I made several attempts to use other operating systems. In 1994 I tried Darkstar Linux but I didn't make it my main operating system. Around the same time I tried Minix 1.5 but it seemed quite difficult to use. It wasn't until I had broadband internet access that I finally made the transition to Linux in 2003 by using Red Hat Linux version 8 as my main operating system.
Monday, March 9, 2015
Easy Way to Get Coreboot
Replacing the proprietary BIOS firmware on most computers is a process that often can be frustrating. It's possible that your computer could be rendered unuseable in the process. Back in 2010 I managed to get coreboot working on the Gigabyte GA-6BCX motherboard and although the process went fairly smoothly it did consume a fair bit of time. Fortunately we now have an inexpensive way of obtaining a ready to go coreboot computer.
There are sites that now offer Thinkpad X60, T60 and X200 series laptops with coreboot installed, although it seems that they are often out of stock. The simplest alternative to obtain coreboot is to buy a chromebook or a chromebox which are readily available from most computer stores.
If you wish to replace ChromeOS with a Linux distribution you should read the info here.
I had an occasion to use a Chromebook and I thought the keyboard was rather cheap so I recommend getting a Chromebox and supplying it with a good quality keyboard. The ASUS CHROMEBOX-M115U has an Intel Celeron 2955U Processor, 2GB DDR3 memory, 16G SSD for storage and Intel HD Graphics and sells for as low as $200 Canadian.
There is also a Coreboot on Chromebooks community here.
Saturday, February 21, 2015
AntiX Linux: A Brief Review
Certain factors like systemd are polarizing the Linux community. It seems that either you like it or you hate it. Some of the Debian developers are getting nervous and so a fork of Debian called Devuan has been announced.
I'm always looking at other distros that emphasize compactness and the ability to run on old hardware. I was also intrigued by the Debian controversy with systemd so when I saw AntiX 13.2 was based on Debian Wheezy I had to give it a try. AntiX comes on a single CD so installing it was easy enough.
Now the hardware used to test AntiX was an old IBM R30 Thinkpad with 1 gigabyte of memory. It ran fairly well on this hardware, good enough to run fbreader and iceweasel. It had a lot of good attributes like fast boot speed, and it used icewm which I was already familiar with (fluxbox and JWM were also available). It also comes with libreoffice and some multimedia software. It has a fairly modern kernel: 3.7.10.
As for the systemd issue, yes Debian Wheezy has systemd on it, so this version of AntiX has it as well. All you need to do to remove it is:
apt-get remove systemd
What could be simpler? Things get a bit murkier with future releases like Jessie which will have systemd on it by default.
I don't want to get into empty histrionics about systemd. I don't like it because I feel it's too big a departure from Unix principles. People are worried that systemd is "taking over" but so far that's not really the case. If you don't like systemd try AntiX, Slackware or one of its derivatives, or if you like a more traditional Unix try one of the BSDs.
Tuesday, January 20, 2015
The time has come to talk about a different kind of pollution. I'm not talking about the pollution that is created by manufacturing computers. This time I'm talking about pollution of the mind.
Mind pollution has permeated the internet. Computer articles are mostly written from the point of view of a person who uses Microsoft products. The assumption is that everyone uses Microsoft Windows and Windows software. To be fair there are a few OSX and Linux articles. Heaven help you if you use something really unusual like Plan 9 or HaikuOS. I can only assume those folks have a lot of patience and a strong will.
When I was in high school Microsoft Windows did not exist and we just used what was available at the time which was a Commodore PET. For the folks too young to remember a Commodore PET was an 8-bit computer with a 6502 CPU and 32 kilobytes of memory. This was in the early 1980's. So my perspective is different. Moving from one computer ecosystem to another is relatively easy since I've done it so often. I naturally think in more abstract terms such as "software presentation program" rather than thinking of "PowerPoint".
Back then no one would have thought of using a specific software package for word-processing. You either used a dedicated machine that only did word processing or you went to the computer store to find some software that would run on the computer you owned. Since everything was so new no one had any preconceptions about which software package was the "best". One usually went along with what their schools used, e.g. if your school happened to use a Commodore computer you probably would have bought a Commodore computer.
Even in the mid-90s when people were running to the mall to buy Windows 95 the computer articles on the net didn't have the overbearing assumption that you owned a Windows box and now you type "format c: /s" or "ipconfig" (it was ifconfig in the Unix world since 1982). Or my personal favourite "Get your friends off XP" which initially sounds good until you realize the article is about installing Windows 8.1 on your computer.
That's not to say there aren't articles about Linux and BSD, there's lots of them. And even with articles about Firefox there's enough similarities between the Windows and Linux versions that one can often extract useful information from them. I suppose one could still complain about the assumption that everyone uses Firefox.
It's really all about mind share. You might think "Microsoft doesn't control me" but what about the computing masses? What about the schools and the governments and the banks? We should have platform independence and people should be able to do their online activities with the platform of their choice. The media should NOT assume that all their viewers use Microsoft software because that is certainly not the case.
Mostly I can use my Linux and BSD machines to do everything I want to do. There are a few edge cases where it takes quite a lot of effort to do certain things, but for the most part there are no serious problems. Once one walks into a computer store it reaffirms that you're in the Windows World. There are Microsoft stores now but I haven't bothered checking them out as there's just nothing there for me.
I'd like to think that with the internet we have bypassed rigid and inflexible thinking as people are free to tap into any part of the web. Still, it behooves journalists and teachers to write about computer topics in a platform independent way to the greatest possible degree. If one must write about things which are platform dependent they should at least make it immediately clear that their article is platform dependent in the article's title. It is the general concepts and the broad interchange of information that is the most important thing.
Friday, January 2, 2015
BSD Community is Too Insular
First of all let me say I really like BSD. I enjoy studying it's history which extends back to 1978 when it was a mere add-on to Bell Labs Unix version 6. The longest uptime I've ever had on a computer was with OpenBSD. It's a fine piece of work.
On the other hand when I look at the BSD community I see a less than friendly environment. It is rather like a gated community where you need to be invited in. Often when one goes to BSD forums one gets some mysterious error message and no access. IRC channels related to BSD are also invite only.
When I re-entered the Linux community in 2003 there was a strong feeling I was less than welcome. As an old Xenix user I thought I would be treated well enough, instead I was barely tolerated. Eventually I felt I was accepted, at least to a certain degree (I also developed a tougher skin). Some of the rudeness seemed to be part and parcel to being online. Manners on the internet have never been very good, but lately things have really deteriorated.
In the case of the BSD community things really do need some improvement. While in the Linux world it's true that people are treated with faint rudeness and a certain amount of condescension, it's still preferable to being utterly locked out of the BSD community. One wonders what rites of initiation are necessary to be accepted into the BSD world and if it's worth the trouble.
I consider the various BSDs to be closer to the original Bell Labs Unix. In some ways I prefer the BSD way of doing things. It's good that we have a varied ecosystem of software. Nevertheless the BSD community really needs to try to become more welcoming to newcomers. This is also true of the Linux community to a lesser degree.
The original programmers of Bell Labs Unix were undoubtedly writing an operating system for other programmers. Programmers are not the friendliest group of people in the world and sometimes this can be very discouraging for young people who are getting used to Unix type operating systems for the first time. An extra effort needs to be made for the BSD and Linux communities to be more welcoming and less insular.
Subscribe to Posts [Atom]