[Mon Aug 30 21:23:29 CDT 2004]

If earlier today I wrote about the announcement of a delay in the release of Longhorn, now I must comment on a couple of articles by Frank Hayes I came across of last week while reading Computer World. In Lite a Fire and The Power of No, Hayes tells us about the bright side of having introduced some level of competition in the technology market thanks to the acceptance of Linux. While some still consider Linux a hype, the reality is that at the very least it is helping us free ourselves if only partially and temporarily from the monopolistic practices that Redmond shovelled down our throats during the 1990s. It is now in our hands to make sure that things don't revert back to what it was just a few years ago for, believe me, as soon as Gates and Ballmer feel they don't have to fear the penguin any more they will make us pay. It's not a Microsoft-free world I'm calling for, just a world where at least we have a choice. {link to this story}

[Mon Aug 30 14:16:12 CDT 2004]

Yesterday, we read that Bill Gates announced a delayed release date for Longhorn (oh, surprise!), which is now expected somewhere in 2006. Even worse, it was also announced that the much hyped WinFS would not be present in the final preoduct either (again, oh, surprise!). If there is something that doesn't change much over the years is Microsoft's ability to keep a straight face when marketing vaporware, using them as a key argument to justify why customers should not switch to another platform... and then carefully admit in public that... well, the product is not ready for release and it will be delayed. Today, Bill Gates explains in an interview that:

WinFS, I'd be the first to say, is very ambitious. Nobody has ever brought together the world of documents, media and structured information in giving you one simple set of verbs that lets you richly find, move around and replicate those things.
Agreed. It's a challenge. Sure, it's not easy to implement. However, that didn't stop him from hyping it for the last couple of years to stop anyone from migrating to Linux, right? I'll dare to make another guess too: whenever it's finally released, WinFS will be as buggy as hell, will have a few serious security holes, and it will also take a while before it truly stabilizes and can be used in production. No matter how many times this happens, we never learn. Oh, well. {link to this story}

[Fri Aug 27 12:00:37 CDT 2004]

If you listen to radio on the Internet and use Linux on your main desktop system, make sure you download Real Player 10 for Linux. The previous version of Real Player worked just fine, but this one does merge into the desktop much better (or at least it does in the case of GNOME, which is my preferred desktop environment). It definitely provides a better user experience, and not the half-baked product we had to use before. {link to this story}

[Thu Aug 26 21:10:30 CDT 2004]

Red Hat is so central to Linux that so many months (or has it been longer?) after they announced they'd discontinue their low-end distribution people are still writing about it. Brian Jones writes SysAdmin to SysAdmin: The Red Hat end-of-life debacle. Nothing big, except for the fact that Jones tells us about some of the RHES-like alternatives sysadmins have: mainly CentOS and White Box Linux. I had never heard of CentOS, but it does appear to be at least as nicely put together as White Box Linux. Nevertheless, we should all bear in mind that, no matter what, these solutions are not backed by a corporate vendor and there is no formal commitment of releasing updates on time. One of the readers who posted a comment to Jone's article explained that White Box's updates tend to be delayed by about a month, and CentOS's by about two weeks. You consider whether this is an affordable price to pay, especially taking into account that there are no fees involved and the whole thing is put together by volunteers. Of course, one could always argue that neither Slackware nor Debian have those delays and also depend on contributions by volunteers. Once again, it's your choice, which is the nice thing about open source.

On a related note, Desktop Linux publishes an article on how to build enterprise Linux starting with regular Red Hat 7.3. I see a few mistakes in the article, such as the assertion that RHEL 3 is based on Red Hat Linux 7.3, which is not the case. RHAS 2.1 was indeed based on Red Hat Linux 7.3, but RHES 3 is based on Red Hat Linux 9. Finally, when Red Hat finally releases RHES 4 sometime in 2005, it will be based on Fedora. Nevertheless, the article contains a few good pointers on how to build your own "enterprise solution". It is interesting as a curiosity, but I simply cannot see why I should even bother investing so much time in it when there are other alternative Linux distributions out there. {link to this story}

[Mon Aug 23 21:52:55 CDT 2004]

I recently came across Joel On Software, one of those blogs out there that is really well worth the time to read. The particular article I read was How Microsoft Lost the API War, which I found quite insightful, original and free of all the common places one tends to find in most articles about Microsoft and its business and technology strategies. Bear with me because Joel's article is quite long, but I will try to come up with a quick overview that stresses its main points.

Joel starts by stating that "Microsoft's crown jewel, the Windows API, is lost". He doesn't consider the OS itself (not even MS Office, as many people do), Microsoft's most important product, but rather their Windows APIs. How come? It all goes back to that mantra that Steve Ballmer repeats at every major Microsoft event: "developers, developers, developers!". But what for? Why are the developers so important to Microsoft? Well, who writes the applications? If Bill Gates understood something really well back at the dawn of the PC era, it was that he could benefit from building a nice level of standardization (his own APIs) on top of an already standardized hardware platform (IBM took care of that). So, instead of jealously protecting their APIs from the outside world like others did (mainly Apple, both on the hardware and the software front), Gates chose to release them to the public. Even better, he made the developers the centerpiece of his company's line of products. End-user applications were certainly nice, but his interest was in convincing the programmers to move all their development to his MS-DOS/Windows platform, and boy was he right! As Joel points out,

It's so important for Microsoft that the only reason they don't outright give away development tools for Windows is because they don't want to inadvertently cut off the oxygen to competitive development tools vendors (well, those that are left) because having a variety of development tools available for their platform makes it that much more attractive to developers. But they really want to give away the development tools.
Unlike Steve Jobs, whose greed and anal obsession with complete control led him down the path of proprietary everything, Bill Gates realized pretty soon that in order to win big he had to let go a little bit. The direct consequence of this approach to computing was what Joel labels the Raymond Chen camp within Microsoft, whose main belief is that no matter what Microsoft does, no matter how much innovation it wishes to introduce in new products, it always has to make sure they are backwards compatible with the previous APIs. Why? Well, it should be clear by now: it's the only way to keep developers happy and guarantee that they won't have many headaches implementing and deploying their new solutions. And did that change? You bet it did! It changed in the last couple of years or so. As Joel explains,
The old Microsoft, the Microsoft of Raymond Chen, might have implemented things like Avalon, the new graphics system, as a series of DLLs that can run on any version of Windows and which could be bundled with applications that need them. There's no technical reason not to do this. But Microsoft needs to give you a reason to buy Longhorn, and what they're trying to pull off is a sea change, similar to the sea change that occurred when Windows replaced DOS. The trouble is that Longhorn is not a very big advance over Windows XP; not nearly as big as Windows was over DOS. It probably won't be compelling enough to get people to buy all new computers and applications like they did for Windows. Well, maybe it will, Microsoft certainly needs it to be, but what I've seen so far is not very convincing.
Yes, .Net is pretty good. As Joel admits,
A lot of us thought in the 1990s that the big battle would be between procedural and object oriented programming, and we thought that object oriented programming would provide a big boost in programmer productivity. I thought that, too. Some people still think that. It turns out we were wrong. Object oriented programming is handy dandy, but it's not really the productivity booster that was promised. The real significant productivity advance we've had in programming has been from languages which manage memory for you automatically. It can be with reference counting or garbage collection; it can be Java, Lisp, Visual Basic (even 1.0), Smalltalk, or any of a number of scripting languages. If your programming language allows you to grab a chunk of memory without thinking about how it's going to be released when you're done with it, you're using a managed-memory language, and you are going to be much more efficient than someone using a language in which you have to explicitly manage memory.
However, the fact that Microsoft is choosing to break all previous APIs could be their downfall in the end, no matter how great the new platform is. It is, after all, the same reason that brought about Microsoft's success (the ease of development and, above all, deployment) that could cause them trouble now that what Joel labels the MSDN Magazine camp appears to have won the internal war within the company.
... if you're developing a Windows GUI app today using Microsoft's "official" latest-and-greatest Windows programming environment, WinForms, you're going to have to start over again in two years to support Longhorn and Avalon. Which explains why WinForms is completely stillborn. Hope you haven't invested too much in it. (...) So you've got the Windows API, you've got VB, and now you've got .NET, in several language flavors, and don't get too attached to any of that, because we're making Avalon, you see, which will only run on the newest Microsoft operating system, which nobody will have for a loooong time. (...) No developer with a day job has time to keep up with all the new development tools coming out of Redmond, if only because there are too many dang employees at Microsoft making development tools!
And it is precisely in this context that more and more companies are starting to rely on the web to deploy their applications. Needless to say, the arrival of web services, to which Microsoft itself has contributed so much, is one of these developments that could mean trouble to Bill Gates.
The new API is HTML, and the new winners in the application development marketplace will be the people who can make HTML sing.
In other words, the new war is not so much Windows vs. Linux as Microsoft APIs vs. the Web, and believe it or not, for the first time in many years, the playing field may be more or less levelled. It is far from clear that Microsoft and .Net will end up imposing their standards over everything else, mainly because nobody (especially the governments) seems to be happy with a future where our complete technology infrastructure depends on one sole vendor. {link to this story}

[Sat Aug 21 22:42:26 CDT 2004]

I know this may sound strange, but at this pace software patents and the struggle to protect intellectual copyright may simply destroy innovation altogether. Mind you, I've never been one of those FSF dogmatics, but way too many corporations are taking this too far, and the system is simply broken. Just a few days ago, we read that Bill Gates promised an "increased, intense focus" on protecting intellectual property in the software business, calling for a dramatic rise in the amount of patents filed by his company. Tonight, Justin Mason (from SpamAssassin) publishes in his blog a link to US Patent 6,775,781 where Microsoft is attempting to patent something akin to good old sudo. Needless to say, sudo was already included in BSD back in 1980, although I suppose Mr. Gates couldn't care any less about it. Now, some people keep predicting the arrival of the Armageddon if we don't protect intellectual property, but to me this sounds like corporate piracy of the first magnitude, albeit in a very clever legal disguise. Is this the return of the robber barons? {link to this story}

[Sat Aug 21 21:16:09 CDT 2004]

It sure shouldn't surprise us much to learn by now that new technologies take a while to sink in, and at least during a transitional period all those who benefitted from the old status quo prefer to fight back rather than adapt to the new situation. The San Jose Mercury News reported a few days ago that the International Olympic Comittee is barring the athletes from writing first-hand accounts for news and other websites, mainly in an attempt to defend its own lucrative broadcast contracts. Times are a'changin and in this case the IOC prefers to go against the new technological advances even if it could mean that they end up missing the train. Like it or not, whether you find it enlightening or an absolute waste of time, blogging is here to stay and we'd better get used to its presence. It wouldn't surprise me if in the Olympic Games of the near future many athletes report about their experience first-hand, without any sort of media conglomerate mediating between them and us. We will have to wait for that day though. {link to this story}

[Sat Aug 21 15:47:32 CDT 2004]

So that nobody fools himself, Earl McBride clarifies in Information Week what he is after. He is quoted as saying:

If we lose in court, then Linux is at that point a runaway train, and we never will chase it down.
Nice to know that Mr. McBride worries so much about innovation and fair competition. I have no doubt now he feels confident his company can compete on a levelled playing field if given a chance, hence his efforts to resolve this in court once and for all. I find it amazing that he is still able to say these things with a straight face. I suppose it's part of the job, after all. {link to this story}

[Sat Aug 21 15:13:59 CDT 2004]

Oh, the irony! Last night, I was reading a piece published by Linux Weekly News on the value of the Linux distributors as some sort of middleman that adds value not only by collecting software from the open source community, bundling it all together in a nice CD image and writing idiot-proof installers, but also patching certain applications to add functionality that users request but that the maintainer is not willing to include or support (the author of the article uses cdrtools as an example). Today, while reading the latest issue of Debian Weekly News I read about an entry in the weblog of José Carlos García Sogo noticing that Jörg Schilling, the maintainer of cdrecord has added a non-modification clause to a file which renders the package non-free. Here is the scoop. Schilling is unwilling to make a couple of modifications to cdrecord in order to make it easier to address devices without using the old SCSI nomenclature (in other words, using something like dev=/dev/cdwriter instead of dev=0,2,0) and to support DVD write capabilities. However, there are quite a few users out there demanding these features (or so we are told, since I truly have no bullet-proof way to find out, to be honest). So, apparently SuSE, among other Linux distributors, has included such functionality in the form of patches to the original source code and now Schilling complains that too many users are contacting him to fix problems that he didn't introduce in the code in the first place. While the LWN editor congratulated himself on the fact that the GPL license allows for that flexibility, Schilling was already hard at work figuring out how to change such license to fight back. In the meantime we, the users, have little choice but to sit and see what happens next. Well, at least there is a choice to simply fork the application and be done with it. By the way, this is one of those instances where that characteristic of the Debian community that so many people find annoying (their obsession with the legal aspects of software licensing) pays off, since it's people like them who are usually paying attention to these developments and tell the rest of the community about it. {link to this story}

[Sat Aug 21 15:01:16 CDT 2004]

Brian D. Foy asks in his weblog on O'Reilly Network what could Perl developers learn from PHP, and as one could imagine had to deal with a good amount of rants. Some of them are well thought out though. For instance, Peter Hickman writes:

PHP's great advantage is that it does so much out of the box. You want sessions, you've got it. It's built right in. With Perl instead you have many and varied solutions and that can be the problem. For starters, you will have to install extra software no matter what, not with PHP, and that assumes that you can make a sensible choice from the various options. Then you have to configure it. Yes, the various Perl solutions to Apache sessions give you more flexibility and scale better but damit I want sessions and I want them now.

Yep, been there myself. Not to talk about the problems to install this or that Perl module in order to get an application going. Yes, it is flexible. Yes, there are thousands of them. Still, it is a pain to deal with, especially when one runs into dependency problems or needs to install directly from the CPAN repository bypassing one's own distro package management system. Call me whiner, but it does make it more difficult to administer the system. As somebody else points out, it is also "easier to code simple stuff" with PHP. I still like Perl better for some reason, but deeply dislike those who choose to write off PHP as children's play. It is not. It's a serious tool, at least for one particular niche: the development of dynamic HTML code. {link to this story}

[Sat Aug 21 14:04:29 CDT 2004]

Of all the free trade magazines out there, SD Times is one of the few I find useful for project managers (Software Development is also quite useful for developers, and a few others such as eWeek or Information Week are good to keep in touch with all that is going on in this field from a business perspective). In any case, today I came across 21 Rules of Thumb for Delivering Great Products on Time, by Jim McCarthy, and even though I found most of it to be way too abstract to be applicable directly (please, don't misinterpret what I'm saying, McCarthy does discuss a few general rules of thumg that are indeed useful), his reference to "the triangle" called my attention:

There are fundamentally three things that you are working with as a member of a development team: resources (people, money and things that cost money), product features (including the quality of their implementation) and time (primarily as expressed in a schedule). Changing one has an impact on at least one other axis, usually two.
Sure, it's nothing earth-shattering but I still have seen lots of projects go south because management overlooked precisely this triangle. Even better, it doesn't apply only to development projects. {link to this story}

[Sat Aug 21 13:46:39 CDT 2004]

Let's be honest. Even the most rabid Emacs users have to use Vim every now and then. Like it or not, it still is the default editing tool one can definitely count on whenever the Linux/Unix system one administers is down in single-user mode and some configuration file needs to be edited. Aside from that, I must acknowledge it is without any doubt my favorite editor. I know, I know, it is full of cryptic key combinations and its modal approach to text editing is less than natural to many people. However, once you get used to at least the most basic functionality (I definitely don't claim to be an advanced user by any means) it also has lots of advantages: it is simple, light-weight, it is installed by default on most Linux/Unix systems, there are versions for MacOS and Windows too, it does syntax highlighting, its search and replace function must be about the faster and most flexible in the market, one can run it remotely without any problem... do I need to continue? In conclusion, I'm not going to be naive enough to advocate the use of vim for regular end-users, but I'd recommend it without any doubt to system administrators and developers. I write all this because yesterday I came across a great article on certain advanced functions of vim that can help you get the most out of your favorite editor, including things such as mapping shortcut keys, customizing the preferences to the file type and using shell commands from within the editor. Finally, if you like what you read, head for the official Vim User Manual in their own website. {link to this story}

[Tue Aug 17 12:24:56 CDT 2004]

Ever had a system crash in Linux and needed to interpret a kernel oops? I bet your first thought was that the oops-tracing.txt file included with the kernel source is no help. Now there is a better document telling you how to interpret those cryptic messages. Denis Vlasenko has written an excellent How To Locate An Oops document that he sent to the lkml list, explaining step by step how to read the oops, track it down to the proper source file, disassemble the code and match the assembly back up with the precise line in the C code. Alas, this is only an example. Hopefully, someone some day will write a more comprehensive article on how to troubleshoot Linux kernel problems, but for the time being Vlasenko's text is just great. {link to this story}

[Thu Aug 12 22:05:48 CDT 2004]

It is amazing how fast times change! I just came across an old article published by Wired the day after Netscape released its browser's source code back in March 1998, and there is a little comment about Microsoft's reaction that certainly called my attention:

Netscape's primary competitor, Microsoft, professed indifference. Microsoft maintains that it gives developers similar access to Internet Explorer functionality, only via self-contained, finished browser components rather than raw source code. This approach, says Microsoft, is preferred by most developers.
Just in case you do not know what I am talking about, this is the very same company that more recently has been trying to sell us the idea of shared source as the greatest invention since the sliced bread. Yep, times are definitely a-changin'. {link to this story}

[Thu Aug 12 15:17:54 CDT 2004]

Arvind Narayanan publishes a thoughtful critique of port knocking. At least some of the people who commented on the article (see the postings at the bottom) do not appear to get it. It is not an issue of criticizing port knocking as useless. Rather, I read Arvind's article as a warning for all those who see it as the be all end all of security, and may be just relax and sit back after implementing it on a public server just to see their systems compromised soon afterwards. {link to this story}

[Thu Aug 12 13:13:46 CDT 2004]

James Gosling has published an old paper about what he would do if he were to design a window system today that makes for some quick and curious read. His comments about the state of the art in this field 15 or 20 years ago provides for a good information on how much computing has advanced during these years:

Back then, there were no shared libraries. This seems odd, looked back at from today, but back then no version of Unix had the ability to have a library like libc or OpenGL that was shared between processes. All applications had to be "statically linked". There was a primitive segment sharing facility that allowed one segment per process to be shared, that was at the beginning of the address space; but it wasn't powerful enough for this purpose.
So, what does he propose? Summing up, this is what Gosling would do:

I would make the "window system" so minimal that it is almost non-existent. Each graphical application gets direct access to the hardware, and a window is nothing more than a clipping list and an (x,y) translation. I would build a "device driver" that did nothing more than manage the clipping lists and hand out graphic device ports. This might actually be best done at user level, rather than a device driver, using shared memory and semaphores.

There are a variety of "hairy bits" that make this more complicated:

It doesn't just maintain clipping lists. It maintains the "true shape" of each window, and a stacking order. The windows clip is derived from these by subtracting from the clip for a window the shapes of all those above it. Whenever the shape or stacking order of a window is changed I would notify it via a message on the mouse/keyboard event queue. Until the application acknowledges the clip change, the old window shape has to be considered as being continuously damaged by the application.

It has to handle resource allocation within the accelerator, including texture space and rendering ports.

Keeping an eye on his blog may pay off every now and then. {link to this story}

[Thu Aug 12 12:40:56 CDT 2004]

Tom Adelstein publishes an article on How to Build a Low Cost Linux Desktop Computer that is well worth the small amount of time it takes to read. It provides some good information on cooling and quieting the processor as well as a link to Build EasyPC, an interesting site with some basic information on PC hardware. {link to this story}

[Thu Aug 12 11:49:50 CDT 2004]

Reading an interview with Subversion hacker Ben Collins I came across some pretty sensible comments comparing FreeBSD and Linux:

Well, I used to use FreeBSD as my main desktop OS, but a couple years ago I switched back to Linux. I've decided that FreeBSD is the best server OS out there. But it's not as instantly easy to get going as a desktop workstation —Linux is better at APM, cardbus support, thread support, USB HID support, WIFI GUI applets, instant CUPS support, and has tools like valgrind. All that stuff "just works" when you install Linux on a notebook; on FreeBSD, that stuff either doesn't work, or is tricky to get going (granted, this is all FreeBSD 4; I have not tried FreeBSD 5 yet).

In my former life as a Unix sysadmin —I can say that FreeBSD, as an OS, is the "tightest" distribution out there. Linux distros feel like a bunch of pieces shoved together: a kernel, a toolchain, some user space apps, and so on. FreeBSD is one coherent system, everything compilable from source in a single make world. It makes the system much easier to manage and administer... and the networking is incredibly solid. It's my first choice for a server OS, no doubt.

Overall, I would have to agree with Ben's comments, although distributions such as Debian do not have that feeling he is referring to. Sure, one cannot do a make world and recompile the whole thing, but it still is much tighter than any other Linux distribution out there. And yes, he is right, that helps a lot when it comes to system management and administration. Nevertheless, I must also say that I find it much easier to install security patches on Debian than on FreeBSD, for example, and that is definitely a very important part of system administration. By the way, if you are interested in the world of the BSDs make sure you read Differentiating Among BSD Distros, published just today by ServerWatch. {link to this story}

[Tue Aug 10 19:26:51 CDT 2004]

Ever since Red Hat decided to change its overall business strategy and discontinue its low-end Linux distribution I have been running Debian on my home server. It is something I had been mulling over for years, mainly because I had the feeling that sooner or later Red Hat might do what it finally did. I still remember some discussions I had with a good friend of mine over this issue, and how he tried to persuade me that Red Hat had always been a good open source citizen. However, that was never the problem. I believed back then (and still believe now) that of all the Linux companies out there, Red Hat must be very high in the rank of those who are friendly and honest to the open source community, and this in spite of all the accusations of behaving like the Microsoft of the Linux world that one hears all the time. They continue contributing a lot of technologies to the community (the most recent, Sistina's Global Filesystem), and Fedora is arguably one of the most user-friendly and featured desktops out there. Still, the reality is that Red Hat is now a public company and making money is their top priority. Even worse, as a public company, simply making a small profit is not good enough anymore. They need to show growth, and no matter how enthusiastic we all are about Linux and open source the ugly fact is that people still have to figure out how to make money out of the software (yes, I am aware IBM and HP are profiting from Linux, but they sell hardware and that is where they make the money; free software is, in their case, a simple add-on or feature if you might).

In any case, as I was saying, I have been running Debian for a while now and I really love it, especially on the server. It is stable, as solid as any other Linux distribution and, above all, really easy to administer. What do I mean by that? Am I saying that Red Hat is not easy to administer perhaps? Well, yes and no. Like so many other people in the US, I learnt Linux via Red Hat. So I am still getting acquainted with the Debian way to do things. However, how many times does one fiddle with this or that setting on a server? It is something that, by definition, one should edit and leave one, and that is precisely where Debian shines. While in the case of Red Hat one had to muck around fixing problems that upgrading this or that other package caused, in the case of Debian everything is quite smooth. I am not referring only to their famous apt package management system, but also to their overall design. As in the case of FreeBSD, it gives you the feeling that is has been carefully laid out and developed. On the other hand, in the case of Red Hat I always have the feeling they improvise constantly. Things are broken here and there, up2date is a total catastrophe, everybody has plenty of stories about dependency hell, one performs an upgrade and multiple packages get broken not so much due to the new capabilities included in the software as a direct consequence of the carelessness with which they were put together... A security advisory is out and I have to install an update? A simple apt-get update; apt-get upgrade does the trick. I need a new package to do this or that on the server? Again apt-get to the rescue. "But, but, but... (I hear the Red Hat fans) you can also do that with apt-rpm or yum. Well, again yes and no. I have not tried yum but I can tell you apt-rpm is as buggy as the rest of applications that Red Hat touches. Hopefully, I will find the time to write a few lines about that topic one of these days.

So, what is stopping me from recommending Debian for desktop use? I do use it for a development workstation, and am also very happy with it. I do not need to worry much about the dependencies I may need for this or that particular application, and installing new software is always easy. Of course, I run the unstable branch because otherwise I would be hopelessly lagging behind the times and would not be able to accomplish much. Also, it still is not nearly as user-friendly as Fedora or SuSE. Finally, the installation process is OK for people with technical knowledge but still quite complex for the average Joe out there. Still, a release candidate of the new Debian installer has been released and it does show promise. It finally includes hardware auto-detection (yeah, it is about time), RAID and LVM support, and many other goodies. Apparently, it still does not work very well to set up X, which is a pity since I have long considered this to be one of Debian's worse weaknesses (yes, even more than the text-based installer which does not bother me in the least since that is what I use even for Fedora). By the way, something I have always loved about the Debian installer is that it connects to the online software repositories during the installation process to automatically upgrade all the packages that have security vulnerabilities. That is a nice detail, and something I would love to see in other distributions. Heck, it is something I would like to see even in commercial Unices. Spend some time browsing through the screenshots of the installer to get an idea what I am talking about and, of course, test it if you have a chance. {link to this story}

[Tue Aug 10 15:42:13 CDT 2004]

Rippleweb publishes an interview with Eugenia Loli-Queru, from OSNews. For whatever reason, there are some people out there who treat Eugenia with a condescending attitude that must be rooted in some semi-hidden chauvinistic attitude. Sure, her views are her views and anybody can consider them wrong and argue against them. The problem I see is that all too often people appear to react to her always well thought out articles with some knee-jerk ad hominem attacks. In any case, in this interview she definitely proves she knows far more about operating systems than many of us. The conversation revolves about the desktop operating systems, GUI design, hobbyst OSes... By the way, I found it interesting that her favorite operating system is Windows 2003 Server. {link to this story}

[Tue Aug 10 14:54:10 CDT 2004]

NewsForge published today a review of UnixWare 7.1.4 that raises some interesting questions. The product itself appears to be solid and stable, but then that has never been an issue. It is UNIX after all. What I find fascinating is the fact that UnixWare includes so much GPL'ed software that it should make anyone feel suspicious about SCO's arguments in the ongoing legal battle against IBM and Linux. This is a company that accuses the GPL license of being "anti-American" and "inconstitutional" and yet it does not appear to have any qualms in adding it to its own products. Nice. What is that talk about "principles" again, Mr. McBride? {link to this story}

[Mon Aug 9 12:24:31 CDT 2004]

Just came across a few interesting articles on window managers and desktop projects out there. First of all, David Uhlman tells us 10 GNOME tweaks that are well worth reading if you use that desktop environment. They range from information about some basic GNOME tools to the use of emblems, disabling Nautilus and even some Easter eggs. Meanwhile, the Enlightenment guys keep working on their next release (version 0.17), which promises to be quite revolutionary. It is taking a long time, but the Enlightenment Foundation Libraries are starting to look pretty good. For whatever reason, this window manager still has a reputation for being a hog. It may have been several years ago, but I have used it a few times in the last couple of years whenever my system was too old to run GNOME and I found it quite responsive and snazzy. Say what you may about Enlightenment, but at least we all have to acknowledge it is not a simple implementation of the Windows paradigm in Linux, like so many other desktop environments and window managers. It is not only the looks that are puzzling, but also a few original and innovative approaches its developers take. Give it a try if you have a chance. Finally, I must say I am impressed at the latest fvwm screenshots. Its default configuration is quite lame and boring, but it is nice to see it is still possible to run a very light window manager that can still look cool. I may install it on my old laptop that is running SuSE since both KDE and GNOME just take forever to load there, and I only use it to browse the Web, do some quick programming and take notes at company meetings. {link to this story}

[Mon Aug 9 08:03:43 CDT 2004]

It was not a long time ago that I wrote about the Fedora Legacy project and how I had my doubts it would be able to rise to the challenge they set for themselves. Well, I now read the confirmation of this. Linux Weekly News publishes that Fedora Legacy announced the end of support for Red Hat 7.2 and 8 due to a lack of community participation. One does not need to be a genius to realize that it is simply not possible to maintain so many different versions of a given distribution, especially when Fedora continues its relentless advance with a new major release every few months. As a matter of fact, it is reaching the point where Fedora Core is outdating itself in a matter of less than two years (the Fedora Steering Committee has just announced that they are going to transfer Fedora Core 1 to the Fedora Legacy Project as soon as FC 3 Test 2 is out). I am sorry, but this is no way to run a production server. I am now glad I made the decision to switch my servers to Debian as soon as Red Hat Linux was discontinued. It did not take me long to get used to the way Debian does things, and I can avoid all these constant headaches about maintenance policies and end of support dates. I would go even further and question whether or not it makes sense to bother with Fedora on the desktop side, especially with other alternatives such as Xandros or Linspire out there. They appear to be just as user-friendly as Fedora (actually more, since they are not burdened with the MP3 and other multimedia problems) and offer far more stability and predictability than a distribution that is being changed all the time, always has the feeling of being in beta and does not count with a serious release and maintenance plan. I still run Fedora on some machines, but mainly because Red Hat and Red Hat-like distributions are considered, like it or not, the standard in the Linux world and anybody who works in this field ignores it at his own peril. Other than that, I could not care any less if it just goes away, to be honest. {link to this story}