[Fri Oct 29 10:55:05 CDT 2004]

This is sort of humorous. Do you know the way people recommend that you run format c: sometimes? Everybody also knows the one about running rm -rf / on a UNIX or Linux system. Well, the thing is that someone took the time to test the damage caused by deleting all files on a running Windows and Linux system. The results are quite interesting, I think. Here are the conclusions:

So what did I learn? Ubuntu's default file permissions and user accounts are much more mature then Windows XPs, NTFS is much slower then EXT3 (or whatever the default Ubuntu FS is) when it comes to unlinking files (alternatively, it could be the fault of the del command), and Windows file locking, while usually annoying, allowed the system to be shut down normally even after the file system was mostly destroyed.

The default install of Windows does not prompt for a password to be created for the primary user. It also is set to auto-logon, even after Service Pack 2 is installed. This means, for a default install of Windows, anyone can walk up, type "del /F /S /Q *", and your system will be hosed. Ubuntu, like most Linux distributions, set up a password for the primary user right away during installation. The primary user's password is required to do anything beyond the single users's environment, so why a user may screw up their own files, they won't destroy the system (which may contain tools which allow them to recover their files).

Deleting files in Windows was a painfully slow process. I sat and watched as every file from every folder was confirmed as deleted, or an error message was printed informing me a file couldn't be deleted. After the thousands of files that were removed, it would have been quite a chore to go back through that output and parse which files could not have been deleted. In Linux, however, deleting files was a snappy process, and the only output I received were the few files that could not be removed.

{link to this story}

[Fri Oct 29 10:37:03 CDT 2004]

Now, I acknowledge Linux users all too often exaggerate how secure our operating system is. It certainly is safer than Windows, that is for sure, but one still has to keep an eye on the script kiddies out there. In any case, the attitude some Windows users are taking lately about the announcement of Red Hat malware is just plain silly. For starters, as someone points out in the comments section, it does bother me when Windows people keep referring to their own systems as "PCs" in order to distinguish them from Linux and other operating systems. Excuse me, but PC is the architecture, and it is completely independent from the operating system running on it, which could be Windows, Linux, BSD or many other products. Most important though, the fake Red Hat security advisory they are referring to is just one more example of phishing. We have been exposed to this type of scam for quite some time now, and it should not surprise anyone. It certainly does not point to any problem whatsoever with how secure an OS is, but it rather takes advantage of good old social engineering to convince users that they should install some piece of software that will introduce a trojan horse or a virus on the system. {link to this story}

[Thu Oct 28 17:35:43 CDT 2004]

The last couple of days have been great for SGI (disclaimer: I work for SGI). On Tuesday, it was announced that SGI's Altix supercomputer at NASA had pushed the company to a leadership position in the supercomputer race with 42.7 trillion calculations per second (i.e., 42.7 teraflops) in spite of the fact that it was not even using all its 10,240 processors yet. In other words, SGI had overcome IBM's Blue Gene and earned a top spot on the Top 500 list. But the best was still to come, since on the very same day the same machine managed to perform 51.9 trillion calculations per second. Do not feel bad if you think these numbers mean little to you out of sheer magnitude. Yes, they are indeed large. In any case, these news come to prove that Silicon Graphics have been able not only to smoothly switch to Linux but it has also done so building the most efficient supercomputers in the world. Suffice to say that before the Altix came on stage, SGI had never been on the top spot of the 500 most powerful supercomputers on earth. Not only is SGI proving itself, but it is also proving to the naysayers that Linux is a very powerful, scalable and flexible operating system that can give other Unices a run for their money. And now, to top it all off, the company is still about to release an upgrade to its high-end Linux servers. I think there is good reason to be happy here if you are an open source fan. {link to this story}

[Thu Oct 28 16:19:54 CDT 2004]

Linux Times publishes an interview with Linus Torvalds where, for a change, he is not asked simple questions about how he felt when he wrote the Linux kernel for the first time or why he does not appear to care about being the richest man on earth. Among other useful advise:

Nobody should start to undertake a large project. You start with a small _trivial_ project, and you should never expect it to get large. If you do, you'll just overdesign and generally think it is more important than it likely is at that stage. Or worse, you might be scared away by the sheer size of the work you envision.

So start small, and think about the details. Don't think about some big picture and fancy design. If it doesn't solve some fairly immediate need, it's almost certainly over-designed. And don't expect people to jump in and help you. That's not how these things work. You need to get something half-way _useful_ first, and then others will say "hey, that _almost_ works for me", and they'll get involved in the project.

And if there is anything I've learnt from Linux, it's that projects have a life of their own, and you should _not_ try to enforce your "vision" too strongly on them. Most often you're wrong anyway, and if you're not flexible and willing to take input from others (and willing to change direction when it turned out your vision was flawed), you'll never get anything good done

{link to this story}

[Wed Oct 20 17:47:52 CDT 2004]

Inspired by an article published on Brett McLaughlin's blog, I had a chance to read today an exchange of views on The State of Java published by the O'Reilly Network. McLaughlin expresses his worries about the current state of Java and its community:

... there's just something sort of boring about the Java space right now. Don't get me wrong —it's still my language of choice. It's still what I go for when I need something done, and I still have something like 3 or 4 Java books for every one of my other programming language books. But... it's not sexy anymore. It's like that hot girl in high school you knew, and then ran into the other day at Kroger, and thought... "Wow. What happened? I mean, she's still good-looking, but not like I remember!" Probably a sort of poor analogy, but it's the best one I've got.

So what's the deal? I mean, there is still juice around Java. I think of things like Groovy, which are cool (although I'm not the biggest fan of another scripting language)--but that's not really Java. It's sort of ancillary, right? And portlets look sort of cool... but nothing like servlets. (I'm sure I'm going to get all sorts of comments about how this project or that project is indeed super-cool.)

But, it's just not exciting. I'm written two books this year, and only one about Java. And I'll probably write two or three next year, and I bet only one will about Java. It's just not as cool--just not as fun. Someone, please, for God's sake, bring back Java Joy ;-) Give me the equivalent of a Head First book (and what it did for publishing) in the technology space. I need something mind-blowing...

(...)

Java a fad? Nope. But it sort of feels like it right now. And —really— who wants to be the last kid on the block still playing hula hoop?

It does indeed feel that way, right? There was plenty of excitement about Java technologies back in the mid-1990s, a lot of talk about the write once, run everywhere idea, and people liked to drop the term revolution here and there. In contrast, today most people think of Java as a boring server-side technology to be used in the backend, and mainly for business transactions and the like. Come on, let's be real! Who is doing anything cool, exciting, with Java these days? As a few point out, there is the risk that C# and .Net end up winning the war at the end of the day. They sure are coming up with far more interesting and exciting tools and projects these days. Heck, at least there is life over there! Sun had better do something quick to revive Java unless they are OK with seeing their cherished technology die a slow death. Perhaps it's time to let go. {link to this story}

[Tue Oct 19 13:18:53 CDT 2004]

Apparently, Apple is speaking from both sides of the mouth when it comes to its allegiance to open source. It now appears as if new employees are "strongly discouraged" to work on free software projects even during their own pesonal time, or at least that has been the experience of Ruby-GNOME2 maintainer Laurent Sansonetti. Some people are already calling this FUD, but the fact remains that it is absolutely true. According to Apple's legal department, free software in general is considered a "competitive product" and therefore employees are prevented from working on any of them. I can sort of see their point, but they should already realize how much they are benefitting from the open source community. On the other hand, I suppose all those Red Hat haters out there deserve to see more news like this, just to understand how open source friendly the guys from Raleigh are after all. {link to this story}

[Mon Oct 18 09:57:35 CDT 2004]

I read with some sadness that AOL may have pulled the plug on Netscape's DevEdge. I suppose many people out there never heard of this website, but it was an excellent resource for web developers that Netscape started back in its days of glory. DevEdge provided lots of technical information on HTML and JavaScript, mainly. Let's see what happens. Perhaps AOL or somebody else will decide to publish the contents online somewhere else. According to one of the comments on MozillaNews though an archived copy of the website is available. {link to this story}

[Sat Oct 16 09:39:35 CDT 2004]

Apparently, Bill Gates has blamed the security holes affecting his operating system on third-party products during an interview with USA Today. Interesting that he says that when not so long ago we were briefed about a JPEG virus that only affected his own operating systems. Perhaps I was wrong when I thought I was viewing JPEG files on my Linux box. They must have been some other "third-party software". Of course viruses come from "third-party software", Mr. Gates. That's by definition. One would hope that Microsoft doesn't release any product with viruses to the market. The issue is whether or not your operating system is an easy prey for the viruses and attackers who will always be out there. Incidentally, this brings up another issue: Gates has repeatedly said the mantra that what really matters is applications, applications, applications. In other words, installing an operating system such as Linux or FreeBSD that cannot run so many mainstream applications is, in his view, useless. On the other hand though, he doesn't seem to think twice before blaming "third-party software" for viruses. So, which is the answer then, Mr. Gates? One cannot have it both ways. I find this behavior especially hypocritical coming from the guys who in the past used Debian as the Linux distribution of reference times to compare the amount of security advisories between Linux and Windows several times, knowing fully well that Debian has the largest repository of free software out there and they could count not only holes on the operating system and key components of the solution but also on every single game out there. {link to this story}

[Thu Oct 14 12:46:45 CDT 2004]

Good old Netscape turns 10 today. The original browser, based on Mosaic, is nearly dead by now, but its direct offsprings (Mozilla, Firefox, Galeon and many others) are not only alive and kicking but also giving Microsoft Internet Explorer a run for its money. By the way, apparently quite a few Netscapees ended up working for a company called LiveOps with an innovative product to manage distributed call centers. {link to this story}

[Thu Oct 14 12:42:02 CDT 2004]

Google has released the Google Desktop Search today. For the time being, it only runs on Windows XP or Windows 2000, so I couldn't give it a try. It sounds quite interesting though. Apparently, it allows you to search not only the Web, but also local files on your hard drive, and even email and conversations from AOL Instant Messenger. I don't see why this would be specially difficult to implement in Linux, so sooner or later we should also see it on my favorite platform. {link to this story}

[Tue Oct 12 12:59:26 CDT 2004]

I just came across a very interesting course while perusing through the entries published on Planet Debian: apparently, the Swedish Göteborg University offers a free course on Open Source/Free Software: Philosophy & Theory as a part of their Programme in System Analysis/Information Systems. All it takes is for anybody interested to send his or her CV before October 29th. Sort of interesting that more and more institutions are taking this approach. If you are interested in other free offerings out there, check out I Can Program and MIT's Open Courseware. {link to this story}

[Tue Oct 12 12:49:48 CDT 2004]

Benjamin Drieu wrote a humorous but quite interesting piece on his blog classifying the different species of bug submitters for us. He includes categories such as the enthusiast, the anonnymous clueless submitter, the rigorous, the clever, the battler, the badger ("this one won't let you breath until you fix his bug, even if there is NO bug"), the Taiwanese (this is the one who gets all pissed off when the programmer includes Taiwan as a province of China, even though that is precisely what the ISO 3166 standard establishes), and the patcher. It's a good read. It will make you smile. {link to this story}

[Sat Oct 9 15:02:57 CDT 2004]

eWeek publishes an article about the latest efforts by the tape industry to fight back the disk challenge. I'm sorry, but it all sounds quite lame: larger storage capabilities and backwards compatibility with older tape formats. Just in case it wasn't clear enough that the tape is going the way of the dodo, the industry is also using good old vaporware in an attempt to stop customers from switching to disk-based solutions en masse:

Over the next decade, the DLT-S systems are expected to enable more than 10TB per cartridge and match the transfer speed of NAS (network-attached storage). The DLT-V line is expected to deliver multiterabyte devices for less than $1,000.
As if disk-based solutions are going to stay still "the next decade". I must acknowledge I never used tape much, and am therefore not very familiar with it. However, it's just too difficult to believe this technology has a bright future at all, especially in a world where the network is so ubiquitous. {link to this story}

[Sat Oct 9 14:40:17 CDT 2004]

Hackers also have a sense of humor. The Linux kernel has this peculiar feature that every now and then turns around and happens to bite its owner. It is called the OOM killer, and it is directly linked to another feature, namely the fact that the kernel can overcommit memory. Now, why write a kernel that can overcommit memory? Isn't it a little bit strange, the idea of committing more memory than is actually available on the system? Well, yes and no. The reality is that most applications will claim more memory at launching time than they will actually use afterwards. Call it defensive programming, if you want to, but the fact is that most programmers do it. They claim X amount of memory just in case, even though their application will actually use X-n in the end. As a result, Linux truly optimizes the utilization of memory which is never wasted in a process that will not use it, and other operating systems also take a similar approach (IRIX, for example, uses the concept of virtual swap). Of course, the problem arises when the application does indeed make use of all the memory that claimed. At that point, the kernel has to make a hard decision about what process should be simply killed in order to free up some memory. Enter the OOM killer, whose name stands for Out Of Memory killer. Lots of threads in the Linux kernel mailing list have been dedicated to discuss the best algorithm to decide which process should be killed in those situations, and recently Thomas Habets submitted a patch called the oom_pardon patch to allow a system administrator to exempt certain processes from being included in the list of processes that can be killed by the OOM killer. Habets sat down to write the patch after he noticed that in one such situation, his screen locking program had been killed in order to free up some memory, therefore creating a very insecure situation where his session was left out totally unprotected (by the way, SuSE apparently wrote a similar patch that I wonder if it was ever contributed to Linus' kernel). In any case, the thing is that Andries Brouwer got into the discussion with a somehow humoristic story that synthesizes the problem pretty well:

An aircraft company discovered that it was cheaper to fly its planes with less fuel on board. The planes would be lighter and use less fuel and money was saved. On rare occasions however the amount of fuel was insufficient, and the plane would crash. This problem was solved by the engineers of the company by the development of a special OOF (out-of-fuel) mechanism. In emergency cases a passenger was selected and thrown out of the plane. (When necessary, the procedure was repeated.) A large body of theory was developed and many publications were devoted to the problem of properly selecting the victim to be ejected. Should the victim be chosen at random? Or should one choose the heaviest person? Or the oldest? Should passengers pay in order not to be ejected, so that the victim would be the poorest on board? And if for example the heaviest person was chosen, should there be a special exception in case that was the pilot? Should first class passengers be exempted? Now that the OOF mechanism existed, it would be activated every now and then, and eject passengers even when there was no fuel shortage. The engineers are still studying precisely how this malfunction is caused.
{link to this story}

[Sat Oct 9 13:23:29 CDT 2004]

Ping Wales publishes an article with Alan Cox's suggestions to write better software. Cox is about to finish his MBA at Swansea University, and has apparently spent some time lately thinking about what the software world could learn from the business world when it comes to quality assurance. The source of the problem, according to him, appears to be that:

When software doesn't work the way it should, it's easy and cheap to ship an upgrade or a patch to the users, who are then inclined to accept buggy software as the normal state of affairs, Cox said.

Even though there has been a movement for some time to introduce traditional engineering concepts such as quality assurance to software development, Cox sees today's software engineering as "the art of writing large bad programs rather than small bad programs".

Boy, does this remind me of the origins of Netscape, perhaps the first company to release extremely buggy software and convince its users to do all the beta testing for them. Mind you, this approach also has its strengths: since the software is out on the street sooner, at least one can use it already; also, users and customers can provide feedback before the product is completely finished and locked. In any case, Cox talks about a few practices that should be common sense by now: firewalling by default, execute-only and read-only memory, garbage collection, type safety, validation tools, tainting... {link to this story}

[Thu Oct 7 13:28:25 CDT 2004]

Lars Wirzenius tells us how to configure Apache to not allow viewing of images if they are linked directly from an external site, therefore preventing image theft. The complete instructions can be found at Ken Coar's Apache Server Resources website, but it basically boils down to adding the following to your server's configuration files:

SetEnvIfNoCase Referer "^http://liw\.iki\.fi/" local_ref=1
SetEnvIfNoCase Referer "^$" local_ref=1
<FilesMatch "\.(gif|jpg|png)">
Order Allow,Deny
Allow from env=local_ref
</FilesMatch>
{link to this story}

[Sun Oct 3 16:29:44 CDT 2004]

I just came across the Agile Manifesto , which has been the talk of the city for quite a while now. It is, nevertheless, a topic I never considered in this blog as far as I can remember. I believe the first time I read about it was in the pages of Software Development magazine about a couple of years ago or so, and my first impression was that most of the tenents of agile development were just common sense, although unfortunately it is often necessary to emphasize precisely that, common sense, in this crazy, crazy world of ours. I also remember that my second thought on this manifesto was that quite a few of its principles apply equally well to management. So, let us pause for a while and take a look at some of the main principles of agile development:

Our highest priority is to satisfy the customer through early and continuous delivery of valuable software.

(...)

Deliver working software frequently, from a couple of weeks to a couple of months, with a preference to the shorter timescale.

(...)

Working software is the primary measure of progress.

Open source anyone? Actually, I find it hard to believe agile development would have ever been proposed without the previous success of open source, whose influence seems obvious at least on these principles I just listed above. However, as I explained, it can also be applied to management:

Welcome changing requirements, even late in development. Agile processes harness change for the customer's competitive advantage.

(...)

Business people and developers must work together daily throughout the project.

(...)

Build projects around motivated individuals. Give them the environment and support they need, and trust them to get the job done.

The most efficient and effective method of conveying information to and within a development team is face-to-face conversation.

(...)

The best architectures, requirements, and designs emerge from self-organizing teams.

In summary, agile development appears to offer a set of principles that synthesize what is needed to manage not only a software project these days but, as a matter of fact, any team whatsoever. In this world of hiperindividualistic comsumers who inevitably take their habits to the workplace, the old hard leadership model is not good anymore. It just does not work. Nobody wants to be treated like a member of the herd. Besides, things change way too fast for anyone to pretend that he or she has the vision needed to succeed. Flexibility is the name of the game these days, and we'd better get used to it sooner than later. {link to this story}