[Mon Jan 29 15:40:29 CET 2007]

Some big news were made public during the weekend, at least as far as the technology world is concerned. Intel and IBM revealed a huge leap forward in transistor design that could change the world of computing in a few years.

In dueling announcements, Intel Corp. and International Business Machines Corp. separately say they have solved a puzzle perplexing the semiconductor industry about how to reduce energy loss in microchip transistors as the technology shrinks to the atomic scale.

Each company said it has devised a way to replace problematic but vital materials in the transistors of computer chips that have begun leaking too much electric current as the circuitry on those chips gets smaller.

Technology experts said it's the most dramatic overhaul of transistor technology for computer chips since the 1960s and is crucial in allowing semiconductor companies to continue making ever-smaller devices that are also energy-efficient.

The reason why this is so exciting is because it may dispel the idea that Moore's Law is about to run out of wind. {link to this story}

[Mon Jan 29 09:07:09 CET 2007]

A quick roundup of technology news this morning. First of all, linux howtos published an article on Linux DVD authoring tools, including some pretty decent pieces of software, like ManDVD. It looks to me as if the author is clearly biased in favor of KDE, but the article is good nonetheless.

Mark Russinovich, a well known specialist in Windows internals, has started a series on the new Windows Vista kernel. This first part centers on scheduling and I/O operations. By the way, I find it quite humorous that Vista now includes "file-based symbolic links". Wow! Yet another Earth-shattering innovation coming out of Redmond.

And, talking about innovation, ITWire has gathered rumors about the features to be included in the next version of MacOS X: full integration with the recently announced iPhone, iChat may include capabilities to record video and also to be used as a video answering machine, the terminal will be revamped, improved handwriting recognition (leading to the speculation that Apple may release its own tablet, etc. Incidentally, there is another pice on the same website about migrating to Windows Vista that is well worth a quick read.

A couple of months ago, I listened enviously as a colleague described his experience of upgrading from his five year old Mac to a new Intel based Mac.

"I simply connected the two Macs together and up popped a box on the screen which asked me if I wanted to upgrade to the new machine", he said. "I answered yes and then lay down on my couch. A couple of hours later, all my applications and data that was on my old Mac was now on my new Mac. Everything was the same except that on the new machine it ran about five times as fast".

How's that for innovation, both Microsoft and Linux? {link to this story}

[Fri Jan 26 11:09:38 CET 2007]

Here's the problem I have with all those paeans to the 3-D desktop one can read about lately. This morning, I came across a document boosting Mandriva's new 3-D capabilities that includes a short video displaying some of the features. The one they call the two-fold desktop I find particularly interesting, since it doesn't appear to offer anything different than the already existing capability on any window manager to copy text and then paste it with a single click of your mouse, provided you configure things (like I always do) so that the focus follows your mouse without the need for yet another click to select the window. Sure, the folding effect looks spiffy and even elegant. I will concede that much. Still, as a feature, it doesn't offer anything that doesn't exist already. While convinced that, sooner or later, a form of 3-D interface (or perhaps even an inmersive reality) will triumph, it's also patently obvious to me that they will need to offer much more than just a spiffy version of an already existing feature. It looks to me as if 3-D desktops are just waiting for someone to think outside the box and come up with something truly revolutionary. Until that happens, I will continue using the good old 2-D desktop. {link to this story}

[Fri Jan 26 11:01:33 CET 2007]

In the course of an article about Windows Vista and Linux on the desktop, John Dvorak gives us a list of applications that, supposedly, customers told Novell they missed on the Linux desktop:

  1. Photoshop
  2. AutoCAD
  3. Dreamweaver
  4. iTunes
  5. Macromedia Studio
  6. Flash
  7. Quicken
  8. Visio
  9. QuickBooks
  10. Lotus Notes

No big surprises there with the exception perhaps of MS Office, mentioned by Dvorak in his notes. Interestingly enough, many of the apps belong to Adobe, not precisely a very big fan of open source. {link to this story}

[Fri Jan 26 11:00:10 CET 2007]

Some light-hearted humor. Apparently, Ubuntu Christian Edition wasn't enough for the Ubuntu folks, so someone in the community has now released Ubuntu Satanic Edition, codenamed Evil Edgy. {link to this story}

[Thu Jan 25 11:35:50 CET 2007]

Yet another interesting project I just found about: OSSwin allows you access to all sorts of open source software for Windows. {link to this story}

[Wed Jan 24 18:09:06 CET 2007]

Wow! It turned out Sun returned to profitability after all!

Sun's net income for the quarter, the second of its fiscal year, was $126 million, or 3 cents a share, in contrast to a net loss of $223 million, or 7 cents a share, for the period last year. The latest result beat analyst's forecasts.

Revenue was $3.6 billion, an increase of 7 percent over the $3.3 billion reported for the second quarter last year. Sun attributed the growth to server sales, as well as strong acceptance of Solaris, Sun's computer operating system. Revenue in Sun's computer systems products unit increased 14 percent, the fourth consecutive quarter of year-over-year revenue growth, the company said.

{link to this story}

[Wed Jan 24 13:10:53 CET 2007]

Just came across a short piece where Ray Kurzweil gives some tips to inventors:

  • Watch for "false pretenders": an upstarts threatens to eclipse older technology but the new technology misses key elements, so its failure reinforces belief by technology conservatives that the old technology will live forever. But new versions of upstart technology can disrupt the old technology. Case in point: electronic books, which failed for now, but will succeed: "books will be obsolete before the decade is out".
  • Fantasize that you're giving a speech years from now and you're explaining how you solved the problem.
  • Create devoted passionate teams and encourage open and wide communication.
  • Organize project milestones around demos.
  • Write the Advertising Brochure first (forces you to articulate features and benefits) and recruit the beneficiaries (users) to help invent the technology.
  • Invent while you sleep, using lucid dreaming.

Apparently, by lucid dreaming he means:

Dreaming while one is conscious of the fact that one is dreaming, thus allowing some measure of rational thought into the random ladscape of involuntary dream images. A state in which one may make discoveries or find solutions to problems.

ThinkQuest has more information about the topic here.

In any case, I agree with Kurzweil on the future of electronic books. I don't think the e-book will completely kill the traditional book form, at least during a long transitional period, but there's no doubt in my mind that within a few years someone will discover the way to make it work in such a way that the technology is flexible and comfortable enough to replace many paper publications. It'd just make way easier things like sharing information, exchanging books, updating them, taking notes, uploading them to another device, expanding the information by including links to other (perhaps online) sources, etc. The future belongs to small devices that can interact with each other in order to expand our personal range, even though I'm also convinced we'll still keep some form of larger computer at home and at the office to allow us to plug or dock into it. {link to this story}

[Wed Jan 24 11:04:28 CET 2007]

I read in KernelTrap that Robert Day has proposed to tag code as deprecated or obsolete in the main Linux kernel tree. It was about time to clarify things in this respect. The patch includes the definitions:

Code that is tagged as "deprecated" is officially still available for use but will typically have already been scheduled for removal at some point, so it's in your best interest to start looking for an alternative.

Code that is tagged as "obsolete" is officially no longer supported and shuldn't play a part in any normal build but those features might still be available if you absolutely need access to them. You are *strongly* discouraged from continuing to depend on obsolete code on an ongoing, long-term basis.

The kernel couldn't go on being littered with lots of code that is barely used anymore, especially taking into account how much it has been growing in size in the last few years. The proposed changes are just normal business in most software companies. {link to this story}

[Wed Jan 24 08:46:58 CET 2007]

Some geek humor. Linux kernel developer Ted Tso sent an email to other programmers announcing the 2007 Kernel Summit for which he also had to set up a discussion mailing list. So, in good geek fashion, and in order to avoid spammers, instead of publishing the actual email address to send the messages to, he wrote a quick Perl script that, when run, will spit out the real address:

#!/usr/bin/perl
$at="@"; 
$AD=(gmtime(time))[5]+1900;
print "ksummit-" . $AD . "-discuss" . $at . "thunk.org\n";
print "http://thunk.org/mailman/listinfo/ksummit-" . $AD . "-discuss\n";

Gotta love it! {link to this story}

[Tue Jan 23 10:16:43 CET 2007]

Playing With Wire published a review of Gentoo titled Why Gentoo Shouldn't be on Your Server that, to me, applies just as well to other distributions, such as Fedora, although to a lesser extent.

If all you're concerned with is keeping your web server up, what you usually want to do is set up a stable system and then forget about it. You install security updates as needed but that's it. With Gentoo, this isn't really feasible because there is no "stable" Gentoo release.

What's worse, there will on occasion be a sort of "system update". This is called a new "profile". The Gentoo documentation and the handbook will at this time encourage you to update to this new profile. A profile update will try to replace your basic system. If you are a system administrator, rather than a desktop user, this should be enough to scare the living daylights out of you!

I have to agree. I've always seen Gentoo as a hobbyist's distribution for those people who want to tinker with Linux on the desktop, and nothing more. Those who just want an OS "that runs" are much better served by other distros, including Fedora, which I'd never use on a server either. {link to this story}

[Mon Jan 22 15:36:12 CET 2007]

Just a couple of new developments I came across of while checking Slashdot (yeah, in spite of all the noise, one can still read interesting news on the website... it's just like the Net itself!): Ubuntu Studio, a multimedia derivative of Ubuntu aimed at the audio, video and graphics enthusiasts; and Rosetta Code, a tool that "translates" from one programming language into another. {link to this story}

[Mon Jan 22 11:09:40 CET 2007]

Samba's Jeremy Allison writes about a recent visit to the Google campus and, along the way, tells us what he thinks about Microsoft's dirty (as usual) tricks to unseat the king of the search engines:

I once asked some Google people what they would do when Microsoft Vista, the next version of Windows, was released upon which people would strangely find it difficult to get to google.com anymore. In my naïveté I thought Microsoft would have to use some technical obfuscation to prevent people from getting to google.com. It turns out all they had to do was to change the default search page to point at their own search engine, MSN Search, and people wouldn't know how to change the default. Microsoft never lost money underestimating the intelligence of their users. I didn't get a good answer from the Googlers, and now I know what they had decided to do about it. Google recently complained to the USA Department of Justice about Microsoft's actions in Windows Vista, and their complaints were summarily dismissed. This is the same Department of Justice which snatched defeat from the jaws of victory in the Microsoft vs. the USA anti-trust case, so it wasn't a surprise. I did think Google might have something a little smarter up their sleeve though. Maybe they only hire technical, not legal PhDs.

But it's very hard to compete with a series of warring tribes, who consider violations of anti-trust law just one of the trivial costs of doing business, who have a war chest large enough to go a year without revenue, and who donate far more to the US Republican party than Google does.

I really hope Google makes it though, and they don't turn out to be a one-trick Internet Advertising pony. The people there are so nice they deserve betten than that. And where will I get a decent free lunch without them?

It's back to square one with the anti-trust case. Once more, Microsoft is playing dirty and the Government doesn't seem to care. I wouldn't be as strong as Allison and blame it on "the intellifence of the users" but it certainly is a trick that Microsoft can pull out of its hat only because the overall level of computer literacy is extremely low even in developed countries. As an example, I recently had to help a relative of mine to set up a network connection via DSL, and to my surprise she was still at the level where "email" was the same as "Outlook" and "Internet" was no different than "Internet Explorer", of course. Not to talk about an integrated office suite program, of course. I just wonder how many people there are like that. Microsoft won, and we are all to blame for that- {link to this story}

[Fri Jan 19 13:05:01 CET 2007]

Now, this is an interesting feature! Nadia Derbey recently posted a set of patches to the lkml titled Automatic Kernel Tunables or AKT in order to introduce "a feature that makes the kernel automatically change the tunable values as it sees resources running out". According to the rest of the description on KernelTrap:

The default automatic adjustment routine provided by the patches simply allow a tunable to be configured with a minimum and maximum values, as well as a thresholds. If a monitored value grows beyond the defined threshold, the tunable is increased. If the monitored values shrinks below the defined threshold, the tunable is increased. The patches also allow more complicated adjustment routines to be defined. The effort is part of the larger libtune project, aiming "at providing a standard API to unify the various ways Linux developers have to access kernel tunables, system information, resource consumptions".

{link to this story}

[Thu Jan 18 11:48:43 CET 2007]

I just came across a short review of Dreamlinux 2.2, a LiveCD Linux distribution based on Debian and sporting an XFCE desktop clearly reminiscent of the MacOS X GUI. The looks are definitely good. {link to this story}

[Thu Jan 18 11:28:16 CET 2007]

Matt Hartley writes a piece in OS Weekly about the WebOS concepto, and I have to admit that his criticism sort of makes sense. He acknowledges that products such as EyeOS have improved a lot in the last few years, but still sees some problems that are inherent to the concept itself:

The biggest problem with online OSes is also their biggest attraction — it's exclusively available online. Outside of needing to access your data on the go from remote, public computers, I simply fail to see the advantage or reasoning behind putting a lot of stock into this concepto. Cool to use. But to me, it just seems more logical to offer the applications themselves much like Linspire and Google have been doing over the past couple of years.

Let's face it: most users truly care about the applications more than the OS their systems are running on, just the same most car drivers care more about the particular characteristics of a model rather than the type of gas it uses. Sure, there are exceptions to the rule, but they are rarely powerful enough to make much of a difference, at least when it comes to regular home use (I don't doubt for a moment that things are a bit different when it comes to workstations and servers, of course). However, Hartley himself admits at the end of his article that there is always some room for improvement, and things may change soon:

Things progress and developers are jumping onboard by the boar load. At the end of the day, what improvements have been made to the productivity or even the level of communication quested after by the target audience? Some perhaps. Maybe the web based operating system will act as yet another form of social media, working to bring collaboration and community to those who have yet to be bit by the Web 2.0 bug.

As wild as it sounds, I believe that the collaboration and community conceptos for web based operating systems is going to be the best sales pitch these online outfits have. To some this may sound a little pessimistic, but to users, including myself, it's a tough sale no matter how you cut it.

{link to this story}

[Tue Jan 16 16:29:52 CET 2007]

Checking the Linux-related news today, I came across an entry on Ian Murdock's blog about the importance of backwards compatibility. In order to illustrate his position, he quotes an old entry from Joel Spolsky's blog (from 2004) referring to a Microsoft engineer (Raymond Chen) who tries to justify the importance of backwards compatibility in Windows development:

The most impressive things to read on Raymond's weblog are the stories of the incredible efforts the Windows team has made over the years to support backwards compatibility: "Look at the scenario from the customer's standpoint. You bought programs X, Y and Z. You then upgraded to Windows XP. Your computer now crashes randomly, and program Z doesn't work at all. You're going to tell your friends, 'Don't upgrade to Windows XP. It crashes randomly, and it's not compatible with program Z'. Are you going to debug your system to determine that program X is causing the crashes, and that program Z doesn't work because it is using undocumented window messages? Of course not. You're going to return the Windows XP box for a refund. (You bought programs X, Y, and Z some months ago. The 30-day return policy no longer applies to them. The only thing you can return is Windows XP)".

I first heard about this from one of the developers of the hit game SimCity, who told me that there was a critical bug in his application: it used memory right after freeing it, a major no-no that happened to work OK on DOS but would not work under Windows where memory that is freed is likely to be snatched up by another running application right away. The testers on the Windows team were going through various popular applications, testing them to make sure they worked OK, but SimCity kept crashing. They reported this to the Windows developers, who disassembled SimCity, stepped through it in a debugger, found the bug, and added special code that checked if SimCity was running, and if it did, ran the memory allocator in a special mode in which you could still use memory after freeing it.

This was not an unusual case. The Windows testing team is huge and one of their most important responsibilities is guaranteeing that everyone can safely upgrade their operating system, no matter what applications they have installed, and those applications will continue to run, even if those applications do bad things or use undocumented functions or rely on buggy behavior that happens to be buggy in Windows n but is no longer buggy in Windows n+1...

Murdock goes on to rant against the "revulsion" he can already feel among his readers, and reminds us how the shelves at the nearest Best Buy or CompUSA are stacked with Windows software, and not Linux or Mac OS X boxes. Fair enough. However, Murdock is jumping to conclusions here, and assuming that the main reason why Windows software is more popular is because Microsoft bends over in order to make sure that its operating systems are always backwards compatible. Incidentally, he is also (conveniently) ignoring that there are plenty of applications that still fail in newer versions of the Windows OS as soon as one upgrades, and Windows XP (which he refers to in his examples) is very well known for this too. Still, he misses the most important point though: an operating system that includes code nothing less than into the memory allocator itself to cater to a single broken application (and a game at that!) should indeed be distrusted by any software engineer (and, more importantly, by any half intelligent user out there). I'm sorry but in a world where software security is quickly becoming a vital issue, this sort of business-driven behavior cannot be accepted. Any wonder that the Windows OS then has so many million lines of code? How many lines are completely useless, like the one Murdock is talking about? How long do they keep this "backwards compatibility" in place in order to cater to a particular market that is more worried about running a game than about making sure their systems are not choke-full of trojan horses spamming half the world? {link to this story}

[Tue Jan 16 12:10:18 CET 2007]

Much has been written over the past few months about the 100 dollar laptop, brainchild of the One Laptop Per Child (OLPC) foundation and MIT's Nicholas Negroponte. The idea was to build a cheap laptop that could be given to kids in developing nations so that they could jump onto the bandwagon of the information highway. It was also stressed since the very beginning that these laptops wouldn't be sold commercially in already developed countries. However, BBC recently published that the so-called Children's Machine may be sold to the public in wealthy nations after all. There's a small gotcha that I sort of liked: the people who want to purchase these laptops will have to buy them in pairs, and one of them will be automatically sent to a child in a developing country. The idea sounds pretty good to me. {link to this story}

[Tue Jan 16 12:05:58 CET 2007]

I just had to modify a search form I wrote for internal use at my company, since someone requested that I made the necessary changes to make the mouse focus go directly to the search box in the browser. While looking around for documentation on web forms, I found a good tutorial published by HTMLSource on Forms Accessibility that is well worth noting here. {link to this story}

[Tue Jan 16 11:18:49 CET 2007]

I'm subscribed to the Os-book-list (linked to Silberschatz and Galvin's Operating System Concepts book), where one of the subscribers recently asked for help about LiveCD Linux distributions he could use to learn about operating systems and programming in general. Someone replied with a link to The LiveCD List. It's a great resource for anyone interested in this particular way to test distributions. {link to this story}

[Mon Jan 15 10:36:32 CET 2007]

I'm so glad I switched to Debian a few years ago, back when Red Hat decided to discontinue its low-end distribution. Linux Weekly News recently reproduced their official announcement that Debian GNU/Linux 3.0 (woody) is finally being archived a whooping four and a half years after it was released and a year after the new release (sarge) has been out. While some people criticize Debian's slow release process (and, to be honest, I can see how it has a serious impact on desktop users), I'm just thrilled that I found a distribution that doesn't force me to upgrade my home server roughly once a year just to keep up. Debian allows me to set up the server and pretty much forget about it for close to five years. All I have to do is concentrate on the actual sysadmin tasks and, of course, keep up with the security fixes, which is easily accomplished with apt-get.

At the time when Red Hat made the controversial decision to discontinue its low-end distriution (which I had run, and dutily paid for, since the days of Red Hat 4.2), Fedora was the only Red Hat sanctioned alternative they left. I tried using it for a while, at least on my desktop systems. For my servers, I knew since the very beginning I didn't want to bother with yearly upgrades. Nevertheless, my main worries about Fedora were of a different nature: first of all, I had no guarantee whatsoever that Red Hat might not pull out more rabbits from their hat whenever it suited them; second, it was clear to me that Fedora was not a fully fledged community distro, in the sense that Red Hat still called the shots, which made me feel naturally distrustful; and, finally, I couldn't trust the Fedora Legacy Project with their pie in the sky plans to support every single old release in sight. Well, if the previous two reasons have been widely supported by the facts, the third one was just confirmed before the end of the year, as explained by InternetNews in a piece titled Fedora's Legacy Wanes:

The Fedora Legacy project is in "transition" and is closing its doors. Effective this week the project is no longer supporting Fedora Core 4 and earlier distributions.

"The current model for supporting maintenance distributions is being re-examined", the Fedora Legacy wiki states. "In the meantime, we are unable to extend support to older Fedora Core releases as we had planned".

(...)

Across more than several dozen mailing lists and a few blog entires, members of the Fedora Legacy community have been discussing their future (or lack thereof) for several weeks, and it looks like the decision is now final.

Fedora Legacy has essentially been dead for nearly a month already. In response to an inquiry about security updates for Fedroa Core 4 not being available via Fedora Legacy, Red Hat developer Florain La Roche wrote on Nov. 17 that interest in Fedora Legacy has slowed down.

Turns out La Roche's comment was a dramatic understatement.

(...)

Currently a typical Fedora Core release will receive approximately nine months of support from the main Fedora Project and up to two years (in the case of Fedora Core 3) on Fedora Legacy. Red Hat developers have been discussing extending core support for Fedora releases to 13 months and merging the Legacy efforts into the core project.

(...)

The rise of Centos, which is a clone of Red Hat Enterprise Linux, is partially to blame, as is the lack of participacion in the project.

If I were heavily invested into Red Hat (which I was indeed at the time when they made the decision, until I realized how important diversity is), I might prefer to run Centos. However, even in that case one should carefully consider how long it will be before Red Hat launches legal action of some form against Centos. Yes, I know, as long as they don't infringe upon Red Hat's copyright and public image open source allows for this type of cloning. Still, lawyers are pretty good at coming up with intricate arguments in front of a court of law. That's what they get paid for. Also, there could be other ways to make life difficult to the Centos folks without even taking legal action. I sleep much better since I switched to Debian for my servers and Ubuntu for my desktop systems. {link to this story}

[Fri Jan 12 17:09:07 CET 2007]

Today, I read a review of Elive that truly got me excited about the idea of trying a different window manager again. I have been running GNOME on Ubuntu for quite sometime now, and I'm quite happy with it. I know, it doesn't have nearly as many configurable options as KDE, but that's precisely why I like it. To tell the truth, I find KDE to be way too convoluted and overkill, especially when it comes to displaying icons all over the place. Every now and then I have to run a KDE-based application, and always find the icons and menus to be extremely excesive and confusing. Not only do they have way too many of them but, for the most part, they are undistinguisable (by this I mean that the actual symbols or icons don't clearly reflect what they accomplish). In general, I find KDE a good example of what not to do when it comes to GUI design, in spite of the fact that most people in the Linux world (apparently, according to what I read) would almost lynch me for saying so. Anyways, the case is that the last time I gave Enlightenment a try, I was quite impressed. I tested E-16, and liked the fact that it was cool, fast, different and... surprisingly lightweight. I know, I still remember the days when Enlightenment was one of the heaviest window managers out there due to its use of cool eye candy. Well, the days of being a heavy window manager are over, but not so with the days of using cool eye candy. Even more so in the case of E-17. Pity it's still in development, where it has remained for years. However, it does show a lot of promise and it's stable enough for a try. As a matter of fact, that's what ELive is based on. Just as in the case of mutt (where I knew fully well I'd end up migrating from Pine to the new application sooner or later, but was just waiting to find the time to spend on the process itself), I also know that I'll end up running my system on E-17 some day. I just have to find the time to play with it and adapt to its own quirks, but I'm sick of the old tired Windows paradigm. I feel like a new approach is needed. {link to this story}

[Tue Jan 9 12:36:24 CET 2007]

I like to listen to stream radio while working, and for quite sometime XMMS or Rhythmbox is all I needed. However, it reached the point where I found it quite annoying having to search around for the individual streams on my own. Sure, there are directories such as SHOUTcast or Icecast available, but one still needs to enter the URLs manually into the audio player, which is a real pain. But a couple of months ago I discovered Streamtuner, a little application that allows you to consolidate all the radio streams in one simple interface (Linux.com published an article today about Streamtuner). The application itself doesn't handle the autio stream, limiting itself to managing the collection of streams and calling an extgernal application to play it. In other words, a simple but really useful idea. The only complaint I have is that, being as it is a simple little app to manager audio streams, it could do a better job at allowing for more personalization to arrange the favorite streams, amont other things. Still, I'd definitely recommend it to anyone who listens to audio streams on Linux. {link to this story}

[Tue Jan 9 10:46:36 CET 2007]

If there is something that surprises me about the so called analysts out there (whether dedicated to write about politics, economics, the arts or technology) is how utterly out of touch they can be. Sure, we are all human beings, and subjectivity is a very human treat. Still, one would expect a higher level of objectivity (or, at the very least, a better understanding of the overall nature of one's field of expertise) in the case of people who, after all, get paid to do nothing more than analyze trends. Yet, we can find real gems like this, obtained from a recent piece written by Paul Murphy for ZDNet:

... IBM's later success in deputizing most of the merry men made it difficult for even the most dedicated fantasist to continue the charade —and the vacuum left when the hot breath of media support disappeared meant that Linux failed its tipping point sometime in 2002/03 and is now in apparent decline relative to Windows, Solaris, and the BSDs.

That doesn't mean it will go away anytime soon, of course —IBM could keep it commercially alive for decades, the momentum built up in north American and European open source and academic usage will not disappear quickly, and the political advantages to the use of Linux across Asia will probably continue to drive acceptance there. But what's scary for Linux is this: the applications and GNU components we think of as Linux don't depend on the Linux kernel —meaning that the only barriers to mass migration by Linux developers to Solaris or the BSDs are psychological.

While Murphy may be right that Linux failed the tipping point a few years ago (not that the statement isn't highly arguable anyways), one wonders where he got the idea that Solaris and the BSDs are on the rise. When it comes to Solaris, it's patently obvious that Sun is already at the stage of doing damage control, and their decision to release the code of their crown jewel cannot be interpreted as anything else than a desperate move to avoid being crushed like most other commercial Unices. As for the BSDs, they truly are dependable and stable, but they all are way behind the curve when it comes to features and drivers for new devices, not to talk about scalability. Finally, neither of them poses any threat whatsoever on the desktop front. So, as I said, I'm just not sure what Murphy is talking about. He must live in a parallel world somewhere. {link to this story}