[Thu Aug 28 14:29:49 CDT 2008]

Speaking of what's to come, Information Week recently published an interview with Ray Kurzweil where the internationally known futurist had this to say about videoconferencing:

I think the next decade, the teen years, we'll be routinely visiting with each other like you and I are doing now in real reality, even if we're hundreds of miles apart and not just as a grainy postage stamp sized video conferencing image on your screen but as a full immersion experience where we really seem to be with the other person. It will be a full immersion visual and auditory virtual reality. We'll have images beamed into our retinas from our eyeglasses. We'll be online all the time. The electronics will be woven into our clothing and in our belt buckles and we'll routinely be visiting with other people in these full immersion environments.

We'll have augmented reality so we'll see real reality but there will be an overlay of virtual reality on top of that helping guide us through the real world. It will direct us inside and outside, not with a navigation system on a small screen that we carry in our palm, but one that's actually built into our field of view. We'll look at a person and it will remind us who they are and give us background about them. That will be very helpful. I mean how many times are you at a cocktail party where you see someone and think you know who they are but you're not quite sure. It would be great get that confirmed and we will have technology like that. It'll help the elderly but it will help all of us. You don't have to be 80 years old to have a senior moment.

Yes, it sounds pretty far out there, but then... who would have thought 20 years ago that most Americans wouldn't be getting their news from TV by now and would rather prefer to connect to a thing called the Internet, which is nothing but a huge worldwide network of interconnected computers? And who wouldn't have thought about the same time ago that routinely using videoconference to talk to our relatives was science fiction? Kurzweil may be closer to "getting it" than we think. I, for one, have little doubt that small devices will take over in just a few years, making it far easier for us to use data "on the go" and reaching something similar to the intermingling of virtual with reality that Kurzweil is talking about. {link to thist story}

[Thu Aug 28 12:26:08 CDT 2008]

The Mozilla folks have been working on some cool stuff for a while now. The latest nice idea was to set up the Mozilla Labs website, a sandbox where developers can publicize whichever latest tool they are building on top of the Mozilla development framework. By now, there are a few really interesting extensions available on the site: Ubiquity, which allows regular people to put together mashups; Prism, a tool that makes it possible to run web applications out of the browser itself; Snowl, a plugin for Thunderbird that consolidates different messaging protocols in a single app, etc. Finally, we have their recent call for participacion in a Concept Series where they ask people to let their imaginations run wild and come up with prototypes of what the future browser may be like. Some of the videos included in the page are quite intriguing. Here is an example:


Aurora (Part 1) from Adaptive Path on Vimeo.

{link to thist story}

[Thu Aug 28 12:06:17 CDT 2008]

I spent sometime perusing Ray Kurzweil's The Age of Spiritual Machines last weekend, and came across the following musings about the concept of technology:

Technology is often defined as the creation of tools to gain control over the environment. However, this definition is not entirely sufficient. Humans are not alone in their use or even creation of tools. Orangutans in Sumatra's Suaq Balimbing swamp make tools out of long sticks to break open termite nests. Crows fashion tools from sticks and leaves. The leaf-cutter ant mixes dry leaves with its saliva to create a paste. Crocodiles use tree roots to anchor dead prey.

What is uniquely human is the application of knowledge —recorded knowledge— to the fashioning of tools. The knowledge base represents the genetic code for the evolving technology. And as technology has evolved, the means for recording this knowledge base has also evolved, from the oral traditions of antiquity to the written design logs of nineteenth-century craftasmen to the computer-assisted design databases of the 1990s.

I couldn't help but to see a direct relationship between this sharing of knowledge that Kurzweil is referring to and the open source philosophy. How else could technological singularity (and all other sort of things predicted by Kurzweil in his book) become a reality without this precondition? Yes, competition brings about innovation and changes, but only when we have a common ground —a very well laid out foundation— to work from. And that foundation tends to be a given, free infrastructure. That's what open source can provide: the very low entry point, the free foundation upon which to build other (great) things. Things like the operating system, the browser or the email client were quite revolutionary at the beginning of our computer age, but by now they should be little more than plumbing, infrastructure, a given. The real cool stuff is what's built on top of that. {link to thist story}

[Tue Aug 26 12:05:32 CDT 2008]

I just came across a very interesting article published by Computer World. Preston Galla writes an opinion piece titled Why Google has lost its mojo —and why you should care that is definitely worth the time spent reading it. Yes, I know, at first I also thought the same: "yet one more journalist who is trying to shock readers with some wacky title and little substance". However, far from that, Galla does spell out some good reasons to explain why he thinks that Google may be losing its mojo. First of all, Galla tells us how it is becoming increasingly clear that Google's upper management sees childcare and other employee benefits as a luxury that employees should enjoy without much complaint if the company decides to increase the fees... but more on this later. Second, the stock price has plummeted at a higher rate than the Nasdaq or the Dow —Google stock has lost about 34% of its value since November 2007. However, the most important reason to think that something may be failing in Google after all is the argument that its own engineers seem to care more about the coolness factor than how effective or stable the services and features are. Sure, this has always been ingrained in the dot-com businesses, at least ever since Netscape released its browser. The mantra has always been: release, release, release, even if it's to soon and the product is not finished yet. It's part of the new economy and the Internet age, right? Well, yes and no. True, this cool engineering has always been a hallmark of the startups in the software business. However, Google is not a startup anymore and they are not trying to sell only your cool new product but rather hosted services and the like. This is a completely different world. People are willing to put up with bugs and sloppy development with new products that are supposed to be on the cutting edge anyways, but definitely not with enterprise products that are a part of the infrastructure —incidentally, it's also this overall sloppiness that has prevented Linux from spreading in the desktop market so far, with the exception of precisely those areas where people are more willing to put up with a small problem here and there: geeks, technically adept people, programmers, students, people who care about the coolness factor, etc.

In any case, what I liked the most about Galla's article was not so much the article itself, but the link that included to Sergey Solyanik's blog post explaining why he decided to leave Google and return to Microsoft. This post is full of sensible comments about what works and doesn't work at Google. He starts by mentioning what he liked about Google:

Among them is the peer-based review model where one's performance is determined largely based on peer comments, and much less so based on the observations of the manager. The idea that a manager is far easier to fool than the co-workers are is sound and largely works. A very important side-effect tha this model produces is an increased amount of cooperation between the people, and generally better relationships within the team.

The wide employee participation in corporate governance through a concept called "Intergrouplets" is a good one and merits emulation. Unlike most other companies where internal life is regulated largely by management, a lot of aspects of Google are ruled by committees of employees who are passionate about an issue, and are willing to allocate some of their time to have this issue resolved. Many thing, such as quality of code base, testing practices, internal engineering documentation, and even food service are decided by intergrouplets. Of course, this is where 20% time (a practice where any Googler can spend one day a week working on whatever he or she wants) plugs in well, for without available time there would have been nothing to allocate.

Doing many things by committee. Hiring, resource allocations at Google are done by consensus of many players. If you are to achieve anything at Google, you must learn how to build this consensus, or at least how to not obstruct it. This skill comes in very handy for every other aspect of work.

Free food. More than just a benefit, it is a tool for increasing communications within the team, because it's so much easier to have team lunches. I don't think making Redmond cafeterias suddenly free would work (maybe I am wrong), but giving out free lunch coupons for teams of more than 3 people from more than one discipline to have lunch together —and at the same time have an opportunity to communicate— I think, has a fair chance of success.

So, if there were so many things that Solyanik loved about the company, why did he leave? Here, the post is very interesting too:

First, I love multiple aspects of the software development process. I like engineering, but I love the business aspects no less. I can't write code for the sake of the technology alone —I need to know that the code is useful for others, and the only way to measure the usefulness is by the amount of money that the people are willing to part with to have access to my work.

Sorry open source fanatics, your world is not for me!

Google software business is divided between producing the "eye candy" —web properties that are designed to amuse and attract people— and the infrastructure required to support them.

Some of the web properties are useful (some extremely useful —search), but most of them primarily help people waste time online (blogger, youtube, orkut, etc.).

All of them are free, and it's anyone's guess how many people would actually pay, say $5 per month to use Gmail. For me, this really does make the project less interesting if people are not willing to pay for it.

The orientation towards cool, but not necessarily useful or essential software really affects the way the sofwtware engineering is done. Everything is pretty much run by the engineering —PMs and testers are conspicuously absent from the process. While they do exist in theory, there are too few of them to matter.

On one hand, there are beneficial effects —it is easy to ship software quickly. I've shipped 3 major features (a lot of spell checker and other stuff in the latest GMail release, multi-user chat in GMail, and road traffic incidents in Google Maps), and was busy at work on my fourth project in just a year. You can turn really quickly when you don't have to build consensus between 3 disciplines as you do at Microsoft!

On the other hand, I was using Google software —a lot of it — in the last year, and slick as it is, there's just too much of it that is regularly broken. It seems like every week 10% of all the features are broken in one or the other browser. And it's a different 10% every week —the old bugs are getting fixed, the new ones introduced. This across Blogger, GMail, Google Docs, Maps, and more.

This is probably fine for free software, but I always laugh when people tell me that Google Docs is viable competition to Microsoft Office. If it is, that is only true for the occasional users who would not buy Office anyway. Google as an organization is not geared —culturally— to delivering enterprise class reliability to its user applications.

While a few of Solyanik's comments can definitely be argued against —why should money be the only way to show appreciation for a piece of software, for instance? Or, to what extent is fare to compare web applications, which as we all know have a much faster development time and turnaround than your regular business applications, to something like Microsoft Office— the fact remains that he does make a few valid points worth considering. {link to thist story}

[Fri Aug 22 16:39:14 CDT 2008]

Matt Hartley publishes an interesting article about the Linux destop as a service on IT Management. It's mainly about Zonbu mini, a cool device that combines a web-centric service with a small form factor PC running Linux. But what truly interested me in Hartley's article was the following paragraph:

Clearly defined list of compatible peripherals. This is an issue that plagues Linux in genral. Not so much because of a lack of support, because in reality, peripheral support is pretty good. Rather, the fact that a user has to "research" what works and what does not. Clearly, this can be fixed by simply providing these peripherals from a simple online store. Let the company wanting to sell Linux as a service do the research, not some poor user simply looking to get a document printed.

After reading this, it occurred to me that there is nothing stopping Canonical —the company behind Ubuntu— from doing something like this. They could set up an e-commerce section on their own website after reaching an agreement with some major retail store, providing easy access to peripherals (printers, PDAs, iPods, digital cameras, etc.) that are guaranteed to be fully compatible with their OS. I think this would be a winner. It would be a classic win-win situation: users win because they know where to go in order to find devices that will work for sure with their Ubuntu systems, instead of wasting time researching on their own; as for Canonical, I'm sure they would manage to increase the sales of Ubuntu —or their market penetration— by doing this. Also, they could make a cut of these sales, of course. Finally, Linux itself could make some serious inroad on the desktop by removing this clear barrier to entry that worries so many new users. {link to thist story}

[Fri Aug 22 14:53:01 CDT 2008]

OK, time to share another nice roster of nifty Linux applications I recently came across of. First of all, I loved 10+ essential programs for the terminal junkie, published on the Binary Codes blog (a really nice blog, incidentally). I already knew of some applications (screen, vim, elinks, Latex, irssi...) but had not heard of alpine, ncmpc, mpd, or rTorrent. I'll have to give them a try. But that's not all. There's more. System administrators will love MultiTail, a little tool that allows you to view one or multiple files via the traditional tail command. Specially good, obviously, for system administrators that need to keep a close eye on logs here and there. Check out the screenshots for some good ideas. Then, I've been using clive quite a bit lately. It allows you to quickly (and easily) download videos from YouTube, which will be automatically converted to the MP4 format, so you can easily upload them to your own iPod. Finally, we have gcalcli, a nifty utility written in Python that allows you to read and write to your own Google Calendar. {link to thist story}

[Thu Aug 21 15:31:31 CDT 2008]

I'm sure I cannot be the only person receiving invitations from friends to join this or that social networking website. Yes, I joined Facebook sometime ago, and find it quite interesting and even useful. Not only does it provide an easy way to keep track of what my friends are up to, but it also helped me connect with old friends, make some new ones and have some very interesting discussions online about several topics. Heck, I even used it to exchange opinions running up to the national convention of a political party I am a member of. We discussed different amendments to the documents that were to be discussed and even coordinated some proposals before the convention took place. I find that to be quite useful. However, I continue receiving invitations to join services like Orkut and Hi5 which make me consider at what point the whole thing stops making any sense. I mean, nobody (I'd hope) has time to keep up with so many different social networking websites. Once accpted that, we also have to realize that joining multiple networks simply defeats the purpose. The main idea, after all, is to have everyone (or as many people as possible) on the same page, so to speak. Since the very moment that people spread across multiple websites, social networking loses a good part of its appeal. I don't see how they can fix this, unless these companies agree to some sort of standard that binds them together. A protocol that allows for users to move between one network and the other seemlessly. Sure there must be a way to achieve this in this Web 2.0 era. {link to thist story}

[Wed Aug 20 10:54:48 CDT 2008]

The GoboLinux folks recently had an interesting discussion over the Filesystem Hierarchy Standard (FHS) that is worth mentioning here. But first, some background information: GoboLinux is a Linux distribution that attempts to replace the traditional Filesystem Hierarchy Standard with a different filesystem structure that's assumed to be more reasonable and easier to use. Now, the way this is accomplished is by leaving the underlying standard FHS in place and building on top a new structure with symlinks to the regular filesystem. Someone recently suggested to change their approach and go at it the other way around: make the regular filesystem a set of symlinks to the GoboLinux filesystem structure. In any case, since I don't use GoboLinux (most of us don't), the actual discussion doesn't have any direct effect on me. What I did find interesting though were some of the arguments used against the FHS:

There is no real confidence. Why do some programs have directories under /etc and others do not? It makes no sense. Why did the FHS make an exception for X11R6 in /usr? Why does even exist any /usr or /usr/local debate WHILE keeping the /opt distinction? Why am I forced to keep the distinction of /bin vs. /sbin? Do I need to use the FHS suggested way for /boot? What if I choose to use only one partition anyways and if I am the sole user of my system where in an extreme case I would not even need ANY new user at all?

It's a layer of ugliness upon ugliness.

But that's not all. The piece continues:

Shevegen notes several other problems with FSH, such as the inability to run different versions of the same program side-by-side. "One huge problem of the FHS is that it does not easily allow one to have multiple versions of a program installed. This is what has lead to the whole .so.1.2.3 mess as well," he explains. "It is a reason why on a typical Debian system one finds a NEW symlink called ruby under /usr/bin which points to a ruby1.8 symlink (or vice versa). If one compiles ruby from source, he does not get any such arbitrary symlinks."

While I can see all these problems and shortcomings, the reality is that I still find the FHS on Unix-like systems to be a step ahead of Windows, for instance, which tends to scatter the files all over the place, making it quite difficult to perform simple operations, such as backup. So, yes, the FHS is outdated (and yes, it could be improved, and there are people working on it), but overall it does things pretty well, I think. {link to thist story}

[Tue Aug 19 14:28:00 CDT 2008]

IT Management carries an interesting piece celebrating the 15th anniversary of the Debian project from which I'd stress the following comment:

The popularity of Ubuntu, Murdock suggests (as well as, he might have added, the popularity of specialized Debian-derived distributions such as Knoppix and Damn Small Linux) may very well mean that Debian's role is changing. Instead of being the distribution of choice for many users, the project may be evolving into an upstream supplier for other, more user-focused distributions. The realiability of its packages, as well as the fact that its package format has not fragmented to anything like the extent that the .RPM format, could make Debian well-suited to this role.

It certainly does look that way. Debian is so stable and provides such a wonderful technology to build upon that many of the most popular Linux distributions choose it to be their underlying infrastructure. It makes sense. Incidentally, for the very same reasons I also like Debian for my servers. It doesn't make me feel as if somebody is constantly pulling the rug from under my feet, as it tends to happen with other distributions that pay far more attention to the coolness factor and are therefore subject to major upgrades and changes all the time. {link to thist story}

[Mon Aug 11 14:52:07 CDT 2008]

A few interesting tidbits of information I came across of lately. First of all, a short note about the pv (pipe viewer) command, which allows you to, for example, see the rate at which a shell pipe is transferring data. I'm not sure this is extremely useful, but who knows? There have been instances in the past when I truly needed to find out something as strange as that in order to troubleshoot a hard problem I had at hand.

Second, DesktopLinux recently published an article on a particular piece of WiFi software that has been ported to Linux. I still haven't had time to install it and give it a try, but it does look nice. I must say that, while in general Ubuntu hasn't given me many problems with WiFi, it's not as if there are many nice apps out there to manage and configure the feature. It still feels as a clear work in progress. Yes, it works. But no, it doesn't feel very integrated in the overall desktop experience from a regular user perspective.

Finally, a piece from Enterprise Storage about NFS 4.1:

pNFS is a key feature of NFS 4.1. The p in pNFS stands for parallel, and pNFS will provide parallel I/O to file systems accessible over NFS. It enables the storage administrator to do things like stripe a single file across multiple NFS servers. This is equivalent to RAID 0, which boosts performance by allowing multiple disk drives to serve up data in parallel. pNFS takes the concept and extends it to multiple storafe devices connected to the NFS client over a network.

We'll see how it plays out. NFS new features tend to be quite finicky. {link to thist story}

[Fri Aug 8 18:18:54 CDT 2008]

Difficult as it may be to believe, some companies out there still behave as if the open source disruption had never happened. Take, for instance, Hasbro's reaction to the Scrabulous craze that spread throughout Facebook. Lots of Facebook users played a digital version of the old Scrabble game, perhaps even discovering the joys of the board game to some people for the first time, and what does Hasbro do? They decide to take the creator to court. End result? Lots of people angry about the company and, in the end, the people who created Scrabulous avoided the legal troubles by making a few cosmetic changes and re-releasing under the name of Wordscraper. Is this a win for the company? Hardly. Did they piss off many fans? Lots. Is there any chance the company would lose money over this issue because people would just stop buying the traditional board game? I find it difficult to believe. Actually, most of the people who played Scrabulous also own one or more copies of the board game. Not only that, but they only played the digital version of the game because it provided a game to play with distant friends and relatives, something that's totally impossible to do with the real life version. All in all, I think Hasbro ended mishandling the isue. {link to thist story}