[Tue Mar 31 13:48:07 CEST 2009]

Well, today is March 31st. It's Mozilla's birthday! Back on 31 March 1998, Netscape released the source code of their browser to the public. I still remember the excitement. Back then I was working the third shift for a company that provided technical support for some very well known American hardware and software manufacturers and ISPs. As a matter of fact, just a few months earlier I had finished a period of a couple of years or so supporting precisely the Netscape browser with them. We were all a bunch of lowly-paid losers, for the most part, but I still remember the excitement we felt. We thought we were part of a digital revolution, smack in the middle of the whole thing. Those were the days! The night the code was released to the public, I downloaded it almost immediately to give it a go and discussed it with lots of other people over IRC (yeah, little did we know that it would take years before a working release of the browser could truly be used to do anything useful). To celebrate it, here is Code Rush, a documentary from that very same year that follows the lives of a few Netscape developers before and shortly after the code was released.

{link to this story}

[Tue Mar 31 13:36:32 CEST 2009]

I have already written in these pages about the e-books. I am convinced that, sooner or later, a good part of our reading will be done by electronic means. Actually, we already do it to some extent. If anything, what's stopping us from migrating to an electronic format these days is that fact that we're still tied to our desktop computers. Still, with the widespread use of laptops and wifi, more and more people rely on digital media and the paperless life is becoming increasingly more feasible. For the record, no, I don't think the traditional book form will die, just the same way the radio and theater didn't disappear either when new forms of communications became widely used. Yet, just as in those other cases, I do think that traditional newspapers, magazines and books may lose their central role in our lives, to the benefit of the e-book. Simply put, it's far more convenient to carry a single device that contains lots of different printed information than go around with a suitcase full of printed materials. Not only that, but it's also way easier to add notes, retrieve them, search for information, feed them all to a different application, upload them to a website, etc. It's far more flexible.

In any case, all this occurred to me after reading a review of the Sony Reader PRS-700 compared to Amazon's Kindle 2. They're still a bit pricey, I think, but we're quickly getting there. Wait a few more years, and they will be all over the place. {link to this story}

[Mon Mar 30 15:25:14 CEST 2009]

Everybody knows we are running out of IP addresses because the old IPv4 system was quite limited. So, a few years ago, the powers that be created IPv6 to solve that problem. Taking into account the large amount of devices we are seeing that allow us to connect to the Internet, we will need the extra addresses. Yet, few people have migrated to IPv6. Why? Internet News explains why IPv6 is like broccoli:

The Internet is at crossroads. The current IPv4 address space is nearing exhaustion, while the next-generation IPv6 addressing system dramatically expands the available address space. Yet, to date, it hasn't been widely deployed.

And despite the impending IPv4 exhaustion, the Internet Society (ISOC) has published a study in which it reported that there are no concrete business drivers for IPv6.

With all the technology has to offer, is there actually no business case for IPv6?

(...)

"IPv6 is not the question —it's the answer," Leslie Daigle, chief Internet technology officer for the ISOC, said during an IETF panel discussion on IPv6. "The question is do we want to continue to have an Internet that continues to be expanded by innovations from everywhere? In which case, we need to deploy IPv6 to continue to have global addressing."

"It's something of a broccoli technology, in that regard: It's better for you if you eat it but it's not necesarily appealing in its own right."

Nice way to illustrate the issue. But the question remains: if IPv6 is so nice and will solve so many problems, how come nobody seems to care? The same article contains a clue:

Richard Jimmerson, CIO of the American Registry for Internet Numbers (ARIN) —the organization that assigns IP address space in North and South America— told the panel that he expects the free pool of IPv4 adress space to run out within the next two years.

But just because there will be no more available IPv4 address space for carriers doesn't mean that IPv4 itself will stop working in two years.

"After IPv4 runs out, it's not the same as running out of oil, where there would be no cars running the next morning," said Alain Durand, director of IPv6 architecture and Internet governance in Comcast's Office of the CTO. "Everything that has been deployed will still work, so don't panic."

Precisely! The fact that the IPv4 networks will continue to run just fine after we run out of IP addresses for the carriers explains why there is no pressing need to migrate to IPv6, especially since migrating would take some serious effort. Why bother if there is no itch to scratch? Sure, it would help if we all did it. Sure, it would make things easier for everybody else, but *I* have no pressing need to do it. Who cares? The way I see it, there are only two possible solutions to the conundrum: either we have to make it financially attractive to do the move or the governments have to do it by decree, sort of like the move from analog to digital TV. I don't see any other way out. {link to this story}

[Mon Mar 30 09:10:31 CEST 2009]

Ah, the ironies of life! Just a few years ago, there were plenty of people wondering how anyboyd could make money out of Linux and now people wonder whether somebody will buy Red Hat because it's too tempting not to buy a company that makes money in a time of recession. The thing is that Red Hat reported profits ahead of projections: revenue rose 18% to US $166 million, which led to a profit of 22 cents a share. The fact is that nobody sees them as a gamble anymore. By now, Red Hat is a well established company without a doubt, which should make the naysayers reconsider certain things. {link to this story}

[Mon Mar 30 08:52:42 CEST 2009]

Somebody posted on on Slashdot a story about the folks from Pirate Bay using Facebook to post links to their torrent files. It sounds like an interesting legal maneuver to make their point, although there is a difference between Facebook's purpose (to create a social network online) and that of The Pirate Bay (post links to torrent files available on the Internet, both legal and illegal). I don't think it would be so difficult to argue in court that while Facebook users can definitely post any sort of content to their pages (and the website can certainly make some efforts to remove anything that's illegal, although it's the individual who should ultimately be held responsible) things are different in the case of TPB. Simply put, it's whole existence revolves around sharing links to torrent files, regardless of whether they are legal or illegal. Sure, there are arguments that they can also use to defend what they do, but I'm not sure a comparison with Facebook is one of them. {link to this story}

[Sun Mar 29 15:00:30 CEST 2009]

This video made me laugh out loud. Sure, you have to know both about The Matrix and Windows for it to make sense, but it's funny. It's titled The Matrix Runs on Windows, a masterpiece from the CollegeHumor folks.

{link to this story}

[Sun Mar 29 13:43:56 CEST 2009]

Science fiction writer Bruce Sterling published an interesting piece on his blog Beyond the beyond blog about compression rot. You basically open the last saved JPEG image, save it as a new JPEG image with slightly more compression, repeat several hundred times and you get the effect. It's a nice illustration of Shannon's source coding theorem.


Generation Loss from hadto on Vimeo

{link to this story}

[Sat Mar 28 16:54:32 CET 2009]

Ever notice those weird and annoying characters when reading the man pages on Ubuntu? The solution is quite straightforward: simply alias man to man -E ascii. As usual, you can put it in your own ~/.bashrc file if you want it to take effect as you log in. {link to this story}

[Thu Mar 26 10:41:08 CET 2009]

InfoWorld published a good article on the challenges posed by multicore chips:

Adding more processing cores has emerged as the primary way of boosting performance of server and PC chips, but the benefits will be greatly diminished if the industry can't overcome certain hardware and programming challenges, participants at the Multicore Expo in Santa Clara, California, said this week.

Most software today is still written for single-core chips and will need to be rewritten or updated to take advantage of the increasing number of cores that Intel, Sun and other chip makers are adding to their products, said Linley Gwennap, president and principal analyst at The Linley Group.

Off-the-shelf applications will often run faster on CPUs with up to four processor cores, but beyond that performance levels off and may even deteriorate as more cores are added, he said. A recent report from Gartner also highlighted the problem.

In other words, that, as it happened when multiprocessor systems were first released to the general public, the software industry will have to adapt to these new products. It's nothing new, of course, but something to keep in mind when listening to a sales pitch. {link to this story}

[Thu Mar 26 10:16:51 CET 2009]

Now, this is interesting. Spencer Reiss tells us in Wired Science about the concept of negawatts:

Problem: It's high noon in July. At 90-plus degrees outside, the masses are jonesing for AC. But it's seriously expensive to keep the juice flowing when demand crests. Firing up turbines that sit idle 360 days a year can multiply electricity costs by a factor of 10. How to keep cool without stressing the grid?

Solution: Pay big users to cut consumption when the need arises. Many utilities already do an ad-hoc version of this, an emergency practice known as demand response that has lately been promoted by John Wellinghoff, acting chair of the Federal Energy Regulatory Commission. Now there's an alternative: Call EnerNOC, a Boston-based company that gangs commercial users who are willing, for a quarterly payment, to trim back operations on 30 minutes' notice. EnerNOC micromanages consumption at 3,400-plus locations from Maine to California. Between dimming light, adjusting thermostats, and suspending industrial activities, the potential cuts top the output of a large nuclear reactor. And the savings can be huge. EnerNOC's cofounder, Tim Heakly, points out that 10 percent of all US generating capacity exists to meet the last 1 percent of demand. Utilities paid EnerNOC $100 million last year simply to stand at the ready —insurance, in effect, against the inevitable days when every AC unit is humming.

{link to this story}

[Tue Mar 24 13:15:01 CET 2009]

Now, this is both interesting and scary. A couple of Argentinian researchers have found a way to compromise systems at the lowest level, the BIOS:

Apply all of the browser, application and OS patches you want, your machine still can be completely and silently compromised at the lowest level —without the use of any vulnerability.

That was the rather sobering message delivered by a pair of secyrity researchers from Core Security Technologies in a talk at the CanSecWest conference on methods for infecting the BIOS with persistent code that will survive reboots and reflashing attempts. Anibal Sacco and Alfredo Ortega demonstrated a method for patching the BIOS with a small bit of code that gave them complete control of the machine. And the best part is, the method worked on a Windows machine, a PC running OpenBSD and another running VMWare Player.

(...)

Sacco and Ortega stressed that in order to execute the attacks, you need either root privileges or physical access to the machine in question, which limits the scope. But the methods are deadly effective and the pair are currently working on a BIOS rootkit to implement the attack.

Sure, root privilege may somehow limit the scope of the problem, but not by much. After all, there are plenty of rootkits out there that allow you to gain those privileges. As a matter of fact, at any given time —right now, in fact!— there are thousands of machines connected to the Net that have been compromised. Imagine if just a portion of those were infected by this type of BIOS malware described in the article. {link to this story}

[Tue Mar 24 13:02:44 CET 2009]

The review of The Age of Speed published on Slashdot is well worth checking. The author makes some good points:

The beginning of the book deals with shedding the guilt most people associate with getting things done quickly. We are lead to believe at an early age that shortcuts diminish the reward or the experience of a task. While there are some tasks where this holds true, overall it is a common myth one needs to overcome in the age of speed.

My favorite anecdote was a fresh look at the Tortoise and the Hare. The common moral one associates with this fable is "Slow and steady wins the race". But the story isn't a condemnation on speed, rather against stupidity. The Hare lost simply because he was dumb enough to take a nap in the middle of the race, in no way did his speed work against him.

(...)

Sometimes I find mysel falling into a black hole of needless distractions, constantly switching between email, Twitter, Slashdot and any other diversion I reward myself with throughout the day. If I have too many distractions in a short amount of time, I'll fall into a pseudo trance of cycling through them endlessly. Afterward I'm at square one with getting back on task. Directly reading the chapter An Exercise in Consciousness I turned off my email auto checker. This simple change transformed my work environment from an interruptive process to one I'm in control of. By removing the interruption I don't have the temptation to succumb to distractions and I've felt much more productive.

Of course, turning off the email auto checker is something you can only do if your work doesn't rely precisely on keeping a close eye on the incoming email. Besides, I find it just as useful to learn to ignore email when I'm busy with something else. Sure, the temptation is still there, but we can overcome that, right? {link to this story}

[Thu Mar 19 08:50:48 CET 2009]

I hadn't checked it in a while but, while reading about hackers recently, I came across the Wikipedia entry dedicated to Lions' Commentary on UNIX 6th Edition, a classic of computer science and myth of the open source movement. I must have the book somewhere in my shelves, but I still have to reorganize them all since we moved to Spain a couple of years ago. In any case, the whole book is now available in PDF and other formats here. And, if you need a good introductory article telling you about the book (other than the Wikipedia piece linked above), here is a good one published by Salon sometime ago. {link to this story}

[Fri Mar 13 13:36:16 CET 2009]

Remember the Bionic Man TV show? Well, apparently, a Finnish programmer has decided to wear a USB finger after his real finger was sectioned during an accident. Check out the link for some pictures. It's a removable prosthetic with an USB memory stick inside. Cool, huh? {link to this story}

[Thu Mar 12 13:29:14 CET 2009]

I read in WorkWithU that Canonical is requesting help to build the Ubuntu Directory Services so that future releases of Ubuntu Server make it easier to set up and configure OpenLDAP. I'm currently involved in a project to set up a network of old desktops running Ubuntu for a non-governmental organization and can attest to the fact that setting up the LDAP database is definitely the worst part of the work. There is no reason why setting up directory services should be so difficult in Linux. Perhaps Canonical's efforts will help in that front. {link to this story}

[Thu Mar 12 13:05:27 CET 2009]

Bill O'Reilly ponders whether or not iPods are changing our perception of music in a recent piece published on The O'Reilly Radar website. To sum things up, Jonathan Berger, professor of music at Stanford, has been running some interesting tests for a number of years now that come to prove that our perception of the quality of audio may be changing as a consequence of our habits:

Berger then said that he tests his incoming students each year in a similar way. He has them listen to a variety of recordings which use different formats from MP3 to ones of a much higher quality. He described the results with some disappointment and frustration, as a music lover lover might, that each year the preference for music in MP3 format rises. In other words, students prefer the quality of that kind of sound over the sound of music of much higher quality. He said they seemed to prefer "sizzle sounds" that MP3s bring to music. It is a sound they are familiar with.

Our perception changes and we become attuned to what we like —some like the sizzle and others like the crackle [of old vinyl records]. I wonder if this isn't also something akin to thinking that hot dogs taste better at the ball park. The hot dog is identical to what you'd buy at a grocery store and there aren't manuy restaurants that serve hot dogs. A hot dog is not that special, except in the right setting. The context changes our perception, particularly when it's so obviously and immediately shared by others. Listening to music on your iPod is not about the sound quality of the music, and it's more than the convenience of listening to music on the move. It's that so many people are doing it, and you are in the middle of all this, and all of that colors your perception. All that sizzle is a cultural artifact and a tie that binds us. It's mostly invisible to us but is is something future generations looking back might find curious because these preferences won't be obvious to them.

Is it nature or nurture? As usual, chances are that it's a convenient mixture of both. My guess is that most of us cannot distinguish sound with a very good quality from another one of a mediocre quality any better than we can distinguish the quality of two differents works of art. All we can say is whether we prefer one or the other, which obviously speaks for our own very personal preferences but nothing else. In other words, perhaps Berger is making the wrong assumptions altogether by assuming that the subjects in his tests are actually capable of distinguishing good quality from bad quality sound based on some form of objective scale. We, human beings, are far more complex than that. {link to this story}

[Mon Mar 9 20:13:52 CET 2009]

Somebody posted a great story on Slashdot a few days ago on why the TV lost to the computer (the original piece can be found here). Here is quick outline of the main arguments:

About twenty years ago people noticed computers and TV were on a collision course and started to speculate about what they'd produce when they converged. We now know the answer: computers. It's clear now that even by using the word "convergence" we were giving TV much credit. This won't be convergence so much as a replacement. People may still watch things they call "TV shows", but they'll watch them mainly on computers.

What decided the contest for computers? Four forces, three of which one could have predicted, and one that would have been harder to.

One predictable cause of victory is that the Internet is an open platform. Anyone can build whatever they want on it, and the market picks the winners. So innovation happens at hacker speeds instead of big company speeds.

The second is Moore's Law, which has worked its usual magic on Internet bandwidth.

The third reason computers won is piracy. Users prefer it not just because it's free, but because it's more convenient. Bittorrent and YouTube have already trained a new generation of viewers that the place to watch shows is on a computer screen.

The somewaht more surprising force was one specific type of innovation: social applications. The average teenage kid has a pretty much infinite capacity for talking to their friends. But they can't physically be with them all the time. When I was in high school the solution was the telephone. Now it's social networks, multiplayer games, and various messaging applications. The way you reach them all is through a computer. Which means every teenage kid (a) wants a computer with an Internet connection, (b) has an incentive to figure out how to use it, and (c) spends countless hours in front of it.

This was the most powerful force of all. This was what made everyone want computers. Nerds got computers because they liked them. Then gamers got them to play games on. But it was connecting to other people that got everyone ele: that's what made even grandmas and 14 year old girls want computers.

I don't think anyone doubts Graham's portrayal of the situation anymore. Yes, a few years ago it was still clear that convergence was going to happen but we didn't know yet which type of media would win in the end. While I totally agree with him that the TV lost, I'd only say that the computer won with some caveats. Specifically, I'd argue that the computer is also seeing some major changes lately. What is a computer, after all? It's anything from a major supercomputer at NASA to the iPhone, with everything in between —including the Netbooks, of course. In other words, the TV didn't lose so much to the computer as to the digital media or, if you prefer, the computer, but only if we understand it in broad terms, going far beyond the old desktop computer next to a desk and with a large monitor attached. It seems clear to me that mobility is the key these days. The fact that we can watch whatever we want at the time it suits us. That's truly the key. That's what's killing TV as we knew it. As Graham states:

Whether they like it or not, big changes are coming, because the Internet dissolves the two cornerstones of broadcast media: syncrhonicity and locality. On the Internet, you don't have to send everyone the same signal, and you don't have to send it to them from a local source. People will watch what they want when they want it, and group themselves according to whatever shared interest they feel most strongly. Maybe their strongest shared interest will be their physical location, but I'm guessing not. Which means local TV is probably dead. It was an artifact of limitations imposed by old technology. If someone were creating an Internet-based TV company from scratch now, they might have some plan for shows aimed at specific regions, but it wouldn't be a top priority.

In other words, the Internet-centered visual industry is perfectly suited to the new era of globalization and ubiquity that we already live in. That's what killed the TV. There is no way back. We barely watch any TV at home, at least not the old style. On the other hand, we do download movies, news casts and documentaries from the iTunes Store or via RSS feeds, then watch it either directly on the computer screen or connecting the laptop to the large TV set. As for the antenna, we could pretty much do without, to be honest. {link to this story}

[Fri Mar 6 10:24:33 CET 2009]

Here is an interesting story I just read in Slashdot. Apparently, prior to Mosaic there was another little known graphical browser created by a few Finnish engineers. The story is actually taken from a website called Xconomy:

The four Finns developed a graphical, point-and-click Internet browser a year before the pioneering Mosaic browser on which Netscape Communications was based: the historical Netscape IPO in August 1995 is widely credited with starting the Internet boom.

"Our 1991 X Window browser, Erwise, showed that a net browser was possible. We were ahead of the times. The next step, to commercialize it, did not happen," Kim Nyberg says.

(...)

In the US, commercialization of the browser, now so much a part of our everyday lives, began in 1994, after Marc Andreessen left the National Center for Supercomputer Applications at the University of Illinois, where he and Eric Bina had developed the Mosaic browser the previous year. Andreessen had moved to California following his december 1993 graduation and teamed up with Silicon Graphics founder Jim Clark, backed by venture capital powerhouse Kleiner Perkins Caufield & Byers, to form Mosaic Communications, later renamed Netscape Communications. Europe was quickly left out in the cold.

By the way, from the Wikipedia entry on this Finnish browser I also learned that the second graphical browser was ViolaWWW. In any case, while all this is certainly curious, it means little in the big scheme of things. The same question applied to inventors and scientists also applies here: what is more important, being the first in discovering/inventing something or, rather, being the one who launched it to fame and made it popular? Sure, let's give credit where credit is due, but the reality is that it was Mosaic and, above all, Netscape that changed the world as we knew it. {link to this story}

[Tue Mar 3 21:10:07 CET 2009]

Now, this is an interesting piece of research. I read on the Fast Company website that a team with an uneven number of members may end up being more effective:

In a study with significant implications for everyday life, two management researchers found that small groups with an odd number of members tend to work better than groups with an even number of members. The conventional wisdom, as confirmed in an initial survey, is that even numbers and even-numbered groups are better. However, in an experiment with discussion groups and in an analysis of dormitory groupings at Harvard, the researchers found that even-numbered groups resulted in either stalemate (e.g., two against two) or domination (e.g., three against one). In odd-numbered groups, disagreement often implies a swing vote (e.g., two against one), which encourages the majority to tread more carefully. One caveat is that the odd-number effect is less powerful in groups that are more diverse, because those groups are less cohesive in the first place.

If the findings are correct, conventional wisdom is proven wrong once more. I've always found it fascinating that certain types (especially the conservative ones) tend to emphasize "common sense" above all. In reality, what we call "common sense" is little else than conventional wisdom, which has been proven wrong time after time in many pieces of research. The reality is that, all too often, our deeply-seated believes are actually based on prejudice and assumptions, more than empirical evidence. In the case of this particular study, it turns out that the American Founding Fathers were right all along: it's a good thing that the Vice-President can cast a deciding vote in the US Senate to avoid a stalemate. {link to this story}

[Tue Mar 3 20:48:31 CET 2009]

I had no idea whatsoever that IBM had released Lotus Symphony as freeware until I read about it in an InfoWorld story comparing office suites. It may be worth a try as a possible alternative to OpenOffice, which feels too heavy at times (especially on older hardware). {link to this story}