[Fri Dec 31 10:13:45 CET 2010]

We got an Acer Aspire One netbook for the kids and I spent a couple of hours getting it all ready before the big Christmas day. I must say that installing Ubuntu Netbook Remix 10.10 was a breeze. I didn't run into any snaffu at all, even though I was installing it on a netbook that already came preinstalled with both Windows 7 and Android and, needless to say, the installer had to shrink the Windows partition to make some room for Ubuntu. True, I didn't do a thorough test of the install, but as far as I could see (and they have been using it for a few days now) it all worked: wireless connection, webcam, audio... I'm very pleased, actually. Here is a screenshot:

Yeah, I know, it's the Unity desktop environment, the same one Mark Shuttleworth would like to see as the default desktop on all Ubuntu systems. It looks nice, right? Well, yes it does. However, I have a few issues with the Unity desktop environment. For starters, I don't see how it is any better than GNOME for a small screen, honest. As a matter of fact, I think it's even worse. It occupies more real estate with those icons on the left-hand side. Second, it's not nearly as customizable. For example, I tend to use multiple profiles with Firefox, so I normally have more than one icon on my desktop or menu bar with the correct options to run the different profiles. However, Unity doesn't let me right-click on the icons and set special options under the properties menu, which doesn't exist. Finally, I also find it quite annoying to navigate. It's difficult to explain exactly what it is that I find annoying, but it just doesn't feel right to click on the icon on the left to open a window that will use up the whole screen in order to launch an application. Thank goodness, switching to the old GNOME is quite easy: just log out and, right after entering your username, instead of leaving the default entry for the session (Ubuntu Netbook), change it to Ubuntu Desktop. That's all.

Now that I think about it, I did run into some issues with the USB image for Ubuntu Netbook Remix 10.10. For whatever reason, it would not boot correctly. After much searching, I found the solution here. Simply follow these steps:

  1. Once the image has been transferred to the USB thumb drive, browse to the root directory, click on the syslinux folder and open the file syslinux.cfg.
  2. Using your favorite text editor, find the following line:
      ui gfxboot bootlogo
      
  3. Change that line to the following:
      gfxboot bootlogo
      
  4. Save the file and reboot to test it. It should work fine now.

Incidentally, I have a couple of comments about the other OSes that came preinstaled with the netbook, First of all, the Android implementation they bundled with it... sucks! It doesn't appear to be a complete official implementation, but rather some half-baked stuff they quickly put together to avoid any accusations of favoritism towards Windows. As for Windows 7... what can I say? My oldest son tried changing the default language from Spanish to English and... it hosed the MBR where I had previously written GRUB. Yeah, my reaction was also: "What the...? Why would changing the language have to imply touching the MBR?" Also, at one point, when I was in a frenzy trying to get things configured properly after the install late at night after a long day at work, I logged out and it proceeded to inform me that it was going to install security updates... without asking for confirmation. Why does this matter? Because there were so many security updates that I had to sit there for over an hour waiting for the damn thing to finish. There was a clear message on the screen letting me know that I should not shutdown the system until it was done. As I said, what pissed me off was the fact that it didn't even ask for my permission to install them in the first place. This could very well be something related to the implementation Acer put together for the first run configuration. I don't know. Regardless, it's moronic. {link to this story}

[Thu Dec 30 18:38:19 CET 2010]

While listening to an HPC Wire podcast from November 19 about the highlights from the SC10 supercomputing conference, we hear that China has taken the lead in the supercomputing business too, beating out the US for the number one spot on the Top 500 list. As this other piece from the day before summarizes;

There are now seven petaflop supercomputers in the world. China's new Tianhe-1A system, housed at the National Supercomputing Center in Tianjin, took top honors with a Linpack mark of 2.56 petaflops, pushing the 1.76 petaflop Jaguar supercomputer at ORNL into the number two spot. At number three was China again, with the Nebulae machine, at 1.27 petaflops. Japan's TSUBAME 2.0 supercomputer is the 4th most powerful at 1.19 petaflops. And at number five is the 1.05 petaflop Hopper supercomputer installed at NERSC/Berkeley Lab. That last two petaflop entrants are the recently-announced Tera 100 system deployed at France's Commissariat a l'Energie Atomique (CEA) and the older Roadrunner machine at Los Alamos.

The assumption the people who put together the podcast make is that perhaps this is the beginning of the end of the American dominance over the supercomputer market. I'm not so sure that way to interpret the facts is correct, to be fair. While it seems obvious that China is taking its new role as an emerging superpower in the world quite seriously (and I don't think this will change in the near future, let's be clear about that too), that is not to say that the US is somehow quickly going downhill. In other words, the fact that other countries (especially China, Japan and the EU) are also getting a nice chunk of the supercomputing pie and the US will have to share from now on does not mean so much that the US technology is lagging behind. This is due to two main reasons: first, these other countries are quite often installing technology designed by American firms; and, second, it only means that the US cannot pretend to monopolize the top 10 positions in this particular market, which is pretty normal. I mean, the advance of other nations does not mean that the US is falling behind. It only means that the other nations are getting better. That's all. Let's not take things out of proportion, please. {link to this story}

[Tue Dec 28 16:18:33 CET 2010]

Aza Raskin, a Mozilla developer, is proposing a new policy that uses privacy icons to display info on whether or not a particular website respects the privacy of its users' data. The icons cover pretty much the whole thing, as far as I can see: if the data is used for purposes not intended by the end user, if the data is sold to other parties, given to advertisers, archived somewhere indefinitely, etc. As Raskin explains it:

Privacy policies and Terms of Services are complex documents that encapsulate a lot of situation-specific detail. The Creative Commons approach is to reduce the complexity of sharing to a small number of licenses from which you choose. That simply doesn’t work here: there are too many edge-cases and specifics that each company has to put into their privacy policy. There can be no catch-all boiler-plate.

Here's the solution: Have Privacy Icons "bolt on to" an existing privacy policy. When you add a Privacy Icon to your privacy policy it says the equivalent of "No matter what the rest of this privacy policy says, the following is true and preempts anything else in this documen...". The Privacy Icon makes an iron-clad guarantee about some portion of how a company treats your data. This method means that without ever having to delve into the details, everyday people can glance at the simple icons atop a privacy to know if and how their data is being used. At the same time, it gives companies the flexibility required to create comprehensive and meaningful policies.

He also goes on to explain how websites do have a reason to use the icons. Basically, because they can use this as a way to differentiate themselves from the competition. However, what I truly don't understand about the whole implementation is what prevents any particular corporation from lying about it and showing the icon on their website, even though they may turn around and sell the data to someone else. I mean, is this self-policed? Who decides whether or not a website truly deserves the icon? Am I truly to believe that large corporations are going to agree to be audited to make sure they truly protect the privacy of their users? I just don't see it happening. Yet, it seems to me, the proposal relies precisely on that. It asks me (i.e., the end user) to trust the corporations. What changes then? {link to this story}

[Thu Jan 16 16:34:01 CET 2010]

Now, this is sort of interesting. Theo de Raadt has sent out an email warning of a possible backdoor planted in IPSEC by none other than the US Government. According to his explanation, he received an email from a Gregory Perry who was the CTO of a company that did some work on OpenBSD's IPsec stack a while back. The claims are quite amazing, but perfectly believable these days, of course:

My NDA with the FBI has recently expired, and I wanted to make you aware of the fact that the FBI implemented a number of backdoors and side channel key leaking mechanisms into the OCF, for the express purpose of monitoring the site to site VPN encryption system implemented by EOUSA, the parent organization to the FBI. Jason Wright and several other developers were responsible for those backdoors, and you would be well advised to review any and all code commits by Wright as well as the other developers he worked with originating from NETSEC.

This is also probably the reason why you lost your DARPA funding, they more than likely caught wind of the fact that those backdors were present and didn't want to create any derivative products based upon the same.

This is also why several inside FBI folks have been recently advocating the use of OpenBSD for VPN and firewalling implementations in virtualized environments, for example Scott Lowe is a well respected author in virtualized circles who also happens top [sic] be on the FBI payroll, and who has also recently published several tutorials for the use of OpenBSD VMs in enterprise VMWare vSphere deployments.

Yes, it could all be a false report. It could just be that Gregory Perry is even more paranoic than Theo de Raadt (I know, I know, difficult to believe, isn't it?). But then, it could also be true. While some people think this is clear evidence that open source software is as easily the object of this type of attacks as any other software, I prefer to see it as a clear example of precisely the opposite: sure, open source software also has bugs, issues and even backdoors, but while there is nothing we can do about it in the case of commercial software at least with open source software everything is done in the open and anybody with the proper knowledge can audit the code. Or, to put it a different way, if you are the Chinese Government, wouldn't you prefer to run your infrastructure on open source software that you can audit, rather than commercial software that may have just as many backdoors but that you cannot even read? Yeah, that's what I thought too. {link to this story}

[Tue Dec 7 16:48:31 CET 2010]

Google has unveiled a preview of its much awaited netbook product based on the Chrome OS. Yes, it seems to be fast, safe and easy to use. Yes, it boots in just a few seconds. It may be worth a try. And yet, I cannot avoid some nagging doubts about a product like this. I mean, how many times did I find myself in a situation where my netbook didn't necessarily work with the wifi network? Or, better yet, how many times did I find that there was no wifi network at all? Since I run a more classic falvor of OS on my netbook, it truly wasn't a problem. I could still do plenty of things with my netbook (well, at least I could do what I needed at that moment, such as taking notes from a meeting or reviewing several documents). Like I said, though, it may be worth a try. I ignore if perhaps the Chrome OS has a stable implementation to work offline.

{link to this story}

[Sat Dec 4 14:14:13 CET 2010]

It definitely looks as if the browser market has been picking up speed out there lately. I recently got an invitation to test RockMelt, a so-called social-media browser or, in other words, a browser specifically designed for those people who spend a considerable amount of time using the online social networks (see Wired's review of the product here). The browser, based on Google's Chromium open source project, is pretty nice. It combines Chrome's speed and reliability with a bunch of new features that do make it easier for the end user to handle the different social networks (Facebook, Twitter, RSS feeds, etc.). Mind you, it's perhaps nothing to call home about, but pretty decent. If your life is centered around the social networks, you should definitely give it a try. However, those people have already invested plenty of hours to customize other browsers (Firefox, for instance, as is my case) may not care to switch. The truth is that, although RockMelt drew a lot of attention because it is Marc Andreesen's latest pet project (for those who don't know who he is, Andreessen was a co-author of Mosaic, the first widely used web browser, and co-founder of Netscape Communications Corp.), it doesn't provide much that is not already available in Flock, for instance. Besides, if social networking is not your key activity online, it may not be the browser for you. In any case, give it a try if you have a chance. I ignore if there is a Linux version, since I got my invitation and happened to download the installer from my Apple laptop.

In the end, no matter what you think of RockMelt, it is sort of nice to see this much competition coming to the browser yet again. We should not forget that it wasn't so long ago that it seemed as if Microsoft's Internet Explorer was going to become as widely used as the Windows OS itself. In case you don't remember anymore, it was Mozilla that changed all that. We should be grateful. While it lasts, at least. One never knows when the Microsoft behemoth may take over. {link to this story}