[Mon Jun 30 13:10:08 CDT 2008]

Jim Rapoza publishes an article on eWeek about the different apps available in Google Labs. I must say I admire the Google's capacity to continue innovating in spite of being a large and successful company. How do they do it? By unleashing the power of creativity inside their employees' heads. Sure, Google doesn't hire your regular mediocre engineer out there. Most of their employees are the brightest of the bright. However, there is something to be said too in favor of a management strategy that allows those employees to thrive and create, instead of suffocating them in paperwork and processes. True, this management style cannot guarantee success (anybody remembers Netscape?) but then... what management style can do that? At the very least, they get the most bang for their buck.

I also read somewhere else that Google plans to add social networking features to its iGoogle homepage service. The new apps will be based on the Open Social API. Interesting stuff. {link to this story}

[Sat Jun 28 13:57:13 CDT 2008]

I really enjoyed watching Jill Bolte Taylor's chat about her own first-hand experience going through a brain stroke available on the TED Talks website (an excellent site, by the way). It is highly recommended.

Incidentally, while hearing her description of what a stroke is like, I couldn't help it but to consider how many similar cases may have been mistaken for mystic experiences in the past. After all, what she went through is eerily reminiscent of an out-of-body experience. I imagine it's one more reason for an agnostic like me to feel skeptical about certain things. {link to this story}

[Fri Jun 27 16:14:07 CDT 2008]

Wired magazine publishes excellent articles every now and then. By this I mean truly good pieces full of insights and ideas. Chris Anderson published an article titled The Long Tail back in an October 2004 issue that changed lots of things in the industry. Well, he now publishes The End of Theory: The Data Deluge Makes the Scientific Method Obsolete, another very original and suggestive piece of work.

"All models are wrong, but some are useful".

So proclaimed statiscian George Box 30 years ago, and he was right. But what choice did we have? Only models, from cosmological equations to theories of human behavior, seemed to be able to consistently, if imperfectly, explain the world around us. Until now. Today companies like Google, which have grown up in an era of massively abundant data, don't have to settle for wrong models. Indeed, they don't have to settle for models at all.

Sixty years ago, digital computers made information readable. Twenty years ago, the Internet made it reachable. Ten years ago, the first search engine crawlers made it a single database. Now Google and like-minded companies are sifting through the most measured age in history, treating this massive corpus as a laboratory of the human condition. They are the children of the Petabyte Age.

Sounds far-fetched? Well, yes, Wired magazine is known for loving this sort of shocking predictions. Sometimes they do pan out, and some other times they don't. However, my experience has been that, for the most part, even when they are wrong, they do have at least a point. In other words, if things don't work out to be the way they predict, at the very least they do get pretty darn close to signalling the trends of what's coming. The way things are, I'd say that's quite good.

Here's how Chris Anderson defines the Petabyte Age:

The Petabyte Age is different because more is different. Kilobytes were stored on floppy disks. Megabytes were stored on hard disks. Terabytes were stored on disk arrays. Petabytes are stored in the cloud. As we moved along that progression, we went from the folder analogy to the file cabinet analogy to the library analogy to —well, at petabytes we ran out of organizational analogies.

At the petabyte scale, information is not a matter of simple three- and four-dimensional taxonomy and order but of dimensionally agnostic statistics. It calles for an entirely different approach, one that requires us to lose the tether of data as something that can be visualized in its totality. It forces us to view data mathematically first and establish a context for it later. For instance, Google conquered the advertising world with nothing more than applied mathematics. It didn't pretend to know anything about the culture and conventions of advertising —it just assumed that better data, with better analytical tools, would win the day. And Google was right.

(...)

This is a world where massive amounts of data and applied mathematics replace every other tool that might be brough to bear. Out with every theory of human behavior, from linguistics to sociology. Forget taxonomy, ontology, and psychology. Who knows why people do what they do? The point is they do it, and we can track and measure it with unprecedented fidelity. With enough data, the numbers speak for themselves.

Suggestive indeed! But, according to Anderson, the biggest changes should take place in the world of scientific research:

The big target here isn't advertising, though. It's science. The scientific method is built around testable hypotheses. These models, for the most part, are systems visualized in the minds of scientists. The models are then tested, and experiments confirm or falsify theoretical models of how the world works. This is the way science has worked for hundreds of years.

Scientists are trained to recognize that correlation is not causation, that no conclusions should be drawn simply on the basis of correlation between X and Y (it could just be a coincidence). Instead, you must understand the underlying mechanisms that connect the two. Once you have a model, you can connect the data sets with confidence. Data without a model is just noise.

But faced with massive data, this approach to science —hypothesize, model, test— is becoming obsolete. Consider physics: Newtonian models were crude approximations of the truth (wrong at the atomic level, but still useful). A hundred years ago, statistically based quantum mechanics is yet another model, and as such it, too, is flawed, no doubt a caricature of a more complex underlying reality. The reason physics has drifted into theoretical speculation about n-dymensional grand unified models over the past few decades (the "beautiful story" phase of a discipline starved of data) is that we don't know how to run the experiments that would falsify the hypotheses —the energies are too high, the accelerators too expensive, and so on.

(...)

There is now a better way. Petabytes allow us to say: "Correlation is enough". We can stop looking for models. We can analyze the data without hypotheses about what it might show. We can throw the numbers into the biggest computing clusters the world has ever seen and let statistical algorithms find patterns where science cannot.

If Anderson is right, we'd be entering a new era, clearly dominated by mathematics and algorithms. {link to this story}

[Fri Jun 27 15:48:51 CDT 2008]

While reading up on Quentin Zervaas' Practical Web 2.0 Applications with PHP book I come across a refernce to script.aculo.us. I had no idea of this product, but it looks pretty cool: a JavaScript library released under the MIT license that provides dynamic visual effects and user interface elements. Nice. {link to this story}

[Fri Jun 27 13:48:26 CDT 2008]

Browsing around here and there I came across an old article published by Wired about SGI fans that, as an SGI employee, makes me feel good. As they say:

In California there's a a computer manufacturer that makes powerful machines beloved by a tiny niche of creative users, featuring a media-oriented Unix operating system and stunning industrial design. But it's not Apple Computer.

The company is Silicon Graphics Incorporated, or SGI, which once was famous for its high-powered graphics and 3-D workstations but has fallen on hard times of late.

(...)

The distinctive bright-blue computer as inteneded as the answer to designers' prayers —all-powerful Internet-oriented and with a wide selection of creative software running on Unix.

Like the iMac, the Indy was used for video editing, graphics and web design. The machine included support for video input, and SGI supplied a high-quality webcam suitable for video conferencing.

Networking was simple, and the machine came with a built-in adapter for then-popular ISDN communications.

The operating system was a friendly version of Unix —one of the first to feature a graphical user interface rather than a command line.

(...)

Unlike Apple, SGI never intended its machines to be sold to the public. Instead, SGI machines were sold at the coporate level.

At one point, SGI attempted to force hobbyist websites into running a legal disclaimer that they were in no way connected with the company. Some closed down rather than agree to the lawyers' conditions.

Now, the company at best tolerates the hobby community, turning a blind eye to sales of secondhand software, which is forbidden by user agreements.

But the numbers are small. If the Mac community is dwarfed by the Microsoft horde, the number of SGI users amounts to a rounding error. Mac users may form a cult, but the SGI community is the tech equivalent of the Pre-Reformation Moravian Church —unknown, tiny and years ahead of its time.

Well, we don't do workstations anymore (at least, it's not our main market), but we are still there, on the forefront of the techonology world. SGI has plenty of extremely bright people, and I still enjoy working here every day. {link to this story}

[Thu Jun 26 16:29:08 EDT 2008]

While reading Wikipedia's article on supercomputers I came across an interesting graph showing the operating systems running on the top 500 supercomputers. Starting sometime in January or February 2004, Linux took off and surpassed UNIX. Ever since, the distance has widened. {link to this story}

[Thu Jun 26 14:54:34 CDT 2008]

Since I am working from Minnesota this Summer and the house where I am staying has an Apple AirPort, I had little choice but to use WPA on my Ubuntu Dapper Drake laptop. While doing this on Hardy Heron is pretty easy, the same cannot be said of Dapper. One only has to search around the Web to find out discussions like this. In the end, after trying quite a few different solutions, the only one that truly did the trick is the the article published by DebianAdmin under the title Enable WPA Wireless access point in Ubuntu Linux. The most important thing, it seems, is to make sure you remove any reference to other network interfaces in the /etc/network/interfaces file. At least, I tried following all the other steps but that one, and it didn't work for me. By the way, the GNOME Network Manager is pretty nice. I had never used it before. {link to this story}

[Wed Jun 25 15:17:51 CDT 2008]

Sun's Chief Open Source Officer, Simon Phipps, has finally come out acknowledging what pretty much everyboy already knew: that Sun has screwed up in its relationships with the opensource community. According to his statements to Builder AU:

Open source developers have been much more sceptical of Sun, a lot of open source developers don't remember the fact that Sun was pretty much the first open source start-up in 1982. All they can remember is what happened in 2001/2002 when, to be quite frank with you, we screwed up. We alienated a large group of open source developers by the attitudes we had of the community back then.

He is absolutely right. However, anybody who paid close attention to the business trends could already tell this back in the year 2001. So, what happened? Yes, the company contributed a lot to open standards (mind you, this is not the same as open source, although Sun executives always try to confuse this in people's minds), which makes it even more puzzling that they behaved the way they did in 2001/2002. By contrast, Netscape was always much more open to the idea of cooperating with the open source community. {link to this story}

[Tue Jun 24 14:18:45 CDT 2008]

I certainly know quite a few people who will not like Microsoft's announcement denying Windows XP a last-minute reprieve. As Computer World puts it:

Microsoft yesterday laid to rest rumors that it might reconsider pulling Windows XP from retail shelves and from most PC makers next Monday.

In a letter to customers, Bill Veghte, the senior vice president who leads Microsoft's online and Windows business groups, reiterated that June 30 would be the deadline when Microsoft halts shipments of boxed copies to retailers and stops licensing the operating system directly to major computer manufacturer, or OEMs (original equipment manufacturers).

I have met way too many people who dislike Vista on both sides of the Atlantic and they will now be left with no other choice of they want to stick with Windows. The few chances I had to use the OS on somebody's computer, I truly hated it. It got on my way to perform the most basic operations. As a matter of fact, at least in one case I had to walk away without being able to use a thumb drive because the OS stubbornly continued viewing an executable that came with the flash drive by default (as it tends to happen with most of them) as a serious security threat and there was no easy way to override it as far as I could see. {link to this story}

[Mon Jun 16 14:59:17 EDT 2008]

Last night we arrived back to Minnesota, where we will be spending the rest of the Summer visiting family and friends. I seriously doubt I will be able to update this blog as often as I usually do. We'll see how things go. {link to this story}

[Wed Jun 11 16:18:44 CEST 2008]

Checking out Wired News, I came across an interesting discussion on Slashdot about setting up a home lab/shop for kids:

When I was growing up, my Dad let my brother and I have the run of his wood shop, and kept up a steady stream of Lego kits, Estes model rockets, chemistry sets, Heathkit projects, and other fun science stuff from the Edmund Scientific catalog, and the rest was history. I'd like to give my kids that kind of experience. If your kids were interested in science, computers, robots, and building stuff, how would you build and outfit a lab/shop for them (and you) to play in?

As usual, there are plenty of people complaining about things not being the same as in the past and running into the Government authorities here and there whenever they tried to obtain certain chemicals online. However, aside from all that whining (yes, sorry, but times have indeed changed, what can we do?), there are some good posts too, of course. {link to this story}

[Fri Jun 6 16:54:22 CEST 2008]

Just in case Google Earth wasn't cool enough, CNet News tells us now that Google is about to release a new web application called Google Ocean to allow public access to the maps of the oceans. According to the piece, the name could still change, but the idea is to create a 3D oceanographic map that could also be consulted to find out information about the currents, water temperature, etc. {link to this story}

[Wed Jun 4 10:09:45 CEST 2008]

I read in Computer World that Bill Gates gave his last publicly scheduled speech as Chairman of Microsoft yesterday and he delivered it precisely to the crowd that helped him succeed: the software engineers. The piece reminds us that Gates started as a software developer about 33 years ago, which is absolutely true. However, as I have pointed out in other occasions, what made Gates famous was not his programming skills but rather his business acumen. While I have some serious doubts that any of his (Microsoft's) programs should ever be used as an example in a programming class, there is little doubt in my mind that his career as a business man is definitely worth something to all those interested in the world of the computer and technology business. Not only did he realize at the dawn of times (so to speak) that big money was to be made in the software industry rather than in the hardware, as many other people thought at the time, but he also realized that the best way to succeed in that field was to sign up contract agreements with hardware vendors, make sure that his software would run on cheap off-the-shelf machines (PCs), aim his products to the broader consumer market instead of the hobbyists and, above all, build the tools that would allow said hobbyists to write the applications that would make his platform the most popular one in the world. On top of all that, Gates the businessman had something that most other people lack: the capacity to see certain trensd coming and adapt to them. Will his successor be able to live up to all that? I doubt it. {link to this story}