Google Chrome was all
the rage a few months ago, but I didn't have the time to give it a try back
then. Besides, Google
officially released the Windows XP version only, and in my house we only run Linux and MacOS. So, it wasn't until a couple of days ago that I decided to search around
and see if there is a beta version for any of those OSes. And, lo and behold,
I found the Chromium
snapshots, still in alpha. To be sure, there is some functionality still
missing from these snapshots. For instance, there is no integration with the
Adobe Flash
Player, which means that sites like YouTube are pretty much beyond reach. Still, if one could forget
about other marketing and business considerations and judge software products
solely on their merits, I'd bet that Google Chrome will become a major
contender in a new round of the browser wars. The little
multithreaded application based on the Webkit framework (like Safari) is specifically geared towards the use of web
applications, a field where Google itself excels and has a significant stake,
of course. Yes, it lacks many of the extra functionality that other browsers
have, but it does the basics very well, it's stable and, above all, it doesn't
suck memory like Firefox.
I ignore if I'm the only one going through this (I doubt it) but I have to
kill the Firefox process at least twice a day lately, especially since I
started using web applications quite often. It's become a true pain in the
neck that swallows the system resources and slows things down. Here is
a short video from the development team of the Chrome browser explaining
the philosophy behind it:
Oh, and by the way, here is another video about Android. I cannot wait until cell
phones with that operating system are commercialized here in Spain.
In my constant search for simple command line applications I recently came across of fbcmd, a little tool that allows you to update
your Facebook account with
new status messages, links, pictures, notes, etc. I installed it on a system
running Ubuntu 8.04 hardy,
and it works fine. Highly recommended for all those who love the command line.
{link to this story}
Researchers synthesized the basic ingredients of RNA, a molecule from which the simplest self-replicating
structures are made. Until now, they couldn't explain how these ingredients
might have formed.
(...)
RNA is now found in living cells, where it carries information between genes
and protein-manufacturing cellular components. Scientists think RNA existed
early in Earth's history, providing a necessary intermediate platform
between pre-biotic chemicals and DNA, its double-stranded, more-stable descendant.
However, though researchers have been able to show how RNA's component
molecules, called ribonucleotides, could assemble into RNA, their many
attempts to synthesize these ribonucleotides have failed. No matter how
they combined the ingredients —a sugar, a phosphate, and one of four
different nitrogenous molecules, or nucleobases— ribonucleotides just
wouldn't form.
(...)
Like other would-be nucleotide synthesizers, Sutherland's team included a
phosphate in their mix, but rather than adding it to sugars and nucleobases,
they started with an array of even simpler molecules that were probably also
in Earth's primordial ooze.
They mixed the molecules in water, heated the solution, then allowed it to
evaporate, leaving behind a residue of hybrid, half-sugar, half-nucleobase
molecules. To this residue they again added water, heated it, allowed it
evaporate, and then irradiated it.
At each stage of the cycle, the resulting molecules were more complex. At
the final stage, Sutherland's team added phosphate. "Remarkably, it
transformed into the ribonucleotide!" said Sutherland.
All this immediately reminded me of something I had read many years ago in
Carl Sagan's Cosmos:
In my laboratory at Cornell University we work on, among other things, prebiological organic
chemistry, making some notes of the music of life. We mix together and spark
the gases of the primitive Earth: hydrogen, water, ammonia, methane, hydrogen
sulfide —all present, incidentally, on the planet Jupiter today and throughout the Cosmos. The
sparks correspond to lightning —also present on the ancient Earth and
on modern Jupiter. The reaction vessel is initially transparent: the
precursos gases are entirely invisible. But after ten minutes of sparking, we
see a strange brown pigment slowly streaking the sides of the vessel. The
interior gradually becomes opaque, covered with a thick brown tar. If we had
used ultraviolet light —simulating the early Sun— the results would have been more or less the
same. The tar is an extremely rich collection of complex organic molecules,
including the constient parts of proteins and nucleic acids. The stuff of
life, it turns out, can be very easily made.
It shouldn't surprise us much that Sagan was an acknowdleged atheist.
{link to this story}
What in 1991 was a novel physics solution now comes packaged in a virtual
world for you to intuitively explore. A new simulation in OpenSim, an open-source version of the
popular virtual world Second Life, shows how a handful of objects floating in space react to each
other's gravity.
In physics, this is known as the N-body problem. It's simple if you have only two objects: they
orbit their common center of mass in a circle or an ellipse. But three or
more objects send the system into chaos. Physicists and mathematicians banged their head against
it for centuries, with a general solution emerging less than 20 years ago.
1801 - Joseph
Marie Jacquard uses punch cards to instruct a loom to weave "hello,
world" into a tapestry. Redditers of the time are not impressed due to the
lack of tail call recursion, concurrency, or proper capitalization.
1842 - Ada Lovelace
writes the first program. She is hampered in her efforts by the minor
inconvenience that she doesn't have any actual computers to run her code.
Enterprise architects will later relearn her tecniques in order to program
in UML.
(...)
1964 - John Kemeny and
Thomas Kurtz create
BASIC, an unstructured
programming language for non-computer scientists.
1965 - Kemeny and Kurtz go to 1964.
(...)
1972 - Dennis Ritchie
invents a powerful gun that shoots both forward and backward simultaneously.
Not satisfied with the number of deaths and permanent maimings from that
invention he invents C and Unix.
(...)
1980 - Alan Kay creates
Smalltalk and invents
the term "object oriented". When asked what objects are made of he replies, "objects."
When asked again he says "look, it's all objects all the way down. Until you
reach turtles."
(...)
1987 - Larry Wall falls
asleep and hits Larry Wall's forehead on the keyboard. Upon waking Larry Wall
decides that the string of characters on Larry Wall's monitor isn't random
but an example program in a programming language that God wants His prophet,
Larry Wall, to design. Perl
is born.
(...)
1991 - Dutch programmer Guido van Rossum travels to Argentina for a mysterious operation. He
returns with a large cranial scar, invents Python, is declared Dictator for Life by
legions of followers, and announces to the world that "There Is Only One Way
to Do It." Poland becomes nervous.
1995 - Yukihiro
"MadMatz" Matsumoto creates Ruby to avert some vaguely unspecified apocalypse that
will leave Australia a desert run by mohawked warriors and Tina Turner. The
language is later renamed Ruby on Rails by its real inventor, David Heinmeier Hansson. [The bit about Matsumoto
inventing a language called Ruby never happened and better be removed in the
next revision of this artcile - DHH].
1995 - Brendan Eich
reads up on every mistake ever made in designing a programming language,
invents a few more, and creates LiveScript. Later, in an effort to cash in on the popularity of
Java
the language is renamed JavaScript. Later still in an effort to cash in on the popularity of skin
diseases the language is renamed ECMAScript.
1996 - James Gosling
invents Java. Java is a relatively verbose, garbage collected, class based, statically
typed, single dispatch, object oriented language with single implementation
inheritance and multiple interface inheritance. Sun loudly heralds Java's novelty.
Despite Larry Ellison's claim that Oracle's
acquisition of Sun
Microsystems will give his company the ability to construct unmatched
business systems that are integrated from "applications to disk," Oracle
originally sought to acquire only Sun's crown-jewel software assets while
leaving its declining hardware unit to the vultures, accoding to a regulatory
filing.
Oracle on March 12 "sent a letter to our board proposing the acquisition by
Oracle of certain of our software assets, a minority equity investment by
Oracle in our common stock, and entering into certain stragegic
relationships," Sun said in a filing Monday with the Securities and Exchange
Commission.
The ultimate outcome of the negotiations different greatly from Oracle's
original offer. Oracle on April 19 struck a deal to acquire all of Sun
—including its aging line of Solaris-powered Sparc servers—for $7.4 billion, or $9.50 per share.
So why did Oracle agree to a big-bucks deal for a vendor that derives most
of its sales from a declining box business? The SEC filing shows that
Oracle's hand may have been forced by IBM, which was engaged in its own talks
with Sun earlier this year, and by yet another vendor —possibly
Hewlett-Packard.
A friend just sent me a link to a YouTube video showing a TV news segment from 1981 discussing how early
home computer users could read their morning newspapers online and the
implications that could have for the near future. Both the technology
displayed in the video and the comments about the new media are priceless.
[Windows and Linux] both play an important role but fundamentally, the
free software ecosystem needs to thrive on its own rules. It is
different to the proprietary software universe. we need to make a
success of our own platform on our own terms. If Linux is just another way
to run Windows apps, we
can't win. OS/2 tried
that...
It's the old search for the killer app. Either that or, perhaps more
likely these days when so many applications are moving to the cloud, a new
and innovative way to use the computer in order to access those apps in the
cloud. Perhaps the future lies not so much in the apps as in the frameworks
and interfaces. The computer is quickly becoming a commodity, after all.
Ideas like KDE's social desktop concept are quite original and may help
differentiate Linux from competing OSes. Here is a nice presentation
demoing the concept:
InfoWorld has published a funny story titled True believers: The biggest cults in tech that makes for an interesting
read. It covers the usual (and unusual, in the sense of unknown to most
non-technies) groups: Apple, Commonodre, IBM mainframes, Lisp, Newton, Palm,
Ruby, Ubuntu, etc.
{link to this story}
[Wed May 6 11:15:08 CEST 2009]
According to Computer World, Apple is said to make a US $700 million buyout offer on Twitter.
What surprises me about the whole thing is that analysts appear to like
the idea. The article quotes Dan Olds, an analyst with the Gabriel
Consulting Group Inc., as saying that it could make for an interesting
combination. If by "interesting" he means out of the ordinary, sure. I just
don't see how it makes any business sense at all. That's all. The same
applies to the rumors of Microsoft being after the company too. On the other hand, both Google and Facebook buying Twitter makes far more sense to me. In the case of Facebook, the company is
already a big player in the social networking market. Google, on the other hand, could integrate
Twitter into its nicely put together ad-revenue business model. But Apple? What could the company do
with it?
{link to this story}