[Sat May 29 13:42:28 CDT 2004]

One has to wonder about the future of a company where its top managers live off criticizing the opponent instead of making efforts to explain what their company is all about and how they plan to bring it back to profitability. Sun Microsystems is such a company. We thought we only had one major clown in town with Scott McNealy's constant attacks on Microsoft, but it turns out that young Jonathan Schwartz is joining his elder in the showbiz. Not only did he maintain with a straight face a month ago that Red Hat was a proprietary company, but not content with that he is now coming back to the very same idea. This time around though, Steven Vaughan-Nichols ran Schwartz' arguments by Eric Raymond to show us all how ridiculous his arguments are. Now, to be fair, Schwartz makes some good points but none of them backs up his main statement (that Red Hat is a proprietary company). As usual, Sun's managers like to confound the public with their distinction between open standards and open source, implying that the latter does not provide true freedom to the customers while the former does. It is just fair to acknowledge Sun's tradition of backing open standards such as NFS, NIS or Java, but that does not change the fact that their products truly are proprietary and Red Hat's are not. Schwartz mentions how Red Hat provides the source code and not the binaries for free, and goes on to remark how most customers simply do not have the know-how to build from the sources. Fair enough. However, the fact that several vendors and projects out there do grab those source RPMs and build their own solution with them does tell me very clearly that Red Hat does not lock anyone in. Red Hat is promoting binary incompatibility, I hear? How so? Yes, ISVs do certify their applications against a given Linux distribution, and these days there is a good chance the distribution of choice will be Red Hat due to the fact that they dominate the market, but how does that represent binary incompatibility of any sort? ISVs certify their applications with Red Hat because that is the distribution they test against, not because it is literally impossible to install them on any other distribution. As a matter of fact, people do install Oracle, just to use an example, on Debian or Slackware. Also, as far as I know the very same ISVs also certify their applications on Solaris, right?

In any case, what I find quite surprising is that Schwartz and company are wasting their time accusing this or that other competitor instead of working on a serious business plan that takes their company out of the hole where they are right now. A negative message does not amount to a solution. We already know what Sun management does not like (pretty much everything that is not Sun), but we still have to hear what is it that they like. For the time being, customers are only hearing a contradictory mish-mash of statemens: Java is an open standard, but we do not want to lose control over it just in case we find the way to make big bucks from the product; Solaris is the best OS ever invented, but we also do Linux if the market likes it; the x86 architecture sucks, but we will sell you some AMD boxes if you want us to; opensource is not as good as people think, but we do release OpenOffice under an opensource license... These guys definitely need to retreat into a mountain cabin to see if they can clarify their ideas a little bit. {link to this story}

[Sat May 29 12:06:55 CDT 2004]

William Schaff writes in Information Week about how the analysts are underestimating HP. I beg to differ. He does make some good points indeed, but overall I think Carly Fiorina still has plenty of work to do in order to shape up the company and, most important of all, come up with a clear and consistent business plan. I disagree with Schaff when he argues that "many industry insiders said merging Hewlett-Packard and Compaq would create a supersize hippopotamus: slow, clumsy, and easy prey". That is not the point. The main problem was (and is) that HP's and Compaq's lines of products overlapped more than complemented each other, and now the post-merger management team is left in an unenviable position where they have no choice but to chop off redundant parts of the company and design a new vision for a company twice as large as the old one. Let us be honest. Both HP and Compaq were already running into trouble when they decided to merge, and I really do not think their situation has changed that much in the meantime. Could someone please step up to the plate and let us all know what HP's vision is? What is their business strategy? Which is their main line of business? What is their differentiator? Sure, as Schaff states, their imaging and printing division is quite profitable, but it is also true that it accounted for 75% of their profits in the fiscal year ended in October while the other areas of the company were about to break even. Does anyone believe imaging and printing can sustain this behemoth in the long run? I seriously doubt it. So, what else is left? High-end servers? IBM appears to be far more competitive and it also offers a more compelling story in that market, and Sun can also provide some good products. Low-end servers? Dell is a tough competitor to beat there. Workstations? Dell shows up again in the horizon. Even worse, HP has no differentiation whatsoever in either of these two markets right now. What would lead anyone to choose HP over Dell? How does one justify it? Nobody quite knows if HP is a Windows, Linux or UNIX company either. Do they do services, consulting, integration? What is it that they do? So, other than imaging and printing, what is it exactly that HP wants to sell? Until their management can answer that very clear question in very clearm terms, I am afraid we will not be able to declare the post-merger transition over yet. {link to this story}

[Sun May 23 14:44:24 CDT 2004]

It is just amazing how fast things change in this field! Not so long ago, Linux was the upstart and one could hear lots of sarcastic comments coming from the commercial UNIX camp about its lack of stability and maturity. To be fair, they were right in pointing out Linux' deficiencies. The problem was that most of those people failed to see Linux for what it was: an extremely dynamic movement capable to catch up and adapt within a few years. Sure, Solaris, AIX, HP-UX or IRIX were far more stable, scalable and mature than Linux, but who says that Linux was going to stay put? What if this new technology could advance much faster than the big UNIX cousins? Well, that is precisely what happened, of course. While the decision-makers at Sun & co. were resting on their laurels, feeling safe about their technical superiority, the Linux upstart kept on improving, adding features, drivers, and maturing. Now, one knows things have changed when yesterday's arrogant critics start drawing defensive strategies to make sure their beloved UNIX flavors do not disappear on the face of the Linux onslaught, or when one reads how sysadmins consider Linux "the standard implementation" by now. Let us face it. For the last few years, most UNIX vendors have made huge efforts to at least make sure that GNU software compiles on their platforms, for they knew fully well that without that they did not have any hope to survive. They kept talking about the supposed superiority of their own tools, but at the end of the day they all ended up doing the same: porting GNU tools to their own UNIX flavors and making sure they did not upset the vast community of users that had adopted these tools as the cornerstone of their daily work. In other words, the large UNIX vendors have just been living in denial for nearly a decade and have been slow to react to a new disruptive technology, which certainly explains their problems now. Is it too late for the likes of Sun or SGI to survive? Who knows, but there is little doubt in my mind that if they finally do it will be at the price of adopting Linux and opensource as the standard, and not by imposing their own "superior" technologies. {link to this story}

[Sun May 23 14:26:39 CDT 2004]

Jonathan Goodyear makes a good argument in Visual Studio Magazine on the perils of relying too much on .Net components. As a matter of fact, his warnings can be applied to components in general, regardless of the language, but they apply especially to frameworks such as .Net where new developments happen at break-neck speed. Much has been said and written in favor of those programming environments that allow us to increase the developer's productivity by making it easier and faster to whip out applications (actually, this has always been one of the main arguments in favor of scripting languages) but not enough attention has been paid to the problem pointed out by Goodyear in this article: in a world where we use more and more pre-packaged components, where re-use is a must, this increase in productivity also has its own set of problems that we ought to consider. {link to this story}

[Sun May 23 14:12:24 CDT 2004]

There are still quite a few people out there who maintain that it is simply not possible to make any money out of opensource software. Microsoft fans are definitely happy spreading this FUD, and so are most top managers from the commercial UNIX camp. The only problem is that it is just not true. Not only is Red Hat, poster child of the Linux revolution, happily making more money than many other companies in the proprietary world, but the critics are forgetting a nice slew of smaller businesses that survive just fine by developing, selling and supporting opensource software: MySQL, Progeny, JBoss... In other words, in our enthusiasm for large corporations that go public to make some quick bucks (undoubtedly a remain from the heyday of Internet startups not so long ago), we tend to ignore those successful companies that are making money selling services in the opensource world. I just realized this while reading Andrew Binstock's article about JBoss quickly becoming the third most important J2EE implementation out there in the market. Sure, to be fair the opensource model (give the product away for free, make money with value-add and services) will not make you billions, but it can help you live a pretty decent life without a need to resort to dishonest lock-in strategies or monopolistic practices. I can see how the top managers at Microsoft and Sun may be worried about it, but it seems to me that the end result of this new landscape would only be more freedom for most people, consumers and vendors alike. {link to this story}

[Sat May 8 16:29:21 CDT 2004]

Peter Coffee writes about why web services makes sense today. It is not only its interoperability that matters, but mainly the increased productivity during the development. Truth be told, there are less and less reasons to use languages like C and C++, although they still have a place in systems programming, of course. However, most mere mortals like you and me are better served by using the so much derided scripting languages and components such as the ones provided by web services. As the author explains,

On past occasions when I've been paid to use a language that involves explicit coding of low-level functions, such as managing memory, I felt as if I were taking the client's money under almost false pretenses —like a taxi driver who keeps the meter running while stopping to wash the car and put gas in the tank. The client is paying but is not getting any closer to the desired destination.
Yes, there are cases where using a low-level language still makes sense. It definitely is the right choice (at least for the time being) when you are going to write an operating system, drivers or high-performance applications. However, let us be honest. We spend most of the time using applications that could perfectly be written in some other less efficient language. It does not make a big difference to me if my IRC client takes 5 or 10 more seconds to launch, especially when the system is hardly ever rebooted because the OS is stable enough. On top of that, the processor is not the bottlenech anymore. Most home computers have processors that their users do not even need to do their daily work. {link to this story}

[Sat May 8 12:02:15 CDT 2004]

Back in February I wrote about Sun's Looking Glass desktop interface and how skeptical I was about its future. Well, apparently I was not alone. Jim Rapoza writes an article in eWeek reasoning why he cannot see the 3-D desktop as a winner either. Sure, the whole concept is cool but at the end of the day that is not what matters in the real market out there. There will always be a small audience for cool tricks, and they definitely like to think of themselves as the avant-garde of what is to come tomorrow, but I beg to differ. I tend to see them as highly experimental people who are very prone to accept any novelty for its own sake. Yes, experimental technologies can become mainstream after some time, but only when it makes sense and we find the right application. Sorry, but I am afraid 3-D desktop interfaces have little room in today's desktop and workstation systems. {link to this story}

[Sat May 8 10:27:46 CDT 2004]

There was a time, not so long ago, when Extreme Programming (XP) was all the rage and one could barely read any industry publication without coming across an article praising its strengths. Alas, as with anything else in life, it turned out that XP also has its weaknesses. Andrew Binstock writes a good article in Software Development Times reviewing where XP worked and did not work, and he does so in a tolerant, rational, non-dogmatic way that should be praised.

First of all, on an adoption basis, XP could hardly be termed a success. IT sites today, as a rule, do not use XP. The closest they come is acceptance of automated testing and limited test-driven development (TDD). Both of these practices are tenets of XP, even though they antedate XP. However, the full XP is based on 12 practices. The other practice IT sites use extensively today is coding standards. Beyond testing and coding standards, XP is not widely deployed.

However, Binstock does not only tell us about the limited acceptance of XP in our IT shops. He also does what any good analyst should do, and attempts to find the reasons why this happened.

The foundation of XP, in my view, is part of the problem: It is a radical embrace of an approach that goes from the particular to the universal. It is the purest form of bottom-up development: You never design more than what is immediately needed, you write the least amount of code that will fulfill your next test, and you design the test to provide the least amount of incremental change. After you've written lots of tests (frequently thousands), you clean up your code by using one of 72 refactorings —which are specifically analyzed techniques for cleaning code without changing its functionality.

The fundamental problem with this approach is that software today is complex and large, so it cannot be designed properly by using the least-increment approach and hoping that a sound product will eventuate through the organic accretion of lots of small design decisions (followed up by code cleaning). Large, complex projects have to be designed top-down and the code must be developed to that design —regardless of its complexity.

In the end, it all boils down to something we should have learnt a long time ago: there is no magic bullet, and our best bet is to be flexible and tolerant enough to apply different development models to different projects. In this sense, the XP dogma is just as bad as the C, Java, C++ or Perl dogma. In software engineering, what matters is not the tools we use but rather the quality of the final product. XP does have its place in our toolbox, but let us not try to force it into every single situation simply because it is what we are used to or it is the latest trend. {link to this story}

[Sat May 1 16:18:01 CDT 2004]

A little bit more on the campaign to request that Java be opensourced. Jason Brooks writes in eWeek in favor of the idea. He acknowledges Sun is still "smarting from its Java battles with Microsoft" and is "afraid that a powerful vendor like IBM might likewise splinter Java for its own purposes", but argues that:

... Sun's fears of forking are overblown, and these fears underestimate how important Java compatibility is to customers and, by extension, to other Java vendors. (...) Jonathan Schwartz, executive vice president of Sun's software group, points to Linux as an example of forking in the open-source world, but the Linux kernel has actually proved fork-resistant. While it's true that there are many different Linux distributions, each runs a kernel from the same project.
As a possible solution to this dilemma, Brooks thinks that Sun should take the same path already trailed by MySQL and release at least the Java Runtime Environment and J2SE under a dual license. For its part, Andrew Binstock offers in an article published by SDTimes some more technical arguments in favor of opensourcing Java. In his view, the language sorely needs some innovation that can only be provided by developers outside of Sun itself, precisely because Sun does not want to rock the boat. {link to this story}

[Sat May 1 15:58:54 CDT 2004]

One has to admire the turnaround that Novell's management appears to be leading. Just a couple of years ago, it was considered nearly unanimously as a company on their way out of the business. Few people could even remember what NetWare was (hint: it's a network operating system) and pretty much everyone ignored that the company was even still around. Today though, and after purchasing SuSE and Ximian, the company is quickly taking shape as the leading vendor in the Linux market, and one that may give the likes of Red Hat, Sun and perhaps even Microsoft a run for their money. HP and Novell have recently announced a partnership to bring SuSE to select HP desktop and laptop systems, at the same time that Novell decided to make both iFolder and YaST open source, new partnerships are signed all over the place, SuSE continues getting excelent reviews for putting together the best integrated and well engineered Linux distribution and open source continues its unstoppable march to world domination. As a matter of fact, while it is true that Red Hat is also doing a great job at increasing its revenue, there is little doubt in my mind that it is Novell that is taking the center of the stage now with a very sound strategy that combines excellent products and solutions across a line that goes from the server to the desktop. In just a matter of months, they have built a plan that can compete not only with Red Hat but even with the IBM behemoth. The only thing that is perhaps still missing from their story is the high-end server and high performance markets, which is why I think that purchasing a company like SGI next would make a lot of sense. Yes, they have also been struggling for a long time now and it is far from clear whether or not they will manage to turn the company around, but their Linux on Itanium line is a success and fits perfectly with the rest of Novell's strategy while the overlap is nearly non-existent. {link to this story}

[Sat May 1 15:42:32 CDT 2004]

Every now and then, especially when there is a recession, issues like social benefits, unemployment and, of course, outsourcing, become central topics of conversation. When I first arrived in the USA back in 1995 lots of people were worried about the outsourcing of manufacturing jobs, perhaps due to Pat Buchanan's loud criticisms. Just a two or three years later though, the issue was mercilessly thrown away into the dustbin. Back in those days, it was the turn of cocky IT people to explain that the logic of capitalism could not be stopped and those jobs needed to be sent abroad in the name of economic efficiency or American companies, unable to compete in a tough global market, would have to close the doors and leave everyone without a job. Well, the story feels somehow different when the phantom of outsourcing and globalization is knocking on one's door, and not the neighbor's. All of a sudden, rational theories of economic efficiency go out the window and one realizes that a job is far more than a number in our nation's economic statistics. It's also an individual's (perhaps even a family's) source of income. In any case, the issue is quite complex and I will not try to cover it here in a few lines (I may write more extensively on the topic somewhere else one of these days) but I cannot help it but to make notice that the complaints against outsourcing that we hear coming from the IT departments these days definitely ring hollow when we know what we know about our recent past. As Rob Preston clarifies in Network Computing:

... before you lash out at those who would expose your profession to scrutiny, ask yourself this: Which occupation has been the most relentless over the past decade in eliminating other people's jobs?

Answer: yours.

You and your colleagues have automated call-center operators, data-entry clerks, records managers, engineers, sales assistants, customer-service reps, support technicians, application developers, benefits administrators, project coordinators and countless other "nonessential personnel" out of jobs through your IT-driven reinvention of various business processes. So why is it when you make other workers redundant, that's progress (or business as usual), but when change threatens your job, it's treason or terrorism?

{link to this story}