[Thu Aug 25 16:07:03 CDT 2005]

Yet another interesting open source project. Kadischi allows you to generate your own Fedora-based Live CD. Be careful, it is still in a very early stage of development and it may not work very well. {link to this story}

[Thu Aug 25 15:52:56 CDT 2005]

Granted that it felt a little bit weird at the beginning, but one of the best things SGI has done with its Altix servers was to set up a serial connection to access the system console by default. Coming, as I did, from PC land, the idea of using a serial port to access the console was strange to me, but it definitely is a time savior when one runs into the unavoidable system problems, making it possible to run a debugger like KDB to troubleshoot and debug any kernel problems one may be running into. In any case, Phil Pokorny, from Penguin Computing, published an article in Enterprise Linux telling us how to set up a serial console log. In summary, after you connect the cable between the two systems, you proceed to change the menu.lst or lilo.conf (depending on whether you use GRUB or LILO as your boot loader, although it will be called elilo.conf in the case of the Itanium2 architecture) in order to add the following option to your boot line:

console=ttyS0,115200n8

You then edit the /etc/inittab file to add the following line to it:

# Serial console login
S0:2345:respawn:/sbin/agetty -L ttyS0 115200

Finally, run telinit -q to re-read the configuration file and everything should be ready. At this point, you should be able to log into the console via the serial port connection using something like the old minicom. That is all there is to it. {link to this story}

[Thu Aug 25 15:46:50 CDT 2005]

Kyle Rankin writes a great article in O'Reilly's Linux DevCenter on how to use mplayer to edit a movie. Basically, you run the following command:

$ mplayer -edlout test.edl file.avi

Now, as the program goes through the video, you can hit i when you come across a given scene that you would like to edit out or skip. This will write down the start time in the test.edl file and skip for 2 seconds. You can then edit the file in a regular text editor and fine tune it a little bit. Finally, use the following command to play the movie again:

$ mplayer -edl test.edl file.avi

It definitely is a great way to remove unwanted scenes from a movie, and since the configuration is done through a simple text file it is just as easy to share with friends. Ah, the power of open source! No wonder so many corporations out there dislike it. After all, it returns the power to the consumer. {link to this story}

[Thu Aug 25 15:17:28 CDT 2005]

For the past week and a half or so I have been enjoying my sabbatical (yeah, it is pretty nice that SGI still has that), so I stayed away from technology news and articles for the most part. No, neither did I stay away from computers (in fact, I have still used them on a daily basis just to communicate and read texts) nor did I completely avoid technology issues. Still, for the most part I have been taking advantage of the free time to be with the family and sunbathe by the lakes. Today though I decided to check LinuxToday and found an interesting piece about a young fellow who wrote his dissertation on the open source movement for the Anthropology department at the University of Chicago. Among other things, he explains:

This dissertation, based on fieldwork conducted on the Debian free software project and among hackers living in the Bay area between January 2001 and May 2003, is an ethnography of the ethics and politics of free and open source hackers. What was once a fringe and esoteric hobbyist technical practice —the production of free and open source software— has veritably exploded since 1998. (...) My aim in this dissertation is to evaluate the rise of expressive rights among hackers as a historically and culturally specific practice of liberal freedom that can only be made sensible through the lens of a hacker technical way of life —in which their pragmatics and poetics are given serious consideration. (...) I argue that hacker values for expressive freedom are a particular instantiation of a wider liberal tradition. Instead of an emphasis of self-determination and individuality based on the acquisition of property, hackers have placed emphasis on individuality as a form of critical self-determination that requires unrestricted access to knowledge in order to constantly develop technical skills and to progress the state of their technical art. Important for the purposes of this dissertation is that hackers challenge one sacred realm of liberal jurisprudence —intellectual property— by drawing on and reformulating ideals from another one —free speech..

All in all, it is an interesting piece. No wonder, by the way, that certain ultraconservative commentators here in the US have called the open source movement "communistic". I suppose it is way too difficult for them to grasp. {link to this story}

[Thu Aug 18 09:28:03 CDT 2005]

Just for fun and giggles I took the nerdtest last night and scored 89 or "Computer High-Geek", which I suppose it is about right. Check it out.

My computer geek score is greater than 89% of all people in the world! How do you compare? Click here to find out!

{link to this story}

[Thu Aug 18 09:19:54 CDT 2005]

Peter Salus reports in Unix Review that the Bell Labs team that created UNIX back in 1969 has been officially disbanded. As he indicates:

There was no malice, so far as I can tell —just an administrative reorg forced by recent cutbacks and layoffs and departures that left the whole research area with too many managers and too few researches.

It is understandable. Nobody has the right to demand a company to behave like a museum. In any case, here is where its main members have gone:

Ken Thompson retired to California.
Brian Kernighan is a Professor at Princeton.
Doug McIlroy is a Professor at Dartmouth.
Rob Pike and Dave Presotto and Sean Dorward are at Google.
Tom Duff is at Pixar.
Phil Winterbottom is CTO at Entrisphere.
Gerard HOlzmann is at NASA/JPL Lab for Reliable Software.
Bob Flandrena is at Mortgan Stanley.

To the best of my knowledge, Dennis Ritchie and Howard Trickey remain, enisled.

{link to this story}

[Mon Aug 15 18:19:25 CDT 2005]

Well, we knew it would happen sooner or later, right? In a repeat of the hacking feat that managed to get Linux up and running on Microsoft's Xbox, we read now in Wired News that a team of hackers has already managed to run MacOS X on standard PCs. The tweaked operating system (nicknamed OSx86 for the time being) can already be downloaded using BitTorrent. As we are told:

The hacked version of OSx86 is based on pirated software, which came from copies of the operating system sent to participants in the Apple Developer Connection. The ADC participants also received MacIntel computers for testing and development.

Now the hacked version of OSx86 is running on Dell laptops and other PCs with Intel and AMD microprocessors.

I suppose it is "so long, MacIntel". Once Steve Jobs announced that Apple would be releasing a version of their MacOS X for a specific PC-based computer that only Apple and perhaps authorized vendors would manufacture, it was pretty clear that the genie was out of the bottle and it would just be a matter of time before someone hacked it to run on any PC. Needless to say, this also goes to prove that any excuses about how the OS was "specifically designed" to run on the MacIntel boxes in order to achieve "better performance" is nothing but ballonie. To make matters worse, the article also explains:

No one knows exactly why OSx86 appears to be running faster on the PCs than the Mac OS does on today's Macs.

"To be honest, we're not sure", said a hacker nicknamed cmoski, who said he works for a large software company. "Some in the Pentium camp want to say, 'Because a Pentium is faster, of course', some want to say (Intel chip architectures are better than Apple's) and some in the PowerPC camp just want to say that it isn't full OS X (running on the beta systems)".

The hacked OSx86 bypasses a chip, the Trusted Platform Module, or TPM, that is intended to prevent the system from running on ordinary PCs.

Other than the TPM though, the OS appears to be complete, no matter what those people from the PowerPC camp may say. {link to this story}

[Mon Aug 15 17:56:10 CDT 2005]

While reading a review of the recently released Red Hat Directory Server published by Network Computing (yes, this is the old Netscape Directory Server that Red Hat bought from AOL and decided to release as open source), I read about a little tool called pGina that allows Windows 2000 and Windows XP system to authenticate using the LDAP protocol. Rutgers University has a good document outlining how to use it and describing some of its main features. {link to this story}

[Tue Aug 9 16:21:38 CDT 2005]

Now, this is an interesting tool. I just came across the Professional Hacker's Linux Assault Kit (PHLAK), a toolkit specifically designed for the security professional that includes lots of mainstream security tools, such as nmap, nessus, snort, and ethereal, as well as other lesser known tools such as hping2, ettercap, kismet or brutus, all of them in a nice live CD distribution that comes with both the XFCE4 and Fluxbox to make things more user-friendly. {link to this story}

[Mon Aug 8 11:47:44 CDT 2005]

I bet we could all see this coming. eWeek publishes a piece under the title Will Your PC Run Windows Vista? where we are warned of the obvious: the newly announced Windows Vista may or may not run on your current system.

Microsoft Corp. has yet to finalize the minimum requirements for a PC to run its forthcoming operating system. But numerous PC industry watchers predict a dichotomy for the OS, which is due in late 2006.

Although it will be able to run on all but the most ancient machines, the OS will favor newer and relatively powerful machines when it comes to showing its true colors, analysts say.

Based on details provided by the software maker —a Microsoft representative this week suggested PC buyers who want to gain full Windows Vista user interface experience pick up a PC with a discrete graphics card that supports its DirectX 9 graphics specification— analysts say that not all of today's hardware has the graphics chops necessary to display Windows Vista's most visually compelling feature, its new Aero Glass 3D user interface.

Thus even PC owners who have purchased new machines in the last year, hardware upgrades of one type or another —either a new graphics card or, if a machine's graphics can not be upgraded, possibily a new system— may be necessary to run Windows Vista's Aero Glass effects.

Oh, my! It sure does not come as a surprise to anybody out there. Do you mean I will have to buy the latest of the latest in hardware so I can run Microsoft's new operating system? As a matter of fact, to make things worse, the beefier hardware is not even necessary, according to what I read, for the operating system per se, but just for its new graphical interface. Way to go! Who said that Microsoft does not continuously improve and innovate on the user experience? {link to this story}

[Thu Aug 4 17:27:53 CDT 2005]

I recently read a short but positive review about 3B, the Broad Band Browser, a freely downloadable browser that renders the web as a 3D city with many windows, each showing a different website. The idea is to group together related websites in "themed districts" that ressemble the shopping districts of any large city. So, I headed for their download page to see if I could give it a try and... well, as it could be expected, it only runs on Microsoft Windows, of course. Since I run Linux and MacOS X, I am completely out of luck. Oh, well. I just wonder what happens once we select a given website in that "shopping district". Does it display information in 3D too or does it just switches to "normal mode"? And how about the actual sites? Does the user get to pick which ones are present or are they pre-selected by the software vendor? In other words, does the user get to "plan" her own city or not? I must say I am still highly suspicious of all attempts to build 3D environments on our PCs. Let me be clear. It is not that I am necessarily a taditionalist. I do think 3D interfaces will sooner or later make it to the mainstream. I just do not think it will be on the PC screen. It seems to me that we will haev to wait until the information superhighway spreads to the TV set and other appliances. {link to this story}

[Tue Aug 2 16:27:30 CDT 2005]

I just ran into one of those situations that make an installer otherwise considered irrelevant truly show off its strengths. I have spent a few days lately setting up a backup server for my house running on FreeBSD. Everybody knows that its installer is quite outdated by today's standards, at least in the sense that it is not flashy at all. Heck, it does not even sport a nice graphical front-end. As in the case of Debian, the FreeBSD installer still uses a curses-based interface. However, as I said, it definitely showed its bright spots today. I installed all packages yesterday, and left to go home precisely at the point where I had to proceed with the actual configuration of the system. Well, as it turned out our building experienced a power outage earlier this morning that brought down the system. In the case of those other flashy installers, I would have little choice but to start the whole installation process again. In the case of FreeBSD, on the other hand, all I had to do was boot up the system, run sysinstall as root and take it where I left the day before. Sure, it is not as if something like this is going to happen every other day but still it made things easier for me today, which is what the whole thing is about. As usual, the concept of user-friendly is quite relative. I am so used to both the Debian and the FreeBSD installers that I could not care any less about the other flashier alternatives. After all, I am one of those users who stuck to Red Hat's old curses-based installer instead of using the GTK front-end. {link to this story}

[Mon Aug 1 08:43:50 CDT 2005]

Paul Murphy publishes an article on ZDNet that tries to destroy once and for all the myth of a deeply divided UNIX. Yes, there are several different flavors out there. Yes, the UNIX vendors compete against each other as if it were the end of the world. And yes, a great knowledge of Solaris does not necessarily translate into a great knowledge of FreeBSD, for example. Yet, the UNIX flavors share far more than most people think.

One of the areas in which this has consequences is systems hiring. It's quite true that someone's hands-on experience with one of the dead or dying Unix variants won't apply directly to Linux, BSD, or Solaris. However, someone who knows how to use HP's ISL configs or how to make raw devices under AIX usually also knows when and why to do these things —and that's what's important. Any specifics needed are available in the on-line manuals.

That doesn't mean that Red Hat certification qualifies an applicant to debug Oracle on a 72-processor Solaris machine; there are differences both in the details and in the tools available. What it does mean is that the Red Hat guy's ramp-up to Solaris competence is very small compared to the hurdles faced by a competitor whose experience is encapsulated by an MCSE designation.

People who categorize the Unix market as splintered or fractured are generally trying to compare it unfavorably to Microsoft's Windows. That's simply wrong: Windows is a brand, Unix a set of ideas. The Windows brand has been consistently handled, but there's essentially no continuity of ideas between the 3.0, 95, NT, and Longhorn Windows generations. The Unix hardware makers, in contrast, have tried hard to differentiate their products through branding when, in reality, all of their products have been part of the same family.

Oddly enough, therefore, both beliefs: that Microsoft has been consistent and that Unix hasn't, are consequences of marketing fictions.

(...)

Unix doesn't have Microsoft's surface consistency, but theory drives change to build a record of continuity as ideas are tested, accepted, and implemented. As a result, the examples in Kernighan and Ritchie's 1978 The C Programming Language work today, Kernighan and Pike's 1984 The Unix Programming Environment applies about equally well to Linux, NetBSD, and Solaris, and binaries made for the first 64-bit UltraSPARCS 10 years ago will run, unchanged, on Sun's next-generation Niagara hardware.

It definitely has been my personal experience that, although there truly are differences between the flavors of UNIX, for the most part they all work within the same parameters. Or, as Murphy puts it, they all share in the same ideas, the same philosophy. This makes it quite easy for people who are "fluent" in one flavor to switch to the another one. I have seen this all the time during the ten years I have been working in this field, most recently when my company, SGI, started to actively support Linux. Although the package management system may be slightly different between one and another flavor of Unix (or even between one and another Linux distribution), and there are also some differences in this or that command, this or that option to a particular command, the core of it is always the same. Now, could the same be told about Windows? I seriously doubt it. Not even the GUI is the same between Windows 3.1 and Windows NT, just to use an example that I consider fair, since UNIX itself has also been around for longer than that. Yet, in the UNIX world is still possible to run fvwm on about any UNIX out there. Not only that, but the equivalent of the command options in Windows (i.e., how one gets to this or that tool navigating through the GUI) has changed so much that at times I find myself clicking like crazy all around the user-friendly desktop to find the tool I need. And let us not even talk about the changes at the command prompt or, horror of horrors, the Visual Basic programming environment. As Murphy points out, this is essentially an issue of good versus bad (or failed) marketing. {link to this story}