Saturday, April 29, 2006

Wither the desktop?

It's a personal thing. My desktop is a virtual space in which I work, play, and socialize. As such, I want it to work my way.

This covers several aspects: the way it looks (colour scheme, imagery); the way it behaves (in response to my requests); what it does (which programs are available).

In unix-like environments, the desktop was originally a number of independent windows looked after by a window manager. This has evolved into the (more or less) tightly integrated desktop environments such as GNOME and KDE that we have today.

In years gone by...

Some of the early window managers were fairly basic. I remember wm and uwm, but I really liked the old Ardent window manager. It was friendly and very customizable.

Time went on and the original generation of window managers bit the dust. It seemed that twm was the new standard. I never used twm itself much, as I was starting to run into problems caused by having too many windows open. So I ended up using tvtwm
instead.

Customizing tvtwm was fairly easy (assuming you're happy to edit configuration files) and the degree of customization available was quite extensive. You had complete control over all the mouse button events, for example. I defined all my own menus. And you could define your colour scheme extremely precisely - I had the window decorations for each type of window in different colours, so I could more easily spot a particular application on screen.

Then along came desktop environments - a window manager with an associated set of libraries, and a (more or less) complete toolset. Open Look and CDE were the well known ones, although I'm sure there were others.

However, I found one thing in common with both Open Look and CDE. I hated them both. Utterly and completely. They both forced me into unnatural and counterintuitive ways of working, and I can't stand either of them for more than 5 minutes.

We're now into the brave new world of Gnome and KDE. Again, the aim is a complete and all encompassing desktop environment with a set of libraries and a complete set of applications.

Unlike Open Look and CDE, I've found I can tolerate both KDE and Gnome. Sure, they're slow, and both have irritating features, but both are good enough that I don't start swearing at the screen after a couple of minutes.

Have we really made progress in the last decade? I'm not sure we have. Some of the applications we have now certainly have more functionality than was available 10 years ago, but I don't see that they are necessarily better suited to today's problems than the applications of 10 years ago were to the problems of the time.

Put it another way. Ten years ago the applications and environments I had available met my needs of the time more than adequately. That's no longer true. And it's not as if my requirements have actually changed all that much - certainly less than the overall computing landscape has.

I'm not optimistic about the future. I was less than impressed with the Gnome-based JDS that will shortly go into OpenSolaris. I see very little of interest, and a lot of regressions.

What I also see is a lack of variety, a lack of excitement, no spark. Coupled with an increasing inability to do the basics, and the desktop is withering away.

When it comes down to it, I'm not actually asking for very much. I want to be able to set the focus policy, define the actions to be taken on mouse clicks, define what menus appear, and under what circumstances, and with what contents, and define the shape, colour, and imagery of decorations. And then use the applications that I want.

The desktop environments appear to have become less customizable, not more. Consider the available themes. There are thousands for WindowMaker (and some of them are quite decent). How many Gnome themes are there? Now, OK, a WindowMaker theme doesn't really do very much, but it does what you want.

It's not as if the desktop frameworks have provided a solid foundation on which to build better applications, either. Most of the standard applications that ship are pretty poor. I would much rather have a dedicated window manager and ally that with best of breed applications than have a bunch of applications that happen to be built using the same toolkit.

In summary, I feel that desktop development has headed off down a cul-de-sac, and we need to get back on the main road to make real progress.

Thursday, April 27, 2006

More distractions...

As if I didn't have enough distractions, I've been playing with Google SketchUp.

(And yes, this means on the Windows box...)

I've used various CAD and 3-D programs over the years (and the majority of consumer ones felt like they were just some third-rate CAD lookalike), and they were pretty hard work. SketchUp is brilliant - so easy to build a model.

Wednesday, April 26, 2006

Die, configure, die!!!

So I'm building a new version of Eterm and, as usual, the ./configure script is getting in the way.

First it couldn't find libast. That was legitimate - so I install libast and expect all to be well.

So now it constructs a compiler command line that gcc chokes on.

That's fine, I tell it to use cc instead and it can't find libast. The reason is simple - it's putting the -last ahead of the -L switch to tell it where the library is.

So I fix that and the compile just blows up. Looking more closely:

cc -DHAVE_CONFIG_H -I. -I. -I.. -I/usr/local/i clude
-I/usr/local/i clude -I/usr/ope wi /i clude -O -c
actions.c -KPIC -DPIC -o .libs/actions.lo

Look closely: it's whacked all the n's out of the include directories.

So I randomly shuffle my PATH (presumably it's found a sed it doesn't like, but isn't that the sort of thing that autoconf is supposed to solve?) and eventually get it to work.

Won't compile. The code simply isn't liked by the Sun compiler.

Ho hum. Back to gcc - let's try a different version. OK, so I get it to configure and make, but it still fails, this time with an Undefined symbol error.

The upshot of all this is that autoconf has just wasted 10 minutes of my time and completely failed to produce a successful result at the end of it all.

Sadly, this is becoming the norm.

Tuesday, April 25, 2006

Retro gaming

There aren't enough hours in the day.

I'm trying to wrap up a bunch of contributions to OpenSolaris, and have managed to get some of them submitted, but I find myself getting distracted.

I just got Pro Pinball: Timeshock for PSOne off eBay (and that has to be the best Pinball simulation ever) so I'm spending a little time on that.

I just got the PS2 Atari Anthology and it's a blast. I remember most of the games. The great thing about having it on the PS2 is that it doesn't cost a fortune to play! The downside is that the controls don't always translate to a console gamepad all that well so some of the games can be quite tricky to play.

And then I've been playing some of the old ZX Spectrum games on World of Spectrum, using the Java emulator.

Gotta go, there's a hi-score to beat...

Sunday, April 16, 2006

Graphical DTrace

So the Chime Visualization Tool for DTrace is now available.

I'm itching to try this out, and to see how you can build bridges with kstat (by means of jkstat or something different).

Unfortunately there appears to be a network problem with my test machine. Or more specifically there's a problem somewhere in the path between where I'm sat and where it's at. I've managed to determine that the machine itself is fine, but doing any work on it is impossible. It being a holiday I don't expect it to be fixed for a day or two.

Wednesday, April 12, 2006

Extra Software

No operating system - no matter how good - comes with a complete set of every piece of software you're likely to want. There are always cases where an end-user needs to add additional software to meet specific needs. Corporate servers need business software; developers may want to live on the bleeding edge.

I used to maintain a lot of stuff myself. But this has become harder and harder over time, as build systems become more complex and less intelligent, and dependencies becore more entwined and harder to resolve.

One place I now use to keep stuff up to date - and to try software out easily without having to invest the effort in building the whole dependency tree from scratch myself - is Blastwave.

It's very use. Install pkg-get and go install. It grabs the software you want and installs it and any dependencies you need. All simple and painless.

Of course, what it also shows you is how bad this dependency tracking gets. I wanted to try out a couple of pieces of software this afternoon - dia and enlightenment as it happens - and off it went, installing package after package. It's a lot easier than doing it myself, but complexity is on the rise and I'm not sure how close we are to total meltdown.

Install investigations

There's been a reaonable amount of discussion recently regarding Solaris installation speed. Indeed, there's even some idea of what the problem is.

However, closer investigation reveals that rewriting the contents file isn't the limiting factor in a Solaris install. It's not even in the top 3, which are:
  • The time taken to uncompress the data (they're bzip2 compressed)
  • The time taken write Solaris to the disk
  • The SMF manifest import on first boot

Currently, the contents file rewrite is having a race with the pkg tools overhead for 4th place.

Why this disconnect between the obvious problem - and it is a problem - of rewriting this large file many times and its relatively minor importance to Solaris install times? After all, for a slightly trimmed full install Solaris itself accounts for 3.5G of writes, and rewriting the contents file is twice that.

The point is, though, that rewriting the contents file involves a small number of large writes, which are quick. Solaris itself is many small files, so it generates something like 100 times the number of I/O operations.

Not only that, but it's possible to tweak the package installation order to minimize the amount of rewritten data. Simply install all the small packages early and leave the big packages that bloat the contents file until last. Doing so could - in principle - reduce the contents file rewrites by an order of magnitude.

This affects zone installs as well. For zone install, the uncompress cost doesn't exist at all. And for a sparse root zone, there's no Solaris to write to disk - it's loopback mounted from the global zone. So the contents file is much more important as a limiting factor for zone creation performance. However, I've managed to halve the contents file rewrites by tweaking the package installation order. I've not got that much control over the installation order, as it seems to depend on both dependency calculations and the order that opendir() goes through /var/sadm/pkg, but even then a gain of a factor 2 was fairly easy.

This isn't to say that the management of the contents file isn't an interesting and important subject that can lead to some benefits, but the relative importance of it in install performance can easily be substantially overstated. There's other low-hanging fruit to have a go at!

Monday, April 10, 2006

./configure is evil

For years I've used emacs as my editor.I don't want to get into religious wars here - if you want to use vi, then that's fine too. I happen to like emacs because I started out with EDT and TPU under VMS, and when I moved off the VAX I had to find an alternative - and I was able to make emacs play ball pretty easily.

I don't use an IDE, or indeed any custom authoring tools for that matter. I write web pages in emacs, as plain HTML.

Si I was glad to find out that I'm not entirely alone. Bill Rushmore just wrote about Emacs as a Java IDE. Spurred on by this, I thought about upgrading the version of emacs that I use.

I've been stuck on GNU emacs, version 19, for a very long time. The reason I haven't upgraded to later version is that they're too darn slow. (The same goes for XEmacs.) Which is one reason for not using an IDE - an editor has to start faster than I can start typing.

I now have a dual Opteron desktop, so it should be possible to get the latest version to start up fast enough, right?

I don't know, the darn thing won't build.

So I download emacs, unpack it on my Solaris box, see what configure options I have, and type:

./configure --prefix=/usr/local/versions/emacs-21.4 --without-gcc
--with-xpm --with-jpeg --with-tiff --with-gif --with-png

Why without-gcc? I need this to go real fast, so I want to use the latest studio compiler and crank it up.

It fails. No xpm, png, in fact no nothing.

configure:2694: /usr/ccs/lib/cpp conftest.c >/dev/null 2>conftest.out
"/usr/include/sys/isa_defs.h", line 500: undefined control

Well, there's a fundamental problem. The configure script is trying to be too clever by half, and is calling cpp directly. Don't do that. Run it the same way the compiler does, and things will work much better.

So I tell it to use "cc -E". This gets it past some of the cpp stuff, but there are two new problems that crop up:

configure:5430: cc -o conftest -I/usr/openwin/include - g - O
-I/usr/openwin/include -L/usr/openwin/lib conftest.c -lpng -lz -lm
- lX11 - lkvm - lelf - lsocket - lnsl - lkstat 1>&5
ld: fatal: file g: open failed: No such file or directory
ld: fatal: file O: open failed: No such file or directory
ld: fatal: file lX11: open failed: No such file or directory
ld: fatal: file lkvm: open failed: No such file or directory
ld: fatal: file lelf: open failed: No such file or directory
ld: fatal: file lsocket: open failed: No such file or directory
ld: fatal: file lnsl: open failed: No such file or directory
ld: fatal: file lkstat: open failed: No such file or directory

There's a major thing wrong here. Why has it randomly decided to put spaces in? (And why does it think it needs libelf, libkvm, and libkstat just to see if it can find a png function? I'll let it off some of the other libraries, as although it doesn't need to specify them all there were times in the past when you had to.)

That's not all:

"junk.c", line 586: invalid input token: 8.elc
...

So it looks like it's trying to use the C preprocessor in different ways.

Foiled again.

Really, I can't see why anyone would think that getting a configure script to make a bunch of random guesses (and often get them wrong) is anything other than a stupid idea. There was a time when unix variants differed dramatically and you needed a way to tell them apart, but that time has long gone. As it is, we've now got to the point where the amount of software that compiles and works is heading towards zero, and the worst thing is that - unlike in the old days when it would have taken a couple of seconds to fix the Makefile to make it work - actually correcting the error when it goofs up like this is almost impossible.