mahiwaga

I'm not really all that mysterious

backward compatibility

I don’t know if it’s because I have just a touch of risk-seeking behavior, but the concept of backward compatibility was never a compelling reason for me to expect that people would deliberately sabotage innovation. And yet, witness the gutted shell that is Vista, which is lacking interesting features like WinFS and Monad/Powershell (although this is eventually going to be released), features that would actually make me want to explore this brave new OS. (And these are only the most infamous of the scrapped features, to boot.)

A glance back at personal computing history shows that backward compatibility was never an excuse to forego new features. I can even remember the 8-bit days of yore when I was a Commodore fan boy. While there was theoretically a compatibility continuum between the PET, the VIC-20, and the C64, each iteration of personal computer had more and more features, generally at the expense of other features. (I mean, it must’ve been tough, working with only like, what, 32k of RAM? A thousand times less RAM than my cell phone?) Their floppy drives, which were in fact independent computers, had even more iterations of CBM-DOS, each barely compatible with the one before. The most popular version, CBM-DOS 2.6, had completely abandoned the notion of dual drives (which sadly ended up causing bizarre bugs)

While the Plus 4 and the C16 may seem like a testament to the dangers of completely abandoning backward compatibility, the C128 offers the opposite lesson entirely. The Commodore 128 was built with backward compatibility in mind, in order to leverage the huge amount of software available to the Commodore 64. Unfortunately, this discouraged developers from exploiting the C128’s unique features, and it never came close to reaching the popularity of the C64. Even more distressing, though, is that the C128 was the last in the line of PET-like machines. From then on out, Commodore went with the Amiga, which was actually quite a successful product, many of which are still in use to this day, but which never caught on, eclipsed by the Macintosh. Finally, Commodore started making x86 clones, which eventually led to the complete demise of the company.

The other reason why I think backward compatibility is bunk is because no one buys a gaming system for backward compatibility. The first one to offer such a feature was Sony’s Playstation 2, and probably the only reason that it didn’t suffer a C128-like fate was because the Playstation 1 architecture was incredibly ancient by then. (Still good for a few games, though.)

Think back to the jump from MS-DOS 6.22 to Windows 3.11. Clearly you couldn’t continue to write console-based, multitasking-unfriendly applications if you wanted to be taken seriously. (Oh, I know, there are plenty of console-based applications still around, but most of them were first built for UNIX, meaning that you can’t monopolize the CPU anyway, like badly written Win16 and Mac Classic apps can.) Sure, the transition took a while, but I don’t see anyone crying about not being able to run their MS-DOS-based apps from 1992. You either get with modern times, or you don’t upgrade, simple as that. (I mean, there are still people running version 2.2 of the Linux kernel, after all. If it ain’t broke, don’t fix it!)

With CPUs as powerful as they are now, and with RAM and hard drive space being cheaper than dirt, I don’t see any reason to hobble your OS with backward-compatible handicaps. You just virtualize. Run your old apps in a sandbox that can’t take the rest of the OS down with it. That’s how the transition from MS-DOS to Windows 95 worked (in theory.) This is how Apple managed the transition from their Classic OS to OS X.

Maybe I’m just thinking too much about the standard development cycle of popular Open Source projects. When GNOME retooled their architecture during the transition from version 1 to version 2, with little-to-no heed given to backward compatibility, the developers took it in stride. In a matter of months, there were GTK2/GNOME2 versions of various apps like Evolution, Galeon, Nautilus, and the GIMP, and while you could still run GTK1/GNOME1 versions with the proper libraries installed, it was kind of ugly, and there really wasn’t that big of a reason to. Even the monolith that was Mozilla was up and ready in a timely fashion.

But maybe that’s just the open source community.

When you’re doing this for money, the timeframes apparently change quite drastically.


From a software consumer’s standpoint, it seems like backward-compatibility is just an excuse for developers to be lazy and not work to implement new features. It gives them an alibi for why their app doesn’t play nice with the latest-and-greatest OS. But software is not supposed to be stagnant. You don’t just write a program and expect to collect money from it for the rest of your life. Even the RIAA is learning that sad lesson.

posted by Author's profile picture mahiwaga

baffling (how i learned to stop worrying and love the gui and high-level languages)

Now, don’t get me wrong. I’m not a developer. The extent of my hacking history lies in the good old 8-bit days when I was hand-coding machine language programs into BASIC DATA statements. I learned, of all things, Pascal (which happened to be the programming language tested on the AP Computer Science test) and tried to muck around with C and C++, but eventually gave up with that and ended up learning Perl instead.

I once wrote an extraordinarily kludgy medical record-keeping application for my dad in Turbo Pascal 5.0 (He still kept hard copies of everything anyway, thankfully) and when I used to work as a medical biller, I wrote a little script to help me keep track of things in Visual Basic for Microsoft Office(!) which rapidly turned into a nightmarish, molasses-like application experience.

So I’ve never really written a program that anyone has (exclusively) depended on, and I did very little bug-fixing, and much less optimization.

I can only imagine what it means to be involved in a massive corporation working on a little subroutine or object class that happens to be crucial to the project, and hoping that it’s well-implemented enough and documented well enough that it won’t bring the entire project crashing down.


On the flip side, I gave up on Windows in 1999, and most of the apps I use are Open Source, or at least based on Open Source. From 1999 to 2002, I was running Red Hat Linux as my main OS, until I decided to get a laptop. (At the time, Linux support for laptops kind of sucked. It would’ve been a major pain in the ass, I think.) I couldn’t stomach the idea of switching back to Windows, so I bought a Mac.

The irony is that as an undergrad, I was a proponent of x86-based hardware, and thereby staunchly against the very proprietary nature of Macintosh hardware. Now I don’t think I ever thought Windows was “cool”, but me and my roommate would have Mac vs. PC arguments all the time.

The other issue I had with the Mac was its GUI-only nature. I had grown up on the command-line, and I couldn’t (and actually still can’t) fathom the idea of interacting with your computer only through a mouse. (Even as I type this, I have a few terminal windows open in Mac OS X. It’s still easier for me to hit Cmd-Tab, type cd ~, and then open some-file—using command completion in zsh—than it is for me to click on the Finder icon, navigate to my home folder, scan the thousands of unsorted files sitting there, and finally double-click on it.)

But bizarrely, it was Linux (along with X and GNOME) that taught me to love the GUI. You could do all sorts of weird stuff with the various window managers available, like have windows resize to set dimensions (useful for previewing HTML at small screen sizes) or automatically having all your windows roll up and stack themselves in a line on the right side of the screen. It was a matter of a few mouse-clicks to get a window to stay at the top of the pile, no matter how many other windows you opened up. I also grew dependent on multiple desktops (the fact that this is missing in Mac OS X is only made bearable because of Exposé.) And at the time (in the summer of 2000), GNOME had the only open-source web browser with tabs (Galeon), a feature that all modern browsers now have, and which I can’t use a browser without. (IE 6?! Bleh. Bleh. Bleh.)

The only thing that made switching to Mac OS X palatable was the fact that it had an entire UNIX subsystem running underneath the pretty GUI. Meaning, unlike Mac OS Classic, you could drop out into a terminal, and navigate and manipulate your filesystem with little cryptic commands like cd, ls, find, xargs, and grep. And meaning that you could (1) run an X server and (2) compile all the apps that I was using on Linux and use them on the Mac. For a while, Galeon still remained my browser of choice (until I found Camino, neé Chimera)

But X is a hungry beast, and after a while, I got tired of hard drive thrashing, and I couldn’t afford to buy more RAM. So eventually I settled into Mac OS X proper. Right now I have seven applications up, not counting the Finder or the Dashboard: (1) Mail.app (2) Vienna (3) Safari (4) iTerm (5) Chicken of the VNC (6) VLC (7) Preview.app. Five of these seven are open source projects. All of them have equivalents on Linux and Windows (with Cygwin installed.) As you can see, I’m not a big fan of proprietary software. (Hell, I grew up in a time when people would type in huge programs from the back of a magazine, sometimes entirely in raw machine language—just strings of unadorned hexadecimal numbers. My word processor of choice on the Commodore 64 was Speedscript, which came out of the back of Compute!’s Gazette.)


What inspired this rehashing of my own personal computing history was this post on The Old New Thing discussing a very poorly thought out function called IsBadXxxPtr which supposedly tells you whether or not a pointer is valid or not.

And frankly, I find it astounding. Not so much that these functions exist, because they are probably useful to somebody somewhere trying to debug something. But that these functions would be allowed to escape into production code, resulting in a situation where you can’t just get rid of these functions without breaking apps.

I try to figure out why this should be so, and why such practices never make it into production UNIX code (other than the fact that the standard APIs don’t have functions like these.)

Is it because there are so many Windows developers, in contrast to the number of *nix/Linux/Mac OS X developers? That maybe these kinds of kludgy functions exist in core APIs, but they can be easily deprecated because not that many apps actually rely on their bad behavior? That the apps that would break wouldn’t cause a massive uproar?


I don’t know. I feel like backward compatibility is overrated. If you want to stay compatible, then you shouldn’t upgrade your system. Problem solved. I mean, people still run CP/M and VAX/VMS, don’t they? There are probably even a few MS-DOS boxes floating around somewhere, maybe even connected to a network!

Meanwhile, the bleeding edge can throw out the baby with the bath water and actually innovate. And if you, as a developer, feel that you just can’t live without these new features, then you’ve got to rewrite your app so that you can compile it against the new API.

Isn’t this a massive reduplication of effort? Not necessarily. If you have clean code that is well documented, you might get away with just rewriting a few functions here and there.


This certainly reminds me of the general debate about Windows Vista. Microsoft could’ve pulled an Apple and they could’ve just scrapped all the legacy code and started fresh. And to keep the break from being overly traumatic, they could’ve done what Apple did with Mac OS Classic. They could’ve just turned the old Windows stuff into a stand-alone virtualized system. After all, they already own Virtual PC, so why not? Most people have dual-cores these days, and the other core just sits idly by, why not give it something to do? And maybe they could also have a compatibility layer (akin to Apple’s Carbon API) so that all those Windows developers wouldn’t have to just start from the beginning all over again, and they could continue to develop new apps that would run both on XP and Vista—natively, without having to run the virtualized Windows Classic.

All sorts of doom and gloom was predicted when they abandoned Mac OS entirely and adopted Jobs’ baby, NeXTSTEP, which much of OS X is based on. But that’s now ancient history, with Apple making yet another jump (albeit this time on the hardware side.) And how many people still write Carbon apps these days?


But a lot of these problems stem from the fact that, despite being nearly 40 years old, C is still the dominant language for developers. Sure, there’s C++, but is new and delete really all that different from malloc and free? Manual memory management seems like such a Sisyphean task. That’s why I quickly gravitated to scripting languages which generally use garbage collection. After all, I learned to program in BASIC, of all things. Commodore’s BASIC 2.0 was infamous for the incredibly long pauses when garbage collection was triggered, but I could live with that. I mean, it wasn’t bad for a computer running at 1 MHz.

In time, we’ll have a generation of coders who have cut their teeth on Java and C# and Python and Ruby, who won’t know or care about dangling pointers. Until then, we’ll have to continue to deal with the kludgy ways that developers utilize in an effort to avoid shooting themselves in the foot, and their crash-provoking consequences.

posted by Author's profile picture mahiwaga

trolling the board

A few small gems that made me laugh out loud that I found while looking for potential admits on tonight’s emergency department board:

On the disposition column next to a patient’s chief complaint, where they usually notate where the patient is going to go next (e.g., radiology, discharge home, admit medicine, admit ICU, etc.):

Wants to get his ass kicked!

On another patient complaining of penile discharge and testicular swelling, in the chart where it usually notes pre-hospital care (e.g., IV started, oxygen given, sublingual nitroglycerin given, intubated in the field, defibrillated in the field, coded for 55 minutes in the field, etc.):

Has been smoking marijuana to relieve pain

posted by Author's profile picture mahiwaga

addicted

Quizzes. Not from J™. Unfortunately I don’t remember the source.

[I am 71% Addicted to Coffee][1]{: style="display: block; text-align: center; padding-top: 167px; height: 35px; font-size: 16px; font-family: Arial; text-decoration: none; color: #fff;"}

Mingle2 - Free Online Dating

78%How Addicted to Blogging Are You?Mingle2 - Online Dating

OK this one is from J™:

<table width=350 height="120" bgcolor="black" cellpadding=1 border=0 cellspacing=0>| <table width="100%" bgcolor=white height="100%">| | | | [Are you addicted to MySpace?][5] | |

And in the same vein:

You're Strung Out on MySpace!!!
You’re Strung Out on MySpace!!!
Take Are you addicted to MySpace? today!
Created with Rum and Monkey’s Personality Test Generator.

You’re a full-blown addict. Please admit this to yourself, if you haven’t already. MySpace is your drug, your world, your all. You eat and breathe MySpace. You walk and talk MySpace. You just can’t get enough. If you’re not checking your Profile Hits, you’re attempting to add more “friends” to your list. If you don’t have a new message, you quickly send one out and await a reply (oh, that red “New Messages!” alert!). Perhaps there is a MySpace Addicts Anonymous group you can join …

</p>

<table width=350 align=center border=0 cellspacing=0 cellpadding=2><tr><td bgcolor=”#EEEEEE” align=center>

**You Are 44% Addicted to Myspace**

</td></tr>

Your Myspace addiction factor is: Moderate

You’re slowly building a very strong addiction to Myspace. Get out while you still can! </font>

| </table>

[Are You Addicted to Myspace?][9]

Curse you, Tom.

And Rupert Murdoch needs to hire some people who actually know how to code. Myspace looks like Web 1.0 circa 1996. I’m surprised that they actually use CSS. And that the blink tag isn’t all over the place inducing epilepsy.

posted by Author's profile picture mahiwaga