Friday, August 31, 2001

Dan Libby and XML RPC Dan's a developer friend of mine who's done some interesting work with XML-RPC. As this protocol has been in the news for a while, I asked Dan to tell us about it. Here's his response:

In 1999, when I joined Epinions.com, we were using a proprietary database/app server, and a custom apache module to talk to it. The apache module parsed some very simple html templates, and found "entities", which looked like standard html entities, but contained function calls, which were calls to the backend server. We had a proprietary protocol, RAD (Random Ass Data), written by Lou Montulli of Netscape fame, that transferred these requests back and forth. Typically, the returned data would simply be blobs of html that would be inserted directly into the document. This meant that the backend server, which was C based, had to understand a great deal about html, which was bad.

When we decided to integrate php into our system, one of the main goals was to separate out the display oriented processing from the logic/data oriented processing. This required a way of sending back much richer result sets corresponding with native php types: ints, doubles, lists, hashed arrays, etc. To this end, I devised a simple xml vocabulary that was almost a 1 to 1 representation of php's native data types. I then created an API in the app server code for generating this vocab, and a small php extension which used expat to parse the xml and then decoded it. In the final days of that project, I began moving on to introspection and some other fun stuff. For example, I created a C API for describing methods and their arguments, and then a php function for formatting the returned data prettily. Thus, php coders finally had some decent documentation as to what they could expect from a given method call in the server.

The day came when we had to re-architect the site, circa May 2000. We liked the current xml based response mechanism, but the new architecture called for more complicated queries and a more robust request/response solution. By this time, I had read about XML-RPC, and had noted its great similarity to what I'd been doing. I proposed that we switch to using standard XML-RPC for the new project, and it was agreed to. The only problem was that I could not find any great C implementations. I found one, expat-ensor, that generally worked, but we had performance issues and found the API non-intuitive. Worse, it was no longer supported. One day, after battling with ensor for a long time, I got to looking at my old code and decided that the API was actually more sensible and could work just fine for XML-RPC. I sat down, and within 2 days had a working prototype that was many times faster, architecturally cleaner, and could be plugged into either the php extension or the backend server. A few days later and it could read/write from either the xml-rpc vocabulary or the simpler vocab I had previously devised.

In the early days, while working out the bugs, it was quite neat that we could use the php native xml-rpc implementation (by Edd Dumbill) interchangeably with my C implementation. I wrote a pair of simple xmlrpc_encode/decode functions for Edd's code that mirror'ed my own API's, and thus switching between the implementations became a one line change. I think this was when people within our organization really started to see the benefit of using a standard protocol.

(on WSDL) Nevertheless, it strikes me as a bit strange that a protocol that is capable of describing data/objects of any type (SOAP) requires yet another xml vocabulary to provide introspection capabilities. That said, I'm sure it is great, and will probably support it at some time. I'd like to see xml-rpc actually mentioned in the wsdl spec(s), given that it is not supposed to be soap specific.

My philosophy with xml-rpc has always been that the vocabulary should be as simple as possible to provide a small set of building blocks on which more complex things can be built. So when I needed introspection capabilities, I simply defined them in terms of xml-rpc's native methods and data structures. Since the early days I have provided some form of introspection in my code, and a few months ago I posted a detailed, yet still small spec [1], that describes how this works, so that other implementors may also choose to support it.

My introspection support is meant to work in a fashion similar to javadoc, robodoc, and other auto documentation systems. Basically, the developer leaves markup in the code and the documentation system makes sense of it. The developer may leave as much or as little information as desired. The cool thing about this is that it can be queried at run time by anyone with access to the system, and formatted by them in whatever manner they desire. Further, both the parameters and return values from methods may be nested arbitrarily deep and have names/descriptions associated with them. I have used this data to provide server help page(s) and interactive web interfaces to the xml-rpc server. Given that the introspection data provides both method names and parameter types and descriptions, it then becomes possible for the client to present the user with a form wherein s/he may fill in the parameters by hand and execute the method call.

A few Introspection examples:

Terraseek's GPS Web Service uses introspection to provide browser interface.
http://216.101.160.38/xmlrpc.html

A generic introspection client that can be pointed at any xmlrpc-epi
server
http://xmlrpc-epi.sourceforge.net/xmlrpc_php/introspection_client.php

Pretty formatting of introspection data via php function
http://xmlrpc-epi.sourceforge.net/xmlrpc_php/introspection.php

(Having built the system at ePinions, what was the story behind open sourcing it?)

I wanted to open source it from the start, and told my manager immediately, who was amenable. I got it to the point where it passed the XML-RPC.com validation test suite just prior to thanksgiving of 2000, and requested official approval. Long story short, it took until March 2001 before the legal department was satisfied and I was finally able to open source it. In the meantime, Eric Kidd came out with his C/C++ library, thus making mine a bit redundant. The positive news is that this effort sort of cleared the barriers, and Epinions.com has now open-sourced two more pieces of software: mvserver [2], which uses xml-rpc, and yats [3], which is a fast template engine that I wrote a while back. I have received a lot of positive feedback -- enough to keep me working on the library from time to time, though not as much as some would like. The big news is that I've just received access to the php CVS repository, and am planning to make xmlrpc-epi-php a standard php extension. woo hoo!

[1] http://xmlrpc-epi.sourceforge.net/specs/rfc.system.describeMethods.php
[2] http://mvserver.sourceforge.net/
[3] http://yats.sourceforge.net/


(Where did your XML library come from?)

I wrote it in order to facilitate integrating php into our system. Specifically, it was two pieces of code: an API on the backend server that would spit out xml representing data structures, and a separate piece on the php side that used expat to parse the xml and convert into native php data structures. The XML vocabulary was similar to that of XML-RPC, but more human readable, and with support for mixed arrays, which php supports but XML-RPC (and some languages) do not. (mixed array = some values have keys, some do not). This vocabulary was something that I just came up with on the whiteboard one day in about 20 minutes, asked Lou if he liked it, and sat down to crank out the code.

It really only has two type elements: "scalar" and "vector". Vectors may be arbitrarily nested within eachother, as with XML-RPC. scalars [now] support the the same types as XML-RPC, although
originally it did not include base64 or datetime. I also "borrowed" methodName, methodCall, and methodResponse from XML-RPC when working on the 2nd generation library; the first generation vocab had not required either because it was only ever used for responses.

More examples of this vocab are available on the SourceForge Page at:
http://xmlrpc-epi.sourceforge.net/main.php?t=samples

My current xmlrpc-epi distribution still supports this original vocabulary, called "simpleRPC", and it can actually read/write to either it or XML-RPC. This is possible because I wrote the library in a modular fashion such that there is a parsing layer (expat), a DOM layer (custom), serialization layer(s), and finally the data structure and API layer. I am currently toying with the idea of plugging in a serialization layer for SOAP (or a subset thereof), and thereby have a single C library that can read/write to XML-RPC or SOAP interchangeably with a single application level API.

(Why did you use XML-RPC?)

I think it was just a matter of a common need and similar solutions. I had needed a way to represent arbitrarily nested, typed data. XML seemed the easiest way to do it, because the parser was already written for me. Userland had needed something similar and thus had come up with a solution that was very close to mine, and so matched up well with my existing API.

(Tell me about what it took to get folks to use the standard protocol)

I think there was quite a bit of the "Not Invented Here" syndrome. Heck, we were even using our own database! Also, remember that this code was originally written as a means for connecting between our web server and the backend system(s). It was not really important that it be able to interoperate with other applications. It was important that it be very fast. XML is quite verbose, which translates into a lot of network traffic and additional parsing overhead, and so there was originally some concern that perhaps we should not use XML at all. Consider that at the time we were serving over a million page views a day, and that some of those pages contained > 20 requests to the backend servers. When you start dealing with those types of numbers, performance becomes very important, and the existing XML-RPC implementations, most of them for scripting languages, were simply not up to the task. Thus, we could see the advantages of using an open protocol, but it soon became clear to me that I'd have to roll my own implementation. Ultimately, I felt this was one of the more valuable things I did for my career while employed at Epinions, because it got me involved with technology and people outside the company, and that type of knowledge and experience is much more transferrable than the arcana of how a particular web site operates.

(How did you decide where the complexity would sit, in the wrapper or a process that sits on the end of a pipe? ie. would you embedd the protocol in a more complicated protocol, or would you ask programs getting the data over the pipes to do complicated things with it?)

Either/both. I think that as applications are developed to use XML-RPC, people realize they are doing common things over and over, and begin to form higher level protocols to address them. This is what I did with Introspection, and also with standardized error codes, which are not part of the XML-RPC spec (http://xmlrpc-epi.sourceforge.net/specs/rfc.fault_codes.php)

I think that, in general, this is the way the internet and the web have developed. Relatively simple protocols have been piled, one upon another, to ultimately enable something that is very complex. tcp/ip is composed of many layers. Http sits on top of them. Now xml-rpc and soap are sitting on top of http, and eventually other things will sit on top of them. In SOAP's case, there is already wsdl and uddi, for instance.

(Tell me about management pushback when open sourcing the code)

The push back had to do with the license and with liability. Surprisingly, the GPL was deemed "too restrictive", so we ultimately settled on a BSD-like license. Then we had to prove that there was no code in the library owned by anyone else. The laywer(s), of course, had much more important things to be doing, and so all of this took much longer than would be expected.

I wanted to open source it because I love open source. I can't stand the thought of having to write something that I know someone else has already written. It annoys me. It feels wrong. So anytime I can spare someone else that pain, it makes me happy.

I think the company agreed to do it because it was non-essential software. It was not something that really gave us a competitive edge or was otherwise deemed "strategic". Further, since it was written around an open protocol, it just made common sense that it would be useful for others, and that it might even be useful for our partners, etc.

Thanks for writing in, Dan!
Link to this article

Thursday, August 30, 2001

Hackers should go to Washington At the LinuxWorld conference, Stanfard Law professor Lawrence Lessig called on hackers to put away their technical arguments and fight to keep the Internet open against vested interests (big business, government, Microsoft, and the publishing industry). He also argues that only open source has a vested interest in stopping patent abuse and protecting innovation.

Wednesday, August 29, 2001

Silicon Desert Dubai is a city in the United Arab Emirates, a collection of seven desert kingdoms near Saudi Arabia. It's also where I grew up. Every time I return the place has doubled in size, with more fancy hotels, shopping malls, and parks. It's really quite something.

Dubai now has its own Internet City-- basically a tax-free clustered campus designed to lure technology firms to the region. Dreamt up by Sheikh Mohammed (the crown prince of Dubai) in the heady days of 1999, completed in less than one year, it stands ready just as the technology biz continues to collapse.

I chatted with Hussain Al Mahmoudi, Dubai Internet City's Marketing Communication Manager. Here's what I learned:

Although Dubai has no financial market, no technical academic roots, little manufacturing, and little installed tech industry, it does offer a great standard of living, committed government subsidies, and a strong import/export base. Many companies working in the Gulf like Dubai because their employees are happy and the infrastructure works. Given the lack of regional technology hubs (barring Israel), Dubai seems well positioned to set up a redistribution cluster to serve the Arab world. And Dubai is much closer to Bangalore than Palo Alto. Setting up clusters is hard to do, as the struggling Silicon Glen's of the world can attest, but let's wish them the best of luck.

The government is the major technology investor and buyer here. Their e-government initiatives are the best I have seen anywhere. My father uses the Internet to pay utilities, settle fines, and renew services. I wonder if they've cosied up to monopoly service providers the way the UK government did with Microsoft, or if they're resisting single-vendor lock-in like all companies should (but few do). If local buyers aren't savvy about this stuff, the region will become a feeding frenzy while Gates and Ellison lock-in as many cash cows as possible. Ugly for Dubai, good for Microsoft and Oracle.

Dubai is also a pretty conservative place (but much more liberal than, say, Saudi). The government phone monopoly, Etisalat, provides all telecommunications infrastructure and (like China) the entire country is hosted off a proxy-server that censors unsuitable websites. "Unsuitable" usually means porn, but occasionally includes The Register and Internet-telephony providers that threaten Etisalat's long distance revenue. The phone monopoly is too lucrative for the government to give up, but Dubai might get a second government entity that competes to provide the same service. That'll be interesting.

I think the Internet will shake up the culture here, albeit slowly. "Freedom of expression, freedom to create," (the motto of Dubai Internet City) sounds peculiar in a place with rigid labor policies, a two-tier legal system, and strict limitations on business ownership. But I think they recognize the need to give folks more liberty if they want to attract the better class of person they seem to desire. So far, they have 200 companies signed up, which is pretty good in these depressed times. If anyone out there is thinking about setting up shop in the Gulf region, I'd check out Dubai Internet City.

Link to this article

Tuesday, August 28, 2001

Clueless business folk I am continually shocked by how poorly people understand the "network effect." The "network effect" occurs when the value of the network rises as there are more users. Email is an example, but hotmail is not. Phone are an example, but cell phones are not. Napster is an example, but streaming audio from a central server is not. Another way to put this is "positive demand side externalities." But shocking, no one gets it. Andrew Odlyzko writes a good piece on the myth of Internet time, but even he doesn't really seperate the network effect from people's slow uptake of new technology.

Linux goes to WallStreet I'm thrilled to see IBM targeting niche sectors with complete Linux based products. This is definately they way to build up critical density in particular industry niches, and build up positive demand side externalities to monopolize those segments. This has already started to happen with small devices also.

Thursday, August 23, 2001

Angry reactions to Gnome/KDE essay There was an angry discussion about my recent essay on Gnome and KDE critisizing me for 1) assuming Unix users were superior to other people, 2) slagging Microsoft unfairly, 3) slagging printing stuff out on the computer, 4) slagging people who don't know how to write shell scripts, 5) ignoring the fact that Unix tools have been ported to Windows (and many other platforms). There were other complaints, but this is a good start.

This conversation was interesting because I critisized Gnome and KDE in the aticle, which are both Unix desktops and used a particular Macintosh platform as a frame of reference. And while I don't claim the Office suite is dead, I would argue that most people mainly use their computers for Internet related work (Web, email, IM etc.) and that the Office suite obviously comes from a printer-based world. So while the Office apps are neccessary for some tasks, they are niche uses compared to Internet stuff.

My argument was that the way Unix uses small programs passing plaintext between each other is a particuarly good way to handle information over a network. I was not celebrating the OS itself. The Good Easy on my Mac has the Unix philosophy built into it's GUI, but is not Unix. I can use it to automate away repetitive tasks without knowing shell scripts. For example, a simple feature like text expansion does not require scripting, cutting and pasting often replaces pipes, and quick-keys can speed up task-switching without anything resembling programming.

I was critisizing the desktop focus on Office-style productivity applications, of which MS Office makes up almost all the market, because it ignores basic Internet functions (like text editting) that Unix users know all about. Microsoft's poor support for basic plaintext management tools is not ammeliorated by the availability of Windows ports of Unix tools. The regular user should not be expected to download emacs just to get a text editor less awful than notepad (and I don't think emacs suits the text editting needs of the non-programming home or office user either). But I don't anticipate good support for plaintext from Microsoft anytime soon, it goes against their lock-in strategy.

So, I'm not advocating that everyone should switch to Unix. I don't think Unix users are better human beings. And I don't think software is going to alter human nature. But I am arguing that when it comes to Internet related work (which I think is the most important use for a computer), folks should learn from the Unix philosophy of simple programs passing plaintext between each other and develop an appropriate GUI (like the Good Easy) that combines power with flexibility. I've been there, it can be done, and without a command line in sight.
Link to this column

How business misunderstands technology This article in Fortune talks about how companies are having trouble handling all their email. Large volumes of email can challenge individuals in any organization, but the proposed solution takes precisely the wrong approach and illustrates why businesses keep failing to enhance productivity with technology.

The challenges of email are cultural (what exactly does this message mean in a social context) and practical (how do I handle 400 messages a day). Making email more complicated by adding information and other baggage (like Outlook, EcoCap etc. do) will just make the problem worse. This approach pretends that enough technology can solve any business problems and that more technology is the appropriate response to too much information. Some clueless purchasing agent will no doubt inflict this hateful system on workers who will then suffer one more complicated, malfunctioning piece of electronic garbage while basic business processes remain unexamined. The software provider will lock the company into their system and extort money out of them for many years. As a shareholder, I would be furious.

I would recommend that companies explore protocols based on plaintext and cultural training that enable employees to handle the increasing volume of emails in their worklives. Mark Hurst has some well developed ideas about handling email which sadly aren't written up anywhere online, but the crux of the idea is to ruthless delete bits, and to manage the bits that remain appropriately using simple tools. I experienced this system at Creative Good, and it works a treat.

Tuesday, August 21, 2001

Lock-em-in, Shake-em-down The particular economics of the software industry rewards companies who can lock their customers in and then shake them down for all their worth. Microsoft has perfected this tactic, and is extorting the City of Austin as well as corporate IT departments the world over. Of course, neither of these entities has a clue about how to use computers. This trick can even work for Unix.

Some sanity On a happier note, ICANN has been roundly critisized in a well researched paper, and even the Washington Post has denounced the DMCA. I honestly don't beleive this law will pass a consitutional challenge.

IBM and Linux Big Blue has a very clear motivation for supporting GNU/Linux -- it commoditizes away the operating system layer. GNU/Linux (and other open source software) also supports interoperability better and more reliably than any closed system can. In a fractured market, open source standards absolutely win. In a monopolized market (the desktop PC), it is much less clear.

Hardware support Very thoughtful article on one person's experience with hardware support for Linux. Indeed, I noticed the petulance among Linux supporters the article mentioned in my own recent post on this. Until Linux is more widespread on the desktop, it might make economic sense for hardware manufacturers not to support it. Once it spreads however, there will be no reason for them to not commodify away the complementary operating system. On the server-side it's another story...

Monday, August 20, 2001

Why Gnome and KDE are misguided Back in 1987, John M. Carroll and Mary Beth Rosson published "Paradox of the active user". The Apple Lisa had just been introduced, and researchers were interested in observing how normal people used computers. Carroll and Rosson studied users and observed that 1) people don't read manuals and 2) once they figure out how to achieve an effect, they will not change their protocol even if doing things a different way would save the time and effort. This behavior is paradoxical because automating repetitive tasks, and using pre-built shortcuts would save users time overall.

This paradox is killing computer productivity now. In the pre-network days, the value of a PC rose when you bought a printer for it. Before, all you could really do was play games, but now you could type letters, create spreadsheets, and basically use the computer to do useful tasks because you could share the results. This has all changed in the network world, the value of the PC is now determined by the speed of the Internet connection it has. I care about checking my email, not printing things out.

Unfortunately, all the basic PC software comes from the pre-network age, where the computer was a machine that converted bits (digital data) into atoms (paper) through a printer. Word, Excel, Powerpoint, and much of Access are all built around the idea that bits are most useful when converted to atoms. Outlook, even though it was built for email, betrays its paper-centric heritage through it's obsession with formatting, inability to pipe plaintext in (or out), and general, monolithic architecture. In the PC world it was OK for a program to be aggressive in what it tries to do and conservative with what input it accepted. In the networked world, exactly the opposite holds true. But because of the active user paradox, most people have no idea how unproductive they are on their computers. They're like frogs in boiling water.

Who knows how to organize workflow and use a computer in the networked world? Unix users, of course, whose design philosophy, toolset, and culture grew out of the Internet itself, instead of having connectivity features bolted on. Small, stable programs passing plaintext between each other works well over a network, creating flexible, powerful, and simple systems. Unix has great text editors (better than desk top publishing packages), email clients (better than personal information management applications), search tools (grep vs. Windows search), and file management (standard Unix heirarchy vs. Windows Explorer). In short, email, list-servs, bulletin boards, and a simple (plaintext) file heirarchy searchable with grep are better tools in a networked environment than Microsoft's paper-era suite. Moreover, the Unix environment also gives users many tools to automate away repetitive tasks and capture productivity (and competitive advantage) over those who don't. Windows has yet to offer a decent text editor.

The great irony is that just as Microsoft is bolting on more and more network features onto it's paper-centric PC system, the Unix world, which has already figured out how to operate in a networked environment has forgotten its heritage and is struggling to recreate the tired old desktop suite on Linux. While Linux may need the equivelent of Word to grow in today's desktop market, it's ludicrous for them to forget all the tools needed to operate in a networked environment. Unix users have already done all the intellectual heavy lifting in this area, and should port that thinking to the GUI instead of creating shadows of paper-era applications.

Mark Hurst developed a system he called the "Good Easy" that basically took the Unix design philosophy and ported it to a Mac OS 9 GUI (he calls his idea behind this "bit literacy"). Basically the Good Easy consists of five key applications (email, browser, calender, text editor, and file manager) that swap plaintext between each other (using cut and paste). This Unix pipe-style feature is created by tying each application to a function key and using those to switch between them. When using the system, I literally do not notice what program I was in at any particular moment, I just use the system to get my work done (does any of this sound familiar to Unix folk?) Also, there is a universal spell checker, a text expander (that expands character combinations to longer text strings, e.g. turning "za" into "zimran ahmed", "dt" into today's date etc.), quick-key creator (to automate away repetitive tasks), and search (Sherlock in OS 9 is better than search in Windows Professional, and "find" in the BBEdit text editor is grep).

This system is tied together by a culture that understood how to use the programs as a whole, the technology is simple. Everything is kept in plaintext (email is piped through the text editor to strip out the line breaks and then saved), folder heirarchies are kept flat, and there is a careful naming convention. User created files are kept seperate from application executables, which reduces backing up to dragging a single folder from the desktop to a networked drive. This should all sound familiar to Unix folk, but I noticed it was difficult for Windows people to embrace. They did not understand why plaintext was important, could not shuffle text between applications well, and happily let email pile up in their client instead of moving it to their harddrives (where it could be searched). And forget about automating away repetitve tasks using the text expander and quick-key creator--way too difficult, no matter how much tedium it saves you.

Unix people have already figured out how to manage workflow in a networked, digital environment. Businesses that learn these lessons will have competitive advantage over others. The fact that it seems to be forgotten on Gnome and KDE is a crying shame. You don't have to operate at a command line level or be a programmer to enjoy the productivity benefits Unix offers. A pity this isn't reflected in the GUIs being built.
Link to this column
Configure the Good Easy on your Mac OS 9 (Mark did some great work here)

Friday, August 17, 2001

Great interview on Slashdot It's worth reading this Bradley Kuhn (VP of the Free Software Foundation) interview where he talks about software libre.

Two thoughts I had today on Free software:
1) The GPL is essentially the poor man's patents. Just as companies gather patents mainly to trade or pool them with other companies (and ensure they have access to other patented works) the GPL maintains open access to a network of related software works. Individual developers can't stockpile patents to ensure they get a seat at the table, but they can license code under the GPL.

2) Hardware vendors that don't release their drivers for GNU/Linux are probably being paid off by Microsoft, or other proprietary software OS providers who want to fight against the open source operating system. Hardware vendors have every incentive to commoditize complementary products (ie. products that use the hardware) and having many open drivers is the obvious way to do this. The only reason they would act against their sound economic interest is if they were being paid. If anyone has any inside info on this, please write to me at zimran@winterspeak.com

Movies available online A cartel of movie studios are on the verge of offering (NY Times subscription needed) a video-on-demand service over the Internet. It's a step in the right direction, but it's not clear if studios are hoping to control their content or maximize its value. Customers can download movies onto their hard disks where they will hang out for 30 days before expiring, or 24 hours after the first viewing. They're also planning to release films through this service when they enter the pay-per-view section of their lifecycle. So it's definately metered, crippled media carefully positioned not to cannibalize existing distribution channels. I'm not sure what the pricing is, but I doubt it customers are interested. It'll be interesting to see how the industry reacts when crackers break their cryptography system.

More patent absurdity But not on software -- looks like stem cell research (NY Times subscription needed) is owned by one group of people.


How to tame the Internet Back in the day, by insisting that information wants to be free, hackers sat out of meetings between lawyers, the publishing industry, and government bodies. As a result monstrosities like the Hague protocol and DMCA came into being. I think that this is less true now, certainly after Dmitry, Felton, etc. The challenge for hackers is how to ally with regular folk, who are still not part of meetings where there freedoms are restricted.

They may find some unlikely allies in the financial industry, where companies are incensed that others are keeping their public pronouncements, well, public. The narrow mental outlook of the legal set who support the idea that a public conference call is somehow the "property" of the company that organized it should raise the issue of what exactly "ownership" should mean when applied to information.

Wednesday, August 15, 2001

Web Applications There is much excitement over "Web applications" these days, essentially Web servers with particular functionalities built into them.

How do I feel about this? Well, right now the discussion is just at a technology level, which means engineers are quibbling about implementation details and know-nothing MBAs spouting hype.

Just to keep some perspective, let's revisit what Web services are: functions distributed over the network in such a way that programs can talk to each other. Is this useful? Sure. You can create all kinds of neat applications with it, but the big wins (for consumers or producers) center around lock-in. If Web services increase openness, interoperability, and ease-of-switching, then innovation (and customers) will win. If Web services just boost lock-in (via Microsoft) then all technology users suffer.

On Blogger I use two webservices to run this column, Blogger and SPAM by Philip Greenspun. Both are excellent, both are free, and both represent the future of Web services (except maybe for the free part). Scripting.com has a great run down of how XML RPC is building distributed computing services.

On the DMCA One of the absurdities of the DMCA is that it effectively outlaws fair use by making it criminal to use circumvent ANY copy-protection of protected media. As folks so often forget, fair use came FIRST and copyright was a sop to give content producers an incentive to enrich the public domain. It seems someone in Congress has noticed this. Unfortunately, it's being dealt with in a granting-exceptions-to-special-interest-group approach instead of scrapping the bad legislation and replacing it with something reasonable.

Monday, August 13, 2001

Microsoft continues to extend lock-in Microsoft's lock-in strategy continues despite appearances to the contrary. Here's the clearest account I've found explaining how Microsoft is leveraging its desktop monopoly (through Passport) too lock all users and vendors into its identification system.

Friday, August 10, 2001

Delta holds its customers in disdain Just had an absolutely terrible experience with Delta Airlines. Was trying to attend an important event in Omaha when New York gets two inches of rain and cancels all flights into and out of La Guardia. The next eight hours of my life become a hellish nightmare of standing in line waiting for a single Delta agent who could only process one person every 45 minutes. Eventually, I was told there is no way they could get me to Omaha on time.

I spoke to two customer care representatives on the phone, Rod Tidwell and "Miss" Brown. "Miss" Brown refused to give me her first name, just as she refused to give me quality service. Delta refunded me the money for my ticket, but refused to refund me my hotel bill and airport cab fare. Their excuse? "Flight was delayed by weather, so we're not liable." They seemed incapable of understanding that this semantic trick did not get them off any sort of hook as far as I (the customer) was concerned: Delta's failure to come through on a contractual agreement makes them liable for all consequent financial harm. Weasel words will not change this.

I wasn't trying to shake them down for emotional harm or inconvenience. I accepted that hours had been lost from my life that I was never getting back. I accepted that I was not going to attend the important event. I just wanted them to cover transport and hotel costs. Weather happens, taxis happen, and hotel rooms happen. They should be prepared for this stuff. But no, the company that spends over $2M a year in advertising, decided it was not worth $200 to turn a loudly and publicly irate customer into a loudly and publicly satisfied customer. Thus this unfortunate missive. Any company that treats its customers with a disdain that borders on contempt deserves what it gets. This is a very unfortunate incident, as I was ready to write a very positive review had they treated me right but no. Oh well.

Open Source's success Here's a great piece by Tim O'Reilly arguing that open source advocates overlook some of their greatest successes and the powerful businesses (such as ISPs) they've created.

More on plaintext and lock in Vikas Kamat comments on my plaintext article and notes how this simple format helps documents outlive the email client. As he puts it: "plaintext rules for the same reason the paper rules -- because it is not dependent on anybody for its dissimilation." Well said. Lock-in is the single greatest inhibitor of productivity and innovation for technology providers and consumers alike (very large essay on this soon).

Thursday, August 09, 2001

Internet Copyright Jessica Litman describes how copyright laws are actually created: publishers claim they already have all control rights and particular interest groups who are hurt claim exemptions for themselves. Congress then signs a long bill that "affirms" copyright statute (when in reality extending it) while granting 100 special interest groups 100 complicated exeptions. Consumers are not present at the table, and consequently have no say in the negotiations. It's all just lawyers talking to lawyers (Lessig gives a good interview on why legislation of this sort kills innovation). Kind of makes you wish for more Chicago School style Law & Economics that might get lawyers to look beyond their narrow fields of specialization.

I also recently read Next by Michael Lewis, where he discusses how Insiders (for whom the system works) are terrified of Outsiders (who try to change the system so it will work for them) but that the disruption generate by Outsiders is good for capitalism because it creates new wealth. This is why VC funds are still growing, it pays to be edgy. Unpredictable (real) change also scares a lot of people, including Bill Joy, the Unabomber, and Caleb Carr.

Me? I'm for more wealth. And children should always frighten their parents.

Hot of the presses A good friend of mine just published an article on Star Wars tourism in Matmata, Tunisia. After the Ragnarok that was Episode One, I have no interest in the ludicrously named sequel.

Wednesday, August 08, 2001

The Joy of Plaintext Microsoft hates the fact that email is in plaintext. My Outlook Express client is buggy when it comes to handling the simplest of all tasks: receiving and responding to a text email. I've fiddled with all the internal settings, trying to get it to convert HTML mail to text, responding in text, and including all these simple plaintext protocols like adding ">" to quoted parts of an email I'm responding to. But my Outlook still insists on having things pop-up in tiny, colored fonts that are impossible to read, and then not tagging quoted text. In this environment, emails quickly bloat and become incoherent.

This hatred of plaintext is also evident in Hotmail, Microsoft's Trojan horse to Passport. The web-based text editor actually allows you to format your mail using bold, italic, and underline etc. Insanity even when it doesn't crash your browser. I'm glad AOL is still holding out against this sort of nonsense. I hope it continues to do so.

Why does Microsoft hate plaintext? One possible reason is it comes from the PC-world where having a printer was all important, and don't understand that desktop publishing functions like bold, italic, and underline make no sense in the networked world, where data is rarely printed out. But I think they're smarter than that. The real reason Microsoft hates plaintext is because it makes lock-in impossible. Plaintext can be created by anything and read by anything. It is the cleanest, simplest, least proprietary way of passing information from A to B. The Unix culture, where interoperability is God understands this, and has raised simple programs passing plaintext to a high Art. By contrast, Microsoft thinks interoperability is Satan and focuses on locking-in customers and locking-out competitors, using proprietary file formats like .doc to extend its monopoly. Plaintext is the enemy of proprietary standards. It is also the enemy of monolithic programs that are liberal in what they try to do and conservative in what input they accept. Microsoft understands this well, so is trying to kill the format.

The average computer user does not understand the power of plaintext. They don't know how to work in the networked world and see no problems with storing notes that will never be printed in Word documents. In time, businesses that understand how to operate in a networked environment will realize what Unix users have known all along -- keeping information in plaintext allows for faster searching, delivery, and manipulation, the bold, italic, and underline of the networked world. And if businesses reinvent themselves along these principles, they will gain competitive advantage over their competitors.

In the meantime, don't let Microsoft turn email into just another of their proprietary standards. Stick to plaintext.
Link to this column

Tuesday, August 07, 2001

What is core to Open Source? As the OS community is tested by Microsoft and it's "shared source" FUD on one end, and by the market on the other, it's important that it figures out what is absolutely core to its mission, what it should never change, so that it's flexible enough to change everything else (thanks to Jim Collins for this observation). Richard Stallman lays this out in his four freedoms.

Freedom 0, the freedom to run the program for any purpose is critical to continue the open, end-to-end architecture that supports innovation. This freedom is being threatened by Microsoft's licensing terms, authentication measures, .NET, and the DMCA.

Freedom 1, the freedom to study how the program works, and adapt it to your needs, is critical to end lock-in, the economic strategy that has done most to limit software innovation and productivity. The DMCA challenges this, as does proprietary code. OS and the business community need to get much more sophisticated about lock-in, and should take an active stand against this at an infrastructure level.

Freedom 2 (the freedom to redistribute copies) does not seem critical to me, but part of freedom 3 (the freedom to improve the program) is critical, because again it fights lock-in.

This notion of lock-in is so important that I feel it deserves its own essay, that I'll write soon.
Link to this column

Dmitry is out on bail You can read the story here, and some details on his cryptography work here.

Layoffs at Monitor Monitor announced layoffs today--they're cutting 15% of their North American consultants. Management said it was doing this is to preserve bonuses for junior people at a time when business is slow. I think this is the right attitude: a bottom-up bonus structure acknowledges that junior folk (who do much of the work) must get paid what they're worth or they'll leave and holds management accountable for its poor forecasting. This is much wiser than some tech firms I know about who ended up with 30 know-nothing MBAs and no coders. Having been through this myself, I sympathise with those who were laid off, but also know that great things lie in store for them in the future. Good luck.

Monday, August 06, 2001

Good day for rationality Today seemed to be a good day for rationality in the high-tech world. There's a bill trying to keep online music competitive (good luck), UCITA has gone down (thank God), and WAP now follows HTML standards (a step in the right direction, about 3000 more needed).

Sunday, August 05, 2001

Site outages Apologies to all. Winterspeak was picked up on Tomolak, Scripting, and the front page of Kuro5hin all on pretty much the same day. Server went down. Things are back up now.

Wednesday, August 01, 2001

The Microsoft mind It struck me the other day how hated Microsoft has become. They were always disliked by some, but now they can't make a single move without every public commentator denouncing them and their evil ways. Microsoft itself does not understand this, as we see in this discussion of smart tags. Also, here's an insightful look at the economics of bug fixing Microsoft-style. Note: they only calculate the cost of bugs from the producers perspective, the value of the consumers work or time lost to bugs is invisible. To further underscore how universally hated and distrusted Microsoft has become, check out this theory on how Microsoft plans to use it's abysmal virus record to takeover TCP/IP. And the Scottish Police have just rejected Microsoft too. I wonder how MSFT employees feel about where they work.