Retirement journal: Part 3

It took about five and a half months of retirement but this morning when I woke up I realized had nothing pressing to do.

I guess that’s good. For much of these last months the pressing things were related to our pending relocation and mostly they involved fixing up our house. That work is mostly done. We got something of a Good Housekeeping Seal of Approval last week when our house stager came by to tour our house. It’s her job to make it attractive enough to draw a seller willing to pay top dollar. I was worried she’d want to bring in rented Ethan Allen furniture and make us move much of our furniture into storage, but there was none of that. She approved or at least could work with the furniture we have.

Her suggestions were for the most part easy to deal with: silver knobs and handles for the kitchen cabinets and lots of fluffy white towels for our bathrooms, which either she or our realtor will supply from their inventory. Our beds will need skirts around them. Perhaps the most onerous task is to get rid of the green trim in the living room, dining room and hallway. The green trim will become bright white, and that includes two doors painted green. Mostly she was positive. Our months of work have paid off. We’ll find out how well it worked around March 1, when our house will go on the market. If we get and accept an offer then a whole other process will start.

Already our home is becoming a house. Most of the personal items hanging from the wall have been put away. Possessions are moving into boxes that are getting stuffed into closets, probably not to be seen again until they are reopened in our new home. Furniture is getting moved around. Open space is what buyers want. So off went the valences that obscured the view of our deck, which makes our main floor now appear much larger than it is. Clutter like our coat tree is bad and we were instructed to hide it. Buyers must get the illusion of large and uncluttered open spaces, including kitchen countertops. Our many upgrades over the years are marketable. These include hardwood floors on the main level and granite countertops in the kitchen. The stager complemented us on our curb appeal and smiled when she saw our large backyard. It should appeal to someone or someones probably like us, just twenty or so years younger than us: someones with the time and money to tackle the endless tasks of keeping a house in good repair while actually living in it. I assume it would be a family with small children, but for some reason I imagine some gay or lesbian with lots of stuff buying the house instead.

Meanwhile our new home awaits construction. Nearly a month has passed since our last visit to Northampton Massachusetts where we will move but there has not been much progression on our attempt to get a house actually constructed. Both the builder and the architect inconveniently took two-week vacations during the holidays. The ground froze over while they went to warmer climates. The foundation is the first part of our house to go in. It doesn’t sound like frozen ground will keep us from having the foundation put in, but completion a P&S (purchase and sale) agreement has. We had to find a lawyer up there to represent us, and the owner of the plot is supposed to forward an agreement to our lawyer. It’s no big deal and it hasn’t happened yet, but maybe it doesn’t matter since we need to go up there again to have a meeting with the architect (now back in the snowbelt), and our amenities will certainly affect the price. In any event, we will need to find 5% of the assumed price when they start digging the basement, and any old check won’t do. It has to come from our credit union directly, because Massachusetts’s privacy laws prohibit the builder from seeing our account number.

There is a high probability that we will settle on the sale of our existing house long before the new house will be ready. This means we’ll have to live somewhere, so we’ll probably have to find temporary digs. We’ll likely move to some apartment or house near our new home, leaving much of our stuff in storage up there but unpacking quite a bit of it while we wait. The other possibility is that our house won’t sell for whatever reason. We will take all steps to prevent this of course, but it really has to sell if we are to pay for the bulk of the new house. Renting out the old house while buying the new is possible, but we’d need some sort of bridge loan. And it would raise the complexity of the whole relocation thing another notch.

All these things are in motion but at the moment not much of it requires our immediate attention. So today is something like a slack day, and it’s not the first. Last week we took in a Wednesday matinee. Apparently some theaters try to attract us people of leisure with discount Wednesdays tickets. That’s how we got to see The Imitation Game for $5.75 a ticket. It’s amazing how much less complicated living in Northern Virginia is when you can routinely get around outside of rush hour. It makes living around here almost pleasant.

I put out new versions of two open source programs that I have written. My consulting business continues to do well but at the moment there is not much in my work queue requiring immediate attention. When the weather cooperates I can get my daily walks in rather easily. I’m hitting the gym more often because most days are below freezing, but some days I take long walks in the cold air anyhow, bundled in my warmest coat, hat, scarf and gloves and with a podcast in my ears. I am contemplating starting a port of my two open source programs to a new platform, but finding the time to write my first app still is on the back burner, but something I want to do. It’s how I have fun, apparently. The idea is to sell an app or two, although most apps tend to languish, but hopefully it will generate some significant income worth the time invested.

In general, I am finding that retirement is good. I am still somewhat skeptical I can actually afford it, but a year or two of experience will prove it one way or the other. It’s not bad to bring in some income, but I do it mostly because I enjoy it, not because I have to do it. I want to stay busy and do stuff I enjoy but without feeling the pressure to make another mortgage or tuition payment. To find out if I succeed, keep reading these occasional retirement journal posts.

Thanks are not enough, Steve

It’s not news that one of the founders of Apple Computers and the visionary behind a plethora of Apple (and other) products died yesterday. Even people who don’t usually tune in real news tuned into the news of Steve Jobs’ passing. There’s a good chance when they got the news, it appeared first on their iPad, iPhone or iMac, just a few of Steve’s many inventions. (I got the news on my iMac.)

The more attached you are to Apple’s products, the more the news affected you. Part of your feelings was also the anguish that the iPad or iPhone you held in your hand may be the last cool product you would own. Unquestionably, Steve Jobs was an extraordinary inventor and creator. It will take a couple of decades for it to sink in just how greatly his life impacted humanity.

To call him the Thomas Edison of his generation is not enough. In reality he was some combination of Edison and Sir Isaac Newton. Edison’s genius was that he could figure out how to make inventions that seemed beyond our technical grasp. (He also patented many inventions that never took off.) The need for a more reliable and cleaner source of light was understood in his time. Someone just needed to figure out how to do it. That was Edison’s genius. When the Apple computer was unleashed on the world, it filled a void we never knew existed. Like Sir Isaac Newton, who discerned order behind the forces of nature, Jobs could model a usable version of the information centric age the rest of us simply could not even imagine. Jobs could do that with almost any product he invented. Jobs’ genius was pulling our inchoate needs right from our id, figuring out a way technology could fulfill them, then designing irresistible products that could realize them. But he could also turn an invention that had been done before into something everyone suddenly wanted. The iPad by all rights should have not succeeded because it had been done before. The problem was that no one had built a tablet computer so easy to use and so sexy that we would be pulled to it like a moth to a flame.

I have a particular reason to mourn Steve Jobs’ passing and to be thankful to the guy. It wasn’t because I thought Apple products were particularly cool. I am typing this on an iMac, which is pretty neat, but not all that much neater than my Windows 7 computer at work. Rather, I am grateful to Steve for the Apple 2 Plus computer that he helped create. It literally changed my life. It did not make me a wealthy man, but it did make me a well-moneyed man. In part because of Steve’s Apple 2 Plus, I changed to a career I found that I loved and which paid much, much better than my old career. I became an information technology geek.

Sometime around 1983, the management where I work purchased an Apple 2 Plus computer and put it on a table near my desk. No one really knew what to do with it, but there it was all shiny and new. It was mostly ignored, but when my work was done I’d sit down at it and start playing with it. I was not entirely new to computers, but I had never experienced a personal computer before. I had experienced a mainframe computer, which in 1975 meant tediously constructing Fortran source code using a keypunch machine, delivering a stack of cards to an operator behind a glass wall, and waiting for a couple of hours until your job was run. Invariably you made some sort of syntactical error in your code, so you’d redo the cards that were coded incorrectly, being careful not to disturb the order of the cards. And you’d go through the cycle many times until with luck your program ran correctly. You would get a printout from a wide dot-matrix printer with sprocket holes on both sides of the paper. In short, programming computers could not have been more difficult, tedious and time consuming. I got through the class but if I had an idea of doing computer programming for a living, it went away. Programming was for masochists.

The Apple 2 Plus changed that. It had a keyboard and a monitor, and it ran a computer language called BASIC that was simple enough for even a novice to pick up. More importantly, it was personal. I could use it in real-time and get immediate feedback. At the time I was using pink copies of handwritten forms to track the movement of “service requests” through the printing plant I worked at. I kept them in a binder. With the Apple 2 Plus, I figured out a way to do away track these service requests on the computer. I stored the data on five and a quarter inch floppy disks. I impressed my bosses and I recall getting an award. The award included lunch with other award winners and our director in the director’s conference room. I was onto something good.

The details of what happened since then are not important except to say I wiggled my way into a journeyman’s computer programmer slot. Since 1986, I have made my living first through computer programming and later more advanced information technology stuff. The Apple 2 Plus totally changed my life. It made computer programming fun and profitable at a time where anyone with modest computing skills could get a job. My income soared, my sense of self worth and job satisfaction went through the stratosphere and eventually I had the income to live a larger and more comfortable life I craved.

That was what the genius of Steve Jobs’ mind did for me. He gave me wealth and he gave me work that was both creative and mattered in the real world. He did it by making a computer actually personable. It was a long time between that Apple 2 Plus and my 2008 purchase of an iMac, twenty-five years in fact when I lived mostly in a Windows world. I followed the market, which were machines running MS-DOS and later Windows. However, it was inventors like Steve Jobs who made computers relevant to the masses, in fact they became must have items, which stimulated demand, drove us to email systems, and then the web, and more lately into social networks. Steve was not only a creator and inventor; he cemented us into the information age. He personally connected us with technology and each other in ways that had never been done before.

He died way too young at age 56, but he could not have died without knowing the huge impact of his life. He deserves monuments and museums, cities renamed for him and, if we ever build an American Pantheon, perhaps the biggest statue in the room. I am quite certain I will not live long enough to see the rise of another man or woman of his caliber. Quite frankly, I believe that Steve Jobs may ultimately prove to be one the most influential Americans that ever lived, ranked right next to Lincoln. Through intelligence, foresight and boundless energy, he invented a broader and more connected future for all of us.

AJAX is not just a scouring powder

(Note to readers: I leave on vacation tomorrow, so I will be posting sporadically next week if at all. I expect to be internet inaccessible for a few days. My family will be in Phoenix and Las Vegas.)

It used to be that Ajax was a scouring powder. Actually, it still is, but to us geeks there is also AJAX (all upper case) which stands for “Asynchronous Javascript and XML”. When a web developer programs in AJAX they can do cool things that markedly improves the usability of a web interface. Perhaps the most prominent example of AJAX in use was one of its pioneering applications, Google Maps. Somehow, wholly within a browser and without refreshing the page, Google was able to do things that hitherto seemed impossible, such as moving a map by dragging on the image. The new map images are fetched in the background; the web page itself is not redrawn.

This magic is possible in large part because finally we are using newer browsers. Hidden inside your browser is something called the XMLHttpResponse object. This little sucker makes it possible for your browser to contact the server without redrawing your web page. This seems like such obvious functionality you have to wonder why it was not designed into the browser ten years ago. Part of the reason was that HTML (the language used to describe web pages) was not standardized at that time. Both Netscape and Microsoft were unwisely adding nonstandard tags to differentiate their browsers. This had the effect making life hell for us web developers. In addition, the other part of AJAX, the XML part, did not even become an official standard until 1999. It took many years for the browser makers to agree they all needed to work from common references. The common references were HTML, Javascript (a programming language that lets your browser do dynamic things) and something weird called the HTML Document Object Model. The World Wide Web Consortium is the standards organization that created these standards.

Now that browser manufacturers are testing to a common set of standards, web developers can breathe easier. There are still browser quirks that need to be programmed around, but they are largely minor annoyances now. More importantly since these three components can be taken for granted at least 98% of the time it becomes possible for web browsers to do amazing things that used to require installing separate programs on your desktop computer. The logic to make it work in your browser gets a bit complex at times but it is possible to emulate most of the rich functionality of a desktop application inside the browser. Most web developers, however, have yet to dip their toes into this AJAX stuff. Until about a week ago, I was one of them.

I dallied because being a web developer is a very part time gig for me. My full time job is in Information Technology management, not in programming. You can make a programmer a manager but some part of the manager remains a programmer. On nights and weekends, I have been dipping my toes into the AJAX world. It is not entirely a waste of my time. I have a few practice domains where I can try this stuff out without embarrassing myself. What I learn I can often apply to my job. For example, we have a need to collect names and email addresses to run a customer satisfaction survey for the system that I manage. If we can avoid the overhead of having a page refresh, we can drop the form onto many of our web pages without disrupting their flow of business. This allows us to target particular kinds of system users so we can get a better response rate to particular kinds of questions.

I run two other domains. I use the Oak Hill Virginia site as my test bed. It is designed to be a community web site. While it does not get much in the way of traffic, judging from the ads served on my site by Google it is of interest to the real estate community. When I purchased the domain in 2001 there were little in the way of tools that I could use to integrate real estate content into my site. Fortunately, that has changed. Working with the husband of a local realtor, I became aware two sites. offers a home valuation service. offers a real estate search engine that can plop houses for sale on top of Google Maps. Both offer APIs, or Application Programming Interfaces. This means a developer like me can register to use their APIs and embed them in my web sites.

My first AJAX project was to integrate the home valuation service into my site. I did not want my page to refresh. I simply wanted someone to enter the address of a house and have its price unobtrusively appear on my web page. This could only be done using AJAX. My original hope was that I would not have to write any proxy script for my website to get the home price from Unfortunately, browsers have security mechanisms built in. Unless you specifically configure your browser otherwise, it cannot use the XMLHttpRequest object to call any other server other than the server that served the web page. (Not only that but the browser will assume is not the same server as Therefore, I had to create a proxy script on my server, which I wrote in PHP. It took the address and simply passed it on to It waited for a response from then echoed back the value of the home.’s API, like most, uses a REST based XML service. It avoids the overhead of using SOAP. However, the lack of standardization in the REST world meant that I had to fine-tune my script to accommodate the eccentricities of the API. Still, this was easier than dealing with the overhead of getting the data as a SOAP service.

It took about a day of development but I was able to make it work. You can see the service on my website. I found the W3 Schools web site to be invaluable. It has a simple AJAX tutorial. I copied, pasted and changed a few things from their example to get it to work. To make the form usable I also had to dig into their HTML DOM reference and research a few things I forgot from their Javascript reference.

Now could I write my own AJAX web service instead of relying on someone else’s service? I created a contact form where people interested in buying or selling property in my community could simply type in their information and it would be sent to a local realtor. This was not that difficult a service to create either. For both services, I found I wanted to record information received into a MySQL database table on my server as well as have the service send me an email whenever someone invoked it. Since I know MySQL and its programming API rather well this was not terribly challenging. In addition, PHP comes with a handy mail function, so it was also straightforward to have the services send me email. Eventually if the services prove to attract sufficient contacts, the contact information will go automatically to the realtor. Right now, I am reviewing content for spam.

AJAX applications like Google Maps are of course an order of magnitude more complex than my little AJAX applications. For most of us in the IT business though, we early our bread and butter through standard business applications like these. Eventually as time permits I hope to create a real estate page for the site, integrating the service from that lets me show local home listings in Google Maps, and serving related ads (mostly from realtors) around the content.

Whether I will succeed in my ultimate goal of making this site profitable remains to be seen. I have learned a few things along the way: AJAX programming is not too hard if you have a decent understanding of HTML, Javascript, the HTML DOM and a server scripting language. My applications so far are simple, returning a single value. If I need to return multiple values in a single call, I will have to delve into the details of using the XML parser built into browsers, plucking and displaying the content that way.

Continue reading “AJAX is not just a scouring powder”

My Movable Type Recent Visitors Application

I have never been enthusiastic about using SiteMeter to monitor my blog. It provides some real time statistics, but not very accurate ones. I will keep metering with it nonetheless because it is free and monitoring my blog is one of my favorite ways to waste time.

Every hit to this blog is recorded in the web server log. I thought it might be interesting to expose relevant page requests in my web server log in real time as a “Recent Visitors” application. The content of the actual web server log is, unfortunately, nothing but a lot of plain text, and ugly text at that. It is not easy to turn it into anything meaningful to an end user. Fortunately, there are many programs out there that will slice and dice web server logs. One of them (Awstats) comes free from my web host. If you have a web site, you probably have access to a similar program. Unfortunately, the public cannot generally access the statistics these programs create. Moreover, these programs are typically run only once a day. This makes it hard to see recent page requests in real time.

I noticed that SiteMeter can discern with some reasonable accuracy the geographical location of most people hitting my blog. It uses geo-location technology owned by MaxMind to translate your Internet Protocol (IP) address to a geographical location. To me this software is magical stuff. With some time on my hands this weekend, I put together a little geeky “Recent Visitors” application for this blog. It provides some real time visibility into what visitors are reading, where they are reading it from, and when they read it.

While I cannot afford the commercial version of MaxMind’s geo-location technology, there is also a “good enough” version called GeoLite City that is free for the download. MaxMind also publishes a variety of Application Programming Interfaces (APIs) that let programmers query their city database using the programming language of their choice. Since I need to put their database on my web server, I needed a programming language that operates on my web server. Since PHP is the easiest for me to program on the server side, I chose their PHP API.

My blog gets thousands of hits a day, most of which are not meaningful. I want to know what recent blog entries were read successfully. To do this I had to translate the URL requested into the corresponding blog entry title. This is not an easy thing to do. To translate something like “/2004/06/life_in_the_cou.html” to a human readable blog entry name took some work. It required parsing out the relevant portion of the resource name (“life_in_the_cou”) then querying the MySQL database hosting my blog. I searched for this file name in the mt_entries table and returned the entry title, which was “Life in the Courtyard”. From this I could both show what was being read and link to it.

To show only relevant content, I had to filter out the obvious noise in the web server log such as robots, crawlers, requests for images and dynamic pages, “file not found” errors and related web server error codes. After a lot of playing around, it worked!

Any geeks out there can look at my PHP source code (dead link removed). It is currently ugly code and it will be cleaned up in time. It assumes your Apache web log format is readable, and that its format matches mine. It also assumes you have MovableType weblog that stores content in a MySQL database. However, this application demonstrates that even a MovableType weblog can expose its visitor information in real time, albeit in a somewhat jury-rigged manner.

Right now, I am exposing the Recent Visitor’s Log (link removed) on a separate page. Eventually I intend to integrate the information into my Main Index page. To do this I will have to embed a window inside the Main Index web page. It appears that MovableType does not allow embedded CGI applications inside a page. Perhaps they will support this in a future version.

No, I am not offering any support for this code if you choose to borrow it. If you use it, you will likely have to do quite a bit of tweaking, so you best know PHP pretty darn well. However, as long as I am maintaining it I will publish the latest source code at this URL (link removed) for those who are interested.

The Coding Vacation

I must be a nerd. I am spending a significant part of my two weeks off programming.

While theoretically I can program at work, as a manager I usually do not have the leisure.
Therefore, I do it on my vacation as something of a hobby. The rest of the year, I cannot seem to work up the energy. For me to find joy in programming, I really need plenty of time without major interruptions or distractions. I also need to a project that might actually turn into something meaningful, both for myself in the form of knowledge, and for a community of people.

My first project involves another phpBB modification. phpBB is forum software written in the (ta da!) PHP programming language. Two years ago, I created a popular E-Mail Digests modification for phpBB. It sends out a customized digests of posts on a phpBB forum. It became so popular that it became something like a second job for which I was not compensated. Fortunately, a small community of phpBB open source zealots eventually took over the project. What a job they have done! Their latest version 1.3 is amazing. For me it was a learning experience in the value of open source collaboration. If you have a good enough idea, others will often collaborate with you, or even take over the project when you get sick of it. If it is not interesting enough it will simply join the voluminous list of dead open source projects, such as can be found at and

This new project is a variant of my E-Mail Digests modification. It is also considerably smaller in scope than the E-Mail Digests modification. As readers know, I am fascinated by syndication technology. The way I learn a new technology best is to try to build it from the ground up myself. There are a variety of competing news syndication formats out there including RSS 1.0, RSS 2.0 and Atom. Each format has its evangelists but in practice, all are being used. Most newsreaders can read any of the formats.

A half dozen or so phpBB modifications have already been created to allow phpBB forums to be syndicated using these XML protocols. phpBB sites allow forums to be restricted to special user groups. You probably do not want the Google search engine scanning these protected forums. However, with the emergence of news feeds, many people would like to get access to these restricted forums using a standards complaint newsreader. As best I can tell, none of the phpBB syndication modifications keeps users out of protected forums. In that sense, they are violating the architecture of phpBB. I made it my mission to figure out a secure way to allow authenticated members of a phpBB site to get content in protected forums as a news feed.

The trick is to authenticate the user. How to do this when news feeds are accessed as a simple URL? The solution was more challenging than I thought. Somehow you have to put in the URL both your username and password. That is not secure. You do not want your password bookmarked as part of the URL. phpBB helps by actually encoding the password into its database. However, it is easily decoded by simply studying the phpBB source code and writing a simple PHP script. Consequently, the password has to be further encrypted somehow.

My solution was not elegant, but it works. I mixed the password in the database with the database password using a hash algorithm. The database password is about the only thing guaranteed to be unique on a phpBB site, so it was a good encryption key. While it does not meet NIST encryption requirements, it is reasonably secure. I even figured out a way to roll in the user’s IP address into the authentication parameter. I made this optional, but if they select it, the news feed will only be served to the originating IP address.

I also wanted to dig into the mechanics of the various syndication protocols. I even bought a book to help. I came away from the experienced thinking that each protocol has its virtues and that people should pick the protocol they need. RSS 1.0 appears to be the most extensible, yet most complex. RSS 2.0 seems to have the most built in features. Atom seems the most thought through. I try not to reinvent the wheel if possible. I hunted for existing PHP classes that created news feeds. The best I found was this class, extended by many authors. You simply pass as a parameter to the class the type of feed you want. As good as it was I discovered a few minor problems. For example, the Atom 1.0 news feed did not properly validate HTML content. That was easily solved by modifying one line of the class. In addition, the class ignored some limitations of RSS 0.91, including the 15-item limit and length limits for the contents of certain tags. I coded around these too, and used to make sure my feeds were valid.

I will likely keep tweaking the modification in the weeks ahead, but eventually it will be provided to phpBB as a modification. Whether it will be embraced like the E-Mail Digest modification remains to be seen. My payoff is that I now know that I truly understand news feeds. What I have learned is that creating news feeds is pretty darn easy. Therefore, there is little reason not to keep promoting them where I work. I get regular requests from the public requesting our data in a news feed format. Since it is easy to do, I will try to allocate some of my team’s time for the project. (This is always a challenge since they are asked to do too much.)

Similarly, I am fascinated by web services. Web services allow data encapsulated in XML to be requested like an Application Programming Interface (API) over the Internet. Arguably, news feeds are a low-tech web service. This observation alone has been one of my biggest discoveries. While there are times when the orchestration of web services are needed and consequently you need SOAP protocols, 95% of the time low tech XML over HTTP is fine. Rather than reinvent the wheel, if possible serve your content by riding on top of an existing XML technology. RSS and Atom news feed formats are obvious examples, since there are so many consuming applications out there.

I also spent about a day trying and succeeding, in using server side XML technologies. Could I get data out of a database as XML? Could I then transform the XML data using XML style sheets and serve it as HTML? In short, could I replace the traditional plumbing in server side scripting languages to use these new methods?

Yes, I could, but it sure was neither elegant nor efficient. Since I am comfortable with PHP, I did not venture into the Perl, Python or Java worlds. The first problem was that MySQL has no inherent way to render SQL queries as XML. I am sure that will come in time. Therefore, I went hunting for a class library to do just that. I found a number of solutions, all very primitive. The XML that came out was rather inelegant, but at least it was XML. The problem turned out to be transforming the XML into HTML using a style sheet engine. The one currently built into PHP was primitive. It did the job but it required both the XML and the style sheet to be in files. It would be much more efficient to do it all in memory. So while viable, it hardly looks like a way to save CPU and bandwidth. There are already elegant solutions like Hibernate for the Java world, but those of us in a LAMP environment are still quite constrained. I am sure these solutions will appear in time. As I remarked a couple months ago, ActiveGrid looks like one such solution.

I do not see much value in doing all this work on the server side. It strikes me as inefficient. Therefore, I would let the other scripting template engines, or the application server for those who can afford one, do the dirty work using much more efficient methods. However, when transforming database information into something more than HTML, then these tools should be given heightened consideration. If, for example, a business were to implement a true SOAP compliant web service then these technologies should be used. For an ordinary dynamically driven web site serving HTML, the overhead of transforming data from a relational database into XML, then transforming it again into HTML is simply not worth the hassle.

With six days left in my vacation, I hope to keep experimenting with web services, and to carry my practical knowledge forward into my work during 2006.

Intrigued by ActiveGrid

As frequent readers know, I was on the west coast this week. I was in Cupertino, California (in Silicon Valley) to attend a MySQL Customer Advisory Board meeting. MySQL is an open source and very fast database, used mostly on the web. Perhaps because we happened to have our representative pay a call a few weeks earlier I was invited to attend this meeting. As the only customer representing the federal government at the meeting, I was one of the dozen or so customers invited to throw in my two cents about the features that MySQL AB (the parent company) should put into the product.

Alas, I cannot give any hints on where the company will be taking the product in the years ahead. My participation was predicated on abiding by their nondisclosure agreement. I can say that if I were Larry Ellison (CEO of the mammoth Oracle Company) I would be worried. MySQL is a small but very agile company that has the lion’s share of the open source database market. It was cool to be in the same room with mega internet companies like Google and CNet and learn how they are creatively using the product.

A year ago last April I attended their MySQL User Conference in Orlando, Florida. It was there that I got the open source religion. Nothing since then has changed my mind. I think proprietary software is going diminish. I am sure Oracle will still be around in a decade. However, it might well be a shadow of its current self. Every software company these days should at least be pondering an open source strategy. At least one other database vendor has the religion. Ingres recently went open source.

MySQL won its market share the hard way: by creating a great and (in many cases) dirt free product. Corporate MySQL licenses do cost money but they cost a tiny fraction of what you will spend for products like Oracle. Most of their money is made in technical support contracts, consulting and training. However, their product is reliable enough that many corporations can do without these extras. I was interested to learn one tidbit at the meeting that demonstrates the difference between MySQL and Oracle. At MySQL, its founder Monty Widenius is one of three people allowed to put new code into their software baseline. Somehow, I cannot see Larry Ellison doing this. Heck, I am not even sure he could write a line of code. He is too busy flying places in his corporate jet or hanging out with his foxy wife. Having met Monty, he is certainly no billionaire. He is a geek with no pretensions of grandeur.

MySQL is one of the best of many, many open source products out there. Open source MovableType software used to run this blog. It is licensed a lot like MySQL: it requires modest costs for those who want more than a personal blog. However, much open source software is wholly free. Apache, the web server used to serve this web page is one example. The Linux operating system is also free, although most people prefer to purchase packaged distributions like RedHat. A number of programming languages for the web like Perl, PHP and Python are also robust and completely free.

Linux, Apache, MySQL and Perl/Python/PHP form a set of core open source products that offer amazing quality and features for little or no cost. They also work and play very well together. This synergistic combination of products is sometimes referred to under its acronym: LAMP. Mastering any of these products is not rocket science. This makes them affordable and accessible to the masses. Any reasonably smart person who has fiddled with a programming language can write LAMP applications by reading a couple books. (Writing professional LAMP applications is another matter.)

In addition to open source software, a new standard for data communication has emerged. Specifically, in 1998 Extensible Markup Language (XML) became a recommendation by the World Wide Web Consortium. XML may be a standard but in some ways, it feels like open source software. Vendors are busy writing software (much of it open source) that reads and writes XML. In addition to describing data in a standard way, XML specifications exist that allow data to be validated (XML Schema), rendered as output to various kinds of devices (XML Stylesheets), describe workflows (BPEL) and capture input (XForms). With XML as an industry standard, the cost of doing business electronically over the internet is going much lower.

It was probably only a matter of time before someone looked at the largely free LAMP stack, looked at the uses of XML and said, “Wouldn’t it be great if there were an open source LAMP software solution that could help you develop web applications quickly and also worked transparently with XML.” That day has arrived and ActiveGrid is the emerging product.

The system I manage is a perfect example of the hassles of using open source technologies the old way. Our system is a pure LAMP application but it is a pain to maintain. It was developed in the late 90s when this stack was just coming together. The result is a wonderful system that is amazingly flexible but a pain to modify. Perl, our primary programming language, was not designed to be object oriented. Object oriented languages are important for modern systems that need to be designed to morph over time. As the system manager, I am looking to the future and imagining how the open source system of the future should be engineered. ActiveGrid, if it proves viable, looks to do for applications what MySQL has done for databases.

All the software ActiveGrid generates is open source. You may have to write some of it by hand. If you do then you pick Perl, PHP or Python, whichever rocks your boat. (PHP support is coming.) MySQL can be your database, but you can also use other open source or commercial databases including SQL Server and Oracle. ActiveGrid assumes that your web application will be built on top of Linux and Apache. All these open source LAMP technologies run on cheap commodity hardware. Therefore, it is not difficult to stand up enterprise class web, application and database servers, each on separate machines, for $20,000 or less out of pocket. The economics of this LAMP model are compelling.

ActiveGrid allows you to build LAMP applications quicker. It provides an abstraction interface for many of the things you would have otherwise code. For example, you can design screens using its drag and drop tool. This is a lot faster than creating the code in an editor! You can design your workflow logic graphically using a tool that renders open source BPEL. What is particularly cool is that bundled in the ActiveGrid toolset are a number of XML engines. Therefore, your users might see web pages rendered as HTML, yet under the ActiveGrid hood, it has used XML stylesheets to render data described in XML as HTML. Slap other XML stylesheet templates on these data and they can become PDF documents, rendered for a cell phone or sent in a format designed to be heard (VoiceXML). On the input side, the user might see a HTML form, but it is translated into XML using XForms technology. Supposedly, the tool is sophisticated enough to render AJAX compliant code in the browser. This potentially gives it a very robust web interface such as you can see in Google Maps. In addition, much of the business logic is handled by an engine built into ActiveGrid that reads BPEL. In the past, you had to writing a lot of Perl, PHP or Python code to implement business logic.

As I have mentioned before, systems basically take input, apply business rules to it, and render it as output. ActiveGrid does this with XForms, BPEL and XSL. The result is one tool that leverages the low costs of the LAMP stack, commodity hardware and open source XML toolkits. ActiveGrid has the potential to create impressive web based systems that are quick to create and deploy yet are entirely open source. Technologies like Java 2 Enterprise Edition (J2EE) will continue to have their place. However, with ActiveGrid these open source technologies can be orchestrated to render complex applications that are likely to be just as reliable and efficient for a fraction of their cost.

How will ActiveGrid make money? Already an open source version is available, though it is still being tweaked. ActiveGrid will introduce in December a commercial version with fancier features. It is clear that the standard product is ample for most needs. However, if you want features like integration with Enterprise Java Beans (EJB) or Lightweight Directory Access Protocol (LDAP) then you need to pay for a commercial license. The proposed price ($3K per server per year) does not sound very burdensome. They also will offer paid support for both their standard and commercial products.

As I am a manager, I do not have the time to do much programming. However, I am hoping that in my spare time I have more time to play with this product. I want to put it through its paces. Moreover, I want to find out whether it can also easily create genuine W3C complaint web services. If so, this might well be a great product to use to create the next version of our system.

If you have experimented or deployed systems with ActiveGrid, please leave a comment and let me know your experiences with the product.

Slip Sliding into the Past

For nine years, I worked in the bowels of the Pentagon. Okay, maybe “bowels” is not the right word. I rarely went into the basement, that deep, dark and mysterious place. In the Pentagon basement, rats were not too difficult to find and all sense of direction was lost. It was a dark and horrid place. I worked on the third floor near the A (innermost) ring, which was a challenging enough place to work. Among other things, it was very noisy and constantly about eighty-five degrees. The Pentagon was designed before air conditioning and personal computers. With hundred of PCs on all the time it felt like an oven.

I still find it hard to believe that I spent nine years there. If there is one building in the world where I wanted to work least it was the Pentagon. I had been there before. It was a confusing maze of dilapidated halls chock full of military guys wearing lots of stripes, stars and medals. They had short tempers, short hair and seemed to specialize in rushing frantically from meeting to meeting. While defending the nation was important work, at its core their mission was finding very lethal ways to kill other people. It was not an easy place for this liberal to work.

I ended up in the Pentagon because I wanted the security of the civil service again. I started my career with six years working for the Defense Mapping Agency. Eventually I got restless and decided to try the private sector. I worked for the Democrats but in 1988 during one of their periodic budget woes, I ended rather abruptly laid off. To make ends meet I scrambled and took a contract job. For three months, I worked as a subcontractor at the Department of Labor. However, with a new house I could not afford unemployment or even underemployment for very long. The civil service at least had the virtue of having a steady paycheck. I found the Air Force at a job fair in Tysons Corner. The Air Force in the Pentagon was hiring. It took them less than three months to reinstate me as a civil servant.

Perhaps I should have suspected something. No doubt, my still active security clearance weighed on their decision to hire me. Still, it felt too fast and too easy. By government standards, they filled my position at something approaching breakneck speed. Thus, January 1989 found me everyday boarding the 5N bus from Reston to the Pentagon.

For nine years, I worked in the Pentagon. I shall not name the organization. We directly supported the Air Force staff in the Pentagon with software systems. Our work was mostly classified. My particular niche was to support a decision support system written in a programming language called PL/I. It helped the Air Force figure out where they were going to place all their aircraft over the next five years.

For all the difficulty and hassle of working there, it was quite a learning experience. I sharpened my programming teeth in the Pentagon, working up from journeyman programmer to lead programmer to technical leader. For a civil service job, it could be very stressful at times. Taxpayers have this image of civil servants sitting at their desks tossing paper airplanes around. In this job, at times I was running a system that kept me on call in the middle of the night. I reported to Colonels who did not take any excuses and had very short fuses. I learned a lot about my ability to deal with stress (not very well). I came to both admire the officers running around the place and loathe them. I admired their confidence and ability to get things done. I did not like the way they moved from job to job every couple of years. They rarely understood the culture of our organization. To get good performance appraisals they had to look like they were changing things big time. Therefore, it seemed we were always in constant reorganization mode. Some years it amazed me that we got anything done at all.

Nevertheless, the Air Force in the 1990s was well funded. I got lots of training. Whether I wanted to or not I learned all about software engineering. Moreover, because I was talented, I was eventually assigned to do some cool stuff. In the mid 90s, client/server architectures were all the rage. I was running a hip project written in a tool that now seems as antiquated as COBOL called Powerbuilder.

The military came and went every couple of years but the civilians hung around, like lamprey to the hull of a ship. The civilian workforce there ran the gamut from every taxpayer’s worst nightmare of a civil servant to mediocre to talented to incredibly brilliant. In general, there were those who did and those who did not. Moreover, there were those with talent and those who could only write spaghetti code. Mostly we maintained legacy classified systems that ran on Multics (and eventually) IBM mainframes.

I left seven and a half years ago. Since that time, I have not given the old organization much thought. I’m been busy moving on, working next for the Department of Health and Human Services and for the last seventeen months or so with the U.S. Geological Survey. However, I did find from time to time that I missed certain people with whom I had worked intimately. In particular, I missed my boss John, Steve, Ray and Diane. In the early 90s, we formed a very effective team. We also worked very well together. Moreover, we knew how to kick back together. For example, on Fridays we would escape to a Shakey’s Pizza place in Annandale for lunch. There you could get all the pizza you could eat for less than $5. What a deal.

The golden years were few. We move on and largely lost touch with each other. Ray retired. Diane took another job. Steve and John took jobs elsewhere in the Pentagon. Except for one retirement luncheon six years ago, I had not seen any of them until today.

I was one of the last people to get training in the obscure art of programming Multics computers. Through, I found a guy who I used to work with. He kept in touch with others from the Pentagon (he had moved on to the private sector). He passed my email address on. When a former boss of mine announced his retirement, I got an invitation to attend the luncheon.

For about a week, I pondered whether I wanted to open up that part of my life again. I worked with a great team for a few years. I also spent the last few years of my time there working in a different branch. There I was the squeaky wheel. In that new branch, I was not well liked. Eventually the project manager I worked for threw a temper tantrum. I was thrown off her team and sent back to do mainframe programming, which I loathed.

To say the least I was upset and hurt. Not surprisingly, soon thereafter I shopped my résumé around. By 1998, I was out of the Pentagon and working for the Department of Health and Human Services. I knew if I went to this luncheon that I might encounter some of this bad karma again. Did I want to blow them off and lock out that part of my past? Or did I want to venture back after seven and a half years and maybe say hello again to some people I had grown to like?

Nothing ventured, nothing gained. Therefore, it was with some trepidation that I attended my old boss’s retirement luncheon today. I was a bit nervous. Seven and a half years is a long time. I would remember faces. However, could I remember their names? My worry was specious. I was hardly the only person returning after many years. One man attended who had retired in 1988. For the most part, I also remembered the names of the people who were there too.

My old boss John was there, two grades higher than when I last knew him. That alone justified coming. He now manages hundreds of people in a very demanding job. (Since he ran on adrenaline, I figured he was right where he belonged.) I was amazed for in sixteen years he had not aged a day. Ray was also there. He had retired more than five years earlier. It was as if not a day had passed. We greeted each other warmly. Alas, neither Steve nor Diane was there. Diane had hoped to come but apparently did not make it. I do not know if anyone had even bothered to track down Steve. Yet conversation resumed naturally, as if I had not spent more than seven years of my life elsewhere. It seemed a bit odd.

And my nemesis L. was there too, as I expected. If my stomach was tightening, it was because of her. My most enduring memory of her was her screaming at me when she threw me off her team. Today we greeted each other cordially. In seven years, she had moved from project manager to the director of the whole office. This is an amazing accomplishment. (When I left she had only a high school education.) Her screaming fit at me aside, L. filled the mother hen role in the organization. Her specialty was people. While she obviously failed in establishing a healthy working relationship with me, she had worked her social charms (and hopefully competence) into the director’s job. I complemented her on her promotions and she politely inquired about my current employment.

As for the retiring guest of honor, I was glad to see my old boss Bill again too. Bill is a plainspoken man, and he took the time to take me aside. “Mark,” he said. “You were screwed by this organization.” He told me the story of how the nascent system I had led floundered after I left. To this day, it remains an expensive mess that does not meet the customer’s requirements. He said because I was not available a contractor had to be hired to write a functional description of the system. “You could have written it in a week.” Yes indeed. It was good to hear these words from Bill. I felt validated at last.

I did not hear similar words from my former nemesis L. However, I found her behavior a lot different. Maybe it came from having much more responsibility. She seemed more deferential toward me than I remembered. She talked about the vacancies in the office and encouraged me to stop by the office sometime and chat. With no malice in my voice, I told her I did not think that was likely to happen. Yet I could see her wheels turning. Perhaps she was thinking, “If I could get Mark to come back, he could fill a key role.”

On the drive home, I contemplated the idea of returning to that organization. I must confess after so many years that it felt comfortable jumping back into that culture. The nine years I spent there remains the longest time I spent at any one job in my career. It felt a little like going home to Mom and Dad’s and sleeping in your old bedroom again. Knowing L., I suspect I will hear from her in the coming weeks. If she does I suspect she will be sounding my out on whether I might want to return to working for the Air Force.

I cannot see myself trading in my current job for the hassle of a security clearance and commuting into Arlington every day. Although I am a fairly new employee at USGS, I already realize that I am at last where I should be. Every job has its stresses including my latest one. Nevertheless, USGS feels like the place where I should have begun my federal career. It is at USGS that I want to pour out my talent until I retire. I do hope that I hear from L. anyhow. I think she has regrets for past behavior and wants to tell me directly. Perhaps then, this old wound will fully heal.

The Transformation of the Information System

Like many of us in the information technology field, my career has been about creating and maintaining information systems. The techniques and technologies used have varied. Until now the process has stayed essentially the same thing. The process is something like this. Get people to put data into a computer. Store it somewhere. Apply business rules to it by writing a lot of customized code. Then spit it out in the forms wanted by various data consumers.

Really, that’s it. It’s doing with a computer what people used to do in their brains. Computers just have the ability to do these things much more quickly and reliably. But of course you have to tell computers precisely what to do and the order in which it must be done. This logic is what we call code or software. While it has not made me rich it has kept me gainfully employed and enjoying a comfortable lifestyle.

There were classically a couple ways to get information into a system. The most often used method at the start of my career in the early 80s was to stick someone in front of a terminal and have them enter data into forms on a screen. They then pressed a key and off the data went, through the ether and into a database somewhere. But there are other ways to bring data into a system. In the old data processing days one popular way was to load big reels of tapes from somewhere else and read them into mainframe computers. Since then we found more efficient ways of recording some information in a computer. Bar code scanning is probably the best-known way.

Once the information is in the system it is scrubbed, processed, compared with other information and placed somewhere else. In other words, it is sort of assembled. An information system is a lot like a factory. Raw material (data) is dumped in at one end. Out the other end comes data on steroids: information. You know much more about something from the output of the system than you know from the relative garbage of facts that fed it. And this information is typically used to add value, such as to gain a strategic or competitive advantage.

That stuff in the middle between keyboard and printer was a lot of usually hand crafted code. At the dawn of my career it was often written in Fortran or COBOL. During the mid to late 1980s it was more likely to be in languages like C, Pascal or PL/I. During the 1990s object oriented programming languages gained ascendance. Instead of C, it was C++. Businesses that ground out client/server object oriented applications used development environments like Delphi or PowerBuilder. Data and the software used to manage these data began to merge into something called objects. Logically the stuff was still stored separately. But conceptually an object let us programmers get our brains around larger and larger programming problems. As a result we learned the value of abstraction. Our programming languages became more ethereal. It became a rare programmer who could actually write a binary sort routine. Instead we call APIs or invoked methods on software objects to do these things.

Toward the late 90s critical mass developed around the idea that data should be easy to move around. Businesses needed simpler ways to transmit data with other businesses. This was one of those “no duh” ideas that someone should have successfully ran with twenty years earlier. Okay, there were ideas like EDI, but they were expensive to implement. Instead with the Internet finally ubiquitous enough to use as a common data transmission medium, a data standard for the web emerged: Extensible Markup Language, or XML. In the process data became liberated. Whether a field started in column 8 no longer mattered. Tags describing the data were wrapped around each element of data. An externally referenced XML Schema acted as a reference to tell you whether the data were valid or not. Instead of writing yet another unique application to process the data, a generic SAX or DOM parser could easily slice and dice its way through the XML data. Using objects and modules built into the programming language of your choice it became fairly simple to parse and process XML data. As a result there was at least a bit less coding needed to put the system together than in the past.

The newest wave in data processing is called web services. Just as XML became a generic way to carry data with its meaning over the Internet, web services provides protocols for automated discovery, transmission and retrieval of XML formatted data. Figuring out the best way to do this is still being hammered out. Protocols like SOAP are losing favor for simpler URL based methods like XML-RPC and REST. We’ll figure out what works best in time. But equally as interesting as these web services technologies are the XML transformation engines now widely available. The XSLT (Extensible Stylesheet Language) specification, for example, allows XML data going or coming into a system to be transformed in an infinite variety of different ways. It can be something simple like converting XML data into a web page with the XML data embedded inside it. Or XML can be rendered into something more complex, like a PDF file or a MS Word Document.

But what does all this mean? The light bulb finally went off yesterday. I was explaining to a colleague at work why I wanted a system I managed to have web services. My team understood the value. Data and presentation could be wholly separated. With data in XML it could fairly easily be transformed with the XSLT engine of our choice into the format that we chose. The effect of this is to markedly diminish the actual logic needed to set up and maintain an information system. The big payoff? In theory, fewer programmers are needed and it should be faster and easier to manage information. But in addition the system should behave more reliably, since less code is needed to run the system.

For example the system I manage is what we in the computer business call tightly coupled. It works great. But it’s just a pain to maintain. The data of course is stored in a classical relational database. To get it out and present it to the user we have to turn it into HTML. Right now we do this with a lot of code written in Perl. Naturally we get lots of requests to add this and delete that and show data rendered like this. And so once again, as programmers have done for a generation, we perform major surgery on our code and go through extensive testing until we get the results requested. But since we are a government system in a non-sexy agency we are grossly under funded. So most of these requests go into a programming queue. Many of these great ideas will be abandoned due to our tightly coupled system and our limited resources.

So what’s really interesting to me about these XML technologies is that we should be able to put together systems much quicker once we have the architecture in place. In addition we should be able to make changes to our systems much quicker too. We could end up with systems that in the classical sense require little programming. This example on the W3Schools site shows how incredibly simple it can be to take data from an XML data store and render it as HTML. Once the XML schema is defined and the template is written in XSLT then rendering it can be accomplished in just a few lines of code. Of course this is a very simple example. But when I think about what sort of effort and time would have been required to render this same result in those pre XML and web services days I am a little awe struck. The productivity potential is sky high.

So I’m starting to wonder: do XML technologies mean that information systems will no longer require any crafting by programmers at all, but will instead be easily assembled? If so this is revolutionary. But the pieces seem to be there. On the output side of the system XSLT and an XML database work fine together at spitting out information in a useful format. There is little or no coding needed here to make that happen. But what about the input side? There is revolutionary news here too. Initiatives like the W3C XForms project are finding standards based ways to gather form data intelligently. We programmers should not have to struggle too much longer with HTML forms, embedded with Javascript client logic and server based scripting logic. XForms will handle the job in an XML way that will minimize coding and markedly reduce maintenance costs.

And so there you have it: all the components needed to construct basic information systems in a generic fashion are nearly in place. Simple data collection and retrieval systems — what I have been doing my whole career — could potentially be done using open standards and without writing a line of code. With an XForms editor we will draw our forms and push it out to browsers and other web-aware device. Input interface: done. Web services can be utilized for the automated data interchanges needed between businesses. To realize this vision may require putting a SOA (service oriented architecture) in place first. A good application server will be able to get the data and persistently store it without much coding. And an XML aware transformation engine embedded in or talking to the application server will take our template of choice and render it in the format and media wanted.

Will programmers no longer be needed to construct information systems? Not quite, at least not quite yet. Few applications are as simple as the one I suggested. And there are hosts of other variables to be thought through, including quality of service requirements that often require programmers. But I suspect that over time we will see that information systems will require fewer programmers. Instead the emphasis will move toward those on the system administration side. Database administrators will still capture and manage data, but they will also tune the database for rendering content in XML. Business rules will move more into the database or into rules engines attached to the application server. The result should be fewer programmers steeped in the mechanics of languages like Perl. Instead we can expect more time spent tuning databases and maintaining business rules. Form data will be designed in an XForms editors. We will use similar tools to render output using XSLT.

Time will determine whether I am seeing the future clearly. But clearly I am not alone since this is the whole larger point of XML technology. Companies like Microsoft have created packages like BizTalk just for this purpose. (Their Visio product is used to diagram the business rules.) It should get easier and become less costly to create and maintain information systems. And over time we can expect that systems will be able to exchange and process data with each other much more simply.

The joy of coding

I’m a software engineer and a project manager so I don’t do much in the way of coding software anymore. In truth most code writing and testing isn’t that much fun. I was kind of glad to be lead out of the programming hole I was stuck in some ten years back. I realized I was writing the same code over and over again. It was getting boring. How many times can one code variations on the same do/while loop without pulling your hair out? It was better to give the work to some programmer grunts and work at a higher lever of abstraction. Project management pays better anyhow and college tuitions will be coming due in a few years.

Programmers may dispute this assessment, but they are the blue collar people of the information age. We coders are software mechanics, really. At some point I was led out of the software garage and into the manager’s office because others thought I had bigger fish to fry. I try to keep a toe or two back in the garage though. It feels more real than project management. Programming feels tangible and something I can take to the bank. Being a project manager feels ephemeral. I’m not sure I will have enough work to keep me busy a year from now. But I can always hang out my sign “Will code for food” if need be. I doubt “Will manage projects for food” will have the same marketing appeal. So I try, but don’t always succeed, in keeping up my programming skills. This is a market that moves very quickly. I’ve done some programming in the Java language, for example, but need to do a lot more. I won’t be asked to code Java servlets in my job, however. I may need to assign people to do the work for me however.

I took up teaching web page design partially to force myself to keep up with new technology. It worked and I now can create validated XHTML, can write cascading style sheets without usually consulting a reference manual, code cross browser Javascript and have good working knowledge of some hot server side scripting languages like PHP and ASP.

This blog is one place I practice. The underlying software is Moveable Type, which is written in a programming language called Perl. If necessary I can go in and tweak the code, but it’s not necessary. Setting up this place was pretty straightforward. Fortunately I also get to play with the PHP server scripting language on my forum, The Potomac Tavern.

My forum is based on open source bulletin board software written in PHP called phpBB. About the time I installed it I also ordered some manuals so I could learn to write PHP. phpBB also requires a database. A database called MySQL comes free from my web host so I used that and ordered a book on MySQL. The combination of the server operating system (Linux), PHP and MySQL is a zero cost option for creating extremely robust and reliable web based systems. And it turns out you don’t have to be a programming guru to do serious stuff in this environment. Much like those at the start of the PC revolution who put together HeathKit personal computers in their garages, the hobbyist with decent understanding of programming languages can do it themselves and have some fun. No need to work on a car in your garage anymore for amusement. Program some scripts for the web instead!

A lot of programming is boring for me because it doesn’t mean that much. I’ve done a lot of patching and upgrading of systems written by others in my career, and it’s definitely not that interesting. It’s necessary work, just like the mechanic who has to replace your muffler, but it is boring. Most programmers would like to write something original and all their own. It gives them a feeling of ownership and that they have created something meaningful. Unfortunately unless you do it for your own amusement, such experiences tend to be fewer and further between. Sadly, much of this work can be outsourced to India instead of keeping Americans gainfully employed as programmers.

So it’s a joy to find such a coding project recently that was both creative for me and actually useful for a large number of people. Back in May I was looking at the phpBB forum software and thinking “Why can’t it have digests? It works for Yahoo! Groups!” I frankly expected someone to have done it before but no one had. So I began work on a “mod” or “modification” to the official blessed phpBB software. With my modification you don’t get sent every email to your group, as happens with Yahoo! Groups. Rather, this software allows you to fine tune the digest you get to pick particular forums of interest, and to set a fairly wide variety of options. It is customized for you. It was a great mod that I installed on my own forum. I learned a lot about the phpBB architecture and how to write good PHP code in the process. Eventually I packaged up the whole thing in a ZIP file and posted it on the phpBB web site. I figured it would get people excited.

But it didn’t. It just sat there and got ignored. I didn’t understand it because it was a great idea. But I guess its time hadn’t come then. A week or two back I started getting inquiries about my modification. Is it going to be finished? Will it be submitted as an official phpBB modification?

It’s time has come. Now it has garnered a lot of interest and my spare time has been kept increasingly busy making more modifications to it and getting feedback from the developer community. Shortly it will be submitted as an official modification and when it shows up on the list of approved phpBB software modifications, as I hope it will, I suspect it will be pretty popular.

No, there is no money in this work. When building on top of an open source platform you just give it away. But there is a vicarious thrill and pride in ownership of not only writing some very cool and efficient code optimized for this phpBB software, but to garner some fleeting low level fame among this community of people. These people are appreciative of my work. It reflects not only a needed enhancement to phpBB, but from the feedback I am getting it is also very well designed and thought out.

And that makes me feel happy and gives me a tangible feeling of accomplishment. Some people are jumping the gun and won’t wait for the final release. One guy from Brazil has been writing me with questions. I’ve been helping him out. When I took a look at his site though I realized that I was really helping out … a low level pornographer!

Well, why am I not surprised? Who were the pioneers on the internet? Not Bill Gates, that’s for sure. No, it was the smut merchants who figured out how to turn a profit on from the internet first. If a pornographer or two finds a way to use my software modification to push down adult content to some horny end users looking for some cheap thrills, that’s part of the deal. I’m sure it will find more legitimate uses in time.

It’s still a damn fine set of code. And I’m glad to know I still got the right stuff.