Archive for the ‘Technology’ Category

The Thinker

Your blog deserves a great Content Delivery Network

While I do a lot of blogging, I suck at marketing my blog. Oh, I do look at who’s viewing my blog and check my statistics daily, and often more than once a day. Google Analytics provides a wealth of data on my web hits, and StatCounter is useful to see what was recently read. Aside from dressing up my blog’s sidebars with marketing stuff and making sure my content is easily accessible as a newsfeed, I can’t seem to be bothered to do much else.

Part of the problem is that my blog serves principally to keep me amused and to stave off boredom. If readers find an occasional post worthy of a Facebook Like or a Share, that’s nice, but I don’t lose sleep when they don’t. You would think that as a software engineer and someone who spent ten years directing the management of the largest web site in the U.S. Department of the Interior, I might find this web marketing business pretty easy. But one thing I learned early on is if you have great content, the marketing kind of takes care of itself.

In that job I simply worked to make the content more readily accessible and to make sure that the data was easily consumed. I spent much of my ten years there leading an effort to make the site’s data accessible as a set of web services. In this sense I do know marketing. When I left these new web services constituted the third most accessed site for my agency, in spite of not having existed just a few years earlier.

On this blog though my traffic is pretty anemic, particularly during the summer. There are things I could do to get more hits: shorter posts, more topical posts, turn it into more of a stream of consciousness blog and link ruthlessly to posts in other blogs, which seems to be the way blog aggregators like Tumblr work. Doing this though would ruin blogging for me. It might be successful, but I wouldn’t care. I’d be bored with my own blog.

During one of the recent Net Neutrality debates I mentioned that the Internet was already not net neutral. If you can afford little, you may (shudder) use an Earthlink dial-up account and watch web pages slowly draw themselves like they did in 1995. If you can afford $100 a month or more for Internet, or live in a place like Kansas City where you can get Google Fiber, you can cruise the Internet at 100MB per second or more. Some people have 1GB/sec connections.

If you have your own web site you also have some factors that limit the speed of your website. That’s the case with this blog. I host the site on, which is a really good shared web host. What’s not optimal about Hostgator is that while it can reliably serve most content at $5 or so a month, getting the data between its servers and your computer can be like going through every traffic light in town to get home from work as opposed to taking the expressway. It typically took eight or more “hops” to get my blog posts to my computer. A “hop” in this case means a router, which is effectively a traffic light as it routes parts of web pages from one place to another. According to Google Analytics that it took about ten seconds to load one of my web pages. Most of that was due to all those routers that had to be traversed.

So it finally dawned on me that this was probably a significant reason my traffic is declining. Google is looking at the hassle factor at getting content from my site, and is probably lowering my search rankings because of it. Aware of the problem for several years I have used CloudFlare to try to speed up the serving of my content. CloudFlare is a content delivery network or CDN. It specializes in reducing the number of traffic lights and making sure that my content goes through crazily fast connections, usually one physically close to where you are. Hostgator (and a lot of web hosts) offer CloudFlare for free to its customers. CloudFlare like every CDN sells a more expansive service for those with deeper pockets.

I had outsourced my CDN to CloudFlare, but I never really went back to look to see if it was doing a good job. There are probably things I could do to cache more of my content on CloudFlare’s servers (probably for money) but mostly I stuck with its defaults and ignored it. However, when I looked at Google Analytics, my average page load time was still stuck at around ten seconds.

Ten seconds is a long time to wait for content these days. So I figured I was probably losing a lot of readers because they lose patience and go elsewhere, particularly mobile users. We want every web page to load like a Google web page: fully dress itself for our eyes in a couple of seconds or less.

But not my blog. It was like a horse-drawn milk wagon compared with a racing car. Actually, this describes a lot of sites on the web, particularly Mom and Pop affairs where the owners know little or nothing about web architecture.

I decided to put on my software engineering hat, and started researching CDNs some more. There’s a lot of competition in the market, mostly aimed at well moneyed corporations. I’m just a little blog, however. And this blog runs on WordPress. What options do I have for a swift CDN that won’t cost me an arm and a leg? CloudFlare was free but it clearly wasn’t doing the job.

After some research I settled on For about $9 a month it will serve my pages quick. Of course if traffic increases a whole lot it could get a lot more expensive. But if I am content to use principally their servers in Europe and the USA (which is most of my readers) and I expect a terabyte or less of bandwidth a month then $9 a month should be fine. I can afford that. My pages seem to load in about 3 seconds now. A lot of the sidebar stuff comes from elsewhere, so that slows things down a bit. But the main content, if it is cached, takes about a second to load. That’s pretty impressive for $9 a month. And this fast speed might draw in new readers.

So far it’s looking good. Today’s traffic is roughly double what it was two days ago. Over time Google may take notice and rank my posts higher in their search engine. Here’s hoping.

Does your blog or website need a CDN too? It can’t hurt if you can afford it, and it can’t hurt to do your research and see which CDN is best optimized for your kind of content. MaxCDN has a plug in that works with WordPress to facilitate sharing. It was a little tedious to get it configured but the instructions were clear enough. Some of it is kind of wonky (how many people know what minifying is anyhow?) but the more technical you are the more you can fine tune things.

Please note you don’t need a CDN if you are using a blogging platform like Tumblr, BlogSpot or They are already effectively CDN platforms as well as blogging sites. But if you host your own site and you want to increase traffic, integrating your site with the right CDN may be the most cost effective way to go.

I’ll be watching my metrics and perhaps reporting success or failure in the months ahead. So far the signs look good.

The Thinker

Ashley Madison stupidly lets itself get pwned

So I have been streaming Mad Men on Netflix. It’s a strangely compelling series about the world of Madison Avenue in the 1960s. It’s a world of constant drinking, endless cigarettes and infidelity. The principle character is Don Draper (played by Jon Hamm), the creative director for the advertising firm Sterling & Cooper. As we quickly learn, Don was previously Dick, he is a deeply messed up man, and he also happens to be one hunk of a guy. Don’s a liberal drinker, a liberal smoker and a liberal bed hopper as well. He does this while somehow staying married to his ultra pretty and slinky wife Betty (January Jones).

It takes a few seasons but Betty eventually figures out Don’s infidelities. They divorce but Don keeps bedding the women, often inappropriately, including his secretary. Yet Don is hardly the only character in the series with his pants down. Most of the characters are involved in an illicit relationship or two. I have no idea how close any of this is to real life on Madison Avenue, but from what I’ve read it was not too far off the mark. Most of the men are caught between who they really are and the roles they are supposed to play. How they manage all this screwing around in these pre-Ashley Madison days is kind of mysterious, but likely all that booze helped reduce inhibitions.

Yesterday of course the infidelity website quickly went dark after hackers posted a dump of its database on a number of websites. While bad for cheaters out there, what it said about Ashley Madison was even worse. First, its security system was laughably bad. Second, even after the hack they could have taken down their site and saved their forty million members embarrassment, but they didn’t. They kept collecting fees right up until they went dark. In short, they gave the online infidelity business not only a moral stink but in an unexpected way: they were so busy chasing short term profits that they were willing to throw its forty million customers on mercy of their spouses. Doubtless the hackers provided samples to prove they had hacked the good stuff, including apparently seven years of credit card transactions. AM was hoping they would blink.

Doubtless too that marital counselors and divorce lawyers are going to get a sharp increase in business. It would not surprise me if their phones were ringing off the hooks. As for AM, I wouldn’t blame its customers if they arrived en masse to torch its offices. Cheaters of the world, unite! Anyhow, fifty years after Mad Men, there are still plenty of Don Drapers out there that are mostly hooking up online. Until a couple of days ago apparently Ashley Madison had the lion’s share and then some of this market.

What interests me is not that AM brokered infidelity. As disgusting as most people at least claim to view infidelity and those that aid them, there are far worse things on the Internet, with ISIS beheading videos coming immediately to mind. Some entities like AM are to be expected in our electronic age. What’s interesting and more than a little appalling is how bad a job they did in keeping their clients’ information confidential. As a software engineer, but also as a guy that is currently getting paid to ghostwrite articles about data security, AM gets an F.

Yes, AM kept a record of all its credit card transactions for the last seven years! It’s such a mind boggling, stupid and reckless thing to do, particularly given the profitability of the site. It would have made much more sense to give in to the hackers’ demands and quietly establish a new site under a new name, oh and fix those security problems too. Doubtless they had the money to do it. Forty million customers, figure 30 million of them men, figure each putting out at least $50 each, that’s at least $150 million in revenue. Since they’ve been in business fifteen years, it’s likely a lot more than that. Likely their overall revenue likely exceeded a billion dollars, not that we’ll know for sure. They aren’t publicly traded, although maybe their successor or whoever buys the brand (Vivid Entertainment?) will be publicly traded, and doubtless do a better job at security.

If I had fewer scruples and more money I might create the next AM site, one that its dubious clients could actually trust. Of course there are always risks in anything done over the Internet. AM’s clients now understand that. The next AM is bound to arise from its ashes, and probably sooner rather than later. Here are some actions items for whatever entrepreneur wants to sail in these turbulent waters in the future:

  • Do not keep records of credit card transactions. Just don’t. Purge these daily, if not more often, from any internal databases. Don’t journal them on backup somewhere.
  • Do not collect any privacy information from your customers, you know like their real names, address and phone numbers. Instead, let some third party act as your broker. Your client gives the broker some money and the broker provides some electronic token identifying the payee that doesn’t actually identify them to your company. The future AM should never collect anything that could identify their clients.
  • Accept more discreet ways of payment. There are lower tech and anonymous ways to pay fees confidentially: wire deposits and money orders, for example. I’d say accept BitCoins but BitCoins are hardly anonymous.
  • Don’t use cloud hosting. Use your own data centers that only you can access and control.
  • One person can’t do this in his basement. So find employees who have a history of being trustworthy, very talented, and discreet and pay them very well. Give them incentives to be discreet. Make their bonuses contingent upon their contributions to improving the business’s security.
  • Retain security experts. To get AM’s entire database required a whole lot of bandwidth. This can be monitored. The tools exist to cut off suspicious behavior already.
  • Do regular vulnerability testing of your website and applications. The tools are out there. Of course fix any vulnerabilities found quickly.
  • Hire a CISO, a Chief Information Security Officer with of course the right credentials.
  • Don’t store obviously sensitive information, like a client’s IP address. Passwords should be encrypted in a MD5 hash in the database.
  • Tell your customers what your security plan is. Get an annual (or more often) security audit from a trusted security auditor and publicize the results for your customers.
  • Provide your customers security tips, like clearing your browser history. I can think of another one. Figure out a way for clients to share pictures anonymously. I’m pretty sure it could be done with Instagram.

As for AM’s clients, those who are not on their way to marital counseling or divorce court, you might consider picking up strangers at bars again or just plastering them with lots of alcohol in the privacy of your office. It sounds cheaper and faster. It worked for Don Draper.

The Thinker

Ghostwriter (or the art of tricking Google)

All my life I wanted to be a paid writer. Being a writer sounded quite glamorous. You are paid to create and if you were good enough or wrote for just the right mass audience you could be wealthy like Stephen King.

Life didn’t work out that way for me. It’s probably for the best because most writers are starving writers, which means they do it as a hobby and not for much real income. They have other jobs that pay the rent. Moreover most writing is not glamorous, even when it pays well. Most writers dream of writing popular fiction. What most writers actually do is write articles for magazines or trade journals, or the local newspaper. They adhere to editorial guidelines. Their writing is not very creative. It’s about putting a number of facts and quotations on paper or online in a way that may be interesting enough for the reader to make it to the end of the article. These days even publishers don’t care if readers read the entire article or not. They are looking to serve ads. They care about whether your article attracts a lot of ad views. Whether it gets read is not that important, unless they are going for some sort of award.

So if you can find a writing job it is likely to pay poorly and be demoralizing to you and your self-esteem. And if you do manage to get a book published, it’s likely to sell a hundred to a thousand copies, with extras ending up in a discount book bin or just shredded for pulp for the next book. For the vast majority of creative writers, writing does not provide close to a living wage. Most editors will refuse to acknowledge your brilliance.

Recently though I did get paid to write. I was paid to ghostwrite. So in a sense I have become a published writer, although I think the content is going strictly online. Essentially, I’m being paid to influence Google’s search engine. Yes, I am writing for a set of algorithms! I’ve become something of a slave to the computer!

Google of course is the king of search engines. Getting high or higher on its search index is important. For many businesses it’s the difference between life or death. The only question is: how to get ranked higher than your competitors? Google is not telling, although it does give some hints. Needless to say there are plenty of companies out there that claim they can get your company ranked higher.

Most of these outfits are selling snake oil. There are lots of obvious things that can be done which don’t hurt, such as having URLs with meaningful information about your article, providing a sitemap.xml file and removing bad links. In the trade this is called “search engine optimization” or SEO. Everyone with the means to do so is already doing SEO. What you really want is for your company to appear in the top page of Google’s listing, ideally at or near the top for a given search phrase. These are links that people will click on.

One of my clients has made a business of SEO. I’ll call him Dick (not his real name). He’s hired me for odd jobs maintaining his forum, generally because he’s too busy making real money to mess with it. Dick has a reputation in the SEO world of getting results. That’s why Dick sought me out to be a ghostwriter.

Dick’s success has come through building a company’s online reputation. He figured out that Google ranks higher those sites that publish honest articles. I have no idea how Google assigns an honesty rating to an article, but somehow it’s got a built in bullshit detector in its algorithms. If it doesn’t look like bullshit, it’s ranked higher. If it looks authoritative, it’s ranked even higher. If you publish lots of articles that look honest and impartial, over time it will raise the ranking of your company in Google’s search index. This is a long-term strategy and it’s a costly one as well.

So I was hired to write some technical articles in this client’s particular domain. It turns out I have pretty good credentials. First, I do information technology for a living, so I have practical and current experience along with a masters degree in software systems engineering. Second, I write fairly well. Third, I am mostly retired. And fourth, I can write an impartial article. My years in government have actually helped. Government employees develop finely honed bullshit detectors, because we are constantly dealing with vendors trying to get their products and services into our enterprise.

Dick is also kind enough to provide a few sample articles for my topic. I use these as well as my thirty years in the business to crank out these articles. Generally they are no more than 800 words and follow a format. I charge by the hour. Since most of these are survey articles, I don’t have to really do any research. I just start writing. It takes me about three hours to write an article. I bill at $30/hour (my retiree rate). So far I’ve done two articles and earned $180 ghostwriting. There will probably be more, as the client is satisfied with my work.

I have no idea where these articles will be placed, but Dick tries to get them in online publications of authoritative sites. I could probably find them online if I looked. Dick does edit what I send him, so it may appear somewhat altered. But at least I am a published writer. Some people may find my articles interesting, but the only “person” of real interest is Google’s search engine. We are basically trying to fake it out. Dick’s client is essentially renting my experience for potential future customers and an improved reputation.

I’ll probably never know how this will all pan out. Some part of me thinks I am being dishonest. I am writing honest articles, but I am doing it on behalf of a company that doesn’t have the in-house skills or the time to do it. They are essentially renting my reputation, such as it is, to add to their reputation.

But hey, at least I am a published writer now! My pseudonym? Call me Anonymous.

The Thinker

Hillary’s emails: what the critics are missing

The current kerfuffle over Hillary Clinton’s use of a private Blackberry and private email server for her official business while she was Secretary of State is mostly about making a mountain out of a molehill. Nonetheless the molehill makes for a pretty interesting discussion and analysis. I have some thoughts about this coming from my time as a civil servant as well as some technical perspectives from my career in information technology that I haven’t heard in the media. Hence I’m taking some time to blog about it.

There are many dimensions to the issue. You can look at it from either dimension and feel completely justified that your side is right. Let me advocate for both of them and you tell me which is right.

First, I’ll take the critical perspective. Records should be kept of official government business. The Secretary of State does a lot of official business and it impacts national and international policy. Moreover, the email threads of these historical events may provide useful lessons for the future. The Secretary of State is essentially a civil servant. She works for the taxpayers. So her email should be archived, not necessarily for instant critique, but for history and for congressional and criminal inquiries when they are needed.

However, she was not just anyone. She was the Secretary of State. I can think of few positions in the government, including the Director of the CIA, that are more sensitive. If I were trying to have a confidential back channel communication with the Prime Minister of Israel, would I really want him to communicate with me through, even if the email were highly encrypted? Or using any email address? Would any leader outside our country want anything less than innocuous content to go through such a system? There is always the telephone, of course, and the Blackberry includes a telephone. However, a telephone is synchronous. It’s a relatively inefficient way to work. It’s much better to reply with thought and nuance when you have the opportunity to do so, i.e. use email.

The reality is that the Secretary of State (and most high level government executives) has multiple channels of communications to do their business. Email is an important tool. Staff communications happens at another level and is also vital. In general, all sorts of lower level communications have likely happened before the Secretary picks up the phone or sends an email. If there are times when a confidential email is the best choice for the Secretary, an off the record email system makes a lot of pragmatic and business sense. It’s hard for me to think of myself as Secretary of State but if I was, it was lawful and I had the money I’d probably have done mostly what Clinton did, except I’d have a separate email for strictly personal use. A private email address though was pragmatic and necessary. We should trust implicitly anyone we pick for Secretary of State. If we didn’t trust her, the Senate should not have confirmed her.

Using the same email account for both personal and public use even though it offers convenience is stupid. Personal systems are likely to be less secure as government systems, although government email systems are hardly perfectly secure. One could make the business case that overall her public emails would have been more secure being hidden on a private server inside the government technical enclave. Ideally she would use a hidden government-managed email server that was patched and highly secured.

However, those who think that she should have done all of her email using a email address clearly don’t have much of an understanding of how impractical this is. If this was her only government email address, it would be inundated with thousands of emails every day, even after the spam filter removed the obvious garbage. She would depend on staff to sift through it and flag the ones that she would read. Staff are not perfect though and might potentially not flag the important ones. In addition, there are times when you really don’t want staff reading certain emails but you need to communicate asynchronously. So you need a channel for that. And the open nature of email means anyone can send email to anyone. In short, this approach is not the least bit practical for someone at her position. She needed an email system that only let in those that she needed to let in, and this could not be done through the technology of the time.

What she did was not unlawful at the time, but certainly gave out a bad odor. It feeds into conspiracy theories that the Clintons always attract. It suggests a need for rigorous control and confidentiality; something I argue is not unreasonable for someone in her position. Mostly though I think the problem here is that the technology did not exist that allowed her to do her work pragmatically. It still doesn’t exist. Email is not quite the right medium for what she needed, but it was a tool everyone had. A private email address and mail server was a pragmatic solution to a difficult problem.

It may well be that Hillary Clinton is as paranoid as Republicans believe she is, and that all their theories about her are true. If so she has plenty of company among Republicans. I strongly suspect that she is guilty of being pragmatic and efficient, and using these somewhat unorthodox means allowed her to be the highly productive Secretary of State that most historians agree that she was. And given the unique sensitivity and nature of her work, I think the ends largely justified the means here. I also believe that if there were a technical solution available that would have met her requirements, she would have used it.

The Thinker

The joys, glee and occasional giddiness of virtualization

We are living increasingly virtualized lives. From posting about our lives in Facebook, to wearing Google Glass, to playing games online where we don alter egos, most of us mask ourselves behind walls of electronic processors, networks and software. I’m typical. As for me, aside from Facebook there is this blog, which is not quite a true representation of me. Rather it is a projection of some part of me, perhaps my ego that I choose to share with the world. Most of you don’t know who I really am, which is by design. I go so far as to hide my domain details behind a proxy.

But there are other meanings to virtualization. In the computer world, virtualization is running a computer inside a computer. In my case, I am running Windows inside my iMac. It used to be I had separate machines, and the Windows box was a laptop provided courtesy of my employer. When I retired, I turned in the laptop, leaving me Windows-less. This normally would not be a problem, unless you need to teach a class where students will be doing work on Windows. That’s when I decided that rather than buy a new computer just to run Windows I would cheat. I would run Windows virtually inside my iMac instead.

There are a couple of ways to do this. The cheapest (free) way is actually to use Bootcamp. This allows you to boot to a Windows partition when you start your iMac. This works fine but is inconvenient. You must shut down Windows and go back to Bootcamp to boot your computer as a Mac. What I really want it to run is Windows and the Mac operating system at the same time and be able to share content between them. In short, I needed virtualization software.

Ah, but how to do it on the cheap? Windows computers aren’t that expensive, after all, so a virtualization solution would have to be cheap. It helps to be teaching a class, because I qualify for software at an academic price, roughly half the retail cost. That’s how I acquired Parallels, neat virtualization software that allows me to run Windows (and lots of other operating systems) virtually on my iMac. With discount it cost me just under $40. The only downside was I had to wait a couple of days for the USB drive with the software to arrive in the mail.

There was also the question of how to get a cheap license for the Windows operating system, as I didn’t have one lying around. Fortunately, the college where I teach has an agreement with Microsoft, which it uses on its desktop computers, wherein we teachers could install a free Windows license at home so we could do our work without having to come to campus. If you already have a license for Windows, you could install it in a virtual instance too. And after installing Windows, that’s where I stopped. I did not feel the need for a license for the Microsoft Office suite for Windows. I already have one for the Mac. Frankly, Microsoft Office is becoming obsolete. My needs are modest. I can generally do what I need using Google’s free tools, and they have the bonus of being easily sharable not to mention accessible in the cloud pretty much anywhere.

I did wonder if virtualization technology really would be reliable. I thought there must be some Windows software that simply cannot work in a virtualized environment. And there may be, but I haven’t encountered it yet. It all works perfectly fine. On my spiffy new iMac it runs at least as fast as it would if I had a dedicated Windows box.

Why bother in the first place? It’s not like I like Windows. For the class I teach, the students install a “lite” version of Oracle, Oracle 11g Express Edition, and it’s available for Windows and Linux, but students will install it on Windows. Even in a “lite” version, Oracle is a CPU and memory hog, so I was skeptical it could be run virtually, but it worked fine.

There was some puzzling installing Windows in a virtualized environment. I needed a Windows image file, which I got from the college. Since Parallels is a virtual wrapper around Windows it is software too, specifically it is a hypervisor. This is software that oversees virtual operating systems. In principle, a virtual operating system instance runs in its own little sandbox and cannot harm my iMac’s operating system. In practice though, the Windows instance may be virtual but it is still susceptible to viruses like a non-virtualized operating system, so I loaded the free Avast antivirus to keep it safe.

Aside from my needs for Windows for teaching, I have found it’s useful for other purposes. In retirement I earn income from consulting. Since I am doing web work, it helps to have web environments to do work in. Over the weekend I was involved in a hairy upgrade of a very large forum (about 670,000 posts) from phpBB 2 to phpBB 3. I tried it a number of times on my client’s shared hosting, and it kept failing. I ended up downloading both the programs and the database to my machine. I placed it in my virtual machine and converted it there. While it’s possible to install web server environments on the Mac, it’s more of a hassle. There are turnkey solutions for Windows web server environments, like XAMPP that I am using, that are so much faster to install and maintain. On my machine of course I did not have the limitations of a shared server and I was able to convert the forum.

There are some things that are just done better or more elegantly on Windows. Obviously, if you do any work for a business, they are likely using Windows, so having a Windows environment may be necessary. But there are some programs for Windows that are so nifty that there really is no equivalent for the Mac. Winmerge comes to mind and it is also free. I do have DeltaWalker for the iMac, but it is much harder to use. I have a version of Quicken for the Mac. The Windows version is much better. At some point I may move my Quicken files into my Windows virtual machine and take advantage of all these new features.

Beyond Windows, all sorts of virtual machines can be created using Parallels. I don’t have much need to install versions of Linux like Debian, but I can install it rather easily if needed and still have my iMac purring away. Parallels is also smart enough to allow copying and pasting rather easily between operating systems. In the Windows world, copy and paste is done with CTRL-C and CTRL-V. In Parallels, it will recognize the Mac’s CMD-C and CMD-V, which is more intuitive. Sharing files is more problematic. There are ways to do it, but so far I’ve been moving files on flash drives.

Overall, I am impressed by the ease of the Parallels virtualization technology. I effectively got a Windows machine without having to buy any hardware. With a free Windows license, my only real cost was the cost of the Parallels software. I have the benefits of both a Windows and a Mac without a lot of Windows hassles. I can do eighty percent or more of my work in the Mac operating system, but when I need Windows it is there reliably and transparently. Should I choose to get rid of Windows, it could not be simpler. I simply tell Parallels to remove my Windows virtual instance. Bing! Done!

So my computing life is good. I feel like I won the lottery. Among all the other benefits, virtualization technology is also environmentally friendly because I am running one physical computer instead of two. In short, it’s a slick and easy to use solution. Don’t be afraid to virtualize this aspect of your life.

The Thinker

Changes to subscription services

Sorry if this is a somewhat geeky post.

I am using the Feedburner feed service. It allows many of you to acquire this blog through various mechanisms that don’t actually require that you to come to the site, a great way to read the blog if you are busy and/or lazy. It either emails my posts to you or by caching it on the Feedburner site makes it highly available in your feed reader.

Feedburner was the first to succeed in this market. It hadn’t been in operation too many years before it was acquired and Google and stuffed into its vast holdings. There it has been languishing, still working, but ignored. I can tell it is not being maintained because Google turned off the Feedburner API. In addition, it can’t even bother to maintain the documentation on the site. For example, it references Google Reader and iGoogle, which it retired a year or so back. This means that Feedburner is becoming untrustworthy. Google will probably get rid of it at some point.

Syndication is an important way for me to distribute my blog posts. Feedburner says I had 118 subscribers on average over the last week. This includes 22 active email subscribers. Given Feedburner’s problematic and untrustworthy status, I need to take some actions.

Those of you who subscribe via email will start receiving posts from my blog instead. Mail will come from It’s possible your email program will move this into spam or trash. You may need to create a rule or filter to put these in your inbox. Each email should contain a link allowing you to unsubscribe.

Those of you that subscribe via news aggregators like may need to change the feed URL. Rather than get it from Feedburner, you need to get it directly from my site. This generic feed URL should work fine:

You can also choose feeds for a specific feed protocol:

Thank you and thanks for reading the blog.

The Thinker

IBM: The Dilbert of companies

We all BM

HARLIE (the computer), from When HARLIE was One, by David Gerrold

I grew up in an IBM town. IBM pretty much owned Endicott, New York when I lived in the area. The exception was the Endicott-Johnson shoe factories, which were in serious decline in the 1960s. In fact, IBM was founded in Endicott, New York in 1911.

Big IBM-white boxy concrete buildings line McKinley Avenue and other Endicott streets. If you didn’t work for IBM, you prospered from mooching off of IBM. IBM guys were cool if white guys in white shirts, black pants, narrow ties and short hair could be cool in the 1960s. In any event they lived well, worked hard and gave their all to the company Thomas J. Watson founded. It sure looked like a cool company to me back then. Not only did they rake in all these billions in revenue, but also their employees were happy with terrific pensions, great salaries (because IBM hired top talent only) and had pretty much a guarantee of lifetime employment. Management actually listened to their employees and encouraged them to be creative and innovative. The guys (and they were almost all guys, except in the clerical or punch card pool) wore THINK buttons on their suits and shirts. It was embedded in their logo — so much so that it was hard not to associate IBM with THINK (in capital letters).

That was then, but it bears no resemblance to the IBM of today. At least that’s my conclusion having finished Robert X. Cringley’s eBook on IBM, The Decline and Fall of IBM: End of an American Icon? Cringley has been a tech journalist since the 1980s, and made a name for himself (under a pseudonym I am pretty sure) writing for InfoWorld, the tech publication that focuses on information technology in the enterprise. I credit InfoWorld for much of my career success, since it was always topical and ahead of current trends, plus it told me stuff I needed to know to succeed in the workplace of the moment.

InfoWorld is still around, but its print publication is long gone. So, in fact, is Robert X. Cringely. Well, not quite. You see, there are two Robert X. Cringleys. There’s the guy that wrote the original columns over many years, and then there’s the trademark “Robert X. Cringley”, which InfoWorld claims to own. So there is still a reputed tech spy named Cringley on, but not the real Cringley, the tech guy that amused us with likely fictitious anecdotes about his relationship with “Pammy”, a curvy younger woman that ran hot/cold. Reading his column was half neat behind the scenes tech news, and half soap opera. It was fun and addictive. Anyhow, the first and legitimate Cringley, now 60+, is still one of the few people doing honest information technology journalism, and can be read on his website. And I assume the model in the picture is “Pammy”.

Cringley has been studying IBM for a long time, having grown up in an IBM town like me. He believes the company is ready to implode. This is because, very sadly, the company has morphed into the Dilbert of companies. It is overrun by pointy-haired bosses that are busy working their employees into early graves, if they are not being summarily fired to hire greatly discounted and frequently incompetent employees from India who largely have no idea what they are doing, or who have mastered the idioms of American English.

From the perspective of Wall Street, IBM is doing great. The managers are doing a great job of increasing their earnings per share quarter after quarter. It’s a metric they are focused on like a laser beam. You know what the problem is when you focus: it distracts you from the rest of the world. As Cringley’s analysis points out, the things that should matter about IBM are simply being ignored. It’s crazy what its managers are doing to its core assets, not to mention its employees. They are burning the seed corn, to use an analogy from the Civil War. For many years they have been relentlessly firing their best employees, mainly because they cost too much. They cut pensions and eventually did away with them altogether. They outsourced a lot of their work overseas, adding huge communications barriers and dispensable employees, who were often just cheap contractors, to handle technical interactions with their global services customers. These are very profitable customers that need a long-term relationship with a tech firm to manage their complex systems. To do this right, it requires a deep understanding of their technical needs, their business and a rigorous, engineered approach to managing their complex technical infrastructure. Done right these are hugely profitable customers for life. They used to do this right, and now it’s hard to find an example of a company that does it worse, or charges more for the privilege.

Sadly, the more you read of this relatively short eBook, the more appalling the whole thing becomes. (It’s a quick read and at $3.99, this self-published book that no publisher would otherwise touch is also a bargain. About half of it is an appendix of comments he has received over the years.) It doesn’t take much reading though to discover what the real problem is: managers come exclusively from the sales ranks, not the technical ranks. Consequently overall they have little clue what their customers want, and lack the creativity to direct their employees to give them what they want, or even bother to ask them. Moreover, it has more bureaucracy than the federal government, so many incredible layers of hierarchical management, despite implementing a flawed version of the Lean efficiency program.

Managers and employees often widely geographically separated, causing stilted communication that adds cost and delay. Not that employees have the time to give feedback. They are kept working like slaves: sixty or more hours a week, for now below par industry wages and they are massively overcommitted, with the grim reaper of outsourcing always at their heels. Their customers are being pick pocketed too: they pay highly inflated prices for crappy services, made worse by contracts based on billable hours that are often inflated. The smarter customers have moved on, which is fine with IBM. They then lay off more employees, which helps increase earnings per share, and Wall Street applauds because they equate this with good management.

Cringley has solutions but IBM’s leadership has proven both tone deaf and hostile to creating growth again in the company. As for listening to their employees, they simply can’t be bothered. Which means that IBM is a shadow of its former self. And this has been going on for a decade or so. I know people who have been laid off from IBM. As I read Cringley, I wonder why they didn’t bail long ago. In many cases, it’s because they are in their late 40s and 50s, and it’s hard to find a job that pays as well or even at all.

IBM is also buying back tons of its own stock, often with borrowed money, simply to prop up its earnings per share. No one seems to be looking at its sales and how they have been dropping, and how many of their largest customers have gone elsewhere. No one, least of all its management, is looking at the quality, innovativeness, or value of its product lines. Management simply isn’t interested.

What is IBM management good at? It’s good at creating Potemkin Villages: shells that look good to outsiders, but with hollow or non-existent insides. Its major advantage is a huge legacy of accumulated cash from its glory years that lets it hide its inefficiencies and which they apparently won’t invest in innovative products and services. Touring Endicott, New York, where only a couple hundred of the thousands that it employed in its glory days remain, easily demonstrates its hollowness as a corporation.

Cringley’s analysis, and it’s voluminous as well as filled with insider dope, is unfortunately right. I don’t invest in individual stocks, but if the price of increasing earnings per share is to piss off its customers and stop creating products that lead the market or offer greatest value, then it’s only a matter of time before its house of cards collapses. From the looks of things, it shouldn’t be too much longer. It won’t matter to its managers. Much of their pay is based on IBM’s earnings per share so their prosperity is already assured, so in some sense they are betting on failure. By tying pay to earnings per share, IBM embraced a false Wall Street value. Real growth and real value comes from companies that innovate, like Apple Computers. IBM is proving to be the stodgiest and most tone deaf of companies. The Davids of the corporate world have already hit this Goliath with a rock on the forehead. Goliath simply hasn’t figured out that he is falling to the ground.

At the start of the book, Cringley relates a real story. As a child in the 1950s he had a great idea that he took to IBM. Thomas J. Watson himself read and forwarded his letter. He actually got an interview with a group of IBM engineers. To say the least those days are long gone. Watson should be rolling in his grave. Most likely though IBM executives will remain clueless until Wall Street finally notices, and the company collapses into a bunch of sub-prime parts that get sold off by ticked off stockholders. Pretty much any company out there could do a better job of managing these parts than IBM.

I hope you will read Cringley’s book. It doesn’t take long and should make you cry, particularly if you knew the IBM that used to be. It should also make you very angry.

The Thinker

The Internet is already not net neutral

Upset by proposals by the Federal Communications Commission to create “express lanes” on the Internet? If the current proposal now out for public comment becomes a rule, it would allow Internet Service Providers (ISPs) like Verizon and Comcast to charge a fee to those web sites that want faster content delivery.

This is the opposite of net neutrality, which is the principle that all web content should be delivered by an ISP at the same speed. (Actually, it’s at the same bandwidth, since all network traffic is effectively at the speed of light.) The argument goes that without net neutrality, those companies with deeper pockets, particularly those who are already established, such as Netflix, have an unfair competitive advantage over other services or start ups without such deep pockets. It’s a concern I certainly share, so much so that I first blogged about it in 2006. Bottom line: I am still concerned and I think this proposal must be fought.

What I didn’t write about back in 2006 was that there was no net neutrality back then either. Effectively, bandwidth is already discriminatory because it is based on ability to pay. It’s just based on your ability to pay, not the content provider’s. For example, Verizon has basically four tiers of Internet service from it’s “high speed” service (actually it’s lowest speed service) where content delivery does not exceed 1MB per second to its “high speed Internet enhanced” service where you can download at up to 15MB per second. It’s hard to quantify what the cost of the 1MB/sec plan is compared to the 15MB/sec plan, because it depends on many factors including what bundle you may or may not choose. Suffice to say if you want a 15MB/sec service, you will pay more than a 1MB/sec service. So if streaming Netflix is critical to you, consider their 15MB/sec service. (Of course, this assumes that the port between Verizon and Netflix can handle 15MB/sec. If it can’t then there is no point in paying Verizon the premium.)

You can think of the Internet connection from your ISP like a water pipe. If the water pipe is big (and the water pressure is high enough) you can get more water per second through a bigger pipe. What the FCC is proposing is to take this pipe and put two pipes inside it. One is a fat pipe that will serve certain content very quickly, the “fast lane”. The other smaller pipe is for those who can’t afford to pay ISPs these premiums, i.e. the “slow lane”. Since I live in traffic-congested Washington D.C., I think of the “fast lane” as the pricey HOT (High Occupancy Toll) lanes on the beltway, and the “slow lane” as the toll free and usually congested other lanes. It’s not hard to imagine the Internet feeling a lot like it did in 1995, when the hourglass was principally what you saw in your web browser. Pages took forever to load, if they ever did. For those of us who remember those days, revisiting them sounds quite frightful. ISPs would have every incentive to throttle the slow lanes, because it would mean that web content providers would come to them and negotiate to use their fast lanes. In addition, they would have little incentive to increase bandwidth for their customers overall, but plenty of profit to funnel back to stockholders from those that pay for fast lanes. It is the antithesis of what the Internet is about.

So already there is no net neutrality of content delivery, unless you have an ISP that provides a “one speed for all customers” plan. The issue is not content delivery; it is the speed of particular content distribution within the ISP’s network. Which brings up another less noticed way that the Internet is not equal. It has to do with Content Delivery Networks (CDNs).

If you access my blog with a browser you will notice it takes a while to render a web page. Why is that? It’s because I don’t pay for a content delivery network. I did a test from home on accessing my web site. I had to go through 13 routers (switches on the internet) between my home computer and my web host:

1 wireless_broadband_router ( 0.525 ms 0.244 ms 0.216 ms
2 ( 7.083 ms 7.095 ms 8.161 ms
3 ( 9.435 ms 12.101 ms 12.305 ms
4 ( 9.731 ms ( 27.151 ms ( 8.855 ms
5 ( 10.166 ms ( 9.396 ms 10.254 ms
6 ( 9.610 ms ( 9.693 ms ( 10.872 ms
7 ( 8.733 ms 10.023 ms 9.717 ms
8 ( 10.252 ms ( 14.819 ms ( 12.388 ms
9 ( 39.468 ms 42.618 ms 37.101 ms
10 ( 42.852 ms 45.176 ms 44.283 ms
11 ( 50.270 ms 49.438 ms 50.270 ms
12 ( 49.692 ms 85.009 ms 50.379 ms
13 ( 49.597 ms

Electrons still travel at the speed of light, but they are thirteen stoplights between my computer and my web server, at least for me. You can see how long my request took at each stop. For example, hop 13 took 49.597 milliseconds. Add up all the milliseconds to see how long it took for me to get to my site. If you do the same thing, the number of hops will probably vary, along with the access time. In short, it’s relatively slow to get to, which alone may explain why my traffic is down. People are impatient when they click on a link to my site from a search index. So they go elsewhere or get an effective CDN by using a subscription service to read content like Feedburner or

This is not much of a problem if I go to Here is the route:

1 wireless_broadband_router ( 0.557 ms 0.229 ms 0.202 ms
2 ( 6.919 ms 8.588 ms 7.432 ms
3 ( 12.248 ms 12.530 ms 9.252 ms

So basically Google has figured out a way for its servers to be “close” to me, usually geographically, so I get their content more quickly, or at least with fewer stoplights between their servers and my computer. This magic is done through a content delivery network. I’m pretty sure Google rolled their own, and that takes a lot of money, which Google helpfully has.

You can imagine if a company wanted to create a new amazing search index, it would be at a significant disadvantage if it didn’t have a content delivery network. They probably won’t roll their own like Google, but use one of the companies out there that do this for profit, like Akamai and Level 3. The technology behind this is interesting but I won’t detail it here. The linked Wikipedia article explores it if you are interested. Suffice to say it does not come free, but there are times when it is justified. The U.S. Geological Survey where I work uses a commercial content delivery network. Whenever there is a major earthquake they push the content out to the CDN, otherwise their servers would get overloaded and it would be like a massive denial of service attack. It also gets this data out more quickly to the public, as the typical customer probably only has to traverse three hops instead of thirteen to get the information.

We like to think that the Internet is free, but of course it isn’t. We all pay for access to it. Even if we don’t pay it directly, we pay indirectly, perhaps for the cup of coffee at Starbucks while we surf on their wireless network, or through taxes if we use Internet kiosks at our local library. Doing away with net neutrality is just another means by which ISPs hope to make gobs of money from having a monopoly on the last mile between the content you want and your computer. This may be due, in part, by our refusal to pay for their pricier tiers of service. The only difference is that this time you are not directly paying for it but other content providers will be. (You would think ISPs might cut you in on the deal and discount your rate, but that assumes they are benevolent, and not the profit-obsessed weasels they actually are.) As we all know, nothing is free, so these costs will certainly be passed on to you if you are a subscriber, and that profit will go to the ISP.

Given that bandwidth to the home is a limited commodity, giving discriminatory access to web content providers that can afford to pay must by necessity mean that others will get less access. In that sense, the latest FCC proposal is smoke and mirrors, and it is in everyone’s interest to get off our lazy asses and oppose it.

You can leave a short comment to the FCC here or a long comment here.

The Thinker

There may be a Chromebook in my future

In principle, I am against getting in bed with any computer company. And yet it is hard to avoid.

Since 2008, I have been principally using Apple computers. I have an iMac where I do most of my work, and an iPad when I want to read more than interact with the web. I also have, courtesy of my employer, a Windows 7 laptop. I need it for work but there are also times when I just need Windows. Unfortunately, I’ll have to turn in that machine when I retire August 1. I don’t like Windows enough to want to buy a Windows computer, or even pay a license to run it virtually on my iMac, particularly now that Windows 8 is your user interface. In any event, upon retirement this will leave me with an Android-based Smartphone as my remaining computing device.

So you basically have to pick your platform. It’s almost always Windows or Mac for the desktop, and Android or iOS for mobile devices. None of them are ideal, even Apple with its shiny computers and snappy user interfaces. There is also no one-size-fits-all device, which is probably good because what you need often depends on your intended use.

For example, I don’t need to run Quicken on my Smartphone. I don’t need to edit Microsoft Office documents on my smartphone either, although seeing them on my smartphone is occasionally useful. When I am doing financial stuff, writing or banging out code, that’s when I really need a desktop or laptop computer. This kind of work is either mostly a lot of entering numbers or text. The work is primarily assertive computer use.

By the way, this is a term I just made up. It means I need to assert lots of real world facts to a computer, basically translating my thoughts into something that a computer can use. Assertive computer use often involves repetition but it also means expressing structured content and thought. Creating this post, for example, is assertive use. It requires not just a brain dump, but structuring my words carefully so exact meaning is communicated. In theory I can do this with voice recognition software. In practice it is much more efficient to do it with a keyboard.

During my last vacation I brought along just my iPad and a wireless keyboard, basically to see how realistic it was to do assertive computer work on this kind of device which is really optimized for browsing. What I discovered was that it was possible to do assertive work, but it was a hassle. The Microsoft Office suite has now arrived for the iPad, but it doesn’t make doing assertive work that much less challenging. It’s a hassle because I am using an iPad, and it’s not a desktop computer, and a tablet computer is basically used for browsing and for simple interactions that can be done by pointing. For assertive work, it’s like expecting a subcompact to haul a trailer. It is technically possible perhaps, but not close to ideal. Moreover, by its size and nature, it never will be ideal for this work.

So there is no one-size-fits-all device. We like to think that it can be done, but it can’t all be done elegantly on one device. But even when a device can do something elegantly, it cannot always do it optimally. That’s what I’m learning about my iMac. Mostly what I am learning is that after six years with the machine, I need to replace it. It’s not because there is something wrong with my machine, it’s that software has evolved a lot in six years. It’s gotten bigger and fatter and is causing my iMac to go into conniptions.

My 2008 iMac has 4GB of memory. It’s no longer close to enough, particularly when I am using Google Chrome as my browser, but also when I am running Dreamweaver or any Microsoft Office product. Chrome is fast, provided you have the memory. I now need 16GB of memory to get good performance and keep all the programs I use regularly handy. Unfortunately, I can’t add more. Once memory is used then when I start new programs I often wait, and wait. The operating system had to create a whole lot of virtual memory on my disk drive, which is much slower to read and write to than memory. It can take a couple of minutes to open Excel for the Mac, particularly if I have Chrome running.

Apple would like me to buy a new Mac, and I may have to. Six years is a long time to use any computer. However, the computer still looks like new. There is no reason to replace it other than due to general slowness due to new and more bloated programs I am running. I can’t replace the drive with a solid state drive to improve performance. And I can’t reengineer Chrome, Microsoft Office or any of these memory hogs. I can choose less memory intensive programs, perhaps by using Firefox instead of Chrome. But I moved to Chrome from Firefox because of its instabilities.

The general problem is there is no way to really know how efficiently a program will run until you use it a while with other memory resident programs. Software developers, being lazy, assume you have the latest machines with plenty of memory and super-fast processors. Coding for minimal memory use generally does not occur to them. What I can do is use my iMac just for assertive tasks, like writing documents, coding and email and stop using it for web browsing, in favor of devices which are better optimized for that, like my iPad. Or I can get a new computer and go through the same cycle again in a few years.

Or I could get a Chromebook. A Chromebook is Google’s version of a laptop computer, optimized exclusively for Google services. It runs on its own ChromeOS operating system. It basically requires you to do all your work inside of the Chrome browser. To use it effectively you generally need to be on a high speed wireless network. Of course you have access all the features of Google Drive so you have word processing, spreadsheets and presentations. Google is working hard to allow it to work easily disconnected from the network, via Chrome Apps.

Why does this help? Well, for one thing, I don’t need to wait a couple of minutes for Excel to load my spreadsheet. The functionality is there in a Google spreadsheet already. It’s true that their spreadsheets are not quite the same as Excel, but they are now close enough. In addition, all the stuff on your Google Drive is readily sharable. Google spreadsheets even have capabilities that Excel does not, perhaps the most useful of which is they are in the cloud, instead of sitting on your hard disk when you are a thousand miles away. And since my use is minimal, it is essentially free. There is no need to worry about installing the latest version of Google spreadsheets. There is no requirement to pay a Microsoft ransom periodically to keep writing or maintaining a spreadsheet. I also don’t need to spend more than a grand to upgrade my iMac. It’s all done in a web browser. These hassles of doing a lot of my assertive work, if it works as advertised, largely go away.

Moreover, I don’t need to spend a lot of money to buy a Chromebook. A decent Mac laptop is going to cost well over $1000. Chromebooks start around $200. Even if it only lasts you a few years, your data is in the cloud, hence always backed up. In addition, the device is cheap enough to easily replace. It can be used for most assertive tasks, as well as for browsing. Perhaps most cool of all, there is almost no “boot” time. Your Chromebook is available when you need it in seconds.

Its downside is limited use. If it can’t be done in a browser or one of their apps, you can’t do it at all. But I don’t see a Chromebook as my only computer, but as a primary computer to use except when I need the power of a desktop computer.

In short, it’s a pretty compelling solution as long as you don’t mind getting in bed with Google. If I’m going to have to get into bed with any company however, I might as well save money and time.

The Thinker

Mankind is not going to the stars

I’m something of a space geek. It helps to have grown up during the space race. I still find the exploration of outer space fascinating. Our knowledge of the universe is growing by leaps and bounds. Hundreds of planets have been discovered, so many that it no longer makes news to announce new ones. Many of these planets look like they might support human life, although it’s hard to say from such a fantastic distance away. Some of us are waiting expectantly for someone to discover the warp drive, so we can go visit and colonize these distant worlds. Maybe it will be just like Star Trek!

It fires my imagination too and it excites the popular press as well. It all sounds so exotic, fascinating and Buck Roger-ish. The only problem is that almost certainly none of this will happen. We might be able to put men on Mars, but colonizing the planet looks quite iffy. Even colonizing the moon, as I suggested when Newt Gingrich was promoting the idea during his last presidential campaign, is probably cost prohibitive. Which means we need to keep our dreams of visiting strange new worlds in check. It won’t be us, it won’t be our children or grandchildren and it probably won’t happen at all. To the extent we visit them it will occur virtually, using space probes.

I don’t like to be the bearer of bad news but hear me out. We probably won’t colonize the moon permanently because it will be too costly to sustain it. It’s one thing to land a man on the moon, which America did rather successfully. It’s much more costly to stay there. For a sizeable group of humans, say ten thousand or so, to live self sufficiently on the moon is probably impossible. If it can be done it will take a capital investment in the trillions of dollars. My guess is that it would take tens of trillions of dollars, if not hundreds of trillions of dollars. It’s unlikely the moon has enough water that we could mine but if it does it’s likely very inefficient to process as it is wrapped up in moon dust. Otherwise water would have to be imported from earth at ruinous costs. In fact, colonists would have to import pretty much everything. Even if citizens of the moon could grow their own food and recycle their own water, manufacturing is likely to be limited. We might have a small scientific colony on the moon, like we do at the International Space Station. It probably won’t amount to more than a dozen men or women, and it’s likely to cost at least ten times as much as the international space station cost, since you have to move equipment and men a much further distance.

What about man colonizing Mars? It doesn’t seem out of reach. When the orbits are working just right, a spacecraft can transit each way in about six months. The cost of getting a pound of matter to Mars is likely ten to a hundred times the cost of getting it to the moon, which is probably cost prohibitive in itself. The journey there and back though looks chancy. It’s not just the possibility that some critical system will fail on the long journey; it’s the cosmic rays. Our astronauts are going to absorb a heap of radiation, the equivalent of 10,000 chest X-rays. It looks though that this is probably manageable. What of man living on Mars itself?

The good news is that humans could live on Mars, providing they don’t mind living underground. The atmosphere is much thinner than Earth’s, it is much colder on Mars than on the Earth in general and you can’t breathe the air and live. It’s true that by essentially burying our houses in Martian soil humans could be safe from much of it. Slapping on some SPF-50 sunscreen though won’t do the job. Anyone on the surface will have to wear spacesuits. So far we haven’t found a reliable source of water on Mars either. Colonizing Mars is within the realm of probability, but it is fairly low. Frankly, it’s a very remote, cold and arid place with nothing compelling to it other than a lot of empty mountains and valleys and swirling Martian dust, almost always in a pink or orange haze.

Colonizing distant moons and asteroids have similar problems: no suitable conditions for sustaining life as we know it, insufficient gravity, toxic radiation, frequently toxic chemicals and cold of the sort that most of us simply cannot comprehend it. Both Venus and Mercury are simply too hot to inhabit, and Venus is probably as close to hell as you will find on a planet in our solar system.

What about colonizing planets around other solar systems? Here’s where we need to fire up the warp drive of our imaginations because as much as physicists try to find exceptions around Einstein’s theories of relativity, they can’t. The closer you get to the speed of light the more energy it takes. It’s like Sisyphus pushing that rock up the mountain. To get a spacecraft with people in it to even 10% of the speed of light looks impossible with any technology we have or can reasonably infer. The closest star is three light years away, so even if this speed could be achieved it would take thirty years to get to the closest star. But it can’t be achieved. In fact, we’d be lucky to get to 1% of the speed of light, which would make a journey to Proxima Centauri a voyage of 300 years. Moreover, if some generations could make the journey, it is likely that our closest star is not inhabitable.

Perhaps we could freeze ourselves and wake up millions of years later at our destination. Maybe that would work. Obviously, we don’t have technology to do anything like this now. And given the laws of entropy, it’s hard to imagine any spacecraft that could survive a voyage of that duration intact.

What we need is a warp drive and a starship. But what we will really need is an escape clause from the theories of relativity and the technology that will allow humans to utilize it: a spacecraft that can slip through a wormhole or something. It’s not that this is impossible, but with what we know it looks impossible for us and will always be impossible. In any event there doesn’t appear to be any wormholes conveniently near Earth.

In short, we are stuck to the planet Earth. We’d best enjoy what we have and stop squandering the planet on which all human life depends. So far we are doing a terrible job of it.


Switch to our mobile site