Automated Speech must be Free Speech

As aggregation and news feeds become the primary way we digest information, the definitions of electronic speech will be a critical test of whether we maintain the free flow of information that we have grown accustomed to or if free press will simply wither on the vine.

The New York TImes is attempting to muddy the issues by making fun of Free Speech for Computers, but it’s not the computer that has free speech rights, but the company or owner of the machine whose freedom of expression needs protecting.  The 2003 case that Mr. Wu mocks was Search King‘s lawsuit, where a search engine spamming company was manipulating Google’s algorithm and had their rankings dropped.  They argued that Google was responsible for their business loss, Google was vindicated that their “search ranking results” constitutes the company’s free speech and you couldn’t sue over it.  Had the case gone the other way, organic search as we know it might be dead, for anytime a site dropped in the rankings, they could sue.

Nobody would argue that my blog post isn’t free speech, or that a large company doesn’t have some basic free speech rights for a corporate blog (commercial speech, not political speech, but still speech).  So what is “Computer Speech?”

Any site that automatically aggregates information, crawls the Internet looking for information and organizes it, or even a search engine, is being treated as “computer speech,” despite the fact that the computer is executing clearly designed behavior to express what it is intended to express.  Take away that free speech, and Altavista, Google, and any other automated search engine never comes into existence, the early news aggregators never appear, sites like <a href=””>Reddit</a&gt; may never exist either.

“Computer speech” may be commercial speech, or it may be someone’s rambling opinion, and whether I express my views algorithmically as a programmer or verbally as a writer, I should be entitled to the same protections.

Few Monopolies Bridge two Eras, Google’s Cockiness Unwarranted

The only constant of the computer industry is the utter failure of one company to seemingly dominate two generations of computer technology.  IBM’s dominance in the Mainframe era was replaced by Digital’s Minicomputer Dominance.  The PC/Workstation era was categorized by a variety of Unix vendors on the workstation side while Microsoft dominated the PC side.  That era of multiple poles, including Apple as a significant player seemed to end as Windows 95 brought Microsoft to a monopoly status, and Office 95’s integration with Windows 95 simply displaced Wordperfect and Lotus 1-2-3 as the dominant desktop applications, with a combination of bundling, technical malfeasance, and marketing muscle.

Despite the drama of the Netscape vs. Microsoft “Browser Wars,” Microsoft was never able to extend their dominance of the desktop environment onto the Web.  The Free Linux operating system with the Free Apache web server simply out-muscled Microsoft for the server space (in part because FreeBSD had a high performance server platform that Apache grew up on while Microsoft tried to maneuver a server designed for fighting NetWare as a file & print server into a web server), Adobe dominated the development tools, and free standards, despite attempts at manipulation by Microsoft, largely owns the Internet space.  The period of time in which people were willing to develop an IE-only web was relatively short lived, and the Netscape Plugin vs. IE ActiveX controls seems like a blip in the eye compared to the modern era of dynamic, standards compliant (or relatively open Flash) environment.

The post-Web Internet, where the application replaced the web site as the area of interest has been dominated by Google in a way not seen since Microsoft’s early monopoly.  Just as the DOJ complaint against IBM left an opening for Microsoft to monopolize the desktop, the investigation and suit against Microsoft created enough breathing room for the industry to open up the market to new players.  In the last years of the past decade, Google’s industry dominance has resulted in every website honoring their search guidelines, applications supporting their APIs, and their embrace of the AJAX tool set legitimized it despite the technology being effectively created by a Microsoft extension years earlier.

With their new dominance, we’re seeing a newly humbled Microsoft battling an increasingly arrogant Google, creating a new dominant “evil empire” for companies to compete with.  Email marketers trying to work within the guidelines at Microsoft can get a detailed report of their email system, while Google’s Gmail has a handful of vague help pages.  Microsoft’s street address of “One Microsoft Way” was often mocked not as an address, but a mindset, but increasingly Microsoft is willing to work and cooperate with other companies, while Google makes changes in secret that affect the livelihood of millions.

Dominant players of one era happily live on as profitable organizations in the next one, if their management makes the right changes and is able to take their customer base to a new environment.  IBM migrated to a servers company and Microsoft offers solutions in a multi-vendor world.  On the other hand, AOL is a shell of the company it was when it dominated both the dial-up and instant message environment, (see my article about how AIM should be where Twitter is, but somehow didn’t extend to dominate the communication landscape), but may still find a way to bring their existing users and customers to a new market position, Wordperfect and Digital got swallowed up by other companies, and other formerly major players are no more.

Facebook currently controls a rich application environment with tremendous reach, and Apple’s iPhone, iPod Touch, and iPad ecosystem is an interesting niche, but whether either can challenge Google’s dominant position over the Internet remains to be seen.

Demographics of Twitter — Teens Catching Up

The latest demographics show Twitter usage amongst teenagers catching up with older dynamics.  The service still dominates in the 35 – 54 year old segment, which makes sense given that the mid-career professionals with nobody looking over their shoulder at the office all day (literally, more likely to have an office or at least a large cubicle) are making more use of a tool that requires constant connectivity for usage.  However, teenagers are slowly taking more of an interest in Twitter, which seemed odd to those that assume that technology is most often adapted by the young and moves up.  In the case of Twitter, it captured the Blackberry-addict demographic, not the TXTing on a phone demographic that they aimed for.  The comScore Blog Entry shows this with some lovely charts.

Twitter started by assuming that you’d want to update your close circle of friends with your goings on.  When Twitter hit the scene, my friends in urban areas on the coasts jumped on it to update everyone with what they were doing socially.  The teenage demographic doesn’t WANT to publish everything publicly, at least where their parents and/or teachers can find it.  Myspace offered teenagers tremendous room for self expression, while Facebook focused on the college (and later high school and young professional) markets of dating and social connection… high college students keeping in touch with high school friends, etc.

Interestingly, Twitter is now integrated with other parts of the web much better, making it a more useful tool for this demographic.  One of my high school classmates posted on Facebook that we should follow her on Twitter, as she isn’t on Facebook much anymore, but whether this is inevitable or a function of Facebook’s chasing Twitter and de-emphasizing what made it originally popular remains to be seen.  The old core of Facebook, finding old friends and reconnecting, or sharing college experiences with friends across the country, seems to have been supplanted in a barrage of data.  Facebook knocked Myspace off as leader by offering a clean and easy to use interface, but when they started fighting Twitter for buzz the news feed stopped being about sharing photographs and more about comments on statuses and wall posts, making it more and more a poor impersonation of Twitter.  If you want status and comments, Twitter’s world of feeds and mentions is a far cleaner interface than Facebook’s increasingly cluttered system.

Teenagers either have a close social circle that they are in touch with, or looking for ways to break out of the social world that they inhabit during the day.  A service that wants to reach them needs to offer one (or both) of those options.  Twitter offers teenagers the ability to aggregate information flow that interests them, and the increasing integration with other aspects of the web make it more interested.  When I was in high school, BBSes were the online way to communicate, by college, ICQ and later AIM became the online social center.  As Twitter takes that portion of the mindspace, Twitter’s relevance in that group increases.  However, the idea that my instant messages would be published on the website (even with the distinction between direct messages and public ones) seems odd to me, but AIM seemed odd to email/USENET users before us.

Death of Search, Long Live Search

The growth of social media has Internet Marketers wondering if these new areas of interest mean the end of search as the heart of an Internet Marketing campaign.  I have always resented the tag SEO for my ideas on the Internet, because the concept of gaming the search engines has been dead for over 5 years now.  The growth of link based engines, starting with Google made gaming the engines less useful than a simple coherent strategy.  By building content with the user’s needs in mind, you were naturally doing SEO with good links, clear text, and simple content rich sites.

The emergence of social media as new avenues for traffic and links only add more aspects to your traffic strategies.  It is no longer “Google or Bust,” when you can generate traffic from Twitter or Facebook.

Good content, useful materials, clean HTML, and publishing your information into social media can all help you gain links to your website, or visitors that may leave comments and enhance your site.  Anyone on the Internet for more than 8 years remembers “surfing,” where you would click around from site to site exploring.  Pre dot-com, websites linked to each other, Google’s wars on spam may have discouraging linking for a number of years, but with the growth of social media, people are out exploring the Internet, and that helps publishers with good content find more traffic. Bad Idea, But Predictable

Short URLs like were created for serve a valuable purpose, as URLs get long (think long query string, or SEO friendly long text strings), emailing a link is problematic for those using text mail clients as the text wraps around.

Twitter’s use of “shortened” URLs for the 140 character limit are totally arbitrary.  If you are sending it via SMS, the protocol supports a URL being passed along as data, not text.  Further, one could always shorten the URLs for SMS purposes and not on the web.  And on the website, you could use anchor text, the words that you click on, instead of the URL itself.

Nonetheless, Twitter decided to not support URL as special items, and the shrunken URL became a part of Twitter culture and it is here for any area that posting a link doesn’t show anchor text.

Now is a horrid idea.  Creating a special URL isn’t a horrible think, for those that are, switching to seems pretty harmless, and offering a shrunken URL format seems fine.  The “Permalink” of /year/month/day/URL-friendly-title works for Pre-2000 Internet days that the search engines still live in, making it SEO friendly, but less friendly for today’s world of Social Media and quick URL sharing.

However, that doesn’t appear what they are doing.  They appear to be pushing it as a shortening service, so you can still be, but your links will be if you choose to use Short URLs.  I suppose this serves a purpose for Twitter posts worried about Link Rot, but it also may trap you on  If you outgrow their limited Blog feature set, how do you make certain that your links don’t rot out. seems a bit more stable than, but if survives long term, your links are save, may only work on a single host.

Given’s sharing a VC relationship with Twitter, I think that they are pretty safe, because if they can’t figure out a business model, VCs can usually force a merge up of their two investments.

Twitter as the Next Web Browser?

When the first web browsers came out, most of the resources weren’t on the web.  HTML files, hypertext sets of links, would point to resources on Gopher servers, or more often, FTP servers.  The existing network of FTP indexing and search tools got HTML front ends, and the web evolved.  Critics realized that there was nothing new, HTTP instead of FTP just made it stateless without logins.

Now a shopping service has created a Twitter service where you send them a message, and they find what you are looking for and tell you.  Reuters considers this an innovative business.  You can obviously use the Website,, but there is nothing magical about the Twitter connection?  There is no reason that you can’t SMS them, Twitter them, Instant Message them, or email them.  In fact, over a decade ago, those of us without direct Internet access could FTP via email.  There were gateways where you could email a request and you would get a response, and using UUCP Email Gateways, those of us with daily email connections could get files… it just would take a few days with one directory listing every two days (day 1, request a directory, email went out that night, the response came back after the connection, so on night 2 you got the response).

The Twitter API makes it easy for you to integrate, both sending and receiving messages via Twitter.  There isn’t any reason that it’s easier for users to Twitter you, but for people on Twitter all the time, I guess opening a web browser is now an inconvenience.

Twitter isn’t a technologically interesting system, but it sure is a clever social phenomenon.  If you have a web business and you aren’t harnessing Twitter, you’re missing a source of traffic.

Media Submarkets on the Web

I love reading what marketing focused online marketers have to save, because coming from a technology background, I like understanding what my colleagues without a background in tech thing are the market moving forces.  I’ve been quoting a bunch of articles from Media Post, because the daily emails often prompt a good opportunity to think.  Mr. Allen inadvertently suggests that social media of today traces its roots to the early days of the web, and while he is correct that the desire for interactivity shows signs at the early web, the underlying technology has supported this.

Though today’s websites share no common code with the BBS world of the 1970s-1990s, it shares a cultural desire to share information, files, and resources in an online manner.  Some of the early Unix BBS systems were designed to support information sharing like a dial-up BBS between local users at a University, albeit over the TCP/IP network and Telnet instead of a modem and a terminal emulation/dialer program.

However, the “Social Media” world of today required a certain technological shift.  The “Web 2.0” technology shift, and the AJAX acronym didn’t require new technology, but did require a changing software landscape.  In 2001, when I started in this business, trackable links that didn’t break search engines required custom coding and careful management of the HTTP protocol responses.  In 2009, you go to and it does it for you.  In 2001, building a website required building an article repository to manage content, in 2009 many CMS systems are available off the shelf.  In 2001, SEO was emerging from the hacker days of Altavista, and riding the PageRank mathematics of Google’s rise and Yahoo’s use of Google PageRank for sorting.

Why does this matter?  In 2001, building a website required technology skills.  In 2009, has you up and running in 15 minutes, and you can start working on your site.  The early promise of the Web was two-way communications.  Netscape shipped with an HTML editor, because the whole concept of Hypertext was easily shareable and editable documents.  The HTTP spec had concepts of file movement that were never adopted until the DAV group realized that you could do collaborative editing with them.  HTML editing turned out to be too complicated, but Web 2.0 featured the concept of mini code.  IFrames let websites include content elsewhere, but you were at their mercy for displaying it.  Instead, we have interactive forms that pull information from anywhere.

The social media of today traces it’s social roots to the first acoustic modem on a computer, but the technology is new.  When AJAX came out as a popular acronym, it became socially acceptable to put critical content behind a Javascript layer, previously a no-no of web design, Javascript for convenience was accepted, but not required.  The underlying technology was there, but easy libraries brought it to the junior programmers.

Designing a high end website still requires technology and database skills.  But prototype-grade environments like Ruby-on-Rails and CakePHP brought RAD concepts from the desktop application Visual Basic world to web programming for everyone.  And while it certainly brought out many applications that don’t scale, it made these rapid fire AJAX/JSON mini web services easy to write, and that made the social media world possible.

So while marketers may see this evolution of mini-markets, they miss the underlying technology shift.  Once a media is cheap to create, the advertising on it becomes affordable.  The wire service made real reporting cost effective, the web made mail-order effective, and the underlying language libraries let companies without a technology team build interactive websites, creating these markets.