Mobile Web Like Web in 90s (Usability)

Usability is generally ignored on the web today, not because it isn’t a big deal, but because the “common” design patterns are all reasonably usable.  Users are comfortable with the interface, nobody really does remarkably stupid things.  In the late 90s and early 2000s, that wasn’t the case.

Today, the mobile web is the talk, and apparently, we have the same usability problems that we had 10 years ago…  While users have an 80% success rate attempting a task on the web on their computer, it drops to 59% on their phone.

“Observing users suffer during our  … sessions reminded us of the very first usability studies we did with traditional websites in 1994,” writes Jakob Nielson (free plug, I found this article from his website, Use It.  Indeed, the Web 2.0 “Design strategy” of two columns over 3, most common operation front and center, and large fonts show that the Web 2.0 “revolution” largely involved Flash being replaced with sensible Javascript and Designers finally listening to usability guidelines, either intentionally or accidentally.

The oddest thing about the computer/IT industry is that it doesn’t maintain institutional knowledge or learn from the past.  When basic web-forms were decried as a throwback to the 3270 Mainframe model, you would think that the old Mainframe hands would be considered experts, but in an industry where 18-25 year olds can be productive, there is no interest in expertise.  As the mobile web becomes more and more important, usability may make the difference between success and failure.  The idea that I should go to my computer to check a map seems as ludicrous as the idea that I should use the phone book!

CMS and CPU Usage

I have normally been adverse to Content Management Systems (CMS), because they generally are coded poorly to work in a “plugin” format.  Each page routinely makes dozens of database calls, which can put a big strain on the CPU.  On the other hand, the let an individual programmer quickly add LOTS of functionality that would have required a team of programmers months to develop.  I used to find them particularly heinous because they destroyed SEO attempts, but all the modern systems let you have a reasonable URL structure.  As a result, if you are successful and decide to build the $100,000 website, you can always point the old URLs to the new location and not break links.

However, something that was a reminder today, a spike in traffic can destroy your server if you aren’t optimized.  If you are getting promoted on television, being interviewed, or otherwise getting mentioned on a popular program that might send a few thousand people to your site at one time, be careful.  Even if bandwidth isn’t a problem, CPU and Memory might be.  If you are expecting a spike, make your home page static.  Most of your visitors will come there, and if you make it a static page (mod_rewrite them to the dynamic script if they have a logged in cookie), you’ll drastically cut your database load.

In fact, I think most sites would do well to always make the home page static.  Something we did “back in the day,” was program the whole site dynamically, then “mirror” the home page to a file with wget or something.  One could have most of their site mirrored to static files, and serve up dynamic pages to logged in users.  Historically, that’s what Slashdot used to do.

Either way, you should remember to optimize your main queries, and create the appropriate indexes as part of bring a site life.  Bandwidth isn’t the only constraint, sites without dozens of servers need to worry about CPU and memory usage as well.  A popular television show can send WAY more simultaneous traffic than social media or search engines, at least at one time.