2018 Social Media Strategy Overview

Two heads sharing questions and ideas

There is no one side fits all strategy to Social Media, but ignore it at your risk. There are several major platforms and a plethora of minor platforms, but for small business brands and political brands, there are a few to focus on. Thank you to Pew Research for putting these facts together.


YouTube, by reputation, is that of a video streaming service, but the YouTube.com site is a much broader social platform. Comments, discussions, sharing, and thumb up/thumb down scores all contribute to the YouTube experience. Sure you can embed a YouTube video on your website without using those functions, but YouTube.com is now the second largest search engine (after Google.com), and Google Video searches rely heavily on YouTube. A strong YouTube presence, including the social components of commenting on related videos and replying to comments, is very important. As of January 2018, 73% of Adults use YouTube.


The largest of the pure Social Platforms, 68% of Americans have Facebook Accounts. Only Facebook knows what percentage of them are regular and active users, but Pew pegs it at 74% of American Facebook users use it at least once a day. That’s self reported, so take it with a grain of salt, but Facebook may be the easiest and most direct way to reach people. It is not sufficient to have a Page that you share content to. Links to your website, with proper boosts, and audience building campaigns are critical to brands having the ability to engage with people that interest them.


Instagram, the Facebook property, is in third place with 35% of Americans having accounts, 60% of whom use it at least daily. That makes Instagram an important, but not as critical, part of your social media strategy. A simple Instagram account, with a regular picture and caption being added, with good hashtags, can go a long way towards building your brand. A serious effort to build followers and engage can have more serious results, but it depends on your brand. If your goal is to build deep relationships with users (premium luxury brands), Instagram should be front and center. If you need a more casual relationship with the bulk of the population (think politicians, Instagram can be more perfunctory).

Niche Platforms: Pinterest, Snapchat, LinkedIn, Twitter

These platforms, are relatively popular, all having a decent population, with dominance in their area, but lack a universal presence. The usage gap between these platforms and Instagram is only a few percentage points, but Instagram is rapidly growing and these niche platforms are relatively static in their user base. For completeness, Pinterest has 29% of Americans, Snapchat 27% of Americans, LinkedIn has 25%, and Twitter 24%. These aren’t small audiences, with large segments of the population in their niche.

Pinterest is very popular, but is demographic specific, being 81% female, and a median age of 40. The active pinners are younger and even more female, and among Millenials it equals Instagram. It has a strong advertising platform, and is very strong around lifestyle, hobbies, and brands. If you market your brands towards women under 50, Pinterest is a great addition to your platform. (OmniCore’s Pinterest Statistics)

Snapchat is popular, but niche. The advertising platform is immature, and it is challenging for non-celebrities to build a following here. Unless you are in fashion, music, or other youth targeted segments, it is probably more worthwhile to focus on other platforms. But if you are targeting college students and younger, SnapChat is essentials. (OmniCore’s Snapchat Statistics)

LinkedIn is a valuable, and expensive to market on. If your target audience are business professionals, it’s a critical platform. Sales Professionals live and die by LinkedIn. Gainfully employed people may only look at Linkedin when job hunting. If you are selling into corporate markets, LinkedIn needs to fit your platform. If you are marketing to consumers, LinkedIn is probably not going to generate an ROI.

Twitter is a super strange platform. It’s relatively small, but sometimes has an out-sized influence. It is popularly credited by the media with dominating the 2016 election, but there are so few people on it. It is more popular overseas, where the lower data needs and more free-flowing conversations avoid censors. Twitter is extremely possible with journalists, public relations firms, and celebrities. The ability to run the messaging from a cell phone makes it much flexible for those in the business of communicating with those industries. While Twitter shows Videos and Images, pure text messaging still works. If your business is looking to reach journalists, generate publicity, or communicate with customers in a free-wheeling fashion, Twitter should be part of your communications strategy. Twitter’s advertising tools are shockingly primitive, but it’s very powerful if you are trying to reach the demographics active there.

High Performance Writes with Triggers

So a high performance system I was working on encountered a small technical problem, under heavy and repeated load from a single data source resulted in the Apache threads colliding, timing out, and resulting in the load balancer returning a 503 Error.  After much frustration, it was finally tracked down to this process.  When a data record enters the system, it is recorded, counters are interated, and as it moves to the monetization systems, each transmission is logged.  Each transmission also has counters updated via trigger, all of which presents a tremendous amount of data as simple lookups or small aggregate data.  Knowing how many records come in from a company on 5 tracking codes over 3 months means an aggregate query that sums 15 lines (the monthly total for each tracking code), but this means a tremendous number of UPDATE set count = count+1 triggers.  When a second event takes place before the first one is completed, a deadlock can occur as multiple transactions attempt to access the same row for updating.

The solution?  An holding table without triggers.

Another table for transmissions was created, that lacks any of the counter code that causes the deadlock.  When a transmission is logged, it is immediately recorded in this simple table, which returns control back to the system.  A background daemon runs that calls a single stored procedure to migrate the data:

  1. Record the id of all transmission records in this table (obviously, the database should handle this so changes in other transactions don’t interfere, but doing this explicitly makes it more portable and has a negligible impact on the process.
  2. As a single insert, bring all these records from the holding table to the permanent table.
  3. Delete these records from the holding table.

As a result of this process, the deadlock condition is resolved, and this procedure takes a fraction of the time because all the updates can be processed in the transaction without conflict.

The only drawback, the system’s counters and transmission log is only accurate to the last time it ran.  To make this as minimal as possible an intrusion, a daemon was written in bash, see this tutorial for a starting point, to simply run the stored procedure, sleep, and run again.  To protect yourself from potential race conditions, which result in deadlocking, do three things to avoid trouble:

  1. Use the same sequence for the primary key field in the temp table as the real one.  That way, if you attempt to copy it a second time, it will fail because the ids are in use (this also lets me to bulk adds to the table from a data pull without wasting time in the temp table).
  2. Run the function as a dedicated user.  Have the function run with definer permissions, and create a dedicated user to run the script, capped at a single login.
  3. Run the daemon, not a cron job.  If something delays the scripts, and you cron job it every minute, then you could start a second copy before the first is completed.  If you write the daemon, then the second job doesn’t run until the first is done.

When using a holding table, make certain that you are simply trying to let the triggers run after control has returned to the client function.  In a less high performance task, a simple LISTEN/NOTIFY structure instead of triggers will get the job done with less complexity.

Caveat, referential integrity can be compromised here if you are not careful.  If things are dependent on this table, this approach will not work, as the records will be recorded in the system, but not passed to the table where foreign keys reference.  Engineering around that limitation may create tremendous complexity.

Alternative Approach to explore: partial replication to a second server.  In that scenario, all the control code (meta code) could replicate from the repository to the secondary server(s).  Each of those could hold temporary insert tables.  In that case, as you add servers, make the sequences on each server count by multiples (two servers, one uses evens for ids, one uses odds).  Then you can replicate those tables back to the main database with the LISTEN/NOTIFY system, not worried about multiple calls in short order.

Performance Tuning Websites

Download speed used to be one of the ways you could tell a real web pro from a graphic designer that knew how to make things pretty.  One of our excercizes used to be to make pages in a single table (this predated CSS), but carefully spanning rows and columns to move your content around, a pain to develop, but fast to render, since Netscape’s browser used to be terrible at “embedded tables.”  Obviously, this is archaic (along with worrying about 28.8k modem download speeds), but the concept of page that is fast to download and fast to render never went away.  When Google officially announced that it would take page load into account, people started to finally pay attention.  One of the best “starting points” it’s Yahoo’s Performance Guide.

However, the process of making a website fast is pretty straight forward:

  1. Home Page and other enterances: VERY fast and simple
  2. Limit third party items that might cause delays via DNS or download
  3. Prevent things that can get VERY slow from being on these pages

One and two are the ones most often paid attention to, but #3 is potentially the biggest impact and most ignored.  For example, adding gzip to your server cuts file transmission size, that saves time, and is nice, and can get 80 ms loads to 20 ms loads, but #3 is where page loads can move from 100 ms to 10-15 seconds.  For example, if you query the database to build your navigation, it’s easier to manage your navigation, but a hiccup at the database level (that locks that table), and your site hangs loading.  A solution like Memcached moves your “read only” data out of the database and into RAM.  You can still manage it in the database, and the site will update relatively quickly, but there is no reason to consult the database multiple times for information that changes infrequently.

Third party servers often get ignored, but you have no control of when they have problems.  Serving Javascript from a third party has the advantage that you don’t have to maintain it, but puts you at risk of the user’s experience massively degrading.  Consider removing as many third party elements as possible.  Solutions like Google’s Javascript based tracking for Google Analytics has the advantage of having near-zero impact on page load (except the transmission of the text – and the download of the Javascript library), but unlike images, tends to not have performance problems, and the site will load even if it is having trouble obtaining the Javascript library at the bottom of your page.

Getting the 50% – 200% improvements are great, but a real focus on the few things that can explode out of control will serve you better in the long run.

Open Graph Brings SEO Opportunities to Facebook

So Facebook’s push for Open Graph integration, where the “Like Button” replaces the direct link, creates new opportunities for businesses to focus on Facebook’s search mechanism.  Some initial tests indicated that it is possible to now optimize for Facebook search, i.e. bringing SEO to Facebook.

Facebook Open Graph allows one to connect their site to Facebook without full integration, simply using new Facebook Meta Tags and the Like Button (a snipped of Javascript code).  Facebook is tracking these likes and building a “graph” of the Internet based upon the recommendations of your social network, and now they are including relevant results when you search Facebook for something.

This creates an opportunity for companies that are bringing their brand to Facebook to get additional exposure through Open Graph, which creates an incentive to use the technology.  This is exciting, as the move to the “walled garden” of social media threatened to disrupt the open world of search.  In the past, users could recommend a page on their blog, creating a “link graph” for the search engines to use, but now it’s easier to just click “share” and send the link out to your friends that way.  Without this part of the link graph, the search engines are missing out on the recommendations that they build their systems upon.

Open Graph brings out the ability to restore this, even if Facebook is the only company taking advantage of it for now.

Social Networking Across the World

Social networking is amongst the hottest topics of the past few years, but while our US Centric media has focused on Twitter (popular in urban areas) and Facebook (popular throughout the western world), the growth of social media is universal.  Here is a country-by-country map of the dominate Interestingly, Google’s Orkut has a small presence in Latin America, and it is likely that Facebook will overtake it in 2010.  This has the added benefit of destroying anyone who thinks that Google’s web dominance in anyway approaches Microsoft’s desktop dominance in the late 1990s.

Most interesting to me, the map looks increasingly like a Cold War map, with NATO/Facebook squaring off against Warsaw Pact/V Kontakte, although the division is less political and more language based.  Proper Cyrillic support is necessary in Russia and areas with high numbers of Russian speakers.  As communication morphs, increasingly email is the dominate business communication and social media is the dominate personal communication method, making sense of personal communications matching personal friendships while business needs the real time communication for which the academic system was developed for.

As communication increases, the world becomes increasingly similar, and a US-centric view of your potential user base may look myopic as more and more of the world starts to look similar.  If you can sell in the US via Facebook, you can sell across Europe via Facebook, as long as you adapt to the social and language conventions of your target country.

Beyond Relational Databases: Is Your Data Relational?

One of the strangest things about technology is how it moves in circles.  The relational database isn’t new technology, and while many changes to the storage model and the performance of the system has changed, the underlying concept is the same.  The leading databases, except for Oracle, all bare SQL in the name, giving the impression that SQL was critical to the concept of the relational database, not merely a front end language for describing access to relational data.

Web sites fit nicely into a relational model.  They have categories, articles, products, etc., sets of data.  The idea of applying set theory to data is at the core of the relational database.  I can quickly and easily get all Articles in the Category of SEO, because those fields are tagged, and I simply pull the appropriate subset.  You can always get intersections (with JOINs), unions, set deletes (EXCEPT), and other set operations… if you are using sets of data.

Martin Kleppmann asks, on Carsonified, Should you go Beyond Relational Databases? That’s the wrong question to ask.  The question is, “Is your data relational?”  If you have groupings of like data, then you need a relational database.  If you are building an application with non-relational data, then storing it in the database to have a quick id look up is foolish, and you should be looking for persistent data storage that is optimized for that sort of data.

For temporary storage, a system like memcached is perfect, it gives you lightning fast references to data that may only exist temporarily.  For a long term storage, maybe a database is your answer, or maybe you need something more tied to your data structure.  We wouldn’t suggest Microsoft switch from it’s DOC format (and the Docx XML version) to relational databases, but I wouldn’t put relational data into something more object oriented.  You might use objects to represent it in memory for easier programming, but if the data is essentially relational, keep it in relations.

Data structures are at the core of computer science.  With all the free information out there, there is no excuse to be building a large scale system without knowing the basics.  The fact that Twitter built their operation without knowing what they were doing doesn’t mean that everyone can… Bill Gates dropped out of Harvard and made a fortune, not every Harvard drop-out is so successful.

PostgreSQL Cascading: Updates and Deletes

So something nice about a real database is cascading values, most commonly used to deletion, but you can use them for updates as well.  Let me give you a scenario: I track groups of data from my clients by a sub_id field.  As they add groups, their ids aren’t in a range.  If I wanted to consolidate them, I could update the sub_id (somewhere that won’t be stomped on by the sequence), but what about historical data.  Baring a cascade rule, PostgresSQL will stop the update.  A non-relational database like MySQL will just leave orphan foreign keys.  If you set Cascading, they come along for the ride.

View the following example:

letter char);

INSERT INTO A (letter) VALUES ('a'), ('b'), ('c'), ('d'), ('e');

INSERT INTO B(a_id, letter) VALUES (1, 'a'), (2, 'b'), (3, 'c'), (4, 'd'), (5, 'e');

a_id | letter
1 | a
2 | b
3 | c
4 | d
5 | e

b_id | a_id | letter
1 | 1 | a
2 | 2 | b
3 | 3 | c
4 | 4 | d
5 | 5 | e

-- We delete a few fields from the bottom table, and PostgreSQL magically cascades it to B

a_id | letter
1 | a
2 | b
3 | c

b_id | a_id | letter
1 | 1 | a
2 | 2 | b
3 | 3 | c

UPDATE A set a_id = a_id + 3;
-- This is the example above, renumbering the code

a_id | letter
4 | a
5 | b
6 | c

b_id | a_id | letter
1 | 4 | a
2 | 5 | b
3 | 6 | c

So we were able to renumber A, pass the values to B, without any updates to B. Pretty cool, huh?

Tweetdeck Enhances Facebook, Adds Myspace

If you are playing with Social Media, you’re aware of TweetDeck, the Twitter-centric system that helps organize that mass chaos that Twitter can devolve into.  If you are just updating your friends of your comings and goings, particularly via SMS, ignore TweetDeck, but if you are monitoring and participating in far ranging online conversation, TweetDeck forms the center of it.

Custom Searches let you monitor stories and discussions, and with the new version, the directory makes it easy to add discussions and other topics.  TweetDeck supported Facebook Status updates, the original system that Twitter appeared to copy and enhance, but now TweetDeck is integrated with Facebook for tracking all sorts of information.  TweetDeck is also adding MySpace support, the popular service that seems buzz free but with many active users.

Mashable also seems to be big fans of this TweetDeck upgrade.

Mobile Web Like Web in 90s (Usability)

Usability is generally ignored on the web today, not because it isn’t a big deal, but because the “common” design patterns are all reasonably usable.  Users are comfortable with the interface, nobody really does remarkably stupid things.  In the late 90s and early 2000s, that wasn’t the case.

Today, the mobile web is the talk, and apparently, we have the same usability problems that we had 10 years ago…  While users have an 80% success rate attempting a task on the web on their computer, it drops to 59% on their phone.

“Observing users suffer during our  … sessions reminded us of the very first usability studies we did with traditional websites in 1994,” writes Jakob Nielson (free plug, I found this article from his website, Use It.  Indeed, the Web 2.0 “Design strategy” of two columns over 3, most common operation front and center, and large fonts show that the Web 2.0 “revolution” largely involved Flash being replaced with sensible Javascript and Designers finally listening to usability guidelines, either intentionally or accidentally.

The oddest thing about the computer/IT industry is that it doesn’t maintain institutional knowledge or learn from the past.  When basic web-forms were decried as a throwback to the 3270 Mainframe model, you would think that the old Mainframe hands would be considered experts, but in an industry where 18-25 year olds can be productive, there is no interest in expertise.  As the mobile web becomes more and more important, usability may make the difference between success and failure.  The idea that I should go to my computer to check a map seems as ludicrous as the idea that I should use the phone book!

Demographics of Twitter — Teens Catching Up

The latest demographics show Twitter usage amongst teenagers catching up with older dynamics.  The service still dominates in the 35 – 54 year old segment, which makes sense given that the mid-career professionals with nobody looking over their shoulder at the office all day (literally, more likely to have an office or at least a large cubicle) are making more use of a tool that requires constant connectivity for usage.  However, teenagers are slowly taking more of an interest in Twitter, which seemed odd to those that assume that technology is most often adapted by the young and moves up.  In the case of Twitter, it captured the Blackberry-addict demographic, not the TXTing on a phone demographic that they aimed for.  The comScore Blog Entry shows this with some lovely charts.

Twitter started by assuming that you’d want to update your close circle of friends with your goings on.  When Twitter hit the scene, my friends in urban areas on the coasts jumped on it to update everyone with what they were doing socially.  The teenage demographic doesn’t WANT to publish everything publicly, at least where their parents and/or teachers can find it.  Myspace offered teenagers tremendous room for self expression, while Facebook focused on the college (and later high school and young professional) markets of dating and social connection… high college students keeping in touch with high school friends, etc.

Interestingly, Twitter is now integrated with other parts of the web much better, making it a more useful tool for this demographic.  One of my high school classmates posted on Facebook that we should follow her on Twitter, as she isn’t on Facebook much anymore, but whether this is inevitable or a function of Facebook’s chasing Twitter and de-emphasizing what made it originally popular remains to be seen.  The old core of Facebook, finding old friends and reconnecting, or sharing college experiences with friends across the country, seems to have been supplanted in a barrage of data.  Facebook knocked Myspace off as leader by offering a clean and easy to use interface, but when they started fighting Twitter for buzz the news feed stopped being about sharing photographs and more about comments on statuses and wall posts, making it more and more a poor impersonation of Twitter.  If you want status and comments, Twitter’s world of feeds and mentions is a far cleaner interface than Facebook’s increasingly cluttered system.

Teenagers either have a close social circle that they are in touch with, or looking for ways to break out of the social world that they inhabit during the day.  A service that wants to reach them needs to offer one (or both) of those options.  Twitter offers teenagers the ability to aggregate information flow that interests them, and the increasing integration with other aspects of the web make it more interested.  When I was in high school, BBSes were the online way to communicate, by college, ICQ and later AIM became the online social center.  As Twitter takes that portion of the mindspace, Twitter’s relevance in that group increases.  However, the idea that my instant messages would be published on the website (even with the distinction between direct messages and public ones) seems odd to me, but AIM seemed odd to email/USENET users before us.