Author Archives: Derek Slater

Journalists’ 3 worst SEO misconceptions

Photo: Brandon Grassley (Creative Commons)

Photo: Brandon Grassley (Creative Commons)

Dear Colleagues:

I’m terribly sorry that sleazy quick-buck artists shanghaied the term SEO during the last decade.

And I’m sorry that search algorithms are taking so long to mature. And that Demand Media extruded crap based on search data, then bottled it and called it “content”. And that your competitor’s two-bit glossary definition outranks your six-part magnum opus on the same topic, and that nitwits running Scrapebox and Xrumer are flooding your personal blog with fake comments and spammy links.


Okay. Apology accepted?

Because I’d like to move you along to a healthy, productive relationship with web searchers.

Sadly the dysfunction persists in the form of these three SEO myths that journalists (STILL!) love to repeat.

1. SEO is “writing for computers”.

C’mon, man. You’re writing for people, and some of them use search engines to find information.

Stop with the knee-jerk resistance and think a little. This isn’t complicated.

2. It’s all about the headline.

Paradox: In spite of these lingering misconceptions, an informal survey of B2B journalists finds that we nearly all think we’re SEO experts. Almost every damn one of us. (Or rather, we *were* all SEO experts – you know, before we became social media experts instead….)

Dig a little further, though. Ask journalists what tasks they actually perform related to SEO purposes. Let’s keep score at home:

0 points: “I write SEO-friendly headlines and repeat the same keywords in the deck.”

3 points: “I use the keywords creatively and appropriately in subheads, image captions, image file names, sidebars, pullquotes, bold text, numbered sequences and steps, Amazon affiliate boxes, and more. I use these devices to make the story more reader-friendly as well as more searchable.”

5 points: “I use Google Trends [or WordTracker or similar] to research common search phrases and trends before I start reporting.”

Many of our SEO expert journalists are going to score zero points.

Maybe they’re better at social media.

3. Search traffic is crap traffic.

Really? Think Google brings you a bunch of drive-bys?

Have you looked at your analytics to confirm this? Wonderful place, the web – lotsa measurement going on.

In almost every media site I’ve seen, search visitors are MORE engaged on average than the site’s overall numbers.

Lower bounce rate, higher pages per visit, higher time-on-site.

And that’s even been true on sites where the editorial team isn’t paying attention to the powerful, wonderful value of knowing very precisely the topic of interest for a given site visitor.

Imagine how well you could serve the information needs of these visitors – if you’d just stop ignoring search and SEO!

Online Content Marketing in 30 Minutes has lots of suggestions about integrating search data into editorial planning. Buy it! You’ll love it!

3 content marketing analytics blunders

When you want to know how your content marketing program is working, of course you turn to your trusty web analytics software. Omniture, Google Analytics and their ilk can tell you a lot about your content’s effectiveness and progress. Or lack thereof.

But you have to ask your software the right questions.

As a web content creator for 12 years, I have personally made a wide and creative variety of analytics mistakes. And my friends and associates on other sites have made the rest.

Here are three simple web analytics mistakes you can stop making today.

1. Focusing on the wrong view.

On a content-rich website, the Google Analytics view that you’ll use most consistently is

Content > Site Content > All Pages

That’s the RIGHT view.

Managing B2B media websites, I look at this view every day. That’s how I find out which content is resonating with the audience. Watching this page-by-page view shows you the story topics and types that work for your readers.

Other, higher-level metrics of website performance are important – of course they are! Bounce rate matters. Time on site matters. Pages per visit matters.

But as my colleague Art Jahnke and I used to say to each other (with mock exasperation),  “Staring at the numbers doesn’t make them go up.”

You have to drill down to find actionable information, just as looking at the score of a football game doesn’t tell you what trades the Cowboys need to make.

So don’t burn too much time staring at the Google Analytics landing page for your site. That’s the WRONG view. Live in the content view, where the “actionable” is.

2. Never changing the timeframe.

True story: In the early days of the B2B security website, I started a blog called Movers and Shakers. As you would guess, Movers covered career advancement and job changes by security department leaders.

Being in the pre-LinkedIn era, it took a surprising amount of time and a LOT of web searching to feed this blog.

Also being in the Pleistocene era with regards to analytics, we just got one report each month showing us the top twenty articles on the site. Movers always hung around the bottom of this list.

After a couple of years, with a site redesign impending, and the blog experiencing this persistently mediocre level of traffic, I gave it up. Left the blog to rot. Bigger fish to fry, had I.

Some while later I was given my own analytics account. I expanded the date setting and looked at all site traffic over the preceding three years.

Guess what blog was the #1 driver of web readership on the site?

Moral: Once in a while take the long-term view. It may change your understanding of the value of different pages and topics on your site.

3. Comparing last month against the ‘preceding six-month rolling average’ (and other Stupid Math Tricks).

Most sites generate traffic reports to show that they are headed in the right direction. Then most of those same sites compare this month’s traffic against last month’s traffic.

Most of the time, this comparison borders on being pointless. In most B2B sites, December stats are going to be worse than November, which will also be worse than October – which is longer and has fewer holiday disruptions. Conversely, there are lots of B2C sites that will spike upwards in November as the gift-shopping begins in earnest.

July is usually worse than June because summer vacations have started in earnest. And so on.

This-month-versus-last-month comparison has never interested me. And I’ve seen reports that attempt to “normalize” versus these seasonal variations by comparing one month to the average of the preceding six months.

These are stupid math tricks.

What I compare December 2013 against is the only timeframe that takes into account all those factors: December 2012. Year-over-year tells me whether I am growing my site or not – and I damn well better be, because I’ve spent a year’s worth of learning, sharpening, and creating additional content in between.

For a brand-new content operation, obviously you can’t do it. But starting in your 13th month, I highly recommend putting this comparison into your analytics and reporting mix. It’s the best indicator of whether you are making real progress on the traffic front.

Every good metric deserves a counterbalancing metric

An Acme Co. call center once compensated each sales agent based solely on the number of units that agent shipped.

“Sam” gradually emerged as the star agent. Sam closed a deal with nearly every caller the system routed his way. And so Sam reeled in a nice bonus check every quarter. Meanwhile, Acme’s quality assurance folks looked at the sales numbers and monitored the calls of their less effective agents in an effort to improve (or fire) the underperformers.

This is a pretty clear-cut win for Acme, right? They incented and retained their best performer.

Well, no. As you might guess from the title of this post, the truth is that the company was actually losing money on Sam’s outstanding performance.

Acme didn’t realize this until they started monitoring *Sam’s* calls to see what effective techniques they could pass along to other agents. It turned out Sam’s main technique was a very customer-friendly reliance on the company’s ‘satisfaction guaranteed – return the product if you’re not happy’ guarantee. Sam’s go-to sales closer was “Go ahead and try it with no risk – you can return it at no cost!” And as a result, Sam’s customers had a very, very high return rate.

You could blame the quality assurance operation for initially failing to check Sam’s work. But the root cause was an incentive system based only on one metric – units shipped. A better system would have balanced that metric with others: returns, customer satisfaction ratings, etc.

How’s this apply to content marketing?

You might measure the success of your content marketing effort based on the bottom line – new leads or new customers. Unfortunately, while that number may tell you whether “it’s working” or not, it won’t give you enough information to tweak your processes for improvement.

Similarly, you might measure success based on a standard web metric like page views or unique visits – many online media companies do this. But how many of those visits or visitors are converting into customers?

Focusing on a single metric tends to skew the behavior of the people who are tasked with hitting the goals.

Instead, for a healthy and sustainable content marketing initiative, choose an overall goal and then identify at least two or three metrics that are pertinent to achievement of that goal.

Look for metrics that balance each other.

For example, if one of your metrics is page views, balance that with an equal focus on bounce rate – which indicates how many of your visitors view only a single page and then leave the site. This creates an incentive to content marketing that grows a long-term audience with affinity for your site (and thus puts you at the top of their list when they are looking to purchase).

There’s much more discussion of goals and metrics in chapters 1 and 8 of our book Online Content Marketing in 30 Minutes.