8 Ways Content Marketers Can Hack Facebook Multi-Product Ads

Posted by Alan_Coleman

The trick most content marketers are missing

Creating great content is the first half of success in content marketing. Getting quality content read by, and amplified to, a relevant audience is the oft overlooked second half of success. Facebook can be a content marketer’s best friend for this challenge. For reach, relevance and amplification potential, Facebook is unrivaled.

  1. Reach: 1 in 6 mobile minutes on planet earth is somebody reading something on Facebook.
  2. Relevance: Facebook is a lean mean interest and demo targeting machine. There is no online or offline media that owns as much juicy interest and demographic information on its audience and certainly no media has allowed advertisers to utilise this information as effectively as Facebook has.
  3. Amplification: Facebook is literally built to encourage sharing. Here’s the first 10 words from their mission statement: “Facebook’s mission is to give people the power to share…”, Enough said!

Because of these three digital marketing truths, if a content marketer gets their paid promotion* right on Facebook, the battle for eyeballs and amplification is already won.

For this reason it’s crucial that content marketers keep a close eye on Facebook advertising innovations and seek out ways to use them in new and creative ways.

In this post I will share with you eight ways we’ve hacked a new Facebook ad format to deliver content marketing success.

Multi-Product Ads (MPAs)

In 2014, Facebook unveiled multi-product ads (MPAs) for US advertisers, we got them in Europe earlier this year. They allow retailers to show multiple products in a carousel-type ad unit.

They look like this:

If the user clicks on the featured product, they are guided directly to the landing page for that specific product, from where they can make a purchase.

You could say MPAs are Facebook’s answer to Google Shopping.

Facebook’s mistake is a content marketer’s gain

I believe Facebook has misunderstood how people want to use their social network and the transaction-focused format is OK at best for selling products. People aren’t really on Facebook to hit the “buy now” button. I’m a daily Facebook user and I can’t recall a time this year where I have gone directly from Facebook to an e-commerce website and transacted. Can you remember a recent time when you did?

So, this isn’t an innovation that removes a layer of friction from something that we are all doing online already (as the most effective innovations do). Instead, it’s a bit of a “hit and hope” that, by providing this functionality, Facebook would encourage people to try to buy online in a way they never have before.

The Wolfgang crew felt the MPA format would be much more useful to marketers and users if they were leveraging Facebook for the behaviour we all demonstrate on the platform every day, guiding users to relevant content. We attempted to see if Facebook Ads Manager would accept MPAs promoting content rather than products. We plugged in the images, copy and landing pages, hit “place order”, and lo and behold the ads became active. We’re happy to say that the engagement rates, and more importantly the amplification rates, are fantastic!

Multi-Content Ads

We’ve re-invented the MPA format for multi-advertisers in multi-ways, eight ways to be exact! Here’s eight MPA Hacks that have worked well for us. All eight hacks use the MPA format to promote content rather than promote products.

Hack #1: Multi-Package Ads

Our first variation wasn’t a million miles away from multi-product ads; we were promoting the various packages offered by a travel operator.

By looking at the number of likes, comments, and shares (in blue below the ads) you can see the ads were a hit with Facebook users and they earned lots of free engagement and amplification.

NB: If you have selected “clicks to website” as your advertising objective, all those likes, comments and shares are free!

Independent Travel Multi Product Ad

The ad sparked plenty of conversation amongst Facebook friends in the comments section.

Comments on a Facebook MPA

Hack #2: Multi-Offer Ads

Everybody knows the Internet loves a bargain. So we decided to try another variation moving away from specific packages, focusing instead on deals for a different travel operator.

Here’s how the ads looked:

These ads got valuable amplification beyond the share. In the comments section, you can see people tagging specific friends. This led to the MPAs receiving further amplification, and a very targeted and personalised form of amplification to boot.

Abbey Travel Facebook Ad Comments

Word of mouth referrals have been a trader’s best friend since the stone age. These “personalised” word of mouth referrals en masse are a powerful marketing proposition. It’s worth mentioning again that those engagements are free!

Hack #3: Multi-Locations Ads

Putting the Lo in SOLOMO.

This multi-product feed ad was hacked to promote numerous locations of a waterpark. “Where to go?” is among the first questions somebody asks when researching a holiday. In creating this top of funnel content, we can communicate with our target audience at the very beginning of their research process. A simple truth of digital marketing is: the more interactions you have with your target market on their journey to purchase, the more likely they are to seal the deal with you when it comes time to hit the “buy now” button. Starting your relationship early gives you an advantage over those competitors who are hanging around the bottom of the purchase funnel hoping to make a quick and easy conversion.

Abbey Travel SplashWorld Facebook MPA

What was surprising here, was that because we expected to reach people at the very beginning of their research journey, we expected the booking enquiries to be some time away. What actually happened was these ads sparked an enquiry frenzy as Facebook users could see other people enquiring and the holidays selling out in real time.

Abbey Travel comments and replies

In fact nearly all of the 35 comments on this ad were booking enquiries. This means what we were measuring as an “engagement” was actually a cold hard “conversion”! You don’t need me to tell you a booking enquiry is far closer to the money than a Facebook like.

The three examples outlined so far are for travel companies. Travel is a great fit for Facebook as it sits naturally in the Facebook feed, my Facebook feed is full of envy-inducing friends’ holiday pictures right now. Another interesting reason why travel is a great fit for Facebook ads is because typically there are multiple parties to a travel purchase. What happened here is the comments section actually became a very visible and measurable forum for discussion between friends and family before becoming a stampede inducing medium of enquiry.

So, stepping outside of the travel industry, how do other industries fare with hacked MPAs?

Hack #3a: Multi-Location Ads (combined with location targeting)

Location, location, location. For a property listings website, we applied location targeting and repeated our Multi-Location Ad format to advertise properties for sale to people in and around that location.

Hack #4: Multi-Big Content Ad

“The future of big content is multi platform”

– Cyrus Shepard

The same property website had produced a report and an accompanying infographic to provide their audience with unique and up-to-the-minute market information via their blog. We used the MPA format to promote the report, the infographic and the search rentals page of the website. This brought their big content piece to a larger audience via a new platform.

Rental Report Multi Product Ad

Hack #5: Multi-Episode Ad

This MPA hack was for an online TV player. As you can see we advertised the most recent episodes of a TV show set in a fictional Dublin police station, Red Rock.

Engagement was high, opinion was divided.

TV3s Red Rock viewer feedback

LOL.

Hack #6: Multi-People Ads

In the cosmetic surgery world, past patients’ stories are valuable marketing material. Particularly when the past patients are celebrities. We recycled some previously published stories from celebrity patients using multi-people ads and targeted them to a very specific audience.

Avoca Clinic Multi People Ads

Hack #7: Multi-UGC Ads

Have you witnessed the power of user generated content (UGC) in your marketing yet? We’ve found interaction rates with authentic UGC images can be up to 10 fold of those of the usual stylised images. In order to encourage further UGC, we posted a number of customer’s images in our Multi-UGC Ads.

The CTR on the above ads was 6% (2% is the average CTR for Facebook News feed ads according to our study). Strong CTRs earn you more traffic for your budget. Facebook’s relevancy score lowers your CPC as your CTR increases.

When it comes to the conversion, UGC is a power player, we’ve learned that “customers attracting new customers” is a powerful acquisition tool.

Hack #8: Target past customers for amplification

“Who will support and amplify this content and why?”

– Rand Fishkin

Your happy customers Rand, that’s the who and the why! Check out these Multi-Package Ads targeted to past customers via custom audiences. The Camino walkers have already told all their friends about their great trip, now allow them to share their great experiences on Facebook and connect the tour operator with their Facebook friends via a valuable word of mouth referral. Just look at the ratio of share:likes and shares:comments. Astonishingly sharable ads!

Camino Ways Mulit Product Ads

Targeting past converters in an intelligent manner is a super smart way to find an audience ready to share your content.

How will hacking Multi-Product Ads work for you?

People don’t share ads, but they do share great content. So why not hack MPAs to promote your content and reap the rewards of the world’s greatest content sharing machine: Facebook.

MPAs allow you to tell a richer story by allowing you to promote multiple pieces of content simultaneously. So consider which pieces of content you have that will work well as “content bundles” and who the relevant audience for each “content bundle” is.

As Hack #8 above illustrates, the big wins come when you match a smart use of the format with the clever and relevant targeting Facebook allows. We’re massive fans of custom audiences so if you aren’t sure where to start, I’d suggest starting there.

So ponder your upcoming content pieces, consider your older content you’d like to breathe some new life into and perhaps you could become a Facebook Ads Hacker.

I’d love to hear about your ideas for turning Multi-Product Ads into Multi-Content Ads in the comments section below.

We could even take the conversation offline at Mozcon!

Happy hacking.


*Yes I did say paid promotion, it’s no secret that Facebook’s organic reach continues to dwindle. The cold commercial reality is you need to pay to play on FB. The good news is that if you select ‘website clicks’ as your objective you only pay for website traffic and engagement while amplification by likes, comments, and shares are free! Those website clicks you pay for are typically substantially cheaper than Adwords, Taboola, Outbrain, Twitter or LinkedIn. How does it compare to display? It doesn’t. Paying for clicks is always preferable to paying for impressions. If you are spending money on display advertising I’d urge you to fling a few spondoolas towards Facebook ads and compare results. You will be pleasantly surprised.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

How to Rid Your Website of Six Common Google Analytics Headaches

Posted by amandaecking

I’ve been in and out of Google Analytics (GA) for the past five or so years agency-side. I’ve seen three different code libraries, dozens of new different features and reports roll out, IP addresses stop being reported, and keywords not-so-subtly phased out of the free platform.

Analytics has been a focus of mine for the past year or so—mainly, making sure clients get their data right. Right now, our new focus is closed loop tracking, but that’s a topic for another day. If you’re using Google Analytics, and only Google Analytics for the majority of your website stats, or it’s your primary vehicle for analysis, you need to make sure it’s accurate.

Not having data pulling in or reporting properly is like building a house on a shaky foundation: It doesn’t end well. Usually there are tears.

For some reason, a lot of people, including many of my clients, assume everything is tracking properly in Google Analytics… because Google. But it’s not Google who sets up your analytics. People do that. And people are prone to make mistakes.

I’m going to go through six scenarios where issues are commonly encountered with Google Analytics.

I’ll outline the remedy for each issue, and in the process, show you how to move forward with a diagnosis or resolution.

1. Self-referrals

This is probably one of the areas we’re all familiar with. If you’re seeing a lot of traffic from your own domain, there’s likely a problem somewhere—or you need to extend the default session length in Google Analytics. (For example, if you have a lot of long videos or music clips and don’t use event tracking; a website like TEDx or SoundCloud would be a good equivalent.)

Typically one of the first things I’ll do to help diagnose the problem is include an advanced filter to show the full referrer string. You do this by creating a filter, as shown below:

Filter Type: Custom filter > Advanced
Field A: Hostname
Extract A: (.*)
Field B: Request URI
Extract B: (.*)
Output To: Request URI
Constructor: $A1$B1

You’ll then start seeing the subdomains pulling in. Experience has shown me that if you have a separate subdomain hosted in another location (say, if you work with a separate company and they host and run your mobile site or your shopping cart), it gets treated by Google Analytics as a separate domain. Thus, you ‘ll need to implement cross domain tracking. This way, you can narrow down whether or not it’s one particular subdomain that’s creating the self-referrals.

In this example below, we can see all the revenue is being reported to the booking engine (which ended up being cross domain issues) and their own site is the fourth largest traffic source:

I’ll also a good idea to check the browser and device reports to start narrowing down whether the issue is specific to a particular element. If it’s not, keep digging. Look at pages pulling the self-referrals and go through the code with a fine-tooth comb, drilling down as much as you can.

2. Unusually low bounce rate

If you have a crazy-low bounce rate, it could be too good to be true. Unfortunately. An unusually low bounce rate could (and probably does) mean that at least on some pages of your website have the same Google Analytics tracking code installed twice.

Take a look at your source code, or use Google Tag Assistant (though it does have known bugs) to see if you’ve got GA tracking code installed twice.

While I tell clients having Google Analytics installed on the same page can lead to double the pageviews, I’ve not actually encountered that—I usually just say it to scare them into removing the duplicate implementation more quickly. Don’t tell on me.

3. Iframes anywhere

I’ve heard directly from Google engineers and Google Analytics evangelists that Google Analytics does not play well with iframes, and that it will never will play nice with this dinosaur technology.

If you track the iframe, you inflate your pageviews, plus you still aren’t tracking everything with 100% clarity.

If you don’t track across iframes, you lose the source/medium attribution and everything becomes a self-referral.

Damned if you do; damned if you don’t.

My advice: Stop using iframes. They’re Netscape-era technology anyway, with rainbow marquees and Comic Sans on top. Interestingly, and unfortunately, a number of booking engines (for hotels) and third-party carts (for ecommerce) still use iframes.

If you have any clients in those verticals, or if you’re in the vertical yourself, check with your provider to see if they use iframes. Or you can check for yourself, by right-clicking as close as you can to the actual booking element:

iframe-booking.png

There is no neat and tidy way to address iframes with Google Analytics, and usually iframes are not the only complicated element of setup you’ll encounter. I spent eight months dealing with a website on a subfolder, which used iframes and had a cross domain booking system, and the best visibility I was able to get was about 80% on a good day.

Typically, I’d approach diagnosing iframes (if, for some reason, I had absolutely no access to viewing a website or talking to the techs) similarly to diagnosing self-referrals, as self-referrals are one of the biggest symptoms of iframe use.

4. Massive traffic jumps

Massive jumps in traffic don’t typically just happen. (Unless, maybe, you’re Geraldine.) There’s always an explanation—a new campaign launched, you just turned on paid ads for the first time, you’re using content amplification platforms, you’re getting a ton of referrals from that recent press in The New York Times. And if you think it just happened, it’s probably a technical glitch.

I’ve seen everything from inflated pageviews result from including tracking on iframes and unnecessary implementation of virtual pageviews, to not realizing the tracking code was installed on other microsites for the same property. Oops.

Usually I’ve seen this happen when the tracking code was somewhere it shouldn’t be, so if you’re investigating a situation of this nature, first confirm the Google Analytics code is only in the places it needs to be.Tools like Google Tag Assistant and Screaming Frog can be your BFFs in helping you figure this out.

Also, I suggest bribing the IT department with sugar (or booze) to see if they’ve changed anything lately.

5. Cross-domain tracking

I wish cross-domain tracking with Google Analytics out of the box didn’t require any additional setup. But it does.

If you don’t have it set up properly, things break down quickly, and can be quite difficult to untangle.

The older the GA library you’re using, the harder it is. The easiest setup, by far, is Google Tag Manager with Universal Analytics. Hard-coded universal analytics is a bit more difficult because you have to implement autoLink manually and decorate forms, if you’re using them (and you probably are). Beyond that, rather than try and deal with it, I say update your Google Analytics code. Then we can talk.

Where I’ve seen the most murkiness with tracking is when parts of cross domain tracking are implemented, but not all. For some reason, if allowLinker isn’t included, or you forget to decorate all the forms, the cookies aren’t passed between domains.

The absolute first place I would start with this would be confirming the cookies are all passing properly at all the right points, forms, links, and smoke signals. I’ll usually use a combination of the Real Time report in Google Analytics, Google Tag Assistant, and GA debug to start testing this. Any debug tool you use will mean you’re playing in the console, so get friendly with it.

6. Internal use of UTM strings

I’ve saved the best for last. Internal use of campaign tagging. We may think, oh, I use Google to tag my campaigns externally, and we’ve got this new promotion on site which we’re using a banner ad for. That’s a campaign. Why don’t I tag it with a UTM string?

Step away from the keyboard now. Please.

When you tag internal links with UTM strings, you override the original source/medium. So that visitor who came in through your paid ad and then who clicks on the campaign banner has now been manually tagged. You lose the ability to track that they came through on the ad the moment they click on the tagged internal link. Their source and medium is now your internal campaign, not that paid ad you’re spending gobs of money on and have to justify to your manager. See the problem?

I’ve seen at least three pretty spectacular instances of this in the past year, and a number of smaller instances of it. Annie Cushing also talks about the evils of internal UTM tags and the odd prevalence of it. (Oh, and if you haven’t explored her blog, and the amazing spreadsheets she shares, please do.)

One clothing company I worked with tagged all of their homepage offers with UTM strings, which resulted in the loss of visibility for one-third of their audience: One million visits over the course of a year, and $2.1 million in lost revenue.

Let me say that again. One million visits, and $2.1 million. That couldn’t be attributed to an external source/campaign/spend.

Another client I audited included campaign tagging on nearly every navigational element on their website. It still gives me nightmares.

If you want to see if you have any internal UTM strings, head straight to the Campaigns report in Acquisition in Google Analytics, and look for anything like “home” or “navigation” or any language you may use internally to refer to your website structure.

And if you want to see how users are moving through your website, go to the Flow reports. Or if you really, really, really want to know how many people click on that sidebar link, use event tracking. But please, for the love of all things holy (and to keep us analytics lovers from throwing our computers across the room), stop using UTM tagging on your internal links.

Now breathe and smile

Odds are, your Google Analytics setup is fine. If you are seeing any of these issues, though, you have somewhere to start in diagnosing and addressing the data.

We’ve looked at six of the most common points of friction I’ve encountered with Google Analytics and how to start investigating them: self-referrals, bounce rate, iframes, traffic jumps, cross domain tracking and internal campaign tagging.

What common data integrity issues have you encountered with Google Analytics? What are your favorite tools to investigate?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Has Google Gone Too Far with the Bias Toward Its Own Content?

Posted by ajfried

Since the beginning of SEO time, practitioners have been trying to crack the Google algorithm. Every once in a while, the industry gets a glimpse into how the search giant works and we have opportunity to deconstruct it. We don’t get many of these opportunities, but when we do—assuming we spot them in time—we try to take advantage of them so we can “fix the Internet.”

On Feb. 16, 2015, news started to circulate that NBC would start removing images and references of Brian Williams from its website.

This was it!

A golden opportunity.

This was our chance to learn more about the Knowledge Graph.

Expectation vs. reality

Often it’s difficult to predict what Google is truly going to do. We expect something to happen, but in reality it’s nothing like we imagined.

Expectation

What we expected to see was that Google would change the source of the image. Typically, if you hover over the image in the Knowledge Graph, it reveals the location of the image.

Keanu-Reeves-Image-Location.gif

This would mean that if the image disappeared from its original source, then the image displayed in the Knowledge Graph would likely change or even disappear entirely.

Reality (February 2015)

The only problem was, there was no official source (this changed, as you will soon see) and identifying where the image was coming from proved extremely challenging. In fact, when you clicked on the image, it took you to an image search result that didn’t even include the image.

Could it be? Had Google started its own database of owned or licensed images and was giving it priority over any other sources?

In order to find the source, we tried taking the image from the Knowledge Graph and “search by image” in images.google.com to find others like it. For the NBC Nightly News image, Google failed to even locate a match to the image it was actually using anywhere on the Internet. For other television programs, it was successful. Here is an example of what happened for Morning Joe:

Morning_Joe_image_search.png

So we found the potential source. In fact, we found three potential sources. Seemed kind of strange, but this seemed to be the discovery we were looking for.

This looks like Google is using someone else’s content and not referencing it. These images have a source, but Google is choosing not to show it.

Then Google pulled the ol’ switcheroo.

New reality (March 2015)

Now things changed and Google decided to put a source to their images. Unfortunately, I mistakenly assumed that hovering over an image showed the same thing as the file path at the bottom, but I was wrong. The URL you see when you hover over an image in the Knowledge Graph is actually nothing more than the title. The source is different.

Morning_Joe_Source.png

Luckily, I still had two screenshots I took when I first saw this saved on my desktop. Success. One screen capture was from NBC Nightly News, and the other from the news show Morning Joe (see above) showing that the source was changed.

NBC-nightly-news-crop.png

(NBC Nightly News screenshot.)

The source is a Google-owned property: gstatic.com. You can clearly see the difference in the source change. What started as a hypothesis in now a fact. Google is certainly creating a database of images.

If this is the direction Google is moving, then it is creating all kinds of potential risks for brands and individuals. The implications are a loss of control for any brand that is looking to optimize its Knowledge Graph results. As well, it seems this poses a conflict of interest to Google, whose mission is to organize the world’s information, not license and prioritize it.

How do we think Google is supposed to work?

Google is an information-retrieval system tasked with sourcing information from across the web and supplying the most relevant results to users’ searches. In recent months, the search giant has taken a more direct approach by answering questions and assumed questions in the Answer Box, some of which come from un-credited sources. Google has clearly demonstrated that it is building a knowledge base of facts that it uses as the basis for its Answer Boxes. When it sources information from that knowledge base, it doesn’t necessarily reference or credit any source.

However, I would argue there is a difference between an un-credited Answer Box and an un-credited image. An un-credited Answer Box provides a fact that is indisputable, part of the public domain, unlikely to change (e.g., what year was Abraham Lincoln shot? How long is the George Washington Bridge?) Answer Boxes that offer more than just a basic fact (or an opinion, instructions, etc.) always credit their sources.

There are four possibilities when it comes to Google referencing content:

  • Option 1: It credits the content because someone else owns the rights to it
  • Option 2: It doesn’t credit the content because it’s part of the public domain, as seen in some Answer Box results
  • Option 3: It doesn’t reference it because it owns or has licensed the content. If you search for “Chicken Pox” or other diseases, Google appears to be using images from licensed medical illustrators. The same goes for song lyrics, which Eric Enge discusses here: Google providing credit for content. This adds to the speculation that Google is giving preference to its own content by displaying it over everything else.
  • Option 4: It doesn’t credit the content, but neither does it necessarily own the rights to the content. This is a very gray area, and is where Google seemed to be back in February. If this were the case, it would imply that Google is “stealing” content—which I find hard to believe, but felt was necessary to include in this post for the sake of completeness.

Is this an isolated incident?

At Five Blocks, whenever we see these anomalies in search results, we try to compare the term in question against others like it. This is a categorization concept we use to bucket individuals or companies into similar groups. When we do this, we uncover some incredible trends that help us determine what a search result “should” look like for a given group. For example, when looking at searches for a group of people or companies in an industry, this grouping gives us a sense of how much social media presence the group has on average or how much media coverage it typically gets.

Upon further investigation of terms similar to NBC Nightly News (other news shows), we noticed the un-credited image scenario appeared to be a trend in February, but now all of the images are being hosted on gstatic.com. When we broadened the categories further to TV shows and movies, the trend persisted. Rather than show an image in the Knowledge Graph and from the actual source, Google tends to show an image and reference the source from Google’s own database of stored images.

And just to ensure this wasn’t a case of tunnel vision, we researched other categories, including sports teams, actors and video games, in addition to spot-checking other genres.

Unlike terms for specific TV shows and movies, terms in each of these other groups all link to the actual source in the Knowledge Graph.

Immediate implications

It’s easy to ignore this and say “Well, it’s Google. They are always doing something.” However, there are some serious implications to these actions:

  1. The TV shows/movies aren’t receiving their due credit because, from within the Knowledge Graph, there is no actual reference to the show’s official site
  2. The more Google moves toward licensing and then retrieving their own information, the more biased they become, preferring their own content over the equivalent—or possibly even superior—content from another source
  3. If feels wrong and misleading to get a Google Image Search result rather than an actual site because:
    • The search doesn’t include the original image
    • Considering how poor Image Search results are normally, it feels like a poor experience
  4. If Google is moving toward licensing as much content as possible, then it could make the Knowledge Graph infinitely more complicated when there is a “mistake” or something unflattering. How could one go about changing what Google shows about them?

Google is objectively becoming subjective

It is clear that Google is attempting to create databases of information, including lyrics stored in Google Play, photos, and, previously, facts in Freebase (which is now Wikidata and not owned by Google).

I am not normally one to point my finger and accuse Google of wrongdoing. But this really strikes me as an odd move, one bordering on a clear bias to direct users to stay within the search engine. The fact is, we trust Google with a heck of a lot of information with our searches. In return, I believe we should expect Google to return an array of relevant information for searchers to decide what they like best. The example cited above seems harmless, but what about determining which is the right religion? Or even who the prettiest girl in the world is?

Religion-and-beauty-queries.png

Questions such as these, which Google is returning credited answers for, could return results that are perceived as facts.

Should we next expect Google to decide who is objectively the best service provider (e.g., pizza chain, painter, or accountant), then feature them in an un-credited answer box? The direction Google is moving right now, it feels like we should be calling into question their objectivity.

But that’s only my (subjective) opinion.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

Technical Site Audit Checklist: 2015 Edition

Posted by GeoffKenyon

Back in 2011, I wrote a technical site audit checklist, and while it was thorough, there have been a lot of additions to what is encompassed in a site audit. I have gone through and updated that old checklist for 2015. Some of the biggest changes were the addition of sections for mobile, international, and site speed.

This checklist should help you put together a thorough site audit and determine what is holding back the organic performance of your site. At the end of your audit, don’t write a document that says what’s wrong with the website. Instead, create a document that says what needs to be done. Then explain why these actions need to be taken and why they are important. What I’ve found to really helpful is to provide a prioritized list along with your document of all the actions that you would like them to implement. This list can be handed off to a dev or content team to be implemented easily. These teams can refer to your more thorough document as needed.


Quick overview

Check indexed pages  
  • Do a site: search.
  • How many pages are returned? (This can be way off so don’t put too much stock in this).
  • Is the homepage showing up as the first result? 
  • If the homepage isn’t showing up as the first result, there could be issues, like a penalty or poor site architecture/internal linking, affecting the site. This may be less of a concern as Google’s John Mueller recently said that your homepage doesn’t need to be listed first.

Review the number of organic landing pages in Google Analytics

  • Does this match with the number of results in a site: search?
  • This is often the best view of how many pages are in a search engine’s index that search engines find valuable.

Search for the brand and branded terms

  • Is the homepage showing up at the top, or are correct pages showing up?
  • If the proper pages aren’t showing up as the first result, there could be issues, like a penalty, in play.
Check Google’s cache for key pages
  • Is the content showing up?
  • Are navigation links present?
  • Are there links that aren’t visible on the site?
PRO Tip:
Don’t forget to check the text-only version of the cached page. Here is a
bookmarklet to help you do that.

Do a mobile search for your brand and key landing pages

  • Does your listing have the “mobile friendly” label?
  • Are your landing pages mobile friendly?
  • If the answer is no to either of these, it may be costing you organic visits.

On-page optimization

Title tags are optimized
  • Title tags should be optimized and unique.
  • Your brand name should be included in your title tag to improve click-through rates.
  • Title tags are about 55-60 characters (512 pixels) to be fully displayed. You can test here or review title pixel widths in Screaming Frog.
Important pages have click-through rate optimized titles and meta descriptions
  • This will help improve your organic traffic independent of your rankings.
  • You can use SERP Turkey for this.

Check for pages missing page titles and meta descriptions
  
The on-page content includes the primary keyword phrase multiple times as well as variations and alternate keyword phrases
  
There is a significant amount of optimized, unique content on key pages
 
The primary keyword phrase is contained in the H1 tag
  

Images’ file names and alt text are optimized to include the primary keyword phrase associated with the page.
 
URLs are descriptive and optimized
  • While it is beneficial to include your keyword phrase in URLs, changing your URLs can negatively impact traffic when you do a 301. As such, I typically recommend optimizing URLs when the current ones are really bad or when you don’t have to change URLs with existing external links.
Clean URLs
  • No excessive parameters or session IDs.
  • URLs exposed to search engines should be static.
Short URLs
  • 115 characters or shorter – this character limit isn’t set in stone, but shorter URLs are better for usability.

Content

Homepage content is optimized
  • Does the homepage have at least one paragraph?
  • There has to be enough content on the page to give search engines an understanding of what a page is about. Based on my experience, I typically recommend at least 150 words.
Landing pages are optimized
  • Do these pages have at least a few paragraphs of content? Is it enough to give search engines an understanding of what the page is about?
  • Is it template text or is it completely unique?
Site contains real and substantial content
  • Is there real content on the site or is the “content” simply a list of links?
Proper keyword targeting
  • Does the intent behind the keyword match the intent of the landing page?
  • Are there pages targeting head terms, mid-tail, and long-tail keywords?
Keyword cannibalization
  • Do a site: search in Google for important keyword phrases.
  • Check for duplicate content/page titles using the Moz Pro Crawl Test.
Content to help users convert exists and is easily accessible to users
  • In addition to search engine driven content, there should be content to help educate users about the product or service.
Content formatting
  • Is the content formatted well and easy to read quickly?
  • Are H tags used?
  • Are images used?
  • Is the text broken down into easy to read paragraphs?
Good headlines on blog posts
  • Good headlines go a long way. Make sure the headlines are well written and draw users in.
Amount of content versus ads
  • Since the implementation of Panda, the amount of ad-space on a page has become important to evaluate.
  • Make sure there is significant unique content above the fold.
  • If you have more ads than unique content, you are probably going to have a problem.

Duplicate content

There should be one URL for each piece of content
  • Do URLs include parameters or tracking code? This will result in multiple URLs for a piece of content.
  • Does the same content reside on completely different URLs? This is often due to products/content being replicated across different categories.
Pro Tip:
Exclude common parameters, such as those used to designate tracking code, in Google Webmaster Tools. Read more at
Search Engine Land.
Do a search to check for duplicate content
  • Take a content snippet, put it in quotes and search for it.
  • Does the content show up elsewhere on the domain?
  • Has it been scraped? If the content has been scraped, you should file a content removal request with Google.
Sub-domain duplicate content
  • Does the same content exist on different sub-domains?
Check for a secure version of the site
  • Does the content exist on a secure version of the site?
Check other sites owned by the company
  • Is the content replicated on other domains owned by the company?
Check for “print” pages
  • If there are “printer friendly” versions of pages, they may be causing duplicate content.

Accessibility & Indexation

Check the robots.txt

  • Has the entire site, or important content been blocked? Is link equity being orphaned due to pages being blocked via the robots.txt?

Turn off JavaScript, cookies, and CSS

Now change your user agent to Googlebot

PRO Tip:
Use
SEO Browser to do a quick spot check.

Check the SEOmoz PRO Campaign

  • Check for 4xx errors and 5xx errors.

XML sitemaps are listed in the robots.txt file

XML sitemaps are submitted to Google/Bing Webmaster Tools

Check pages for meta robots noindex tag

  • Are pages accidentally being tagged with the meta robots noindex command
  • Are there pages that should have the noindex command applied
  • You can check the site quickly via a crawl tool such as Moz or Screaming Frog

Do goal pages have the noindex command applied?

  • This is important to prevent direct organic visits from showing up as goals in analytics

Site architecture and internal linking

Number of links on a page
Vertical linking structures are in place
  • Homepage links to category pages.
  • Category pages link to sub-category and product pages as appropriate.
  • Product pages link to relevant category pages.
Horizontal linking structures are in place
  • Category pages link to other relevant category pages.
  • Product pages link to other relevant product pages.
Links are in content
  • Does not utilize massive blocks of links stuck in the content to do internal linking.
Footer links
  • Does not use a block of footer links instead of proper navigation.
  • Does not link to landing pages with optimized anchors.
Good internal anchor text
 
Check for broken links
  • Link Checker and Xenu are good tools for this.

Technical issues

Proper use of 301s
  • Are 301s being used for all redirects?
  • If the root is being directed to a landing page, are they using a 301 instead of a 302?
  • Use Live HTTP Headers Firefox plugin to check 301s.
“Bad” redirects are avoided
  • These include 302s, 307s, meta refresh, and JavaScript redirects as they pass little to no value.
  • These redirects can easily be identified with a tool like Screaming Frog.
Redirects point directly to the final URL and do not leverage redirect chains
  • Redirect chains significantly diminish the amount of link equity associated with the final URL.
  • Google has said that they will stop following a redirect chain after several redirects.
Use of JavaScript
  • Is content being served in JavaScript?
  • Are links being served in JavaScript? Is this to do PR sculpting or is it accidental?
Use of iFrames
  • Is content being pulled in via iFrames?
Use of Flash
  • Is the entire site done in Flash, or is Flash used sparingly in a way that doesn’t hinder crawling?
Check for errors in Google Webmaster Tools
  • Google WMT will give you a good list of technical problems that they are encountering on your site (such as: 4xx and 5xx errors, inaccessible pages in the XML sitemap, and soft 404s)
XML Sitemaps  
  • Are XML sitemaps in place?
  • Are XML sitemaps covering for poor site architecture?
  • Are XML sitemaps structured to show indexation problems?
  • Do the sitemaps follow proper XML protocols
Canonical version of the site established through 301s
 
Canonical version of site is specified in Google Webmaster Tools
 
Rel canonical link tag is properly implemented across the site
Uses absolute URLs instead of relative URLs
  • This can cause a lot of problems if you have a root domain with secure sections.

Site speed


Review page load time for key pages 

Make sure compression is enabled


Enable caching


Optimize your images for the web


Minify your CSS/JS/HTML

Use a good, fast host
  • Consider using a CDN for your images.

Optimize your images for the web

Mobile

Review the mobile experience
  • Is there a mobile site set up?
  • If there is, is it a mobile site, responsive design, or dynamic serving?


Make sure analytics are set up if separate mobile content exists


If dynamic serving is being used, make sure the Vary HTTP header is being used

Review how the mobile experience matches up with the intent of mobile visitors
  • Do your mobile visitors have a different intent than desktop based visitors?
Ensure faulty mobile redirects do not exist
  • If your site redirects mobile visitors away from their intended URL (typically to the homepage), you’re likely going to run into issues impacting your mobile organic performance.
Ensure that the relationship between the mobile site and desktop site is established with proper markup
  • If a mobile site (m.) exists, does the desktop equivalent URL point to the mobile version with rel=”alternate”?
  • Does the mobile version canonical to the desktop version?
  • Official documentation.

International

Review international versions indicated in the URL
  • ex: site.com/uk/ or uk.site.com
Enable country based targeting in webmaster tools
  • If the site is targeted to one specific country, is this specified in webmaster tools? 
  • If the site has international sections, are they targeted in webmaster tools?
Implement hreflang / rel alternate if relevant
If there are multiple versions of a site in the same language (such as /us/ and /uk/, both in English), update the copy been updated so that they are both unique
 

Make sure the currency reflects the country targeted
 
Ensure the URL structure is in the native language 
  • Try to avoid having all URLs in the default language

Analytics

Analytics tracking code is on every page
  • You can check this using the “custom” filter in a Screaming Frog Crawl or by looking for self referrals.
  • Are there pages that should be blocked?
There is only one instance of a GA property on a page
  • Having the same Google Analytics property will create problems with pageview-related metrics such as inflating page views and pages per visit and reducing the bounce rate.
  • It is OK to have multiple GA properties listed, this won’t cause a problem.
Analytics is properly tracking and capturing internal searches
 

Demographics tracking is set up

Adwords and Adsense are properly linked if you are using these platforms
Internal IP addresses are excluded
UTM Campaign Parameters are used for other marketing efforts
Meta refresh and JavaScript redirects are avoided
  • These can artificially lower bounce rates.
Event tracking is set up for key user interactions

This audit covers the main technical elements of a site and should help you uncover any issues that are holding a site back. As with any project, the deliverable is critical. I’ve found focusing on the solution and impact (business case) is the best approach for site audit reports. While it is important to outline the problems, too much detail here can take away from the recommendations. If you’re looking for more resources on site audits, I recommend the following:

Helpful tools for doing a site audit:

Annie Cushing’s Site Audit
Web Developer Toolbar
User Agent Add-on
Firebug
Link Checker
SEObook Toolbar
MozBar (Moz’s SEO toolbar)
Xenu
Screaming Frog
Your own scraper
Inflow’s technical mobile best practices

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from moz.com

How to Have a Successful Local SEO Campaign in 2015

Posted by Casey_Meraz

Another year in search has passed. It’s now 2015 and we have seen some major changes in local ranking factors since 2014, which I also expect to change greatly throughout 2015. For some a new year means a fresh starting point and yet for others it’s a time of reflection to analyze how successful your campaign has been. Whatever boat you’re in, make sure to sit down and read this guide. 

In this guide we will cover how you can have a successful local SEO campaign in 2015 starting with the basics and getting down to five action items you should focus on now. This is not limited to Google My Business and also includes localized organic results. 

Now the question is where do you start?

Since Pigeon has now rolled out to the US, UK, Australia, and Canada it’s important to make sure your strategies are in line with this no matter what part of the world you’re in. A successful local SEO Campaign in 2015 will be much more successful if you put more work into it. Don’t be fooled though. More work by itself isn’t going to get you where you need to be. You need to work smarter towards the goals which are going to fuel your conversions.

For some industries that might mean more localized content, for others it may mean more social interaction in your local area. Whatever it ends up being, the root of it should be the same for most. You need to get more conversions for your website or your client’s website. So with this in mind let’s make sure we’re on the same page as far as our goals are concerned.

Things you need to know first

Focus on the right goals

Recently I had a conversation with a client who wanted to really nail in the point that
he was not interested in traffic. He was interested in the conversions he could track. He was also interested to see how all of these content resource pieces I recommended would help. He was tired of the silly graphs from other agencies that showed great rankings on a variety of keywords when he was more interested to see which efforts brought him the most value. Instead, he wanted to see how his campaign was bringing him conversions or meaningful traffic. I really appreciated this statement and I felt like he really got it.

Still, however, far too often I have to talk to potential clients and explain to them why their sexy looking traffic reports aren’t actually helping them. You can have all of the traffic in the world but if it doesn’t meet one of your goals of conversions or education then it’s probably not helping. Even if you make the client happy with your snazzy reports for a few months, eventually they’re going to want to know their return on investment (ROI).

It’s 2015. If your clients aren’t tracking conversions properly, give them the help they need. Record their contacts in a CRM and track the source of each of these contacts. Track them all the way through the sales funnel. 

That’s a simple and basic marketing example but as SEOs
your role has transformed. If you can show this type of actual value and develop a plan accordingly, you will be unstoppable.

Second, don’t get tunnel vision

You may wonder why I started a little more basic than normal in this post. The fact is that in this industry there is not a full formal training program that covers all aspects of what we do. 

We all come from different walks of life and experience which makes it easy for us to get tunnel vision. You probably opened this article with the idea of “How Can I Dominate My Google Local Rankings?” While we cover some actionable tips you should be using, you need to think outside of the box as well. Your website is not the only online property you need to be concerned about.

Mike Ramsey from Nifty Marketing put out a great study on 
measuring the click-through rates from the new local stack. In this study he measured click-through rates of users conducting several different searches like “Salt Lake City Hotel” in the example below. With so many different options look where the users are clicking:

They’re really clicking all over the place! While it’s cool to be number one, it’s much better if you get clicks from your paid ad, organic result, local result, and barnacle SEO efforts (which we’ll talk about a little later). 

If you combine your conversion marketing data with your biggest priorities, you can put together a plan to tackle the most important areas for your industry. Don’t assume it’s a one-size-fits-all approach. 

Third, some spam still works. Don’t do it and rise above it.

There’s no doubt that some spammy tactics are still working. Google gets better everyday but you still see crap
like this example below show up in the SERPs.

While it sucks to see that kind of stuff, remember that in time it disappears (just as it did before this article was published). If you take shortcuts, you’re going to get caught and it’s not worth it for the client or the heartache on your site. Maintain the course and do things the right way. 

Now let’s get tactical and prepare for 2015

Now it’s time for some practical and tactical takeaways you can use to dominate your local search campaign in 2015.

Practical tip 1: start with an audit

Over the years, one of the best lessons I have learned is it’s OK to say “I don’t know” when you don’t have the answer. Consulting with industry experts or people with more experience than you is not a bad thing and will likely only lead to you to enhance your knowledge and get a different perspective. It can be humbling but the experience is amazing. It can open your mind.

Last year, I had the opportunity to work with over ten of the industry’s best minds and retained them for site audits on different matters. 

The perspective this gives is absolutely incredible and I believe it’s a great way to learn. Everyone in this industry has come from a different background and seen different things over the years. Combining that knowledge is invaluable to the success of your clients’ projects. Don’t be afraid to do it and learn from it. This is also a good idea if you feel like your project has reached a stalemate. Getting more advice, identifying potential problems, and having a fresh perspective will do wonders for your success.

As many of the experts have confirmed, ever since the Pigeon update, organic and local ranking factors have been more tied together than ever. Since they started going this direction in a big way, I would not expect it to stop. 

This means that you really do need to worry about things like site speed, content, penalties, mobile compatibility, site structure, and more. On a side note, guess what will happen to your organic results if you keep this as a top priority? They will flourish and you will thank me.

If you don’t have the budget or resources to get a third party opinion, you can also conduct an independent audit. 

Do it yourself local SEO audit resources:

Do it yourself organic SEO audit resources:

Alternatively if you’re more in the organic boat you should also check out this guide by Steve Webb on
How To Perform The World’s Greatest SEO Audit

Whatever your situation is, it’s worth the time to have this perspective yearly or even a couple times a year if possible.

Practical tip 2: consider behavioral signals and optimize accordingly

I remember having a conversation with Darren Shaw, the founder of 
Whitespark, at MozCon 2013 about his thoughts on user behavior affecting local results. At the time I didn’t do too much testing around it. However just this year, Darren had a mind-blowing presentation at the Dallas State of Search where he threw in the behavioral signals curve ball. Phil Rozek also spoke about behavioral signals and provided a great slide deck with actionable items (included below). 

We have always speculated on behavioral signals but between his tests and some of Rand’s IMEC Lab tests, I became more of a believer last year. Now, before we go too deep on this remember that your local campaign is NOT only focused on just your local pack results. If user behavior can have an impact on search results, we should definitely be optimizing for our users.


You can view Phil Rozek’s presentation below: 

Don’t just optimize for the engines, optimize for the humans. One day when Skynet is around this may not be an issue, but for now you need to do it.

So how can you optimize for behavioral signals?

There is a dark side and a light side path to this question. If you ask me I will always say follow the light side as it will be effective and you don’t have to worry about being penalized. That’s a serious issue and it’s unethical for you to put your clients in that position.

Local SEO: how to optimize for behavioral signals

Do you remember the click-through study we looked at a bit earlier from Nifty Marketing? Do you remember where the users clicked? If you look again or just analyze user and shopper behavior, you might notice that many of the results with the most reviews got clicks. We know that reviews are hard to get so here are two quick ways that I use and recommend to my clients:


1. Solicit your Gmail clients for reviews

If you have a list of happy Gmail clients you can simply send them an email with a direct link to your Google My Business Page. Just get the URL of your local page by pulling up your URL and copying and pasting it. A URL will look like the one below:

Once you have this URL, simply remove the /posts and replace it with: 

 /?hl=en&review=1


It will look like this:

If your clients click on this link via their logged-in Gmail, they will automatically be taken to the review page which will open up the box to leave a review which looks like the example below. It doesn’t get much more simple than that. 

2. Check out a service like Mike Blumenthal’s Get Five Stars for reviews

I recently used this with a client and got a lot of great feedback and several reviews.

Remember that these reviews will also help on third-party sites and can help your Google My Business ranking positions as well as click-through rates. You can
check out Get Five Stars Here.

Another way outside of getting reviews is to optimize the appearance of your Google My Business Page. 


3. Optimize your local photos

Your Google My Business page includes photos. Don’t use generic photos. Use high quality photos so when the users hover over your listing they get an accurate representation of what they’re looking for. Doing this will increase your click-through rate. 

Organic SEO: Optimize for Behavioral Signals

The optimization for click-through rates on organic results typically focus on three areas. While you’re likely very familiar with the first two, you should not ignore them.


1. Title tags: optimize them for the user and engine

Optimize your meta title tags to increase click-through rates. Each page should have a unique title tag and should help the viewer with their query. The example below (although fabricated) is a good example of what NOT to do. 


2. Meta descriptions: optimize them for the user

Optimize your meta description to get the user to click on the search result. If you’re not doing this just because Google may or may not pull it, you’re missing opportunities and clicks. 


3. Review Schema markup: add this to appropriate pages

Reviewing
Schema markup is still a very overlooked opportunity. Like we talked about above in the local section, if you don’t have reviews coded in Schema, you could be missing out on getting the orange stars in organic results. 

Practical tip 3: don’t ignore barnacle SEO

I firmly believe that most people are not taking advantage of barnacle SEO still to this day and I’m a big fan. When I first heard Will Scott introduce this term at Pubcon I thought it was spot on. According to Will Scott’s website Search Influence, barnacle SEO is “attaching oneself to a large fixed object and waiting for the customers to float by in the current.” In a nutshell, we know that if you’re trying to rank on page one of Google you will find others that you may be able to attach to. If Yelp results come up for a lot of your search terms you might identify that as an opportunity. But there are three main ways you can take advantage of this.


1. You can try to have the most visible profile on that third party page

If Yelp is ranking for LA Personal Injury Attorneys, it would suit you to figure out how the top users are showing up there. Maybe your customers are headed there and then doing some shopping and making a selection. Or maybe they’re using it for a research platform and then will visit your website. If your profile looks great and shows up high on the list, you just gave yourself a better chance at getting a conversion.


2. You can try to get your page to rank

Hey, just because you don’t own Yelp.com or whatever similar site you’ve found, doesn’t mean you shouldn’t put in the effort to have it rank. If Google is already showing you that they trust a third party site by ranking it, you can use similar organic ranking techniques that you would use on your own site to make your profile page stronger. Over time you might add this to your bio on interviews or other websites to earn links. If you increase the visibility of your profile on search engines and they see your website on the same page you might increase conversions.


3. You can help your Google My Business

If the site you’re using passes link juice and you earn links to the third party profile page, you will start to see some strong results. Links are a big factor in local since Pigeon this year and it’s an opportunity that should not be missed.


So how can you use this advice?

Start by finding a list of potential barnacle SEO partners for your industry. As an example, I did a search for “Personal Injury Attorneys” in Los Angeles. In addition to the law firms that showed up in the results on the first page, I also identified four additional places I may be able to show up on.

  1. Yelp
  2.  Thumbtack
  3. Avvo
  4. Wikipedia

If you were attorney, it would be worth your while to explore these and see if any make sense for you to contribute to.

Practical tip 4: earn some good links

Most people get too carried away with link building. I know because I used to do it. The key with link building is to change your approach to understand that
it’s always better to get fewer high quality links than hundreds or thousands of low quality links

For example, a link like this one that one of our clients earned is what I’m talking about. 

If you want to increase your local rankings you can do so by earning these links to your associated Google My Business landing page.

Do you know the URL you entered in your Google My Business page when you set it up? That’s the one I’m talking about. In most cases this will be linked to either a local landing page for that location or the home page. It’s essential to your success that you earn solid links to this page.


Simple resources for link building

Practical tip 5: have consistent citations and remove duplicates

Identifying and correcting incorrect or duplicate citations has been getting easier and easier over the years. Even if you don’t want to pay someone to do it, you can sign up for some great do-it-yourself tools. Your goal with any citation cleanup program is this:

  1. Ensure there are no duplicate citations
  2. Ensure there are no incorrect citations with wrong phone numbers, old addresses, etc. 

You can ignore small differences and inconsistencies like St vs. Street. I believe the importance of citations has been greatly reduced over the past year. At the same time, you still want to be the least imperfect and provide your customers with accurate information if they’re looking on third party websites.  

Let’s do good things in 2015

2014 was a tough year in search altogether. We had ups like Penguin refreshes and we had downs like the removal of authorship. I’m guessing 2015 will be no different. Staying on the roller coaster and keeping with the idea of having the “least imperfect” site is the best way to ring out the new year and march on moving forward. If you had a tough year in local search, keep your head up high, fix any existing issues, and sprint through this year by making positive changes to your site. 

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from moz.com

Illustrated Guide to Advanced On-Page Topic Targeting for SEO

Posted by Cyrus-Shepard

Topic n. A subject or theme of a webpage, section, or site.

Several SEOs have recently written about topic modeling and advanced on-page optimization. A few of note:

The concepts themselves are dizzying: LDA, co-occurrence, and entity salience, to name only a few. The question is
“How can I easily incorporate these techniques into my content for higher rankings?”

In fact, you can create optimized pages without understanding complex algorithms. Sites like Wikipedia, IMDB, and Amazon create highly optimized, topic-focused pages almost by default. Utilizing these best practices works exactly the same when you’re creating your own content.

The purpose of this post is to provide a simple
framework for on-page topic targeting in a way that makes optimizing easy and scalable while producing richer content for your audience.

1. Keywords and relationships

No matter what topic modeling technique you choose, all rely on discovering
relationships between words and phrases. As content creators, how we organize words on a page greatly influences how search engines determine the on-page topics.

When we use keywords phrases, search engines hunt for other phrases and concepts that
relate to one another. So our first job is to expand our keywords research to incorporate these related phrases and concepts. Contextually rich content includes:

  • Close variants and synonyms: Includes abbreviations, plurals, and phrases that mean the same thing.
  • Primary related keywords: Words and phrases that relate to the main keyword phrase.
  • Secondary related keywords: Words and phrases that relate to the primary related keywords.
  • Entity relationships: Concept that describe the properties and relationships between people, places, and things. 

A good keyword phrase or entity is one that
predicts the presence of other phrases and entities on the page. For example, a page about “The White House” predicts other phrases like “president,” “Washington,” and “Secret Service.” Incorporating these related phrases may help strengthen the topicality of “White House.”

2. Position, frequency, and distance

How a page is organized can greatly influence how concepts relate to each other.

Once search engines find your keywords on a page, they need to determine which ones are most
important, and which ones actually have the strongest relationships to one another.

Three primary techniques for communicating this include:

  • Position: Keywords placed in important areas like titles, headlines, and higher up in the main body text may carry the most weight.
  • Frequency: Using techniques like TF-IDF, search engines determine important phrases by calculating how often they appear in a document compared to a normal distribution.
  • Distance: Words and phrases that relate to each other are often found close together, or grouped by HTML elements. This means leveraging semantic distance to place related concepts close to one another using paragraphs, lists, and content sectioning.

A great way to organize your on-page content is to employ your primary and secondary related keywords in support of your focus keyword. Each primary related phrase becomes its own subsection, with the secondary related phrases supporting the primary, as illustrated here.

Keyword Position, Frequency and Distance

As an example, the primary keyword phrase of this page is ‘On-page Topic Targeting‘. Supporting topics include: keywords and relationships, on-page optimization, links, entities, and keyword tools. Each related phrase supports the primary topic, and each becomes its own subsection.

3. Links and supplemental content

Many webmasters overlook the importance of linking as a topic signal.

Several well-known Google
search patents and early research papers describe analyzing a page’s links as a way to determine topic relevancy. These include both internal links to your own pages and external links to other sites, often with relevant anchor text.

Google’s own
Quality Rater Guidelines cites the value external references to other sites. It also describes a page’s supplemental content, which can includes internal links to other sections of your site, as a valuable resource.

Links and Supplemental Content

If you need an example of how relevant linking can help your SEO,
The New York Times
famously saw success, and an increase in traffic, when it started linking out to other sites from its topic pages.

Although this guide discusses
on-page topic optimization, topical external links with relevant anchor text can greatly influence how search engines determine what a page is about. These external signals often carry more weight than on-page cues, but it almost always works best when on-page and off-page signals are in alignment.

4. Entities and semantic markup

Google extracts entities from your webpage automatically,
without any effort on your part. These are people, places and things that have distinct properties and relationships with each other.

• Christopher Nolan (entity, person) stands 5’4″ (property, height) and directed Interstellar (entity, movie)

Even though entity extraction happens automatically, it’s often essential to mark up your content with
Schema for specific supported entities such as business information, reviews, and products. While the ranking benefit of adding Schema isn’t 100% clear, structured data has the advantage of enhanced search results.

Entities and Schema

For a solid guide in implementing schema.org markup, see Builtvisible’s excellent
guide to rich snippets.

5. Crafting the on-page framework

You don’t need to be a search genius or spend hours on complex research to produce high quality, topic optimized content. The beauty of this framework is that it can be used by anyone, from librarians to hobby bloggers to small business owners; even when they aren’t search engine experts.

A good webpage has much in common with a high quality university paper. This includes:

  1. A strong title that communicates the topic
  2. Introductory opening that lays out what the page is about
  3. Content organized into thematic subsections
  4. Exploration of multiple aspects of the topic and answers related questions
  5. Provision of additional resources and external citations

Your webpage doesn’t need to be academic, stuffy, or boring. Some of the most interesting pages on the Internet employ these same techniques while remaining dynamic and entertaining.

Keep in mind that ‘best practices’ don’t apply to every situation, and as
Rand Fishkin says “There’s no such thing as ‘perfectly optimized’ or ‘perfect on-page SEO.'” Pulling everything together looks something like this:

On-page Topic Targeting for SEO

This graphic is highly inspired by Rand Fishkin’s great
Visual Guide to Keyword Targeting and On-Page SEO. This guide doesn’t replace that canonical resource. Instead, it should be considered a supplement to it.

5 alternative tools for related keyword and entity research

For the search professional, there are dozens of tools available for thematic keyword and entity research. This list is not exhaustive by any means, but contains many useful favorites.

1.
Alchemy API

One of the few tools on the market that delivers entity extraction, concept targeting and linked data analysis. This is a great platform for understanding how a modern search engine views your webpage.

2.
SEO Review Tools

The SEO Keyword Suggestion Tools was actually designed to return both primary and secondary related keywords, as well as options for synonyms and country targeting. 

3.
LSIKeywords.com

The LSIKeyword tool performs Latent Semantic Indexing (LSI) on the top pages returned by Google for any given keyword phrase. The tool can go down from time to time, but it’s a great one to bookmark.

4.
Social Mention

Quick and easy, enter any keyword phrase and then check “Top Keywords” to see what words appear most with your primary phrase across the of the platforms that Social Mention monitors. 

5.
Google Trends

Google trends is a powerful related research tool, if you know how to use it. The secret is downloading your results to a CSV (under settings) to get a list up to 50 related keywords per search term.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from feedproxy.google.com