5 Spreadsheet Tips for Manual Link Audits

Posted by MarieHaynes

Link auditing is the part of my job that I love the most. I have audited a LOT of links over the last few years. While there are some programs out there that can be quite helpful to the avid link auditor, I still prefer to create a spreadsheet of my links in Excel and then to audit those links one-by-one from within Google Spreadsheets. Over the years I have learned a few tricks and formulas that have helped me in this process. In this article, I will share several of these with you.

Please know that while I am quite comfortable being labelled a link auditing expert, I am not an Excel wizard. I am betting that some of the things that I am doing could be improved upon if you’re an advanced user. As such, if you have any suggestions or tips of your own I’d love to hear them in the comments section!

1. Extract the domain or subdomain from a URL

OK. You’ve downloaded links from as many sources as possible and now you want to manually visit and evaluate one link from every domain. But, holy moly, some of these domains can have THOUSANDS of links pointing to the site. So, let’s break these down so that you are just seeing one link from each domain. The first step is to extract the domain or subdomain from each url.

I am going to show you examples from a Google spreadsheet as I find that these display nicer for demonstration purposes. However, if you’ve got a fairly large site, you’ll find that the spreadsheets are easier to create in Excel. If you’re confused about any of these steps, check out the animated gif at the end of each step to see the process in action.

Here is how you extract a domain or subdomain from a url:

  • Create a new column to the left of your url column.
  • Use this formula:

    =LEFT(B1,FIND(“/”,B1,9)-1)

    What this will do is remove everything after the trailing slash following the domain name. http://www.example.com/article.html will now become http://www.example.com and http://www.subdomain.example.com/article.html will now become http://www.subdomain.example.com.

  • Copy our new column A and paste it right back where it was using the “paste as values” function. If you don’t do this, you won’t be able to use the Find and Replace feature.
  • Use Find and Replace to replace each of the following with a blank (i.e. nothing):
    http://
    https://
    www.

And BOOM! We are left with a column that contains just domain names and subdomain names. This animated gif shows each of the steps we just outlined:

2. Just show one link from each domain

The next step is to filter this list so that we are just seeing one link from each domain. If you are manually reviewing links, there’s usually no point in reviewing every single link from every domain. I will throw in a word of caution here though. Sometimes a domain can have both a good link and a bad link pointing to you. Or in some cases, you may find that links from one page are followed and from another page on the same site they are nofollowed. You can miss some of these by just looking at one link from each domain. Personally, I have some checks built in to my process where I use Scrapebox and some internal tools that I have created to make sure that I’m not missing the odd link by just looking at one link from each domain. For most link audits, however, you are not going to miss very much by assessing one link from each domain.

Here’s how we do it:

  • Highlight our domains column and sort the column in alphabetical order.
  • Create a column to the left of our domains, so that the domains are in column B.
  • Use this formula:

    =IF(B1=B2,”duplicate”,”unique”)

  • Copy that formula down the column.
  • Use the filter function so that you are just seeing the duplicates.
  • Delete those rows. Note: If you have tens of thousands of rows to delete, the spreadsheet may crash. A workaround here is to use “Clear Rows” instead of “Delete Rows” and then sort your domains column from A-Z once you are finished.

We’ve now got a list of one link from every domain linking to us.

Here’s the gif that shows each of these steps:

You may wonder why I didn’t use Excel’s dedupe function to simply deduplicate these entries. I have found that it doesn’t take much deduplication to crash Excel, which is why I do this step manually.

3. Finding patterns FTW!

Sometimes when you are auditing links, you’ll find that unnatural links have patterns. I LOVE when I see these, because sometimes I can quickly go through hundreds of links without having to check each one manually. Here is an example. Let’s say that your website has a bunch of spammy directory links. As you’re auditing you notice patterns such as one of these:

  • All of these directory links come from a url that contains …/computers/internet/item40682/
  • A whole bunch of spammy links that all come from a particular free subdomain like blogspot, wordpress, weebly, etc.
  • A lot of links that all contain a particular keyword for anchor text (this is assuming you’ve included anchor text in your spreadsheet when making it.)

You can quickly find all of these links and mark them as “disavow” or “keep” by doing the following:

  • Create a new column. In my example, I am going to create a new column in Column C and look for patterns in urls that are in Column B.
  • Use this formula:

    =FIND(“/item40682”,B1)
    (You would replace “item40682” with the phrase that you are looking for.)

  • Copy this formula down the column.
  • Filter your new column so that you are seeing any rows that have a number in this column. If the phrase doesn’t exist in that url, you’ll see “N/A”, and we can ignore those.
  • Now you can mark these all as disavow

4. Check your disavow file

This next tip is one that you can use to check your disavow file across your list of domains that you want to audit. The goal here is to see which links you have disavowed so that you don’t waste time reassessing them. This particular tip only works for checking links that you have disavowed on the domain level.

The first thing you’ll want to do is download your current disavow file from Google. For some strange reason, Google gives you the disavow file in CSV format. I have never understood this because they want you to upload the file in .txt. Still, I guess this is what works best for Google. All of your entries will be in column A of the CSV:

What we are going to do now is add these to a new sheet on our current spreadsheet and use a VLOOKUP function to mark which of our domains we have disavowed.

Here are the steps:

  • Create a new sheet on your current spreadsheet workbook.
  • Copy and paste column A from your disavow spreadsheet onto this new sheet. Or, alternatively, use the import function to import the entire CSV onto this sheet.
  • In B1, write “previously disavowed” and copy this down the entire column.
  • Remove the “domain:” from each of the entries by doing a Find and Replace to replace domain: with a blank.
  • Now go back to your link audit spreadsheet. If your domains are in column A and if you had, say, 1500 domains in your disavow file, your formula would look like this:

    =VLOOKUP(A1,Sheet2!$A$1:$B$1500,2,FALSE)

When you copy this formula down the spreadsheet, it will check each of your domains, and if it finds the domain in Sheet 2, it will write “previously disavowed” on our link audit spreadsheet.

Here is a gif that shows the process:

5. Make monthly or quarterly disavow work easier

That same formula described above is a great one to use if you are doing regular repeated link audits. In this case, your second sheet on your spreadsheet would contain domains that you have previously audited, and column B of this spreadsheet would say, “previously audited” rather than “previously disavowed“.

Your tips?

These are just a few of the formulas that you can use to help make link auditing work easier. But there are lots of other things you can do with Excel or Google Sheets to help speed up the process as well. If you have some tips to add, leave a comment below. Also, if you need clarification on any of these tips, I’m happy to answer questions in the comments section.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Everything You Need to Know About Mobile App Search

Posted by Justin_Briggs

Mobile isn’t the future. It’s the present. Mobile apps are not only changing how we interact with devices and websites, they’re changing the way we search. Companies are creating meaningful experiences on mobile-friendly websites and apps, which in turn create new opportunities to get in front of users.

I’d like to explore the growth of mobile app search and its current opportunities to gain visibility and drive engagement.

Rise of mobile app search

The growth of mobile device usage has driven a significant lift in app-related searches. This is giving rise to mobile app search as a vertical within traditional universal search.

While it has been clear for some time that mobile search is important, that importance has been more heavily emphasized by Google recently, as they continue to push
mobile-friendly labels in SERPs, and are likely increasing mobile-friendliness’s weight as a ranking factor.

The future of search marketing involves mobile, and it will not be limited to optimizing HTML webpages, creating responsive designs, and optimizing UX. Mobile SEO is a world where apps, knowledge graph, and conversational search are front and center.

For the
top 10 leading properties online, 34% of visitors are mobile-only (comScore data), and, anecdotally, we’re seeing similar numbers with our clients, if not more.

Mobile device and app growth

It’s also worth noting that
72% of mobile engagement relies on apps vs. on browsers. Looking at teen usage, apps are increasingly dominant. Additionally,
55% of teens use voice search more than once per day

If you haven’t read it, grab some coffee and read
A Teenagers View on Social Media, which is written by a 19-year old who gives his perspective of online behavior. Reading between the lines shows a number of subtle shifts in behavior. I noticed that every time I expected him say website, he said application. In fact, he referenced application 15 times, and it is the primary way he describes social networks.

This means that one of the fasting growing segments of mobile users cannot be marketed to by optimizing HTML webpages alone, requiring search marketers to expand their skills into app optimization.

The mobile app pack

This shift is giving rise to the mobile app pack and app search results, which are triggered on searches from mobile devices in instances of high mobile app intent. Think of these as being similar to local search results. Considering
mobile searcher behavior, these listings dominate user attention.

Mobile app search results and mobile app pack

As with local search, mobile app search can reorder traditional results, completely push them down, or integrate app listings with traditional web results.

You can test on your desktop using a
user-agent switcher, or by searching on your iOS or Android device. 

There are slight differences between iPhone and Android mobile app results:

iOS and Android mobile search result listing

From what I’ve seen, mobile app listings trigger more frequently, and with more results, on Android search results when compared to iOS. Additionally, iOS mobile app listings are represented as a traditional website result listing, while mobile app listings on Android are more integrated.

Some of the differences also come from the differences in app submission guidelines on the two major stores, the Apple App Store and Google Play.

Overview of differences in mobile app results

  1. Title – Google uses the app listing page’s HTML title (which is the app’s title). iOS app titles can exceed 55-62 characters, which causes wrapping and title truncation like a traditional result. Android app title requirements are shorter, so titles are typically shorter on Android mobile app listings.
  2. URL – iOS mobile app listings display the iTunes URL to the App Store as part of the search result.
  3. Icon – iOS icons are square and Android icons have rounded corners.
  4. Design – Android results stand out more, with an “Apps” headline above the pack and a link to Google Play at the end.
  5. App store content – The other differences show up in the copy, ratings, and reviews on each app store.

Ranking in mobile app search results

Ranking in mobile app search results is a
combination of App Store Optimization (ASO) and traditional SEO. The on-page factors are dependent upon your app listing, so optimization starts with having solid ASO. If you’re not familiar with ASO, it’s the process of optimizing your app listing for internal app store search.

Basics of ASO

Ranking in the Apple App Store and in Google Play is driven by two primary factors: keyword alignment and app performance. Text fields in the app store listing, such as title, description, and keyword list, align the app with a particular set of keywords. Performance metrics including download velocity, app ratings, and reviews determine how well the app will rank for each of those keywords. (Additionally, the Google Play algorithm may include external, web-based performance metrics like citations and links as ranking factors.)

App store ranking factors

Mobile app listing optimization

While I won’t explore ASO in-depth here, as it’s very similar to traditional SEO,
optimizing app listings is primarily a function of keyword targeting.

Tools like
Sensor Tower, MobileDevHQ, and App Annie can help you with mobile app keyword research. However, keep in mind that mobile app search listings show up in universal search, so it’s important to leverage traditional keyword research tools like the AdWords Tool or Google Trends.

While there are similarities with ASO, optimizing for these mobile app search listings on the web has some slight differences.

Differences between ASO & mobile app SEO targeting

  1. Titles – While the Apple App Store allows relatively long titles, they are limited to the preview length in organic search. Titles should be optimized with Google search in mind, in addition to optimizing for the app store. Additionally, several apps aggressively target keywords in their app title, but caution should be used as spamming keywords could influence app performance in Google.
  2. Description – The app description on the App Store may not be a factor in internal search, but it will impact external app search results. Leverage keyword targeting best practices when writing your iOS app description, as well as your Android app description.
  3. Device and platform keywords – When targeting for app store search, it is not as important to target terms related to the OS or device. However, these terms can help visibility in external search. Include device and OS terms, such as Android, Samsung Note, iOS, iPad, and iPhone.

App performance optimization

Outside of content optimization, Google looks at the performance of the app. On the Android side, they have access to the data, but for iOS they have to rely on publicly available information.

App performance factors

  • Number of ratings
  • Average rating score
  • Content and sentiment analysis of reviews
  • Downloads / installs
  • Engagement and retention
  • Internal links on app store

For iOS, the primary public metrics are ratings and reviews. However, app performance can be inferred using the App Store’s ranking charts and search results, which can be leveraged as proxies of these performance metrics.


The following objectives will have the greatest influence on your mobile app search ranking:

  1. Increase your average rating number
  2. Increase your number of ratings
  3. Increase downloads

For app ratings and reviews, leverage platforms like
Apptentive to improve your ratings. They are very effective at driving positive ratings. Additionally, paid tactics are a great way to drive install volume and are one area where paid budget capacity could directly influence organic results in Google. Anecdotally, both app stores use rating numbers (typically above or below 4 stars) to make decisions around promoting an app, either through merchandising spots or co-branded campaigns. I suspect this is being used as a general cut-off for what is displayed in universal results. Increasing your rating above 4 stars should improve the likelihood you’ll appear in mobile app search results.

Lastly, think of merchandising and rankings in terms of 
internal linking structures. The more visible you are inside of the app store, the more visibility you have in external search.

App web performance optimization

Lastly, we’re talking Google rankings, so factors like links, citations, and social shares matter. You should be
conducting content marketing, PR, and outreach for your app. Focus on merchandising your app on your own site, as well as increasing coverage of your app (linking to the app store page). The basics of link optimization apply here.

App indexation – drive app engagement

Application search is not limited to driving installs via app search results. With app indexing, you can leverage your desktop/mobile website visibility in organic search to drive engagement with those who have your app installed. Google can discover and expose content deep inside your app directly in search results. This means that when a user clicks on your website in organic search, it can open your app directly, taking them to that exact piece of content in your app, instead of opening your website.

App indexation fundamentally changes technical SEO, extending SEO from server and webpage setup to the setup and optimization of applications.

App indexation on Google

This also fundamentally changes search. Your most avid and engaged user may choose to no longer visit your website. For example, on my Note 4, when I click a link to a site of a brand that I have an app installed for, Google gives me the option not only to open in the app, but to set opening the app as a default behavior.

If a user chooses to open your site in your app, they may never visit your site from organic search again.

App indexation is currently limited to Android devices, but there is evidence to suggest that it’s already in the works and is
soon to be released on iOS devices. There have been hints for some time, but markup is showing up in the wild suggesting that Google is actively working with Apple and select brands to develop iOS app indexing.

URI optimization for apps

The first step in creating an indexable app is to set up your app to support deep links. Deep links are URIs that are understood by your app and will open up a specific piece of content. They are effectively URLs for applications.

Once this URI is supported, a user can be sent to deep content in the app. These can be discovered as alternates to your desktop site’s URLs, similar to how
separate-site mobile sites are defined as alternate URLs for the desktop site. In instances of proper context (on an Android device with the app installed), Google can direct a user to the app instead of the website.

Setting this up requires working with your app developer to implement changes inside the app as well as working with your website developers to add references on your desktop site.

Adding intent filters

Android has
documented the technical setup of deep links in detail, but it starts with setting up intent filters in an app’s Android manifest file. This is done with the following code.

<activity android:name="com.example.android.GizmosActivity"
android:label="@string/title_gizmos" >
<intent-filter android:label="@string/filter_title_viewgizmos">
<action android:name="android.intent.action.VIEW" />
<data android:scheme="http"
android:host="example.com"
android:pathPrefix="/gizmos" />
<category android:name="android.intent.category.DEFAULT" />
<category android:name="android.intent.category.BROWSABLE" />
</intent-filter>
</activity>

This dictates the technical optimization of your app URIs for app indexation and defines the elements used in the URI example above.

  • The <intent-filter> element should be added for activities that should be launchable from search results.
  • The <action> element specifies the ACTION_VIEW intent action so that the intent filter can be reached from Google Search.
  • The <data> tag represents a URI format that resolves to the activity. At minimum, the <data> tag must include the android:scheme attribute.
  • Include the BROWSABLE category. The BROWSABLE category is required in order for the intent filter to be accessible from a web browser. Without it, clicking a link in a browser cannot resolve to your app. The DEFAULT category is optional, but recommended. Without this category, the activity can be started only with an explicit intent, using your app component name.

Testing deep links

Google has created tools to help test your deep link setup. You can use
Google’s Deep Link Test Tool to test your app behavior with deep links on your phone. Additionally, you can create an HTML page with an intent:// link in it.

For example
:

<a href="intent://example.com/page-1#Intent;scheme=http;package=com.example.android;end;"> <a href="http://example.com/page-1">http://example.com/page-1></a>

This link would open up deep content inside the app from the HTML page.

App URI crawl and discovery

Once an app has deep link functionality, the next step is to
ensure that Google can discover these URIs as part of its traditional desktop crawling.

Ways to get apps crawled

  1. Rel=”alternate” in HTML head
  2. ViewAction with Schema.org
  3. Rel=”alternate” in XML Sitemap

Implementing all three will create clear signals, but at minimum you should add the rel=”alternate” tag to the HTML head of your webpages.

Effectively, think of the app URI as being similar to a mobile site URL when
setting up a separate-site mobile site for SEO. The mobile deep link is an alternative way to view a webpage on your site. You map a piece of content on your site to a corresponding piece of content inside the app.

Before you get started, be sure to
verify your website and app following the guidelines here. This will verify your app in Google Play Developer Console and Google Webmaster Tools.

#1: Rel=”alternate” in HTML head

On an example page, such as example.com/page-1, you would add the following code to the head of the document. Again, very similar to separate-site mobile optimization.

<html>
<head> 
... 
<link rel="alternate" href="android-app://com.example.android/http/example.com/page-1" /> 
...
</head>
<body>
</body>
#2: ViewAction with Schema.org

Additionally, you can reference the deep link using Schema.org and JSON by using a 
ViewAction.

<script type="application/ld+json"> 
{ 
"@context": "http://schema.org", 
"@type": "WebPage", 
"@id": "http://example.com/gizmos", 
"potentialAction": { 
"@type": "ViewAction", 
"target": "android-app://com.example.android/http/example.com/gizmos" 
} 
} 
</script>
#3 Rel=”alternate” in XML sitemap

Lastly, you can reference the alternate URL in your XML Sitemaps, similar to using the rel=”alternate” for mobile sites.

<?xml version="1.0" encoding="UTF-8" ?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xhtml="http://www.w3.org/1999/xhtml"> 
<url> 
<loc>http://example.com/page-1</loc> 
<xhtml:link rel="alternate" href="android-app://com.example.android/http/example.com/page-1" /> 
</url> 
... 
</urlset>

Once these are in place, Google can discover the app URI and provide your app as an alternative way to view content found in search.

Bot control and robots noindex for apps

There may be instances where there is content within your app that you do not want indexed in Google. A good example of this might be content or functionality that is built out on your site, but has not yet been developed in your app. This would create an inferior experience for users. The good news is that we can block indexation with a few updates to the app.

First, add the following to your app resource directory (res/xml/noindex.xml).

<?xml version="1.0" encoding="utf-8"?> 
<search-engine xmlns:android="http://schemas.android.com/apk/res/android"> 
<noindex uri="http://example.com/gizmos/hidden_uri"/> 
<noindex uriPrefix="http://example.com/gizmos/hidden_prefix"/> 
<noindex uri="gizmos://hidden_path"/> 
<noindex uriPrefix="gizmos://hidden_prefix"/> 
</search-engine>

As you can see above, you can block an individual URI or define a URI prefix to block entire folders.

Once this has been added, you need to update the AndroidManifest.xml file to denote that you’re using noindex.html to block indexation.

<manifest xmlns:android="http://schemas.android.com/apk/res/android" package="com.example.android.Gizmos"> 
<application> 
<activity android:name="com.example.android.GizmosActivity" android:label="@string/title_gizmos" > 
<intent-filter android:label="@string/filter_title_viewgizmos"> 
<action android:name="android.intent.action.VIEW"/> 
... 
</activity> 
<meta-data android:name="search-engine" android:resource="@xml/noindex"/> 
</application> 
<uses-permission android:name="android.permission.INTERNET"/> 
</manifest>

App indexing API to drive re-engagement

In addition to URI discovery via desktop crawl, your mobile app can integrate
Google’s App Indexing API, which communicates with Google when users take actions inside your app. This sends information to Google about what users are viewing in the app. This is an additional method for deep link discovery and has some benefits.

The primary benefit is the ability to appear in
autocomplete. This can drive re-engagement through Google Search query autocompletions, providing access to inner pages in apps.

App auto suggest

Again, be sure to
verify your website and app following the guidelines here. This will verify your app in Google Play Developer Console and Google Webmaster Tools.

App actions with knowledge graph

The next, and most exciting, evolution of search is leveraging actions. These will be powerful when
combined with voice search, allowing search engines to take action on behalf of users, turning spoken language into executed actions.

App indexing allows you to take advantage of actions by allowing Google to not only launch an app, but execute actions inside of the app. Order me a pizza? Schedule my meeting? Drive my car? Ok, Google.

App actions work via entity detection and the application of the knowledge graph, allowing search engines to understand actions, words, ideas and objects. With that understanding, they can build an action graph that allows them to define common actions by entity type.

Here is a list of actions currently supported by Schema.org

For example, the PlayAction could be used to play a song in a music app. This can be achieve with the following markup.

<script type="application/ld+json">
{
"@context": "http://schema.org",
"@type": "MusicGroup",
"name": "Weezer", "potentialAction": {
"@type": "ListenAction",
"target": "android-app://com.spotify.music/http/we.../listen"
}
}
</script>
Once this is implemented, these app actions can begin to appear in search results and knowledge graph.

deep links in app search results

Overview of mobile app search opportunities

In summary, there are five primary ways to increase visibility and engagement for your mobile app in traditional organic search efforts.

Mobile apps in search results

The growth of mobile search is transforming how we define technical SEO, moving beyond front-end and back-end optimization of websites into the realm of structured data and application development. As app indexing expands to include iOS, I suspect the possibilities and opportunities associated with indexing applications, and their corresponding actions, to grow extensively. 

For those with Android apps, app indexing is a potential leapfrog style opportunity to get ahead of competitors who are dominant in traditional desktop search. Those with iOS devices should start by optimizing their app listings, while preparing to implement indexation, as I suspect it’ll be released for iOS this year.

Have you been leveraging traditional organic search to drive visibility and engagement for apps? Share your experiences in the comments below.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

For Writers Only: Secrets to Improving Engagement on Your Content Using Word Pictures (and I Don’t Mean Wordle)

Posted by Isla_McKetta

“Picture it.”

If you’re of a certain generation, those two words can only conjure images of tiny, white-haired Sophia from the Golden Girls about to tell one of her engaging (if somewhat long and irrelevant) stories as she holds her elderly roommates hostage in the kitchen or living room of their pastel-hued Miami home.

Even if you have no idea what I’m talking about, those words should become your writing mantra, because what readers do with your words is take all those letters and turn them into mind pictures. And as the writer, you have control over what those pictures look like and how long your readers mull them over.

According to
Reading in the Brain by Stanislas Dehaene, reading involves a rich back and forth between the language areas and visual areas of our brains. Although the full extent of that connectivity is not yet known, it’s easy to imagine that the more sensory (interesting) information we can include in our writing, the more fully we can engage our readers.

So if you’re a writer or content marketer you should be harnessing the illustrative power of words to occupy your readers’ minds and keep them interested until they’re ready to convert. Here’s how to make your words
work for you.

Kill clichés

I could have titled this piece “Painting a Picture with Words” but you’ve heard it. Over and over and over. And I’m going to propose that every time you use a cliché, a puppy dies. 

While that’s a bit extreme (at least I hope so because that’s a lot of dead puppies and Rocky’s having second thoughts about his choice of parents), I hope it will remind you to read over what you’ve written and see where your attention starts to wander (wandering attention=cliché=one more tragic, senseless death) you get bored. Chances are it’s right in the middle of a tired bit of language that used to be a wonderful word picture but has been used and abused to the point that we readers can’t even summon the image anymore.

Make up metaphors (and similes)

Did you know that most clichés used to be metaphors? And that we overused them because metaphors are possibly the most powerful tool we have at our disposal for creating word pictures (and, thus, engaging content)? You do now.

By making unexpected comparisons, metaphors and similes force words to perform like a stage mom on a reality show. These comparisons shake our brains awake and force us to pay attention. So apply a whip to your language. Make it dance like a ballerina in a little pink tutu. Give our brains something interesting to sink our teeth into (poor Rocky!), gnaw on, and share with out friends.

Engage the senses

If the goal of all this attention to language is to turn reading into a full brain experience, why not make it a little easier by including sensory information in whatever you’re writing? Here are a few examples:

  • These tickets are selling so fast we can smell the burning rubber.
  • Next to a crumbling cement pillar, our interview subject sits typing on his pristine MacBook Air.
  • In a sea of (yelp!) never ending horde of black and gray umbrellas, this red cowboy hat will show the world you own your look.
  • Black hat tactics left your SERPs stinking as bad as a garbage strike in late August? Let us help you clear the air by cleaning up those results.

See how those images and experiences continue to unfold and develop in your mind? You have the power to affect your readers the same way—to create an image so powerful it stays with them throughout their busy days. One note of caution, though, sensory information is so strong that you want to be careful when creating potentially negative associations (like that garbage strike stench in the final example).

Leverage superlatives (wisely) and ditch hyperbole

SUPERLATIVES ARE THE MOST EFFECTIVEST TOOL YOU CAN USE EVER (until you wear your reader out or lose their trust). Superlatives (think “best,” “worst,” “hairiest” – any form of the adjective or adverb that is the most exaggerated form of the word) are one of the main problems with clickbait headlines (the other being the failure to deliver on those huge promises).

Speaking of exaggeration, be careful with it in all of its forms. You don’t actually have to stop using it, but think of your reader’s credence in your copy as a grasshopper handed over by a child. They think it’s super special and they want you to as well. If you mistreat that grasshopper by piling exaggerated fact after exaggerated fact on top of it, the grasshopper will be crushed and your reader will not easily forgive you.

So how do you stand out in a crowded field of over-used superlatives and hyperbolic claims? Find the places your products honestly excel and tout those. At Moz we don’t have the largest link index in the world. Instead, we have a really high quality link index. I could have obfuscated there and said we have “the best” link index, but by being specific about what we’re actually awesome at, we end up attracting customers who want better results instead of more results (and they’re happier for it).

Unearth the mystery

One of the keys to piquing your audience’s interest is to tap into (poor puppy!) create or find the mystery in what you’re writing. I’m not saying your product description will suddenly feature PIs in fedoras (I can dream, though), but figure out what’s intriguing or new about what you’re talking about. Here are some examples:

  • Remember when shortcuts meant a few extra minutes to yourself after school? How will you spend the 15-30 minutes our email management system will save you? We won’t tell…
  • You don’t need to understand how this toilet saves water while flushing so quietly it won’t wake the baby, just enjoy a restful night’s sleep (and lower water bills)
  • Check out this interactive to see what makes our work boots more comfortable than all the rest.

Secrets, surprises, and inside information make readers hunger for more knowledge. Use that power to get your audience excited about the story you’re about to tell them.

Don’t forget the words around your imagery

Notice how some of these suggestions aren’t about the word picture itself, they’re about the frame around the picture? I firmly believe that a reader comes to a post with a certain amount of energy. You can waste that energy by soothing them to sleep with boring imagery and clichés, while they try to find something to be interested in. Or you can give them energy by giving them word pictures they can get excited about.

So picture it. You’ve captured your reader’s attention with imagery so engaging they’ll remember you after they put down their phone, read their social streams (again), and check their email. They’ll come back to your site to read your content again or to share that story they just can’t shake.

Good writing isn’t easy or fast, but it’s worth the time and effort.

Let me help you make word pictures

Editing writing to make it better is actually one of my great pleasures in life, so I’m going to make you an offer here. Leave a sentence or two in the comments that you’re having trouble activating, and I’ll see what I can do to offer you some suggestions. Pick a cliché you can’t get out of your head or a metaphor that needs a little refresh. Give me a little context for the best possible results.

I’ll do my best to help the first 50 questions or so (I have to stop somewhere or I’ll never write the next blog post in this series), so ask away. I promise no puppies will get hurt in the process. In fact, Rocky’s quite happy to be the poster boy for this post—it’s the first time we’ve let him have beach day, ferry day, and all the other spoilings all at once.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Google Webmaster Tools Just Got a Lot More Important for Link Discovery and Cleanup

Posted by RobertFisher

What if you owned a paid directory site and every day you received emails upon emails stating that someone wants links removed. As they stacked up in your inbox, whether they were pleasant or they were sternly demanding you cease and desist, would you just want to give up?
What would you do to stop the barrage of emails if you thought the requests were just too overwhelming? How could you make it all go away, or at least the majority of it?

First, a bit of background

We had a new, important client come aboard on April 1, 2013 with a lot of work needed going forward. They had been losing rankings for some time and wanted help. With new clients, we want as much baseline data as possible so that we can measure progress going forward, so we do a lot of monitoring. On April 17th, one of our team members noticed something quite interesting. Using Ahrefs for link tracking, we saw there was a big spike in the number of external links coming to our new client’s site. 

When the client came on board on two weeks prior, the site had about 5,500 links coming in and many of those were less than quality. Likely half or more were comment links from sites with no relevance to the client and they used the domain as the anchor text. Now, overnight they were at 6,100 links and the next day even more. Each day the links kept increasing. We saw they were coming from a paid directory called Netwerker.com. Within a month to six weeks, they were at over 30,000 new links from that site.

We sent a couple of emails asking that they please stop the linking, and we watched Google Webmaster Tools (GWT) every day like hawks waiting for the first link from Netwerker to show. The emails got no response, but in late May we saw the first links from there show up in GWT and we submitted a domain disavow immediately.

We launched their new site in late June and watched as they climbed in the rankings; that is a great feeling. Because the site was rising in the rankings rather well, we assumed the disavow tool had worked on Netwerker. Unfortunately, there was a cloud on the horizon concerning all of the link building that had been done for the client prior to our engagement. October arrived with a Penguin attack
(Penguin 2.1, Oct. 4, 2013) and they fell considerably in the SERPs. I mean, they disappeared for many of the best terms they had again began to rank for. They had fallen to page five or deeper for key terms. (NOTE: This was all algorithmic and they had no manual penalty.)

While telling the client that their new drop was a Penguin issue related to the October Penguin update (and the large ratio of really bad links), we also looked for anything else that would cause the issue or might be affecting the results. We are constantly monitoring and changing things with our clients. As a result, there are times we do not make a good change and we have to move things back. (We always tell the client if we have caused a negative impact on their rankings, etc. This is one of the most important things we ever do in building trust over time and we have never lost a client because we made a mistake.) We went through everything thoroughly and eliminated any other potential causative factors. At every turn there was a Penguin staring back at us!

When we had launched the new site in late June 2013, we had seen them rise back to page one for key terms in a competitive vertical. Now, they were missing for the majority of their most important terms. In mid-March of 2014, nearly a year after engagement, they agreed to do a severe link clean up and we began immediately. There would be roughly 45,000 – 50,000 links to clean up, but with 30,000 from the one domain already appropriately disavowed, it was a bit less daunting. I have to say here that I believe their reticence to do the link cleanup was due to really bad SEO in the past. They had, over time, had several SEO people/firms and at every turn, they were given poor advice. I believe they were misinformed into believing that high rankings were easy to get and there were “tricks” that would fool Google so you could pull it off. So, it really isn’t a client’s fault when they believe things are easy in the world of SEO.

Finally, it begins to be fun

About two weeks in, we saw them start to pop up randomly in the rankings. We were regularly getting responses back from linking sites. Some responses were positive and some were requests for money to remove the links; the majority gave us the famous “no reply.” But, we were making progress and beginning to see a result. Around the first or second week of April their most precious term, geo location + product/service, was ranked number one and their rich snippets were beautiful. It came and went over the next week or two, staying longer each time.

To track links we use MajesticSEO, Ahrefs, Open Site Explorer, and Google Webmaster Tools. As the project progressed, our Director of Content and Media who was overseeing the project could not understand why so many links were falling off so quickly. Frankly, we were not getting that many agreeing to remove them.

Here is a screenshot of the lost links from Ahrefs.

Here are the lost links in MajesticSEO.

MajesticSEO Lost Links March to May

We were seeing links fall off as if the wording we had used in our emails to the sites was magical. This caused a bit of skepticism on our team’s part so they began to dig deeper. It took little time to realize the majority of the links that were falling off were from Netwerker! (Remember, a disavow does not keep the links from showing in the link research tools.) Were they suddenly good guys and willing to clear it all up? Had our changed wording caused a change of heart? No, the links from Netwerker still showed in GWT; Webmaster Tools had never shown all from Netwerker, only about 13,000, and it was still showing 13,000. But, was that just because Google was slower at showing the change? To check we did a couple of things. First, we just tried out the links that were “lost” and we saw they still resolved to the site, so we dug some more.

Using a bit of magic in the form of a
User-Agent Switcher extension and eSolutions, What’s my info? (to verify the correct user-agent was being presented), our head of development ran the user-agent string for Ahrefs and MajesticSEO. What he found was that Netwerker was now starting to block MajesticSEO and Ahrefs via a 406 response. We were unable to check Removeem, but the site was not yet blocking OSE. Here are some screenshots to show the results we are seeing. Notice in the first screenshot, all is well with Googlebot.


But A Different Story for Ahrefs


And a Different Story for MajesticSEO

We alerted both Ahrefs and MajesticSEO and neither responded beyond we will look into it canned response. We thought it important to let those dealing with link removal know to look even more carefully. Now August and three months in, both maintain the original response.

User-agents and how to run these tests

The user-agent or user-agent string is sent to the server along with any request. This allows the server to determine the best response to deliver based on conditions set up by its developers. It appears in the case of Netwerker’s servers that the response is to deny access to certain user-agents.

  1. We used the User-Agent Switcher extension for Chrome
  2. Next determine the user-agent string you would like to check (these can be found on various sites, one set of examples can be found at: http://www.useragentstring.com/. In most cases, the owner of the crawler or browser will have a webpage associated with them, for example the Ahrefs bot.)
  3. Within the User-Agent Switcher extension, open the options panel and add the new user-agent string.
  4. Browse to the site you would like to check.
  5. Using the User-Agent Switcher select the Agent you would like to view the site as, it will reload the page and you will be viewing it as the new user-agent string.
  6. We used eSolutions, What’s my info? to verify that the User-Agent Switcher was presenting the correct data to us.

A final summary

If you talk with anyone who is known for link removal (think people like Ryan Kent of Vitopian, an expert in Link cleanup), they will tell you to use every link report you can get your hands on to ensure you miss nothing. They always include Google Webmaster Tools as an important tool. Personally, while we always use GWT, early on I did not think GWT was important for other than checking to see if we missed anything due to them consistently showing less links than others and all of the links showing in GWT are usually showing in the other tools. My opinion has changed with this revelation.

Given we gather data on clients early on, we had something to refer back to with the link clean-up; today if someone comes in and we have no history of their links, we must assume they will have links from sites blocking major link discovery tools and we have a heightened sense of caution. We will not believe we have cleaned everything ever again; we can believe we cleaned everything in GWT.

If various directories and other sites with a lot of outbound links start blocking link discovery tools because they, “just don’t want to hear any more removal requests,” GWT just became your most important tool for catching the ones that block the tools. They would not want to block Google or Bing for the obvious reasons.

So, as you go forward and you look at links with your own site and/or with clients, I suggest that you go to GWT to make sure there is not something showing there which fails to show in the well-known link discovery tools.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Announcing the All-New Beginner’s Guide to Link Building

Posted by Trevor-Klein

It is my great pleasure to announce the release of Moz’s third guide for marketers, written by the inimitable 
Paddy Moogan of Distilled:

We could tell you all about how high-quality, authoritative links pointing to your site benefit your standing in the SERPs, but instead we’ll just copy the words straight from the proverbial horse’s mouth:

<img style="float: left; width: 64px; border-radius: 3px; margin-right: 20px; margin-left: 60px;" src="http://d2v4zi8pl64nxt.cloudfront.net/beginners-guide-to-link-building/53f24cf6adc0a8.24313709.jpg"

“Backlinks, even though there’s some noise and certainly a lot of spam, for the most part are still a really, really big win in terms of quality for search results.”
— Matt Cutts, head of the webspam team at Google, 
2/19/14

Link building is one area of SEO that has changed significantly over the last several years; 
some tactics that were once effective are now easily identifiable and penalized by Google. At the same time, earning links remains vital to success in search marketing: Link authority features showed the strongest correlation with higher rankings in our 2013 ranking factors survey. For that reason, it has never been more important for marketers to truly earn their links, and this guide will have you building effective campaigns in no time.


What you’ll learn


1. What is Link Building, and Why Is It Important?


This is where it all begins. If you’re brand new to link building and aren’t sure whether or not it’s a good tactic to include in your marketing repertoire, give this chapter a look. Even the more seasoned link earners among us could use a refresher from time to time, and here we cover everything from what links mean to search engines to the various ways they can help your business’s bottom line.


2. Types of Links (Both Good and Bad)

Before you dive into building links of your own, it’s important to understand the three main types of links and why you should really only be thinking about two of them. That’s what this short and sweet chapter is all about.


3. How to Start a Link Building Campaign

Okay, enough with the theory; it’s time for the nitty-gritty. This chapter takes a deep dive into every step of a link building campaign, offering examples and templates you can use to build your own foundation. 


4. Link Building Tactics

Whether through ego bait or guest blogging (yes, that’s 
still a viable tactic!), there are several approaches you can take to building a strong link profile. This chapter takes a detailed run through the tactics you’re most likely to employ.


5. Link Building Metrics

Now that the links are rolling in, how do you prove to ourselves and our clients that our work is paying off? The metrics outlined in this chapter, along with the tools recommended to measure them, offer a number of options for your reports.


6. The Good, the Bad, and the Ugly of Link Building

If we’re preaching to the choir with this chapter, then we’re thrilled, because spammy links can lead to severe penalties. Google has gotten incredibly good at picking out and penalizing spammy link building techniques, and if this chapter isn’t enough to make you put your white hat on, nothing is.


7. Advanced Link Building Tips and Tricks

Mastered the rest of what the guide has to offer? Earning links faster than 
John Paulson earns cash? Here are a few tips to take your link building to the next level. Caution: You may or may not find yourself throwing fireballs after mastering these techniques.


The PDF

When we released the Beginner’s Guide to Social Media, there was an instant demand for a downloadable PDF version. This time, it’s ready from the get-go (big thanks to David O’Hara!).

Click here to download the PDF.

Thanks

We simply can’t thank Paddy Moogan enough for writing this guide. His expertise and wisdom made the project possible. Thanks as well to Ashley Tate for wrangling the early stages of the project, Cyrus Shepard for his expert review and a few key additions, Derric Wise and David O’Hara for bringing it to life with their art, and Andrew Palmer for seamlessly translating everything onto the web.

Now, go forth and earn those links!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Experiment: We Removed a Major Website from Google Search, for Science!

Posted by Cyrus-Shepard

The folks at Groupon surprised us earlier this summer when they reported the
results of an experiment that showed that up to 60% of direct traffic is organic.

In order to accomplish this, Groupon de-indexed their site, effectively removing themselves from Google search results. That’s crazy talk!

Of course, we knew we had to try this ourselves.

We rolled up our sleeves and chose to de-index
Followerwonk, both for its consistent Google traffic and its good analytics setup—that way we could properly measure everything. We were also confident we could quickly bring the site back into Google’s results, which minimized the business risks.

(We discussed de-indexing our main site moz.com, but… no soup for you!)

We wanted to measure and test several things:

  1. How quickly will Google remove a site from its index?
  2. How much of our organic traffic is actually attributed as direct traffic?
  3. How quickly can you bring a site back into search results using the URL removal tool?

Here’s what happened.

How to completely remove a site from Google

The fastest, simplest, and most direct method to completely remove an entire site from Google search results is by using the
URL removal tool

We also understood, via statements form Google engineers, that using this method gave us the biggest chance of bringing the site back, with little risk. Other methods of de-indexing, such as using meta robots NOINDEX, might have taken weeks and caused recovery to take months.

CAUTION: Removing any URLs from a search index is potentially very dangerous, and should be taken very seriously. Do not try this at home; you will not pass go, and will not collect $200!

CAUTION: Removing any URLs from a search index is potentially very dangerous, and should be taken very seriously. Do not try this at home; you will not pass go, and will not collect $200!

After submitting the request, Followerwonk URLs started
disappearing from Google search results in 2-3 hours

The information needs to propagate across different data centers across the globe, so the effect can be delayed in areas. In fact, for the entire duration of the test, organic Google traffic continued to trickle in and never dropped to zero.

The effect on direct vs. organic traffic

In the Groupon experiment, they found that when they lost organic traffic, they
actually lost a bunch of direct traffic as well. The Groupon conclusion was that a large amount of their direct traffic was actually organic—up to 60% on “long URLs”.

At first glance, the overall amount of direct traffic to Followerwonk didn’t change significantly, even when organic traffic dropped.

In fact, we could find no discrepancy in direct traffic outside the expected range.

I ran this by our contacts at Groupon, who said this wasn’t totally unexpected. You see, in their experiment they saw the biggest drop in direct traffic on
long URLs, defined as a URL that is at least as long enough to be in a subfolder, like https://followerwonk.com/bio/?q=content+marketer.

For Followerwonk, the vast majority of traffic goes to the homepage and a handful of other URLs. This means we didn’t have a statistically significant sample size of long URLs to judge the effect. For the long URLs we were able to measure, the results were nebulous. 

Conclusion: While we can’t confirm the Groupon results with our outcome, we can’t discount them either.

It’s quite likely that a portion of your organic traffic is attributed as direct. This is because of different browsers, operating systems and user privacy settings can potentially block referral information from reaching your website.

Bringing your site back from death

After waiting 2 hours,
we deleted the request. Within a few hours all traffic returned to normal. Whew!

Does Google need to recrawl the pages?

If the time period is short enough, and you used the URL removal tool, apparently not.

In the case of Followerwonk, Google removed over
300,000 URLs from its search results, and made them all reappear in mere hours. This suggests that the domain wasn’t completely removed from Google’s index, but only “masked” from appearing for a short period of time.

What about longer periods of de-indexation?

In both the Groupon and Followerwonk experiments, the sites were only de-indexed for a short period of time, and bounced back quickly.

We wanted to find out what would happen if you de-indexed a site for a longer period, like
two and a half days?

I couldn’t convince the team to remove any of our sites from Google search results for a few days, so I choose a smaller personal site that I often subject to merciless SEO experiments.

In this case, I de-indexed the site and didn’t remove the request until three days later. Even with this longer period, all URLs returned within just
a few hours of cancelling the URL removal request.

In the chart below, we revoked the URL removal request on Friday the 25th. The next two days were Saturday and Sunday, both lower traffic days.

Test #2: De-index a personal site for 3 days

Likely, the URLs were still in Google’s index, so we didn’t have to wait for them to be recrawled. 

Here’s another shot of organic traffic before and after the second experiment.

For longer removal periods, a few weeks for example, I speculate Google might drop these semi-permanently from the index and re-inclusion would comprise a much longer time period.

What we learned

  1. While a portion of your organic traffic may be attributed as direct (due to browsers, privacy settings, etc) in our case the effect on direct traffic was negligible.
  2. If you accidentally de-index your site using Google Webmaster Tools, in most cases you can quickly bring it back to life by deleting the request.
  3. Reinclusion happens quickly even after we removed a site for over 2 days. Longer than this, the result is unknown, and you could have problems getting all the pages of your site indexed again.

Further reading

Moz community member Adina Toma wrote an excellent YouMoz post on the re-inclusion process using the same technique, with some excellent tips for other, more extreme situations.

Big thanks to
Peter Bray for volunteering Followerwonk for testing. You are a brave man!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]