Everything You Need to Know About Mobile App Search

Posted by Justin_Briggs

Mobile isn’t the future. It’s the present. Mobile apps are not only changing how we interact with devices and websites, they’re changing the way we search. Companies are creating meaningful experiences on mobile-friendly websites and apps, which in turn create new opportunities to get in front of users.

I’d like to explore the growth of mobile app search and its current opportunities to gain visibility and drive engagement.

Rise of mobile app search

The growth of mobile device usage has driven a significant lift in app-related searches. This is giving rise to mobile app search as a vertical within traditional universal search.

While it has been clear for some time that mobile search is important, that importance has been more heavily emphasized by Google recently, as they continue to push
mobile-friendly labels in SERPs, and are likely increasing mobile-friendliness’s weight as a ranking factor.

The future of search marketing involves mobile, and it will not be limited to optimizing HTML webpages, creating responsive designs, and optimizing UX. Mobile SEO is a world where apps, knowledge graph, and conversational search are front and center.

For the
top 10 leading properties online, 34% of visitors are mobile-only (comScore data), and, anecdotally, we’re seeing similar numbers with our clients, if not more.

Mobile device and app growth

It’s also worth noting that
72% of mobile engagement relies on apps vs. on browsers. Looking at teen usage, apps are increasingly dominant. Additionally,
55% of teens use voice search more than once per day

If you haven’t read it, grab some coffee and read
A Teenagers View on Social Media, which is written by a 19-year old who gives his perspective of online behavior. Reading between the lines shows a number of subtle shifts in behavior. I noticed that every time I expected him say website, he said application. In fact, he referenced application 15 times, and it is the primary way he describes social networks.

This means that one of the fasting growing segments of mobile users cannot be marketed to by optimizing HTML webpages alone, requiring search marketers to expand their skills into app optimization.

The mobile app pack

This shift is giving rise to the mobile app pack and app search results, which are triggered on searches from mobile devices in instances of high mobile app intent. Think of these as being similar to local search results. Considering
mobile searcher behavior, these listings dominate user attention.

Mobile app search results and mobile app pack

As with local search, mobile app search can reorder traditional results, completely push them down, or integrate app listings with traditional web results.

You can test on your desktop using a
user-agent switcher, or by searching on your iOS or Android device. 

There are slight differences between iPhone and Android mobile app results:

iOS and Android mobile search result listing

From what I’ve seen, mobile app listings trigger more frequently, and with more results, on Android search results when compared to iOS. Additionally, iOS mobile app listings are represented as a traditional website result listing, while mobile app listings on Android are more integrated.

Some of the differences also come from the differences in app submission guidelines on the two major stores, the Apple App Store and Google Play.

Overview of differences in mobile app results

  1. Title – Google uses the app listing page’s HTML title (which is the app’s title). iOS app titles can exceed 55-62 characters, which causes wrapping and title truncation like a traditional result. Android app title requirements are shorter, so titles are typically shorter on Android mobile app listings.
  2. URL – iOS mobile app listings display the iTunes URL to the App Store as part of the search result.
  3. Icon – iOS icons are square and Android icons have rounded corners.
  4. Design – Android results stand out more, with an “Apps” headline above the pack and a link to Google Play at the end.
  5. App store content – The other differences show up in the copy, ratings, and reviews on each app store.

Ranking in mobile app search results

Ranking in mobile app search results is a
combination of App Store Optimization (ASO) and traditional SEO. The on-page factors are dependent upon your app listing, so optimization starts with having solid ASO. If you’re not familiar with ASO, it’s the process of optimizing your app listing for internal app store search.

Basics of ASO

Ranking in the Apple App Store and in Google Play is driven by two primary factors: keyword alignment and app performance. Text fields in the app store listing, such as title, description, and keyword list, align the app with a particular set of keywords. Performance metrics including download velocity, app ratings, and reviews determine how well the app will rank for each of those keywords. (Additionally, the Google Play algorithm may include external, web-based performance metrics like citations and links as ranking factors.)

App store ranking factors

Mobile app listing optimization

While I won’t explore ASO in-depth here, as it’s very similar to traditional SEO,
optimizing app listings is primarily a function of keyword targeting.

Tools like
Sensor Tower, MobileDevHQ, and App Annie can help you with mobile app keyword research. However, keep in mind that mobile app search listings show up in universal search, so it’s important to leverage traditional keyword research tools like the AdWords Tool or Google Trends.

While there are similarities with ASO, optimizing for these mobile app search listings on the web has some slight differences.

Differences between ASO & mobile app SEO targeting

  1. Titles – While the Apple App Store allows relatively long titles, they are limited to the preview length in organic search. Titles should be optimized with Google search in mind, in addition to optimizing for the app store. Additionally, several apps aggressively target keywords in their app title, but caution should be used as spamming keywords could influence app performance in Google.
  2. Description – The app description on the App Store may not be a factor in internal search, but it will impact external app search results. Leverage keyword targeting best practices when writing your iOS app description, as well as your Android app description.
  3. Device and platform keywords – When targeting for app store search, it is not as important to target terms related to the OS or device. However, these terms can help visibility in external search. Include device and OS terms, such as Android, Samsung Note, iOS, iPad, and iPhone.

App performance optimization

Outside of content optimization, Google looks at the performance of the app. On the Android side, they have access to the data, but for iOS they have to rely on publicly available information.

App performance factors

  • Number of ratings
  • Average rating score
  • Content and sentiment analysis of reviews
  • Downloads / installs
  • Engagement and retention
  • Internal links on app store

For iOS, the primary public metrics are ratings and reviews. However, app performance can be inferred using the App Store’s ranking charts and search results, which can be leveraged as proxies of these performance metrics.


The following objectives will have the greatest influence on your mobile app search ranking:

  1. Increase your average rating number
  2. Increase your number of ratings
  3. Increase downloads

For app ratings and reviews, leverage platforms like
Apptentive to improve your ratings. They are very effective at driving positive ratings. Additionally, paid tactics are a great way to drive install volume and are one area where paid budget capacity could directly influence organic results in Google. Anecdotally, both app stores use rating numbers (typically above or below 4 stars) to make decisions around promoting an app, either through merchandising spots or co-branded campaigns. I suspect this is being used as a general cut-off for what is displayed in universal results. Increasing your rating above 4 stars should improve the likelihood you’ll appear in mobile app search results.

Lastly, think of merchandising and rankings in terms of 
internal linking structures. The more visible you are inside of the app store, the more visibility you have in external search.

App web performance optimization

Lastly, we’re talking Google rankings, so factors like links, citations, and social shares matter. You should be
conducting content marketing, PR, and outreach for your app. Focus on merchandising your app on your own site, as well as increasing coverage of your app (linking to the app store page). The basics of link optimization apply here.

App indexation – drive app engagement

Application search is not limited to driving installs via app search results. With app indexing, you can leverage your desktop/mobile website visibility in organic search to drive engagement with those who have your app installed. Google can discover and expose content deep inside your app directly in search results. This means that when a user clicks on your website in organic search, it can open your app directly, taking them to that exact piece of content in your app, instead of opening your website.

App indexation fundamentally changes technical SEO, extending SEO from server and webpage setup to the setup and optimization of applications.

App indexation on Google

This also fundamentally changes search. Your most avid and engaged user may choose to no longer visit your website. For example, on my Note 4, when I click a link to a site of a brand that I have an app installed for, Google gives me the option not only to open in the app, but to set opening the app as a default behavior.

If a user chooses to open your site in your app, they may never visit your site from organic search again.

App indexation is currently limited to Android devices, but there is evidence to suggest that it’s already in the works and is
soon to be released on iOS devices. There have been hints for some time, but markup is showing up in the wild suggesting that Google is actively working with Apple and select brands to develop iOS app indexing.

URI optimization for apps

The first step in creating an indexable app is to set up your app to support deep links. Deep links are URIs that are understood by your app and will open up a specific piece of content. They are effectively URLs for applications.

Once this URI is supported, a user can be sent to deep content in the app. These can be discovered as alternates to your desktop site’s URLs, similar to how
separate-site mobile sites are defined as alternate URLs for the desktop site. In instances of proper context (on an Android device with the app installed), Google can direct a user to the app instead of the website.

Setting this up requires working with your app developer to implement changes inside the app as well as working with your website developers to add references on your desktop site.

Adding intent filters

Android has
documented the technical setup of deep links in detail, but it starts with setting up intent filters in an app’s Android manifest file. This is done with the following code.

<activity android:name="com.example.android.GizmosActivity"
android:label="@string/title_gizmos" >
<intent-filter android:label="@string/filter_title_viewgizmos">
<action android:name="android.intent.action.VIEW" />
<data android:scheme="http"
android:host="example.com"
android:pathPrefix="/gizmos" />
<category android:name="android.intent.category.DEFAULT" />
<category android:name="android.intent.category.BROWSABLE" />
</intent-filter>
</activity>

This dictates the technical optimization of your app URIs for app indexation and defines the elements used in the URI example above.

  • The <intent-filter> element should be added for activities that should be launchable from search results.
  • The <action> element specifies the ACTION_VIEW intent action so that the intent filter can be reached from Google Search.
  • The <data> tag represents a URI format that resolves to the activity. At minimum, the <data> tag must include the android:scheme attribute.
  • Include the BROWSABLE category. The BROWSABLE category is required in order for the intent filter to be accessible from a web browser. Without it, clicking a link in a browser cannot resolve to your app. The DEFAULT category is optional, but recommended. Without this category, the activity can be started only with an explicit intent, using your app component name.

Testing deep links

Google has created tools to help test your deep link setup. You can use
Google’s Deep Link Test Tool to test your app behavior with deep links on your phone. Additionally, you can create an HTML page with an intent:// link in it.

For example
:

<a href="intent://example.com/page-1#Intent;scheme=http;package=com.example.android;end;"> <a href="http://example.com/page-1">http://example.com/page-1></a>

This link would open up deep content inside the app from the HTML page.

App URI crawl and discovery

Once an app has deep link functionality, the next step is to
ensure that Google can discover these URIs as part of its traditional desktop crawling.

Ways to get apps crawled

  1. Rel=”alternate” in HTML head
  2. ViewAction with Schema.org
  3. Rel=”alternate” in XML Sitemap

Implementing all three will create clear signals, but at minimum you should add the rel=”alternate” tag to the HTML head of your webpages.

Effectively, think of the app URI as being similar to a mobile site URL when
setting up a separate-site mobile site for SEO. The mobile deep link is an alternative way to view a webpage on your site. You map a piece of content on your site to a corresponding piece of content inside the app.

Before you get started, be sure to
verify your website and app following the guidelines here. This will verify your app in Google Play Developer Console and Google Webmaster Tools.

#1: Rel=”alternate” in HTML head

On an example page, such as example.com/page-1, you would add the following code to the head of the document. Again, very similar to separate-site mobile optimization.

<html>
<head> 
... 
<link rel="alternate" href="android-app://com.example.android/http/example.com/page-1" /> 
...
</head>
<body>
</body>
#2: ViewAction with Schema.org

Additionally, you can reference the deep link using Schema.org and JSON by using a 
ViewAction.

<script type="application/ld+json"> 
{ 
"@context": "http://schema.org", 
"@type": "WebPage", 
"@id": "http://example.com/gizmos", 
"potentialAction": { 
"@type": "ViewAction", 
"target": "android-app://com.example.android/http/example.com/gizmos" 
} 
} 
</script>
#3 Rel=”alternate” in XML sitemap

Lastly, you can reference the alternate URL in your XML Sitemaps, similar to using the rel=”alternate” for mobile sites.

<?xml version="1.0" encoding="UTF-8" ?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xhtml="http://www.w3.org/1999/xhtml"> 
<url> 
<loc>http://example.com/page-1</loc> 
<xhtml:link rel="alternate" href="android-app://com.example.android/http/example.com/page-1" /> 
</url> 
... 
</urlset>

Once these are in place, Google can discover the app URI and provide your app as an alternative way to view content found in search.

Bot control and robots noindex for apps

There may be instances where there is content within your app that you do not want indexed in Google. A good example of this might be content or functionality that is built out on your site, but has not yet been developed in your app. This would create an inferior experience for users. The good news is that we can block indexation with a few updates to the app.

First, add the following to your app resource directory (res/xml/noindex.xml).

<?xml version="1.0" encoding="utf-8"?> 
<search-engine xmlns:android="http://schemas.android.com/apk/res/android"> 
<noindex uri="http://example.com/gizmos/hidden_uri"/> 
<noindex uriPrefix="http://example.com/gizmos/hidden_prefix"/> 
<noindex uri="gizmos://hidden_path"/> 
<noindex uriPrefix="gizmos://hidden_prefix"/> 
</search-engine>

As you can see above, you can block an individual URI or define a URI prefix to block entire folders.

Once this has been added, you need to update the AndroidManifest.xml file to denote that you’re using noindex.html to block indexation.

<manifest xmlns:android="http://schemas.android.com/apk/res/android" package="com.example.android.Gizmos"> 
<application> 
<activity android:name="com.example.android.GizmosActivity" android:label="@string/title_gizmos" > 
<intent-filter android:label="@string/filter_title_viewgizmos"> 
<action android:name="android.intent.action.VIEW"/> 
... 
</activity> 
<meta-data android:name="search-engine" android:resource="@xml/noindex"/> 
</application> 
<uses-permission android:name="android.permission.INTERNET"/> 
</manifest>

App indexing API to drive re-engagement

In addition to URI discovery via desktop crawl, your mobile app can integrate
Google’s App Indexing API, which communicates with Google when users take actions inside your app. This sends information to Google about what users are viewing in the app. This is an additional method for deep link discovery and has some benefits.

The primary benefit is the ability to appear in
autocomplete. This can drive re-engagement through Google Search query autocompletions, providing access to inner pages in apps.

App auto suggest

Again, be sure to
verify your website and app following the guidelines here. This will verify your app in Google Play Developer Console and Google Webmaster Tools.

App actions with knowledge graph

The next, and most exciting, evolution of search is leveraging actions. These will be powerful when
combined with voice search, allowing search engines to take action on behalf of users, turning spoken language into executed actions.

App indexing allows you to take advantage of actions by allowing Google to not only launch an app, but execute actions inside of the app. Order me a pizza? Schedule my meeting? Drive my car? Ok, Google.

App actions work via entity detection and the application of the knowledge graph, allowing search engines to understand actions, words, ideas and objects. With that understanding, they can build an action graph that allows them to define common actions by entity type.

Here is a list of actions currently supported by Schema.org

For example, the PlayAction could be used to play a song in a music app. This can be achieve with the following markup.

<script type="application/ld+json">
{
"@context": "http://schema.org",
"@type": "MusicGroup",
"name": "Weezer", "potentialAction": {
"@type": "ListenAction",
"target": "android-app://com.spotify.music/http/we.../listen"
}
}
</script>
Once this is implemented, these app actions can begin to appear in search results and knowledge graph.

deep links in app search results

Overview of mobile app search opportunities

In summary, there are five primary ways to increase visibility and engagement for your mobile app in traditional organic search efforts.

Mobile apps in search results

The growth of mobile search is transforming how we define technical SEO, moving beyond front-end and back-end optimization of websites into the realm of structured data and application development. As app indexing expands to include iOS, I suspect the possibilities and opportunities associated with indexing applications, and their corresponding actions, to grow extensively. 

For those with Android apps, app indexing is a potential leapfrog style opportunity to get ahead of competitors who are dominant in traditional desktop search. Those with iOS devices should start by optimizing their app listings, while preparing to implement indexation, as I suspect it’ll be released for iOS this year.

Have you been leveraging traditional organic search to drive visibility and engagement for apps? Share your experiences in the comments below.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

A Universal SEO Strategy Audit in 5 Steps – Whiteboard Friday

Posted by randfish

When it comes to building an SEO strategy, many marketers (especially those who don’t spend a significant amount of time with SEO) start off by asking a few key questions. That’s a good start, but only if you’re asking the right questions. In today’s Whiteboard Friday, Rand puts the usual suspects on the chopping block, showing us the five things we should really be looking into when formulating our SEO strategy.

For reference, here’s a still of this week’s whiteboard!

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about building an SEO strategy and having a universal set of five questions that can get you there.

So number one: What keywords do you want to rank for?


Number two
: How do we get links?


Number three
: Site speed. Mobile? Doesn’t even seem like a question.


Number four
: What about Penguin and Panda?


Number five
: When do I get money?

This is bologna. That’s not a strategy. Some of those go to tactics you might invest in an SEO, but this is not an SEO strategy. Unfortunately, this is how a lot of conversations about SEO start at teams, with CMOs, with managers, with CEOs, with clients or potential clients, and it’s very frustrating because you can’t truly do a great job with SEO just in the tactical level. If you don’t start with a compelling strategy, doing all of these things is only going to produce a small amount of potential return compared to if you ask the right questions and you get your strategy set before you begin an SEO process and nailing your tactics.

So that’s what I want to go through. I spend a lot of time thinking through these things and analyzing a lot of posts that other people have put up and questions that folks have put in our Q&A system and others, on Quora and other places. I think actually every great SEO strategy that I have ever seen can be the distilled down to answers that come from these five questions.

So number one: What does our organization create that helps solve searchers’ questions or problems? That could be, “Or what will we create in the future?” It might be that you haven’t yet created the thing or things that’s going to help solve searchers’ questions or problems. But that thing that you make, that product or service or content that you are making, that expertise that you hold, something about your organization is creating value that if only searchers could access it, they would be immensely thankful.

It is possible, and I have seen plenty of examples of companies that are so new or so much on the cutting edge that they’re producing things that aren’t solving questions people are asking yet. The problem that you’re solving then is not a question. It’s not something that’s being searched for directly. It usually is very indirect. If you’re creating a machine that, let’s say, turns children’s laughter into energy, as they do in the film “Monsters, Inc.”, that is something very new. No one is searching for machine to turn kids laughing into energy. However, many people are searching for alternative energy. They’re searching for broader types of things and concepts. By the way, if you do invent that machine, I think you’ll probably nail a lot of that interest level stuff.

If you have a great answer to this, you can then move on to, “What is the unique value we provide that no one else does?” We talked about unique value previously on Whiteboard Friday. There’s a whole episode you can watch about that. Basically, if everyone else out there is producing X and X+1 and X+2, you’ve either got to be producing X times 10, or you’ve got to be producing Y, something that is highly unique or is unique because it is of such better quality, such greater quality. It does the job so much better than anything else out there. It’s not, “Hey, we’re better than the top ten search results.” It’s, “Why are you ten times better than anything on this list?”

The third question is, “Who’s going to help amplify our message, and why will they do it?” This is essential because SEO has turned from an exercise, where we essentially take content that already exists or create some content that will solve a searcher problem and then try and acquire links to it, or point links to it, or point ranking signals at it, and instead it’s ones where we have to go out and earn those ranking signals. Because we’ve shifted from link building or ranking signal building to ranking signal earning, we better have people who will help amplify our message, the content that we create, the value that we provide, the service or the product, the message about our brand.

If we don’t have those people who, for some reason, care enough about what we’re doing to help share it with others, we’re going to be shouting into a void. We’re going to get no return on the investment of broadcasting our message or reaching out one to one, or sharing on social media, or distributing. It’s not going to work. We need that amplification. There must be some of it, and because we need amplification in order to earn these ranking signals, we need an answer to who.

That who is going to depend highly on your target audience, your target customers, and who influences your target customers, which may be a very different group than other customers just like them. There are plenty of businesses in industries where your customers will be your worst amplifiers because they love you and they don’t want to share you with anyone else. They love whatever product or service you’re providing, and they want to keep you all to themselves. By the way, they’re not on social media, and they don’t do sharing. So you need another level above them. You need press or bloggers or social media sharers, somebody who influences your target audience.

Number four: What is our process for turning visitors from search into customers? If you have no answer to this, you can’t expect to earn search visits and have a positive return on your investment. You’ve got to be building out that funnel that says, “Aha, people have come to us through channel X, search, social media, e-mail, directly visited, referred from some other website, through business development, through conference or trade show, whatever it is. Then they come back to our website. Then they sign up for an e-mail. Then they make a conversion. How does that work? What does our web-marketing funnel look like? How do we take people that visited our site for the first time from search, from a problem or a question that they had that we answered, and now how do they become a customer?” If you don’t have that process yet, you must build it. That’s part of a great SEO strategy. Then optimization of this is often called conversion rate optimization.

The last question, number five: How do we expose what we do that provides value here in a way that engines can easily crawl, index, understand, and show off? This is getting to much more classic SEO stuff. For many companies they have something wonderful that they’ve built, but it’s just a mobile app or a web app that has no physical URL structure that anyone can crawl and be exposed to, or it’s a service based business.

Let’s say it’s legal services firm. How are we going to turn the expertise of our legal team into something that engines can perceive? Maybe we have the answers to these questions, but we need to find some way to show it off, and that’s where content creation comes into play. So we don’t just need content that is good quality content that can be crawled and indexed. It also must be understood, and this ties a little bit to things we’ve talked about in the past around Hummingbird, where it’s clear that the content is on the topic and that it really answers the searchers’ underlying question, not just uses the keywords the searcher is using. Although, using the keywords is still important from a classic SEO perspective.

Then show off that content is, “How do we do a great job of applying rich snippets, of applying schema, of having a very compelling title and description and URL, of getting that ranked highly, of learning what our competitors are doing that we can uniquely differentiate from them in the search results themselves so that we can improve our click-through rates,” all of those kinds of things.

If you answer these five questions, or if your customer, your client, your team, your boss already has great answers to these five questions, then you can start getting pretty tactical and be very successful. If you don’t have answers to these yet, go get them. Make them explicit, not just implicit. Don’t just assume you know what they are. Have them list them. Make sure everyone on the team, everyone in the SEO process has bought into, “Yes, these are the answers to those five questions that we have. Now, let’s go do our tactics.” I think you’ll find you’re far more successful with any type of SEO project or investment.

All right gang, thanks so much for joining us on Whiteboard Friday, and we’ll see you again next week. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

12 Common Reasons Reconsideration Requests Fail

Posted by Modestos

There are several reasons a reconsideration request might fail. But some of the most common mistakes site owners and inexperienced SEOs make when trying to lift a link-related Google penalty are entirely avoidable. 

Here’s a list of the top 12 most common mistakes made when submitting reconsideration requests, and how you can prevent them.

1. Insufficient link data

This is one of the most common reasons why reconsideration requests fail. This mistake is readily evident each time a reconsideration request gets rejected and the example URLs provided by Google are unknown to the webmaster. Relying only on Webmaster Tools data isn’t enough, as Google has repeatedly said. You need to combine data from as many different sources as possible. 

A good starting point is to collate backlink data, at the very least:

  • Google Webmaster Tools (both latest and sample links)
  • Bing Webmaster Tools
  • Majestic SEO (Fresh Index)
  • Ahrefs
  • Open Site Explorer

If you use any toxic link-detection services (e.g., Linkrisk and Link Detox), then you need to take a few precautions to ensure the following:

  • They are 100% transparent about their backlink data sources
  • They have imported all backlink data
  • You can upload your own backlink data (e.g., Webmaster Tools) without any limitations

If you work on large websites that have tons of backlinks, most of these automated services are very likely used to process just a fraction of the links, unless you pay for one of their premium packages. If you have direct access to the above data sources, it’s worthwhile to download all backlink data, then manually upload it into your tool of choice for processing. This is the only way to have full visibility over the backlink data that has to be analyzed and reviewed later. Starting with an incomplete data set at this early (yet crucial) stage could seriously hinder the outcome of your reconsideration request.

2. Missing vital legacy information

The more you know about a site’s history and past activities, the better. You need to find out (a) which pages were targeted in the past as part of link building campaigns, (b) which keywords were the primary focus and (c) the link building tactics that were scaled (or abused) most frequently. Knowing enough about a site’s past activities, before it was penalized, can help you home in on the actual causes of the penalty. Also, collect as much information as possible from the site owners.

3. Misjudgement

Misreading your current situation can lead to wrong decisions. One common mistake is to treat the example URLs provided by Google as gospel and try to identify only links with the same patterns. Google provides a very small number of examples of unnatural links. Often, these examples are the most obvious and straightforward ones. However, you should look beyond these examples to fully address the issues and take the necessary actions against all types of unnatural links. 

Google is very clear on the matter: “Please correct or remove all inorganic links, not limited to the samples provided above.

Another common area of bad judgement is the inability to correctly identify unnatural links. This is a skill that requires years of experience in link auditing, as well as link building. Removing the wrong links won’t lift the penalty, and may also result in further ranking drops and loss of traffic. You must remove the right links.


4. Blind reliance on tools

There are numerous unnatural link-detection tools available on the market, and over the years I’ve had the chance to try out most (if not all) of them. Because (and without any exception) I’ve found them all very ineffective and inaccurate, I do not rely on any such tools for my day-to-day work. In some cases, a lot of the reported “high risk” links were 100% natural links, and in others, numerous toxic links were completely missed. If you have to manually review all the links to discover the unnatural ones, ensuring you don’t accidentally remove any natural ones, it makes no sense to pay for tools. 

If you solely rely on automated tools to identify the unnatural links, you will need a miracle for your reconsideration request to be successful. The only tool you really need is a powerful backlink crawler that can accurately report the current link status of each URL you have collected. You should then manually review all currently active links and decide which ones to remove. 

I could write an entire book on the numerous flaws and bugs I have come across each time I’ve tried some of the most popular link auditing tools. A lot of these issues can be detrimental to the outcome of the reconsideration request. I have seen many reconsiderations request fail because of this. If Google cannot algorithmically identify all unnatural links and must operate entire teams of humans to review the sites (and their links), you shouldn’t trust a $99/month service to identify the unnatural links.

If you have an in-depth understanding of Google’s link schemes, you can build your own process to prioritize which links are more likely to be unnatural, as I described in this post (see sections 7 & 8). In an ideal world, you should manually review every single link pointing to your site. Where this isn’t possible (e.g., when dealing with an enormous numbers of links or resources are unavailable), you should at least focus on the links that have the more “unnatural” signals and manually review them.

5. Not looking beyond direct links

When trying to lift a link-related penalty, you need to look into all the links that may be pointing to your site directly or indirectly. Such checks include reviewing all links pointing to other sites that have been redirected to your site, legacy URLs with external inbound links that have been internally redirected owned, and third-party sites that include cross-domain canonicals to your site. For sites that used to buy and redirect domains in order increase their rankings, the quickest solution is to get rid of the redirects. Both Majestic SEO and Ahrefs report redirects, but some manual digging usually reveals a lot more.

PQPkyj0.jpg

6. Not looking beyond the first link

All major link intelligence tools, including Majestic SEO, Ahrefs and Open Site Explorer, report only the first link pointing to a given site when crawling a page. This means that, if you overly rely on automated tools to identify links with commercial keywords, the vast majority of them will only take into consideration the first link they discover on a page. If a page on the web links just once to your site, this is not big deal. But if there are multiple links, the tools will miss all but the first one.

For example, if a page has five different links pointing to your site, and the first one includes a branded anchor text, these tools will just report the first link. Most of the link-auditing tools will in turn evaluate the link as “natural” and completely miss the other four links, some of which may contain manipulative anchor text. The more links that get missed this way the more likely your reconsideration request will fail.

7. Going too thin

Many SEOs and webmasters (still) feel uncomfortable with the idea of losing links. They cannot accept the idea of links that once helped their rankings are now being devalued, and must be removed. There is no point trying to save “authoritative”, unnatural links out of fear of losing rankings. If the main objective is to lift the penalty, then all unnatural links need to be removed.

Often, in the first reconsideration request, SEOs and site owners tend to go too thin, and in the subsequent attempts start cutting deeper. If you are already aware of the unnatural links pointing to your site, try to get rid of them from the very beginning. I have seen examples of unnatural links provided by Google on PR 9/DA 98 sites. Metrics do not matter when it comes to lifting a penalty. If a link is manipulative, it has to go.

In any case, Google’s decision won’t be based only on the number of links that have been removed. Most important in the search giant’s eyes are the quality of links still pointing to your site. If the remaining links are largely of low quality, the reconsideration request will almost certainly fail. 

8. Insufficient effort to remove links

Google wants to see a “good faith” effort to get as many links removed as possible. The higher the percentage of unnatural links removed, the better. Some agencies and SEO consultants tend to rely too much on the use of the disavow tool. However, this isn’t a panacea, and should be used as a last resort for removing those links that are impossible to remove—after exhausting all possibilities to physically remove them via the time-consuming (yet necessary) outreach route. 

Google is very clear on this:

m4M4n3g.jpg?1

Even if you’re unable to remove all of the links that need to be removed, you must be able to demonstrate that you’ve made several attempts to have them removed, which can have a favorable impact on the outcome of the reconsideration request. Yes, in some cases it might be possible to have a penalty lifted simply by disavowing instead of removing the links, but these cases are rare and this strategy may backfire in the future. When I reached out to ex-googler Fili Wiese’s for some advice on the value of removing the toxic links (instead of just disavowing them), his response was very straightforward:

V3TmCrj.jpg 

9. Ineffective outreach

Simply identifying the unnatural links won’t get the penalty lifted unless a decent percentage of the links have been successfully removed. The more communication channels you try, the more likely it is that you reach the webmaster and get the links removed. Sending the same email hundreds or thousands of times is highly unlikely to result in a decent response rate. Trying to remove a link from a directory is very different from trying to get rid of a link appearing in a press release, so you should take a more targeted approach with a well-crafted, personalized email. Link removal request emails must be honest and to the point, or else they’ll be ignored.

Tracking the emails will also help in figuring out which messages have been read, which webmasters might be worth contacting again, or alert you of the need to try an alternative means of contacting webmasters.

Creativity, too, can play a big part in the link removal process. For example, it might be necessary to use social media to reach the right contact. Again, don’t trust automated emails or contact form harvesters. In some cases, these applications will pull in any email address they find on the crawled page (without any guarantee of who the information belongs to). In others, they will completely miss masked email addresses or those appearing in images. If you really want to see that the links are removed, outreach should be carried out by experienced outreach specialists. Unfortunately, there aren’t any shortcuts to effective outreach.

10. Quality issues and human errors

All sorts of human errors can occur when filing a reconsideration request. The most common errors include submitting files that do not exist, files that do not open, files that contain incomplete data, and files that take too long to load. You need to triple-check that the files you are including in your reconsideration request are read-only, and that anyone with the URL can fully access them. 

Poor grammar and language is also bad practice, as it may be interpreted as “poor effort.” You should definitely get the reconsideration request proofread by a couple of people to be sure it is flawless. A poorly written reconsideration request can significantly hinder your overall efforts.

Quality issues can also occur with the disavow file submission. Disavowing at the URL level isn’t recommended because the link(s) you want to get rid of are often accessible to search engines via several URLs you may be unaware of. Therefore, it is strongly recommended that you disavow at the domain or sub-domain level.

11. Insufficient evidence

How does Google know you have done everything you claim in your reconsideration request? Because you have to prove each claim is valid, you need to document every single action you take, from sent emails and submitted forms, to social media nudges and phone calls. The more information you share with Google in your reconsideration request, the better. This is the exact wording from Google:

“ …we will also need to see good-faith efforts to remove a large portion of inorganic links from the web wherever possible.”

12. Bad communication

How you communicate your link cleanup efforts is as essential as the work you are expected to carry out. Not only do you need to explain the steps you’ve taken to address the issues, but you also need to share supportive information and detailed evidence. The reconsideration request is the only chance you have to communicate to Google which issues you have identified, and what you’ve done to address them. Being honest and transparent is vital for the success of the reconsideration request.

There is absolutely no point using the space in a reconsideration request to argue with Google. Some of the unnatural links examples they share may not always be useful (e.g., URLs that include nofollow links, removed links, or even no links at all). But taking the argumentative approach veritably guarantees your request will be denied.

54adb6e0227790.04405594.jpg
Cropped from photo by Keith Allison, licensed under Creative Commons.

Conclusion

Getting a Google penalty lifted requires a good understanding of why you have been penalized, a flawless process and a great deal of hands-on work. Performing link audits for the purpose of lifting a penalty can be very challenging, and should only be carried out by experienced consultants. If you are not 100% sure you can take all the required actions, seek out expert help rather than looking for inexpensive (and ineffective) automated solutions. Otherwise, you will almost certainly end up wasting weeks or months of your precious time, and in the end, see your request denied.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from moz.com

Illustrated Guide to Advanced On-Page Topic Targeting for SEO

Posted by Cyrus-Shepard

Topic n. A subject or theme of a webpage, section, or site.

Several SEOs have recently written about topic modeling and advanced on-page optimization. A few of note:

The concepts themselves are dizzying: LDA, co-occurrence, and entity salience, to name only a few. The question is
“How can I easily incorporate these techniques into my content for higher rankings?”

In fact, you can create optimized pages without understanding complex algorithms. Sites like Wikipedia, IMDB, and Amazon create highly optimized, topic-focused pages almost by default. Utilizing these best practices works exactly the same when you’re creating your own content.

The purpose of this post is to provide a simple
framework for on-page topic targeting in a way that makes optimizing easy and scalable while producing richer content for your audience.

1. Keywords and relationships

No matter what topic modeling technique you choose, all rely on discovering
relationships between words and phrases. As content creators, how we organize words on a page greatly influences how search engines determine the on-page topics.

When we use keywords phrases, search engines hunt for other phrases and concepts that
relate to one another. So our first job is to expand our keywords research to incorporate these related phrases and concepts. Contextually rich content includes:

  • Close variants and synonyms: Includes abbreviations, plurals, and phrases that mean the same thing.
  • Primary related keywords: Words and phrases that relate to the main keyword phrase.
  • Secondary related keywords: Words and phrases that relate to the primary related keywords.
  • Entity relationships: Concept that describe the properties and relationships between people, places, and things. 

A good keyword phrase or entity is one that
predicts the presence of other phrases and entities on the page. For example, a page about “The White House” predicts other phrases like “president,” “Washington,” and “Secret Service.” Incorporating these related phrases may help strengthen the topicality of “White House.”

2. Position, frequency, and distance

How a page is organized can greatly influence how concepts relate to each other.

Once search engines find your keywords on a page, they need to determine which ones are most
important, and which ones actually have the strongest relationships to one another.

Three primary techniques for communicating this include:

  • Position: Keywords placed in important areas like titles, headlines, and higher up in the main body text may carry the most weight.
  • Frequency: Using techniques like TF-IDF, search engines determine important phrases by calculating how often they appear in a document compared to a normal distribution.
  • Distance: Words and phrases that relate to each other are often found close together, or grouped by HTML elements. This means leveraging semantic distance to place related concepts close to one another using paragraphs, lists, and content sectioning.

A great way to organize your on-page content is to employ your primary and secondary related keywords in support of your focus keyword. Each primary related phrase becomes its own subsection, with the secondary related phrases supporting the primary, as illustrated here.

Keyword Position, Frequency and Distance

As an example, the primary keyword phrase of this page is ‘On-page Topic Targeting‘. Supporting topics include: keywords and relationships, on-page optimization, links, entities, and keyword tools. Each related phrase supports the primary topic, and each becomes its own subsection.

3. Links and supplemental content

Many webmasters overlook the importance of linking as a topic signal.

Several well-known Google
search patents and early research papers describe analyzing a page’s links as a way to determine topic relevancy. These include both internal links to your own pages and external links to other sites, often with relevant anchor text.

Google’s own
Quality Rater Guidelines cites the value external references to other sites. It also describes a page’s supplemental content, which can includes internal links to other sections of your site, as a valuable resource.

Links and Supplemental Content

If you need an example of how relevant linking can help your SEO,
The New York Times
famously saw success, and an increase in traffic, when it started linking out to other sites from its topic pages.

Although this guide discusses
on-page topic optimization, topical external links with relevant anchor text can greatly influence how search engines determine what a page is about. These external signals often carry more weight than on-page cues, but it almost always works best when on-page and off-page signals are in alignment.

4. Entities and semantic markup

Google extracts entities from your webpage automatically,
without any effort on your part. These are people, places and things that have distinct properties and relationships with each other.

• Christopher Nolan (entity, person) stands 5’4″ (property, height) and directed Interstellar (entity, movie)

Even though entity extraction happens automatically, it’s often essential to mark up your content with
Schema for specific supported entities such as business information, reviews, and products. While the ranking benefit of adding Schema isn’t 100% clear, structured data has the advantage of enhanced search results.

Entities and Schema

For a solid guide in implementing schema.org markup, see Builtvisible’s excellent
guide to rich snippets.

5. Crafting the on-page framework

You don’t need to be a search genius or spend hours on complex research to produce high quality, topic optimized content. The beauty of this framework is that it can be used by anyone, from librarians to hobby bloggers to small business owners; even when they aren’t search engine experts.

A good webpage has much in common with a high quality university paper. This includes:

  1. A strong title that communicates the topic
  2. Introductory opening that lays out what the page is about
  3. Content organized into thematic subsections
  4. Exploration of multiple aspects of the topic and answers related questions
  5. Provision of additional resources and external citations

Your webpage doesn’t need to be academic, stuffy, or boring. Some of the most interesting pages on the Internet employ these same techniques while remaining dynamic and entertaining.

Keep in mind that ‘best practices’ don’t apply to every situation, and as
Rand Fishkin says “There’s no such thing as ‘perfectly optimized’ or ‘perfect on-page SEO.'” Pulling everything together looks something like this:

On-page Topic Targeting for SEO

This graphic is highly inspired by Rand Fishkin’s great
Visual Guide to Keyword Targeting and On-Page SEO. This guide doesn’t replace that canonical resource. Instead, it should be considered a supplement to it.

5 alternative tools for related keyword and entity research

For the search professional, there are dozens of tools available for thematic keyword and entity research. This list is not exhaustive by any means, but contains many useful favorites.

1.
Alchemy API

One of the few tools on the market that delivers entity extraction, concept targeting and linked data analysis. This is a great platform for understanding how a modern search engine views your webpage.

2.
SEO Review Tools

The SEO Keyword Suggestion Tools was actually designed to return both primary and secondary related keywords, as well as options for synonyms and country targeting. 

3.
LSIKeywords.com

The LSIKeyword tool performs Latent Semantic Indexing (LSI) on the top pages returned by Google for any given keyword phrase. The tool can go down from time to time, but it’s a great one to bookmark.

4.
Social Mention

Quick and easy, enter any keyword phrase and then check “Top Keywords” to see what words appear most with your primary phrase across the of the platforms that Social Mention monitors. 

5.
Google Trends

Google trends is a powerful related research tool, if you know how to use it. The secret is downloading your results to a CSV (under settings) to get a list up to 50 related keywords per search term.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 5 years ago from feedproxy.google.com

The Ultimate Guide to Broken Link Building

 

Have you heard of “Broken link building”?

It’s a highly effective and 100% legitimate link building tactic SEOs now use widely to build links after Penguin.

And you can start using it for your own site (with SEO PowerSuite and the easy 6-steps guide we’ve prepared today.)

The concept behind broken link building is very simple. You just:

1) Find broken links on other niche-relevant sites
2) Contact webmasters and suggest your site as an alternative
3) Get your links placed instead of the broken ones.

Yet the little-known hack you’ll learn today (and a few SEO PowerSuite non-standard use tips you haven’t heard of) will help you make even more – and run “Broken link building” large-scale to get hundreds of backlinks with the least of efforts applied!

 

Learn “Broken link building” in 6 steps

to easily build hundreds of quality links pointing to any site!

Reblogged 5 years ago from feedproxy.google.com

6 Things I Wish I Knew Before Using Optimizely

Posted by tallen1985

Diving into Conversion Rate Optimization (CRO) for the first time can be a challenge. You are faced with a whole armoury of new tools, each containing a huge variety of features. Optimizely is one of those tools you will quickly encounter and through this post I’m going to cover 6 features I wish I had known from day one that have helped improve test performance/debugging and the ability to track results accurately.

1. You don’t have to use the editor

The editor within Optimizely is a useful tool if you don’t have much experience working with code. The editor
should be used for making simple visual changes, such as changing an image, adjusting copy or making minor layout changes.

If you are looking to make changes that change the behaviour of the page rather than just straightforward visual changes, then the editor can become troublesome. In this case you should use the “Edit Code” feature at the foot of the editor.

For any large-scale changes to the site, such as completely redesigning the page, Optimizely should be used for traffic allocation and not editing pages. To do this:

1. Build a new version of the page outside of Optimizely

2. Upload the variation page to your site.
Important: Ensure that the variation page is noindexed.

We now have two variations of our page:

www.myhomepage.com & www.myhomepage.com/variation1

3. Select the variation drop down menu and click Redirect to a new page

4. Enter the variation URL, apply the settings and save the experiment. You can now use Optimizely as an A/B test management tool to allocate traffic, exclude traffic/device types, and gather further test data.

If you do use the editor be aware of excess code

One problem to be aware of here is that each time you move or change an element Optimizely adds a new line of code. The variation code below actually repositions the h2 title four times.

Instead when using the editor we should make sure that we strip out any excess code. If you move and save a page element multiple times, open the <edit code> tab at the foot of the page and delete any excess code. For example, the following positions my h2 title in exactly the same position as before with three fewer lines of code. Over the course of multiple changes, this excess code can result in an increase of load time for Optimizely.


2. Enabling analytics tracking

Turning on analytics tracking seems obvious, right? In fact, why would we even need to turn it on in the first place, surely it would be defaulted to on?

Optimizely currently sets analytics tracking to the default option of off. As a result if you don’t manually change the setting nothing will be getting reporting into your analytics platform of choice.

To turn on analytics tracking, simply open the settings in the top right corner from within the editor mode and select Analytics Integration.

Turn on the relevant analytics tracking. If you are using Google Analytics, then at this point you should assign a vacant custom variable slot (for Classic Analytics) or a vacant custom dimension (Universal Analytics) to the experiment.

Once the test is live, wait for a while (up to 24 hours), then check to be sure the data is reporting correctly within the custom segments.


3. Test your variations in a live environment

Before you set your test live, it’s important that you test the new variation to ensure everything works as expected. To do this we need to see the test in a live environment while ensuring no customers see the test versions yet. I’ve suggested a couple of ways to do this below:

Query parameter targeting

Query parameter tracking is available on all accounts and is our preferred method for sharing live versions with clients, mainly because once set up, it is as simple as sharing a URL.

1. Click the audiences icon at the top of the page 

2. Select create a new audience

3. Drag Query Parameters from the possible conditions and enter parameters of your choice.

4. Click Apply and save the experiment.

5. To view the experiment visit the test URL with query parameters added. In the above example the URL would be:
http://www.distilled.net?test=variation

Cookie targeting

1. Open the browser and create a bookmark on any page

2. Edit the bookmark and change both properties to:

a) Name: Set A Test Cookie

b)URL: The following Javascript code:

<em>javascript:(function(){ var hostname = window.location.hostname; var parts = hostname.split("."); var publicSuffix = hostname; var last = parts[parts.length - 1]; var expireDate = new Date(); expireDate.setDate(expireDate.getDate() + 7); var TOP_LEVEL_DOMAINS = ["com", "local", "net", "org", "xxx", "edu", "es", "gov", "biz", "info", "fr", "gr", "nl", "ca", "de", "kr", "it", "me", "ly", "tv", "mx", "cn", "jp", "il", "in", "iq"]; var SPECIAL_DOMAINS = ["jp", "uk", "au"]; if(parts.length > 2 && SPECIAL_DOMAINS.indexOf(last) != -1){ publicSuffix = parts[parts.length - 3] + "."+ parts[parts.length - 2] + "."+ last} else if(parts.length > 1 && TOP_LEVEL_DOMAINS.indexOf(last) != -1) {publicSuffix = parts[parts.length - 2] + "."+ last} document.cookie = "optly_"+publicSuffix.split(".")[0]+"_test=true; domain=."+publicSuffix+"; path=/; expires="+expireDate.toGMTString()+";"; })();</em>

You should end up with the following:

3. Open the page where you want to place the cookie and click the bookmark

4. The cookie will now be set on the domain you are browsing and will looking something like: ‘optly_YOURDOMAINNAME_test=true’

Next we need to target our experiment to only allow visitors who have the cookie set to see test variations.

5. Click the audiences icon at the top of the page

6. Select create a new audience

7. Drag Cookie into the Conditions and change the name to optly_YOURDOMAINNAME_test=true

8. Click Apply and save the experiment.

Source:
https://help.optimizely.com/hc/en-us/articles/200293784-Setting-a-test-cookie-for-your-site

IP address targeting (only available on Enterprise accounts)

Using IP address targeting is useful when you are looking to test variations in house and on a variety of different devices and browsers.

1. Click the audiences icon at the top of the page

2. Select create a new audience

3. Drag IP Address from the possible conditions and enter the IP address being used. (Not sure of your IP address then head to
http://whatismyipaddress.com/)

4. Click Apply and Save the experiment.


4. Force variations using parameters when debugging pages

There will be times, particular when testing new variations, that there will be the need to view a specific variation. Obviously this can be an issue if your browser has already been bucketed into an alternative variation. Optimizely overcomes this by allowing you to force the variation you wish to view, simply using query parameters.

The query parameter is structured in the following way: optimizely_x
EXPRIMENTID=VARIATIONINDEX

1. The
EXPERIMENTID can be found in the browser URL

2.
VARIATIONINDEX is the variation you want to run, 0 is for the original, 1 is variation #1, 2 is variation #2 etc.

3. Using the above example to force a variation, we would use the following URLstructure to display variation 1 of our experiment:
http://www.yourwebsite.com/?optimizely_x1845540742=1

Source:
https://help.optimizely.com/hc/en-us/articles/200107480-Forcing-a-specific-variation-to-run-and-other-advanced-URL-parameters


5. Don’t change the traffic allocation sliders

Once a test is live it is important not change the amount of traffic allocated to each variation. Doing so can massively affect test results, as one version would potentially begin to receive more return visitors who in turn have a much higher chance of converting.

My colleague Tom Capper discussed further the
do’s and don’ts of statistical significance earlier this year where he explained,

“At the start of your test, you decide to play it safe and set your traffic allocation to 90/10. After a time, it seems the variation is non-disastrous, and you decide to move the slider to 50/50. But return visitors are still always assigned their original group, so now you have a situation where the original version has a larger proportion of return visitors, who are far more likely to convert.”

To summarize, if you do need to adjust the amount of traffic allocated to each test variation, you should look to restart the test to have complete confidence that the data you receive is accurate.


6. Use segmentation to generate better analysis

Okay I understand this one isn’t strictly about Optimizely, but it is certainly worth keeping in mind, particularly earlier on in the CRO process when producing hypothesis around device type.

Conversion rates can vary greatly, particularly when we start segmenting data by locations, browsers, medium, return visits vs new visits, just to name a few. However, by using segmentation we can unearth opportunities that we may have previously overlooked, allowing us to generate new hypotheses for future experiments.


Example

You have been running a test for a month and unfortunately the results are inconclusive. The test version of the page didn’t perform any better or worse than the original. Overall the test results look like the following:


Page Version

Visitors

Transactions

Conversion Rate
Original 41781 1196 2.86%
Variation 42355 1225 2.89%

In this case the test variation overall has only performed
1% better than the original with a significance of 60%. With these results this test variation certainly wouldn’t be getting rolled out any time soon.

However when these results are segmented by
device they tell a very different story:

Drilling into the
desktop results we actually find that the test variation saw a 10% increase in conversions over the original with 97% significance. Yet those using a tablet were converting way below the original, thus driving down the overall conversion rates we were seeing in the first table.

Ultimately with this data we would be able to generate a new hypothesis of “we believe the variation will increase conversion rate for users on a desktop”. We would then re-run the test to desktop only users to verify the previous data and the new hypothesis.

Using segmented data here could also potentially help the experiment reach significance at a much faster rate as
explained in this video from Opticon 2014.

Should the new test be successful and achieve significance we would serve users on the desktops the new variation, whilst those on mobile and tablets continue to be displayed the original site.

Key takeaways

  • Always turn on Google Analytics tracking (and then double check it is turned on).
  • If you plan to make behavioural changes to a page use the Javascript editor rather than the drag and drop feature
  • Use IP address targeting for device testing and query parameters to share a live test with clients.
  • If you need to change the traffic allocation to test variations you should restart the test.
  • Be aware that test performance can vary greatly based on device.

What problems and solutions have you come across when creating CRO experiments with Optimizely? What pieces of information do you wish you had known 6 months ago?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 5 years ago from feedproxy.google.com