The Long Click and the Quality of Search Success

Posted by billslawski

“On the most basic level, Google could see how satisfied users were. To paraphrase Tolstoy, happy users were all the same. The best sign of their happiness was the “Long Click” — This occurred when someone went to a search result, ideally the top one, and did not return. That meant Google has successfully fulfilled the query.”

~ Steven Levy. In the Plex: How Google Thinks, Works, and Shapes our Lives

I often explore and read patents and papers from the search engines to try to get a sense of how they may approach different issues, and learn about the assumptions they make about search, searchers, and the Web. Lately, I’ve been keeping an eye open for papers and patents from the search engines where they talk about a metric known as the “long click.”

A recently granted Google patent uses the metric of a “Long Click” as the center of a process Google may use to track results for queries that were selected by searchers for long visits in a set of search results.

This concept isn’t new. In 2011, I wrote about a Yahoo patent in How a Search Engine May Measure the Quality of Its Search Results, where they discussed a metric that they refer to as a “target page success metric.” It included “dwell time” upon a result as a sign of search success (Yes, search engines have goals, too).

5543947f5bb408.24541747.jpg

Another Google patent described assigning web pages “reachability scores” based upon the quality of pages linked to from those initially visited pages. In the post Does Google Use Reachability Scores in Ranking Resources? I described how a Google patent that might view a long click metric as a sign to see if visitors to that page are engaged by the links to content they find those links pointing to, including links to videos. Google tells us in that patent that it might consider a “long click” to have been made on a video if someone watches at least half the video or 30 seconds of it. The patent suggests that a high reachability score on a page may mean that page could be boosted in Google search results.

554394a877e8c8.30299132.jpg

But the patent I’m writing about today is focused primarily upon looking at and tracking a search success metric like a long click or long dwell time. Here’s the abstract:

Modifying ranking data based on document changes

Invented by Henele I. Adams, and Hyung-Jin Kim

Assigned to Google

US Patent 9,002,867

Granted April 7, 2015

Filed: December 30, 2010

Abstract

Methods, systems, and apparatus, including computer programs encoded on computer storage media for determining a weighted overall quality of result statistic for a document.

One method includes receiving quality of result data for a query and a plurality of versions of a document, determining a weighted overall quality of result statistic for the document with respect to the query including weighting each version specific quality of result statistic and combining the weighted version-specific quality of result statistics, wherein each quality of result statistic is weighted by a weight determined from at least a difference between content of a reference version of the document and content of the version of the document corresponding to the version specific quality of result statistic, and storing the weighted overall quality of result statistic and data associating the query and the document with the weighted overall quality of result statistic.

This patent tells us that search results may be be ranked in an order, according to scores assigned to the search results by a scoring function or process that would be based upon things such as:

  • Where, and how often, query terms appear in the given document,
  • How common the query terms are in the documents indexed by the search engine, or
  • A query-independent measure of quality of the document itself.

Last September, I wrote about how Google might identify a category associated with a query term base upon clicks, in the post Using Query User Data To Classify Queries. In a query for “Lincoln.” the results that appear in response might be about the former US President, the town of Lincoln, Nebraska, and the model of automobile. When someone searches for [Lincoln], Google returning all three of those responses as a top result could be said to be reasonable. The patent I wrote about in that post told us that Google might collect information about “Lincoln” as a search entity, and track which category of results people clicked upon most when they performed that search, to determine what categories of pages to show other searchers. Again, that’s another “search success” based upon a past search history.

There likely is some value in working to find ways to increase the amount of dwell time someone spends upon the pages of your site, if you are already having some success in crafting page titles and snippets that persuade people to click on your pages when they those appear in search results. These approaches can include such things as:

  1. Making visiting your page a positive experience in terms of things like site speed, readability, and scannability.
  2. Making visiting your page a positive experience in terms of things like the quality of the content published on your pages including spelling, grammar, writing style, interest, quality of images, and the links you share to other resources.
  3. Providing a positive experience by offering ideas worth sharing with others, and offering opportunities for commenting and interacting with others, and by being responsive to people who do leave comments.

Here are some resources I found that discuss this long click metric in terms of “dwell time”:

Your ability to create pages that can end up in a “long click” from someone who has come to your site in response to a query, is also a “search success” metric on the search engine’s part, and you both succeed. Just be warned that as the most recent patent from Google on Long Clicks shows us, Google will be watching to make sure that the content of your page doesn’t change too much, and that people are continuing to click upon it in search results, and spend a fair amount to time upon it.

(Images for this post are from my Go Fish Digital Design Lead Devin Holmes @DevinGoFish. Thank you, Devin!)

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

Local Centroids are Now Individual Users: How Can We Optimize for Their Searches?

Posted by MiriamEllis

“Google is getting better at detecting location at a more granular level—even on the desktop.
The user is the new centroid.” – 
David Mihm

The history of the centroid

The above quote succinctly summarizes the current state of affairs for local business owners and their customers. The concept of a centroid—
a central point of relevance—is almost as old as local search. In 2008, people like Mike Blumenthal and Google Maps Manager Carter Maslan were sharing statistics like this:

“…research indicates that up to 80% of the variation in rank can be explained by distance from the centroid on certain searches.”

At that time, businesses located near town hall or a similar central hub appeared to be experiencing a ranking advantage.

Fast forward to 2013, and Mike weighed in again with 
an updated definition of “industry centroids”

“If you read their (Google’s) patents, they actually deal with the center of the industries … as defining the center of the search. So if all the lawyers are on the corner of Main and State, that typically defines the center of the search, rather than the center of the city… it isn’t even the centroid of the city that matters. It matters that you are near where the other people in your industry are.”

In other words, Google’s perception of a centralized location for auto dealerships could be completely different than that for medical practices, and that
neither might be located anywhere near the city center.

While the concepts of city and industry centroids may still play a part in some searches,
local search results in 2015 clearly indicate Google’s shift toward deeming the physical location of the desktop or mobile user a powerful factor in determining relevance. The relationship between where your customer is when he performs a search and where your business is physically located has never been more important.

Moreover, in this new, user-centric environment, Google has moved beyond simply detecting cities to detecting neighborhoods and even streets. What this means for local business owners is that
your hyperlocal information has become a powerful component of your business data. This post will teach you how to better serve your most local customers.

Seeing the centroid in action

If you do business in a small town with few competitors, ranking for your product/service + city terms is likely to cover most of your bases. The user-as-centroid phenomenon is most applicable in mid-to-large sized towns and cities with reasonable competition. I’ll be using two districts in San Francisco—Bernal Heights and North Beach—in these illustrations and we’ll be going on a hunt for pizza.

On a desktop, searching for “pizza north beach san francisco” or setting my location to this neighborhood and city while searching for the product, Google will show me something like this:

Performing this same search, but with “bernal heights” substituted, Google shows me pizzerias in a completely different part of the city:

local result bernal heights pizza san francisco

And, when I move over to my mobile device, Google narrows the initial results down to
just three enviable players in each district. These simple illustrations demonstrate Google’s increasing sensitivity to serving me nearby businesses offering what I want.

The physical address of your business is the most important factor in serving the user as centroid. This isn’t something you can control, but there are things you
can do to market your business as being highly relevant to your hyperlocal geography.

Specialized content for the user-centroid

We’ll break this down into four common business models to help get you thinking about planning content that serves your most local customers.

1. Single-location business

Make the shift toward viewing your business not just as “Tony’s Pizza in San Francisco”, but as “Tony’s Pizza
in North Beach, San Francisco”. Consider:

  • Improving core pages of your website or creating new pages to include references to the proud part you play in the neighborhood scene. Talk about the history of your area and where you fit into that.
  • Interview locals and ask them to share their memories about the neighborhood and what they like about living there.
  • Showcase your participation in local events.
  • Plan an event, contest or special for customers in your district.
  • Take pictures, label them with hyperlocal terms, post them on your site and share them socially.
  • Blog about local happenings that are relevant to you and your customers, such as a street market where you buy the tomatoes that top your pizzas or a local award you’ve won.
  • Depending on your industry, there will be opportunities for hyperlocal content specific to your business. For example, a restaurant can make sure its menu is in crawlable text and can name some favorite dishes after the neighborhood—The Bernal Heights Special. Meanwhile, a spa in North Beach can create a hyperlocal name for a service—The North Beach Organic Spa Package. Not only does this show district pride, but customers may mention these products and services by name in their reviews, reinforcing your local connection.

2. Multi-location business within a single city

All that applies to the single location applies to you, too, but you’ve got to find a way to scale building out content for each neighborhood.

  • If your resources are strong, build a local landing page for each of your locations, including basic optimization for the neighborhood name. Meanwhile, create blog categories for each neighborhood and rotate your efforts on a week by week basis. First week, blog about neighborhood A, next week, find something interesting to write about concerning neighborhood B. Over time, you’ll have developed a nice body of content proving your involvement in each district.
  • If you’re short on resources, you’ll still want to build out a basic landing page for each of your stores in your city and make the very best effort you can to showcase your neighborhood pride on these pages.

3. Multiple businesses, multiple cities

Again, scaling this is going to be key and how much you can do will depend upon your resources.

  • The minimum requirement will be a landing page on the site for each physical location, with basic optimization for your neighborhood terms.
  • Beyond this, you’ll be making a decision about how much hyperlocal content you can add to the site/blog for each district, or whether time can be utilized more effectively via off-site social outreach. If you’ve got lots of neighborhoods to cover in lots of different cities, designating a social representative for each store and giving him the keys to your profiles (after a training session in company policies) may make the most sense.

4. Service area businesses (SABs)

Very often, service area businesses are left out in the cold with various local developments, but in my own limited testing, Google is applying at least some hyperlocal care to these business models. I can search for a neighborhood plumber, just as I would a pizza:

local results plumber bernal heights san francisco

To be painstakingly honest, plumbers are going to have to be pretty ingenious to come up with a ton of engaging industry/neighborhood content and may be confined mainly to creating some decent service area landing pages that share a bit about their work in various neighborhoods. Other business models, like contractors, home staging firms and caterers should find it quite easy to talk about district architecture, curb appeal and events on a hyperlocal front.

While your SAB is still unlikely to beat out a competitor with a physical location in a given neighborhood, you still have a chance to associate your business with that area of your town with well-planned content.


Need creative inspiration for the writing projects ahead?
Don’t miss this awesome wildcard search tip Mary Bowling shared at LocalUp. Add an underscore or asterisk to your search terms and just look at the good stuff Google will suggest to you:

wildcard search content ideas

Does Tony’s patio make his business one of
Bernal Heights’ dog-friendly restaurants or does his rooftop view make his restaurant the most picturesque lunch spot in the district? If so, he’s got two new topics to write about, either on his basic landing pages or his blog.

Hop over to 
Whitespark’s favorite takeaways from Mike Ramsey’s LocalUp presentation, too.

Citations and reviews with the user centroid in mind

Here are the basics about citations, broken into the same four business models:

1. Single-location business

You get just one citation on each platform, unless you have multiple departments or practitioners. That means one Google+ Local page, one Yelp profile, one Best of the Web listing. etc. You do not get one citation for your city and another for your neighborhood. Very simple.

2. Multi-location business within a single city

As with the single location business, you are entitled to just one set of citations per physical location. That means one Google+ Local listing for your North Beach pizza place and another for your restaurant in Bernal Heights.

A regular FAQ here in the Moz Q&A Forum relates to how Google will differentiate between two businesses located in the same city. Here are some tips:

  • Google no longer supports the use of modifiers in the business name field, so you can no longer be Tony’s Pizza – Bernal Heights, unless your restaurant is actually named this. You can only be Tony’s Pizza.
  • Facebook’s policies are different than Google’s. To my understanding, Facebook won’t permit you to build more than one Facebook Place for the identical brand name. Thus, to comply with their guidelines, you must differentiate by using those neighborhood names or other modifiers. Given that this same rule applies to all of your competitors, this should not be seen as a danger to your NAP consistency, because apparently, no multi-location business creating Facebook Places will have 100% consistent NAP. The playing field is, then, even.
  • The correct place to differentiate your businesses on all other platforms is in the address field. Google will understand that one of your branches is on A St. and the other is on B St. and will choose which one they feel is most relevant to the user.
  • Google is not a fan of call centers. Unless it’s absolutely impossible to do so, use a unique local phone number for each physical location to prevent mix-ups on Google’s part, and use this number consistently across all web-based mentions of the business.
  • Though you can’t put your neighborhood name in the title, you can definitely include it in the business description field most citation platforms provide.
  • Link your citations to their respective local landing pages on your website, not to your homepage.

3. Multiple businesses, multiple cities

Everything in business model #2 applies to you as well. You are allowed one set of citations for each of your physical locations, and while you can’t modify your Google+ Local business name, you can mention your neighborhood in the description. Promote each location equally in all you do and then rely on Google to separate your locations for various users based on your addresses and phone numbers.

4. SABs

You are exactly like business model #1 when it comes to citations, with the exception of needing to abide by Google’s rules about hiding your address if you don’t serve customers at your place of business. Don’t build out additional citations for neighborhoods you serve, other cities you serve or various service offerings. Just create one citation set. You should be fine mentioning some neighborhoods in your citation descriptions, but don’t go overboard on this.

When it comes to review management, you’ll be managing unique sets of reviews for each of your physical locations. One method for preventing business owner burnout is to manage each location in rotation. One week, tend to owner responses for Business A. Do Business B the following week. In week three, ask for some reviews for Business A and do the same for B in week four. Vary the tasks and take your time unless faced with a sudden reputation crisis.

You can take some additional steps to “hyperlocalize” your review profiles:

  • Write about your neighborhood in the business description on your profile.
  • You can’t compel random customers to mention your neighborhood, but you can certainly do so from time to time when your write responses. “We’ve just installed the first soda fountain Bernal Heights has seen since 1959. Come have a cool drink on us this summer.”
  • Offer a neighborhood special to people who bring in a piece of mail with their address on it. Prepare a little handout for all-comers, highlighting a couple of review profiles where you’d love to hear how they liked the Bernal Heights special. Or, gather email addresses if possible and follow up via email shortly after the time of service.
  • If your business model is one that permits you to name your goods or service packages, don’t forget the tip mentioned earlier about thinking hyperlocal when brainstorming names. Pretty cool if you can get your customers talking about how your “North Beach Artichoke Pizza” is the best pie in town!

Investigate your social-hyperlocal opportunties

I still consider website-based content publication to be more than half the battle in ranking locally, but sometimes, real-time social outreach can accomplish things static articles or scheduled blog posts can’t. The amount of effort you invest in social outreach should be based on your resources and an assessment of how naturally your industry lends itself to socialization. Fire insurance salesmen are going to find it harder to light up their neighborhood community than yoga studios will. Consider your options:

Remember that you are investigating each opportunity to see how it stacks up not just to promoting your location in your city, but in your neighborhood.

Who are the people in your neighborhood?

Remember that Sesame Street jingle? It hails from a time when urban dwellers strongly identified with a certain district of hometown. People were “from the neighborhood.” If my grandfather was a Mission District fella, maybe yours was from Chinatown. Now, we’re shifting in fascinating directions. Even as we’ve settled into telecommuting to jobs in distant states or countries, Amazon is offering one hour home delivery to our neighbors in Manhattan. Doctors are making house calls again! Any day now, I’m expecting a milkman to start making his rounds around here. Commerce has stretched to span the globe and now it’s zooming in to meet the needs of the family next door.

If the big guys are setting their sights on near-instant services within your community, take note.
You live in that community. You talk, face-to-face, with your neighbors every day and know the flavor of the local scene better than any remote competitor can right now.

Now is the time to reinvigorate that old neighborhood pride in the way you’re visualizing your business, marketing it and personally communicating to customers that you’re right there for them.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

Illustrated Guide to Advanced On-Page Topic Targeting for SEO

Posted by Cyrus-Shepard

Topic n. A subject or theme of a webpage, section, or site.

Several SEOs have recently written about topic modeling and advanced on-page optimization. A few of note:

The concepts themselves are dizzying: LDA, co-occurrence, and entity salience, to name only a few. The question is
“How can I easily incorporate these techniques into my content for higher rankings?”

In fact, you can create optimized pages without understanding complex algorithms. Sites like Wikipedia, IMDB, and Amazon create highly optimized, topic-focused pages almost by default. Utilizing these best practices works exactly the same when you’re creating your own content.

The purpose of this post is to provide a simple
framework for on-page topic targeting in a way that makes optimizing easy and scalable while producing richer content for your audience.

1. Keywords and relationships

No matter what topic modeling technique you choose, all rely on discovering
relationships between words and phrases. As content creators, how we organize words on a page greatly influences how search engines determine the on-page topics.

When we use keywords phrases, search engines hunt for other phrases and concepts that
relate to one another. So our first job is to expand our keywords research to incorporate these related phrases and concepts. Contextually rich content includes:

  • Close variants and synonyms: Includes abbreviations, plurals, and phrases that mean the same thing.
  • Primary related keywords: Words and phrases that relate to the main keyword phrase.
  • Secondary related keywords: Words and phrases that relate to the primary related keywords.
  • Entity relationships: Concept that describe the properties and relationships between people, places, and things. 

A good keyword phrase or entity is one that
predicts the presence of other phrases and entities on the page. For example, a page about “The White House” predicts other phrases like “president,” “Washington,” and “Secret Service.” Incorporating these related phrases may help strengthen the topicality of “White House.”

2. Position, frequency, and distance

How a page is organized can greatly influence how concepts relate to each other.

Once search engines find your keywords on a page, they need to determine which ones are most
important, and which ones actually have the strongest relationships to one another.

Three primary techniques for communicating this include:

  • Position: Keywords placed in important areas like titles, headlines, and higher up in the main body text may carry the most weight.
  • Frequency: Using techniques like TF-IDF, search engines determine important phrases by calculating how often they appear in a document compared to a normal distribution.
  • Distance: Words and phrases that relate to each other are often found close together, or grouped by HTML elements. This means leveraging semantic distance to place related concepts close to one another using paragraphs, lists, and content sectioning.

A great way to organize your on-page content is to employ your primary and secondary related keywords in support of your focus keyword. Each primary related phrase becomes its own subsection, with the secondary related phrases supporting the primary, as illustrated here.

Keyword Position, Frequency and Distance

As an example, the primary keyword phrase of this page is ‘On-page Topic Targeting‘. Supporting topics include: keywords and relationships, on-page optimization, links, entities, and keyword tools. Each related phrase supports the primary topic, and each becomes its own subsection.

3. Links and supplemental content

Many webmasters overlook the importance of linking as a topic signal.

Several well-known Google
search patents and early research papers describe analyzing a page’s links as a way to determine topic relevancy. These include both internal links to your own pages and external links to other sites, often with relevant anchor text.

Google’s own
Quality Rater Guidelines cites the value external references to other sites. It also describes a page’s supplemental content, which can includes internal links to other sections of your site, as a valuable resource.

Links and Supplemental Content

If you need an example of how relevant linking can help your SEO,
The New York Times
famously saw success, and an increase in traffic, when it started linking out to other sites from its topic pages.

Although this guide discusses
on-page topic optimization, topical external links with relevant anchor text can greatly influence how search engines determine what a page is about. These external signals often carry more weight than on-page cues, but it almost always works best when on-page and off-page signals are in alignment.

4. Entities and semantic markup

Google extracts entities from your webpage automatically,
without any effort on your part. These are people, places and things that have distinct properties and relationships with each other.

• Christopher Nolan (entity, person) stands 5’4″ (property, height) and directed Interstellar (entity, movie)

Even though entity extraction happens automatically, it’s often essential to mark up your content with
Schema for specific supported entities such as business information, reviews, and products. While the ranking benefit of adding Schema isn’t 100% clear, structured data has the advantage of enhanced search results.

Entities and Schema

For a solid guide in implementing schema.org markup, see Builtvisible’s excellent
guide to rich snippets.

5. Crafting the on-page framework

You don’t need to be a search genius or spend hours on complex research to produce high quality, topic optimized content. The beauty of this framework is that it can be used by anyone, from librarians to hobby bloggers to small business owners; even when they aren’t search engine experts.

A good webpage has much in common with a high quality university paper. This includes:

  1. A strong title that communicates the topic
  2. Introductory opening that lays out what the page is about
  3. Content organized into thematic subsections
  4. Exploration of multiple aspects of the topic and answers related questions
  5. Provision of additional resources and external citations

Your webpage doesn’t need to be academic, stuffy, or boring. Some of the most interesting pages on the Internet employ these same techniques while remaining dynamic and entertaining.

Keep in mind that ‘best practices’ don’t apply to every situation, and as
Rand Fishkin says “There’s no such thing as ‘perfectly optimized’ or ‘perfect on-page SEO.'” Pulling everything together looks something like this:

On-page Topic Targeting for SEO

This graphic is highly inspired by Rand Fishkin’s great
Visual Guide to Keyword Targeting and On-Page SEO. This guide doesn’t replace that canonical resource. Instead, it should be considered a supplement to it.

5 alternative tools for related keyword and entity research

For the search professional, there are dozens of tools available for thematic keyword and entity research. This list is not exhaustive by any means, but contains many useful favorites.

1.
Alchemy API

One of the few tools on the market that delivers entity extraction, concept targeting and linked data analysis. This is a great platform for understanding how a modern search engine views your webpage.

2.
SEO Review Tools

The SEO Keyword Suggestion Tools was actually designed to return both primary and secondary related keywords, as well as options for synonyms and country targeting. 

3.
LSIKeywords.com

The LSIKeyword tool performs Latent Semantic Indexing (LSI) on the top pages returned by Google for any given keyword phrase. The tool can go down from time to time, but it’s a great one to bookmark.

4.
Social Mention

Quick and easy, enter any keyword phrase and then check “Top Keywords” to see what words appear most with your primary phrase across the of the platforms that Social Mention monitors. 

5.
Google Trends

Google trends is a powerful related research tool, if you know how to use it. The secret is downloading your results to a CSV (under settings) to get a list up to 50 related keywords per search term.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 5 years ago from feedproxy.google.com

The Future of Link Building

Posted by Paddy_Moogan

Building the types of links that help grow your online business and organic search traffic is getting harder. It used to be fairly straightforward, back before Google worked out how to treat links with different levels of quality and trust. However, the fact that it’s getting harder doesn’t mean that it’s dead.

What does the future hold?

I’m going to talk about links, but the truth is, the future isn’t really about the links. It is far bigger than that.

Quick sidenote: I’m aware that doing a blog post about the future of link building the week of a likely Penguin update could leave me with egg on my face! But we’ll see what happens.

Links will always be a ranking factor in some form or another. I can see the dials being turned down or off on certain aspects of links (more on that below) but I think they will always be there. Google is always looking for more data, more signals, more indicators of whether or not a certain page is a good result for a user at a certain moment in time. They will find them too, as we can see from
patents such as this. A natural consequence is that other signals may be diluted or even replaced as Google becomes smarter and understands the web and users a lot better.

What this means for the future is that the links valued by Google will be the ones you get as a result of having a great product and great marketing. Essentially, links will be symptomatic of amazing marketing. Hat tip to
Jess Champion who I’ve borrowed this term from.

This isn’t easy, but it shouldn’t be. That’s the point.

To go a bit further, I think we also need to think about the bigger picture. In the grand scheme of things, there are so many more signals that Google can use which, as marketers, we need to understand and use to our advantage. Google is changing and we can’t bury our heads in the sand and ignore what is going on.

A quick side note on spammy links

My background is a spammy one so I can’t help but address this quickly. Spam will continue to work for short-term hits and churn and burn websites. I’ve talked before about 
my position on this so I won’t go into too much more detail here. I will say though that those people who are in the top 1% of spammers will continue to make money, but even for them, it will be hard to maintain over a long period of time.

Let’s move onto some more of the detail around my view of the future by first looking at the past and present.

What we’ve seen in the past

Google didn’t understand links.

The fundamental issue that Google had for a long, long time was that they didn’t understand enough about links. They didn’t understand things such as:

  • How much to trust a link
  • Whether a link was truly editorially given or not
  • Whether a link was paid for or not
  • If a link was genuinely high quality (PageRank isn’t perfect)
  • How relevant a link was

Whilst they still have work to do on all of these, they have gotten much better in recent years. At one time, a link was a link and it was pretty much a case of whoever had the most links, won. I think that for a long time, Google was trying very hard to understand links and find which ones were high quality, but there was so much noise that it was very difficult. I think that eventually they realised that they had to attack the problem from a different angle and 
Penguin came along. So instead of focusing on finding the “good” signals of links, they focused on finding the “bad” signals and started to take action on them. This didn’t fix everything, but it did enough to shock our industry into moving away from certain tactics and therefore, has probably helped reduce a lot of the noise that Google was seeing.

What we’re seeing right now

Google is understanding more about language.

Google is getting better at understanding everything.
Hummingbird was just the start of what Google hopes to achieve on this front and it stands to reason that the same kind of technology that helps the following query work, will also help Google understand links better.

Not many people in the search industry said much when
Google hired this guy back in 2012. We can be pretty sure that it’s partly down to his work that we’re seeing the type of understanding of language that we are. His work has only just begun, though, and I think we’ll see more queries like the one above that just shouldn’t work, but they do. I also think we’ll see more instances of Googlers not knowing why something ranks where it does.

Google is understanding more about people.

I talk about this a little more below but to quickly summarise here, Google is learning more about us all the time. It can seem creepy, but the fact is that Google wants as much data as possible from us so that they can serve more relevant search results—and advertising of course. They are understanding more that the keywords we type into Google may not actually be what we want to find, nor are those keywords enough to find what we really want. Google needs more context.

Tom Anthony has
talked about this extensively so I won’t go into loads more detail. But to bring it back to link building, it is important to be aware of this because it means that there are more and more signals that could mean the dial on links gets turned down a bit more.

Some predictions about the future

I want to make a few things more concrete about my view of the future for link building, so let’s look at a few specifics.

1. Anchor text will matter less and less

Anchor text as a ranking signal was always something that works well in theory but not in reality. Even in my early days of link building, I couldn’t understand why Google put so much weight behind this one signal. My main reason for this view was that using exact match keywords in a link was not natural for most webmasters. I’d go as far as to say the only people who used it were SEOs!

I’m don’t think we’re at a point yet where anchor text as a ranking signal is dead and it will take some more time for Google to turn down the dial. But we definitely are at a point where you can get hurt pretty badly if you have too much commercial anchor text in your link profile. It just isn’t natural.

In the future, Google won’t need this signal. They will be much better at understanding the content of a page and importantly, the context of a page.

2. Deep linking will matter less and less

I was on the fence about this one for a long time but the more I think about it, the more I can see this happening. I’ll explain my view here by using an example.

Let’s imagine you’re an eCommerce website and you sell laptops. Obviously each laptop you sell will have its own product page and if you sell different types, you’ll probably have category pages too. With a products like laptops, chances are that other retailers sell the same ones with the same specifications and probably have very similar looking pages to yours. How does Google know which one to rank better than others?

Links to these product pages can work fine but in my opinion, is a bit of a crude way of working it out. I think that Google will get better at understanding the subtle differences in queries from users which will naturally mean that deep links to these laptop pages will be one of many signals they can use.

Take these queries:


“laptop reviews”

Context: I want to buy a laptop but I don’t know which one.


“asus laptop reviews”

Context: I like the sound of Asus, I want to read more about their laptops.


“sony laptop reviews”

Context: I also like the sound of Sony, I want to read more about their laptops.


“sony vs asus laptop”

Context: I’m confused, they both sound the same so I want a direct comparison to help me decide.


“asus laptop”

Context: I want an Asus laptop.

You can see how the mindset of the user has changed over time and we can easily imagine how the search results will have changed to reflect this. Google already understand this. There are other signals coming into play here too though, what about these bits of additional information that Google can gather about us:

  • Location: I’m on a bus in London, I may not want to buy a £1,000 laptop right now but I’ll happily research them.
  • Device: I’m on my iPhone 6, I may not want to input credit card details into it and I worry that the website I’m using won’t work well on a small screen.
  • Search history: I’ve searched for laptops before and visited several retailers, but I keep going back to the same one as I’ve ordered from them before.

These are just a few that are easy for us to imagine Google using. There are loads more that Google could look at, not to mention signals from the retailers themselves such as secure websites, user feedback, 3rd party reviews, trust signals etc.

When you start adding all of these signals together, it’s pretty easy to see why links to a specific product page may not be the strongest signal for Google to use when determining rankings.

Smaller companies will be able to compete more.

One of the things I loved about SEO when I first got into it was the fact that organic search felt like a level playing field. I knew that with the right work, I could beat massive companies in the search results and not have to spend a fortune doing it. Suffice to say, things have changed quite a bit now and there are some industries where you stand pretty much zero chance of competing unless you have a very big budget to spend and a great product.

I think we will see a shift back in the other direction and smaller companies with fewer links will be able to rank for certain types of queries with a certain type of context. As explained above, context is key and allows Google to serve up search results that meet the context of the user. This means that massive brands are not always going to be the right answer for users and Google have to get better at understanding this. Whether a company is classified as a “brand” or not can be subjective. My local craft beer shop in London is the only one in the world and if you were to ask 100 people if they’d heard of it, they’d all probably say no. But it’s a brand to me because I love their products, their staff are knowledgeable and helpful, their marketing is cool and I’d always recommend them.

Sometimes, showing the website of this shop above bigger brands in search results is the right thing to do for a user. Google need lots of additional signals beyond “branding” and links in order to do this but I think they will get them.

What all of this means for us

Predicting the future is hard, knowing what to do about it is pretty hard too! But here are some things that I think we should be doing.

  1. Ask really hard questions
    Marketing is hard. If you or your client wants to compete and win customers, then you need to be prepared to ask really hard questions about the company. Here are just a few that I’ve found difficult when talking to clients:

    • Why does the company exist? (A good answer has nothing to do with making money)
    • Why do you deserve to rank well in Google?
    • What makes you different to your competitors?
    • If you disappeared from Google tomorrow, would anyone notice?
    • Why do you deserve to be linked to?
    • What value do you provide for users?

    The answers to these won’t always give you that silver bullet, but they can provoke conversations that make the client look inwardly and at why they should deserve links and customers. These questions are hard to answer, but again, that’s the point.

  2. Stop looking for scalable link building tactics

    Seriously, just stop. Anything that can be scaled tends to lose quality and anything that scales is likely to be targeted by the Google webspam team at some point. A
    recent piece of content we did at Distilled has so far generated links from over 700 root domains—we did NOT send 700 outreach emails! This piece took on a life of its own and generated those links after some promotion by us, but at no point did we worry about scaling outreach for it.

  3. Start focusing on doing marketing that users love

    I’m not talking necessarily about you doing the next
    Volvo ad or to be the next Old Spice guy. If you can then great, but these are out of reach for most of us.That doesn’t mean you can’t do marketing that people love. I often look at companies like Brewdog and Hawksmoor who do great marketing around their products but in a way that has personality and appeal. They don’t have to spend millions of dollars on celebrities or TV advertising because they have a great product and a fun marketing message. They have value to add which is the key, they don’t need to worry about link building because they get them naturally by doing cool stuff.

    Whilst I know that “doing cool stuff” isn’t particularly actionable, I still think it’s fair to say that marketing needs to be loved. In order to do marketing that people love, you need to have some fun and focus on adding value.

  4. Don’t bury your head in the sand

    The worst thing you can do is ignore the trends and changes taking place. Google is changing, user expectations and behaviours are changing, our industry is changing. As an industry, we’ve adapted very well over the last few years. We have to keep doing this if we’re going to survive.

    Going back to link building, you need to accept that this stuff is really hard and building the types of links that Google value is hard.

In summary

Links aren’t going anywhere. But the world is changing and we have to focus on what truly matters: marketing great products and building a loyal audience. 

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 5 years ago from feedproxy.google.com