How Your Brand Can Create an Enviable Customer Experience for Mobile Web Searchers

Posted by ronell-smith

Not very edible corned beef hash

Here I am, seated in a Manhattan, New York restaurant, staring at corned beef hash that looks and tastes like what I imagine dog food to look and taste like.

I’m pissed for two reasons:

  • It cost nearly $25 and was entirely inedible
  • I should have known better given the visuals depicted after doing a Google image search to find the dish, which was offered at a nearby restaurant

In retrospect, I should have checked A and B on my phone before ordering the $25 plate of Alpo. And though I didn’t do that, other would-be customers will, which means the business owner or SEO had better follow the steps below if they wish to stay in business.

The bad news is I no longer relish the thought of eating at high-end NY restaurants; the good news is this experience totally reshaped the way I view mobile, opening my eyes to simple but very effective tactics businesses of all types can immediately put to use for their brands.

My mobile education

We’ve all heard how mobile is transforming the web experience, reshaping the landscape for marketers, brands and consumers.

57ad39752f7dd4.15352822.jpg

As marketers, we now have to account for how our content will be accessed and consumed on mobile devices, whether that’s a phone, tablet or phablet. As brands, we realize our efforts will be judged not only on how well or high we show up in the SERPs, but also on much we can delight the on-the-go prospect who needs information that’s (a) fast, (b) accurate and (c) available from any device.

As prospects and consumers, we’ve come to know and value customer experience in large part because brands that use mobile to deliver what we need when we need it and in a way that’s easily consumed, have earned our attention — and maybe even our dollars.

But that’s where the similarities seemingly end. Marketers and brands seem to get so wrapped up in the technology (responsive design, anyone?) they forget that, at the end of the day, prospects want what they want right now — in the easiest-to-access way possible.

I’ve come to believe that, while marketers appreciate the overall value of mobile, they have yet to realize how, for customers, it’s all about what it allows them to accomplish.

At the customer/end-user level it’s not about mobile-friendly or responsive design; it’s about creating an enviable customer experience, one web searchers will reward you for with traffic, brand mentions and conversions.

I was alerted to the prominence of mobile phone use by noticing how many people sit staring at their phones while out at dinner, even as family members and friends are seated all around them. “How rude,” I thought. Then I realized it wasn’t only the people at restaurants; it’s people everywhere: walking down the street, driving (sadly and dangerously), sitting in movie theaters, at work, even texting while they talk on the phone.

One of my favorite comments with regard to mobile’s dominance comes with the Wizard of Moz himself, when he shared this tweet and accompanying image last year:

But my “aha!” moment happened last year, in Manhattan, during the corned beef hash episode.

After working until brunch, I…

  1. Opened iPhone to Google
  2. Typed “Best corned beef hash near me”
  3. Scanned list of restaurant by distance and reviews
  4. Selected the closest restaurant having > 4-star review ratings
  5. Ended up disappointed

That’s when it hit me that I’d made errors of omission at every step, in large part by leaving one very important element out of the process, but also by not thinking like a smart web user.

Normally my process is as follows, when I wish to enjoy a specific meal while traveling:

  1. Open iPhone to Google Search box
  2. Type “Best _________ near me”
  3. Scan list of restaurants by distance and reviews
  4. Select restaurant having > 4-star review rating but has excellent reviews (> 4.5) of the dish I want and has great images of the dish online
  5. Delight ensues

That’s when three things occurred to me like a brickbat to the noggin’:

  • This is a process I use quite often and is one that has proved quite foolproof
  • It’s undoubtedly a process many other would-be customer are using to identify desirable products and services
  • Marketers can reverse-engineer the process to bring the customers they’re hoping for to their doors or websites.

(Eds. note: This post was created with small business owners (single or multiple location), or those doing Local SEO for SMBs, in mind, as I hope to inform them of how many individuals think about and use mobile, and how the marketers can get in front of them with relevant content. Also, I’d like to thank Cindy Krum of Mobile Moxie for encouraging me to write this post, and Local SEO savant Phil Rozek of Local Visibility System for making sure I colored within the lines.)

Five ways to create an enviable customer experience on mobile

#1 — Optimize your images

Image optimization is the quintessential low-hanging fruit of online marketing: easy to accomplish but typically overlooked.

For our purposes, we aren’t so much making them “mobile-friendly” as we are making them search-friendly, increasing the likelihood that Google’s crawlers can better decipher what they contain and deliver them for the optimal search query.

First and foremost, do not use a stock image if your goal is for searchers to find, read and enjoy your content. Just don’t. Also, given how much of a factor website speed is, minify your images to ensure they don’t hamper page speed load times.

But the three main areas I want us to focus on are file name, alt text and title text, and captions. My standard for each is summed up very well in a blog post from Ian Lurie, who proposes an ingenious idea:

The Blank Sheet of Paper Test: If you wrote this text on a piece of paper and showed it to a stranger, would they understand the meaning? Is this text fully descriptive?

With this thinking in mind, image optimization becomes far simpler:

  • File name: We’re all adults here — don’t be thickheaded and choose something like “DSC9671 . png” when “cornedbeefhash . jpg” clearly works better.
  • Alt text and title text: Given that, in Google’s eyes, these two are the priorities, you must make certain they’re as descriptive as possible. Clearly list what the image is and/or contains without weighing it down with unneeded text. Using the corned beef hash from above as a example, “corned beef hash with minced meat” would be great, but “corned beef hash with minced meat and diced potatoes” would work better, alerting me that the dish isn’t what I’m looking for. (I prefer shredded beef and shredded potatoes.)
  • Caption: Yes, I know these aren’t necessary for every post, but why leave your visitors hanging, especially if an optimal customer experience is the goal? Were I to caption the corned beef, it’d be something along the lines of “Corned beef hash with minced meat and diced potatoes is one of the most popular dishes at XX.” It says just enough without trying to say everything, which is the goal, says Lurie.

“’Fully descriptive’ means ‘describes the thing to which it’s attached,’ not ‘describe the entire universe,'” he adds.

Also, invite customers to take and share pictures online (e.g., websites, Instagram, Yelp, Google) and include as much rich detail as possible.

What’s more, it might behoove you to have a Google Business View photo shoot, says Rozek. “Those show up most prominently (in the Knowledge Panel) for brand-name mobile searches in Google.”

#2 — Make reviews a priority

Many prospects and customers use reviews as a make-or-break tactic when making purchases. Brands, realizing this, have taken note, making it their charge to get positive reviews.

But not all reviews are created equal.

Instead of making certain your brand gets positive reviews on the entirety of its products and services, redouble your efforts at getting positive reviews on your bread-and-butter services.

In many instances, what people have to say about your individual services and/or products matters more than your brand’s overall review ratings.

I learned this from talking to several uber-picky foodie friends who shared that the main thing they look for is a brand having an overall rating (e.g., on Yelp, Google, Angie’s List, Amazon, etc.) higher than 3.5, but who have customer comments glorifying the specific product they’re hoping to enjoy.

“These days, everyone is gaming the system, doing what they can to get their customers to leave favorable reviews,” said one friend, who lives in Dallas. “But discerning [prospects] are only looking at the overall rating as a beginning point. From there, they’re digging into the comments, looking to see what people have to say about the very specific thing they want. [Smart brands] would focus more on getting people to leave comments about the particular service they used, how happy they work with the result and how it compares to other [such services they’ve used]. We may be on our phones, but we’re still willing to dig into those comments.”

To take advantage of this behavior,

  • In addition to asking for a favorable review, ask customers to comment on the specific services they used, providing as much detail as possible
  • Redouble your efforts at over-delivering on quality service when it comes to your core offerings
  • Ask a few of your regulars, who have left comments on review sites, what they think meets the minimum expectation for provoking folks to leave a review (e.g., optimizing for the desired behavior)
  • Encourage reviewers to upload photos with their reviews (or even just photos, if they don’t want to review you). They’re great “local content,” they’re useful as social-proof elements, and your customers may take better pictures than you do, in which case you can showcase them on your site.

Relevant content:

#3 — Shorten your content

I serve as a horrible spokesperson for content brevity, but it matters a great deal to mobile searchers. What works fine on desktop is a clutter-fest on mobile, even for sites using responsive design.

As a general rule, simplicity wins.

For example, Whataburger’s mobile experience is uncluttered, appealing to the eye and makes it clear what they want me to do: learn about their specials or make a purchase:

57f3dd4b0c9037.76728058.jpg

On the other hand, McDonald’s isn’t so sure what I’m looking for, apparently:

57f3dfdb8ba5c6.40363967.jpg

Are they trying to sell me potatoes, convince me of how committed they are to freshness or looking to learn as much as they can about me? Or all of the above?

Web searchers have specific needs and are typically short on time and patience, so you have to get in front of them with the right message to have a chance.

When it comes to the content you deliver, think tight (shorter), punchy (attention-grabbing) and valuable (on- message for the query).

# 4 — Optimize for local content

Like all of you, I’ve been using “near me” searches for years, especially when I travel. But over the last year, these searches have gotten more thorough and more accurate, in large part as a result of Google’s Mobile Update and because the search giant is making customer intent a priority.

In 2015, Google reported that “near me” searches increased by 34-fold since 2011.

And though most of these “near me” searches are for durable goods/appliances and their associated retailers, services, including “surgeons near me,” “plumbers near me,” “jobs near me,” etc., and other things that are typically in a high consideration set are growing considerably, according to Google via its website, thinkwithgoogle.com.

A recent case study of 82 websites (41, control group; 41, test group) shows just how dramatic the impact of optimizing a site for local intent can be. By tweaking the hours and directions page titles, descriptions and H1s to utilize the phrases “franchise dealer near me” and “nearest franchise dealer” the brand saw mobile impressions for “near me” more than double to 8,833 impressions and 46 clicks. (The control group’s “near me” impression share only rose 11%.)

57f3e93b3725a6.45545049.jpg

Image courtesy of CDK Global

Additional steps for optimizing your site for “near me” searches

  • Prominently display your business name, address and phone number (aka, NAP) on your site
  • Use schema markup in your NAP
  • In addition to proper setup and optimization of your Google My Business listing, provide each location with its own listing and, just as important, ensure that the business name, address and phone number of each location matches what’s listed on the site
  • Consider embedding a Google Map prominently on your website. “It’s good for user experience,” says Rozek. “But it may also influence rankings.”

#5 — Use Google App Deep Linking

We’ve all heard the statistics: The vast majority — in some circles the figure is 95% — of apps downloaded to mobile devices are never used. Don’t be deceived, however, into believing apps are irrelevant.

Nearly half of all time spent on the web is in apps.

This means that the mobile searchers looking for products or services in your area are likely using an app or, at the very least, prompted to enter/use an app.

For example, when I type “thai restaurant near me,” the first organic result is TripAdvisor.

57f3f59f25e451.06227108.jpg

Upon entering the site, the first (and preferred) action the brand would like for me to make is to download the TripAdvisor app:

57f3f5888e0c16.02910367.png

Many times, a “near me” search will take us to content within an app, and we won’t even realize it until we see the “continue in XX app or visit the mobile site” banner.

And if a searcher doesn’t have the app installed, “Google can show an app install button. So, enabling your app for Google indexing could actually increase the installed base of the app,” writes Eric Enge of Stone Temple Consulting.

For brands, App Deep Linking (ADL), which he defines as “the ability for Google to index content from within an app and then display it as mobile search results,” has huge implications if utilized properly.

“Think about it,” he writes. “If your app is not one of the fortunate few that get most of the attention, but your app content ranks high in searches, then you could end up with a lot more users in your app than you might have had otherwise.”

(To access details on how to set up Google App Deep Linking, read Enge’s Search Engine Land article: SMX Advanced recap: Advanced Google App Deep Linking)

If your brand has an app, this is information you shouldn’t sleep on.

Typically, when I conduct a “near me” search, I click on/look through the images until I find one that fits what I’m looking for. Nine times out of ten (depending upon what I’m looking for), I’m either taken to content within an app or taken to a mobile site and prompted to download the app.

Seems to me that ADL would be a no-brainer.

Optimizing for mobile is simply putting web searchers first

For all the gnashing of teeth Google’s many actions/inactions provoke, the search giant deserves credit for making the needs of web searchers a priority.

Too often, we, as marketers, think first and foremost in this fashion:

  1. What do we have to sell?
  2. Who needs it?
  3. What’s the cheapest, easiest way to deliver the product or service?

I think Google is saying to us that the reverse needs to occur:

  1. Make it as fast and as easy for people to find what they want
  2. Better understand who it is that’s likely to be looking for it by better understanding our customers and their intent
  3. The sales process must begin by thinking “what specific needs do web searchers have that my brand is uniquely qualified to fulfill?”

In this way, we’re placing the needs of web searchers ahead of the needs of the brand, which will be the winning combination for successful companies in the days ahead.

Brands will either follow suit or fall by the wayside.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 2 years ago from tracking.feedpress.it

Why Effective, Modern SEO Requires Technical, Creative, and Strategic Thinking – Whiteboard Friday

Posted by randfish

There’s no doubt that quite a bit has changed about SEO, and that the field is far more integrated with other aspects of online marketing than it once was. In today’s Whiteboard Friday, Rand pushes back against the idea that effective modern SEO doesn’t require any technical expertise, outlining a fantastic list of technical elements that today’s SEOs need to know about in order to be truly effective.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week I’m going to do something unusual. I don’t usually point out these inconsistencies or sort of take issue with other folks’ content on the web, because I generally find that that’s not all that valuable and useful. But I’m going to make an exception here.

There is an article by Jayson DeMers, who I think might actually be here in Seattle — maybe he and I can hang out at some point — called “Why Modern SEO Requires Almost No Technical Expertise.” It was an article that got a shocking amount of traction and attention. On Facebook, it has thousands of shares. On LinkedIn, it did really well. On Twitter, it got a bunch of attention.

Some folks in the SEO world have already pointed out some issues around this. But because of the increasing popularity of this article, and because I think there’s, like, this hopefulness from worlds outside of kind of the hardcore SEO world that are looking to this piece and going, “Look, this is great. We don’t have to be technical. We don’t have to worry about technical things in order to do SEO.”

Look, I completely get the appeal of that. I did want to point out some of the reasons why this is not so accurate. At the same time, I don’t want to rain on Jayson, because I think that it’s very possible he’s writing an article for Entrepreneur, maybe he has sort of a commitment to them. Maybe he had no idea that this article was going to spark so much attention and investment. He does make some good points. I think it’s just really the title and then some of the messages inside there that I take strong issue with, and so I wanted to bring those up.

First off, some of the good points he did bring up.

One, he wisely says, “You don’t need to know how to code or to write and read algorithms in order to do SEO.” I totally agree with that. If today you’re looking at SEO and you’re thinking, “Well, am I going to get more into this subject? Am I going to try investing in SEO? But I don’t even know HTML and CSS yet.”

Those are good skills to have, and they will help you in SEO, but you don’t need them. Jayson’s totally right. You don’t have to have them, and you can learn and pick up some of these things, and do searches, watch some Whiteboard Fridays, check out some guides, and pick up a lot of that stuff later on as you need it in your career. SEO doesn’t have that hard requirement.

And secondly, he makes an intelligent point that we’ve made many times here at Moz, which is that, broadly speaking, a better user experience is well correlated with better rankings.

You make a great website that delivers great user experience, that provides the answers to searchers’ questions and gives them extraordinarily good content, way better than what’s out there already in the search results, generally speaking you’re going to see happy searchers, and that’s going to lead to higher rankings.

But not entirely. There are a lot of other elements that go in here. So I’ll bring up some frustrating points around the piece as well.

First off, there’s no acknowledgment — and I find this a little disturbing — that the ability to read and write code, or even HTML and CSS, which I think are the basic place to start, is helpful or can take your SEO efforts to the next level. I think both of those things are true.

So being able to look at a web page, view source on it, or pull up Firebug in Firefox or something and diagnose what’s going on and then go, “Oh, that’s why Google is not able to see this content. That’s why we’re not ranking for this keyword or term, or why even when I enter this exact sentence in quotes into Google, which is on our page, this is why it’s not bringing it up. It’s because it’s loading it after the page from a remote file that Google can’t access.” These are technical things, and being able to see how that code is built, how it’s structured, and what’s going on there, very, very helpful.

Some coding knowledge also can take your SEO efforts even further. I mean, so many times, SEOs are stymied by the conversations that we have with our programmers and our developers and the technical staff on our teams. When we can have those conversations intelligently, because at least we understand the principles of how an if-then statement works, or what software engineering best practices are being used, or they can upload something into a GitHub repository, and we can take a look at it there, that kind of stuff is really helpful.

Secondly, I don’t like that the article overly reduces all of this information that we have about what we’ve learned about Google. So he mentions two sources. One is things that Google tells us, and others are SEO experiments. I think both of those are true. Although I’d add that there’s sort of a sixth sense of knowledge that we gain over time from looking at many, many search results and kind of having this feel for why things rank, and what might be wrong with a site, and getting really good at that using tools and data as well. There are people who can look at Open Site Explorer and then go, “Aha, I bet this is going to happen.” They can look, and 90% of the time they’re right.

So he boils this down to, one, write quality content, and two, reduce your bounce rate. Neither of those things are wrong. You should write quality content, although I’d argue there are lots of other forms of quality content that aren’t necessarily written — video, images and graphics, podcasts, lots of other stuff.

And secondly, that just doing those two things is not always enough. So you can see, like many, many folks look and go, “I have quality content. It has a low bounce rate. How come I don’t rank better?” Well, your competitors, they’re also going to have quality content with a low bounce rate. That’s not a very high bar.

Also, frustratingly, this really gets in my craw. I don’t think “write quality content” means anything. You tell me. When you hear that, to me that is a totally non-actionable, non-useful phrase that’s a piece of advice that is so generic as to be discardable. So I really wish that there was more substance behind that.

The article also makes, in my opinion, the totally inaccurate claim that modern SEO really is reduced to “the happier your users are when they visit your site, the higher you’re going to rank.”

Wow. Okay. Again, I think broadly these things are correlated. User happiness and rank is broadly correlated, but it’s not a one to one. This is not like a, “Oh, well, that’s a 1.0 correlation.”

I would guess that the correlation is probably closer to like the page authority range. I bet it’s like 0.35 or something correlation. If you were to actually measure this broadly across the web and say like, “Hey, were you happier with result one, two, three, four, or five,” the ordering would not be perfect at all. It probably wouldn’t even be close.

There’s a ton of reasons why sometimes someone who ranks on Page 2 or Page 3 or doesn’t rank at all for a query is doing a better piece of content than the person who does rank well or ranks on Page 1, Position 1.

Then the article suggests five and sort of a half steps to successful modern SEO, which I think is a really incomplete list. So Jayson gives us;

  • Good on-site experience
  • Writing good content
  • Getting others to acknowledge you as an authority
  • Rising in social popularity
  • Earning local relevance
  • Dealing with modern CMS systems (which he notes most modern CMS systems are SEO-friendly)

The thing is there’s nothing actually wrong with any of these. They’re all, generally speaking, correct, either directly or indirectly related to SEO. The one about local relevance, I have some issue with, because he doesn’t note that there’s a separate algorithm for sort of how local SEO is done and how Google ranks local sites in maps and in their local search results. Also not noted is that rising in social popularity won’t necessarily directly help your SEO, although it can have indirect and positive benefits.

I feel like this list is super incomplete. Okay, I brainstormed just off the top of my head in the 10 minutes before we filmed this video a list. The list was so long that, as you can see, I filled up the whole whiteboard and then didn’t have any more room. I’m not going to bother to erase and go try and be absolutely complete.

But there’s a huge, huge number of things that are important, critically important for technical SEO. If you don’t know how to do these things, you are sunk in many cases. You can’t be an effective SEO analyst, or consultant, or in-house team member, because you simply can’t diagnose the potential problems, rectify those potential problems, identify strategies that your competitors are using, be able to diagnose a traffic gain or loss. You have to have these skills in order to do that.

I’ll run through these quickly, but really the idea is just that this list is so huge and so long that I think it’s very, very, very wrong to say technical SEO is behind us. I almost feel like the opposite is true.

We have to be able to understand things like;

  • Content rendering and indexability
  • Crawl structure, internal links, JavaScript, Ajax. If something’s post-loading after the page and Google’s not able to index it, or there are links that are accessible via JavaScript or Ajax, maybe Google can’t necessarily see those or isn’t crawling them as effectively, or is crawling them, but isn’t assigning them as much link weight as they might be assigning other stuff, and you’ve made it tough to link to them externally, and so they can’t crawl it.
  • Disabling crawling and/or indexing of thin or incomplete or non-search-targeted content. We have a bunch of search results pages. Should we use rel=prev/next? Should we robots.txt those out? Should we disallow from crawling with meta robots? Should we rel=canonical them to other pages? Should we exclude them via the protocols inside Google Webmaster Tools, which is now Google Search Console?
  • Managing redirects, domain migrations, content updates. A new piece of content comes out, replacing an old piece of content, what do we do with that old piece of content? What’s the best practice? It varies by different things. We have a whole Whiteboard Friday about the different things that you could do with that. What about a big redirect or a domain migration? You buy another company and you’re redirecting their site to your site. You have to understand things about subdomain structures versus subfolders, which, again, we’ve done another Whiteboard Friday about that.
  • Proper error codes, downtime procedures, and not found pages. If your 404 pages turn out to all be 200 pages, well, now you’ve made a big error there, and Google could be crawling tons of 404 pages that they think are real pages, because you’ve made it a status code 200, or you’ve used a 404 code when you should have used a 410, which is a permanently removed, to be able to get it completely out of the indexes, as opposed to having Google revisit it and keep it in the index.

Downtime procedures. So there’s specifically a… I can’t even remember. It’s a 5xx code that you can use. Maybe it was a 503 or something that you can use that’s like, “Revisit later. We’re having some downtime right now.” Google urges you to use that specific code rather than using a 404, which tells them, “This page is now an error.”

Disney had that problem a while ago, if you guys remember, where they 404ed all their pages during an hour of downtime, and then their homepage, when you searched for Disney World, was, like, “Not found.” Oh, jeez, Disney World, not so good.

  • International and multi-language targeting issues. I won’t go into that. But you have to know the protocols there. Duplicate content, syndication, scrapers. How do we handle all that? Somebody else wants to take our content, put it on their site, what should we do? Someone’s scraping our content. What can we do? We have duplicate content on our own site. What should we do?
  • Diagnosing traffic drops via analytics and metrics. Being able to look at a rankings report, being able to look at analytics connecting those up and trying to see: Why did we go up or down? Did we have less pages being indexed, more pages being indexed, more pages getting traffic less, more keywords less?
  • Understanding advanced search parameters. Today, just today, I was checking out the related parameter in Google, which is fascinating for most sites. Well, for Moz, weirdly, related:oursite.com shows nothing. But for virtually every other sit, well, most other sites on the web, it does show some really interesting data, and you can see how Google is connecting up, essentially, intentions and topics from different sites and pages, which can be fascinating, could expose opportunities for links, could expose understanding of how they view your site versus your competition or who they think your competition is.

Then there are tons of parameters, like in URL and in anchor, and da, da, da, da. In anchor doesn’t work anymore, never mind about that one.

I have to go faster, because we’re just going to run out of these. Like, come on. Interpreting and leveraging data in Google Search Console. If you don’t know how to use that, Google could be telling you, you have all sorts of errors, and you don’t know what they are.

  • Leveraging topic modeling and extraction. Using all these cool tools that are coming out for better keyword research and better on-page targeting. I talked about a couple of those at MozCon, like MonkeyLearn. There’s the new Moz Context API, which will be coming out soon, around that. There’s the Alchemy API, which a lot of folks really like and use.
  • Identifying and extracting opportunities based on site crawls. You run a Screaming Frog crawl on your site and you’re going, “Oh, here’s all these problems and issues.” If you don’t have these technical skills, you can’t diagnose that. You can’t figure out what’s wrong. You can’t figure out what needs fixing, what needs addressing.
  • Using rich snippet format to stand out in the SERPs. This is just getting a better click-through rate, which can seriously help your site and obviously your traffic.
  • Applying Google-supported protocols like rel=canonical, meta description, rel=prev/next, hreflang, robots.txt, meta robots, x robots, NOODP, XML sitemaps, rel=nofollow. The list goes on and on and on. If you’re not technical, you don’t know what those are, you think you just need to write good content and lower your bounce rate, it’s not going to work.
  • Using APIs from services like AdWords or MozScape, or hrefs from Majestic, or SEM refs from SearchScape or Alchemy API. Those APIs can have powerful things that they can do for your site. There are some powerful problems they could help you solve if you know how to use them. It’s actually not that hard to write something, even inside a Google Doc or Excel, to pull from an API and get some data in there. There’s a bunch of good tutorials out there. Richard Baxter has one, Annie Cushing has one, I think Distilled has some. So really cool stuff there.
  • Diagnosing page load speed issues, which goes right to what Jayson was talking about. You need that fast-loading page. Well, if you don’t have any technical skills, you can’t figure out why your page might not be loading quickly.
  • Diagnosing mobile friendliness issues
  • Advising app developers on the new protocols around App deep linking, so that you can get the content from your mobile apps into the web search results on mobile devices. Awesome. Super powerful. Potentially crazy powerful, as mobile search is becoming bigger than desktop.

Okay, I’m going to take a deep breath and relax. I don’t know Jayson’s intention, and in fact, if he were in this room, he’d be like, “No, I totally agree with all those things. I wrote the article in a rush. I had no idea it was going to be big. I was just trying to make the broader points around you don’t have to be a coder in order to do SEO.” That’s completely fine.

So I’m not going to try and rain criticism down on him. But I think if you’re reading that article, or you’re seeing it in your feed, or your clients are, or your boss is, or other folks are in your world, maybe you can point them to this Whiteboard Friday and let them know, no, that’s not quite right. There’s a ton of technical SEO that is required in 2015 and will be for years to come, I think, that SEOs have to have in order to be effective at their jobs.

All right, everyone. Look forward to some great comments, and we’ll see you again next time for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

The Linkbait Bump: How Viral Content Creates Long-Term Lift in Organic Traffic – Whiteboard Friday

Posted by randfish

A single fantastic (or “10x”) piece of content can lift a site’s traffic curves long beyond the popularity of that one piece. In today’s Whiteboard Friday, Rand talks about why those curves settle into a “new normal,” and how you can go about creating the content that drives that change.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about the linkbait bump, classic phrase in the SEO world and almost a little dated. I think today we’re talking a little bit more about viral content and how high-quality content, content that really is the cornerstone of a brand or a website’s content can be an incredible and powerful driver of traffic, not just when it initially launches but over time.

So let’s take a look.

This is a classic linkbait bump, viral content bump analytics chart. I’m seeing over here my traffic and over here the different months of the year. You know, January, February, March, like I’m under a thousand. Maybe I’m at 500 visits or something, and then I have this big piece of viral content. It performs outstandingly well from a relative standpoint for my site. It gets 10,000 or more visits, drives a ton more people to my site, and then what happens is that that traffic falls back down. But the new normal down here, new normal is higher than the old normal was. So the new normal might be at 1,000, 1,500 or 2,000 visits whereas before I was at 500.

Why does this happen?

A lot of folks see an analytics chart like this, see examples of content that’s done this for websites, and they want to know: Why does this happen and how can I replicate that effect? The reasons why are it sort of feeds back into that viral loop or the flywheel, which we’ve talked about in previous Whiteboard Fridays, where essentially you start with a piece of content. That content does well, and then you have things like more social followers on your brand’s accounts. So now next time you go to amplify content or share content socially, you’re reaching more potential people. You have a bigger audience. You have more people who share your content because they’ve seen that that content performs well for them in social. So they want to find other content from you that might help their social accounts perform well.

You see more RSS and email subscribers because people see your interesting content and go, “Hey, I want to see when these guys produce something else.” You see more branded search traffic because people are looking specifically for content from you, not necessarily just around this viral piece, although that’s often a big part of it, but around other pieces as well, especially if you do a good job of exposing them to that additional content. You get more bookmark and type in traffic, more searchers biased by personalization because they’ve already visited your site. So now when they search and they’re logged into their accounts, they’re going to see your site ranking higher than they normally would otherwise, and you get an organic SEO lift from all the links and shares and engagement.

So there’s a ton of different factors that feed into this, and you kind of want to hit all of these things. If you have a piece of content that gets a lot of shares, a lot of links, but then doesn’t promote engagement, doesn’t get more people signing up, doesn’t get more people searching for your brand or searching for that content specifically, then it’s not going to have the same impact. Your traffic might fall further and more quickly.

How do you achieve this?

How do we get content that’s going to do this? Well, we’re going to talk through a number of things that we’ve talked about previously on Whiteboard Friday. But there are some additional ones as well. This isn’t just creating good content or creating high quality content, it’s creating a particular kind of content. So for this what you want is a deep understanding, not necessarily of what your standard users or standard customers are interested in, but a deep understanding of what influencers in your niche will share and promote and why they do that.

This often means that you follow a lot of sharers and influencers in your field, and you understand, hey, they’re all sharing X piece of content. Why? Oh, because it does this, because it makes them look good, because it helps their authority in the field, because it provides a lot of value to their followers, because they know it’s going to get a lot of retweets and shares and traffic. Whatever that because is, you have to have a deep understanding of it in order to have success with viral kinds of content.

Next, you want to have empathy for users and what will give them the best possible experience. So if you know, for example, that a lot of people are coming on mobile and are going to be sharing on mobile, which is true of almost all viral content today, FYI, you need to be providing a great mobile and desktop experience. Oftentimes that mobile experience has to be different, not just responsive design, but actually a different format, a different way of being able to scroll through or watch or see or experience that content.

There are some good examples out there of content that does that. It makes a very different user experience based on the browser or the device you’re using.

You also need to be aware of what will turn them off. So promotional messages, pop-ups, trying to sell to them, oftentimes that diminishes user experience. It means that content that could have been more viral, that could have gotten more shares won’t.

Unique value and attributes that separate your content from everything else in the field. So if there’s like ABCD and whoa, what’s that? That’s very unique. That stands out from the crowd. That provides a different form of value in a different way than what everyone else is doing. That uniqueness is often a big reason why content spreads virally, why it gets more shared than just the normal stuff.

I’ve talk about this a number of times, but content that’s 10X better than what the competition provides. So unique value from the competition, but also quality that is not just a step up, but 10X better, massively, massively better than what else you can get out there. That makes it unique enough. That makes it stand out from the crowd, and that’s a very hard thing to do, but that’s why this is so rare and so valuable.

This is a critical one, and I think one that, I’ll just say, many organizations fail at. That is the freedom and support to fail many times, to try to create these types of effects, to have this impact many times before you hit on a success. A lot of managers and clients and teams and execs just don’t give marketing teams and content teams the freedom to say, “Yeah, you know what? You spent a month and developer resources and designer resources and spent some money to go do some research and contracted with this third party, and it wasn’t a hit. It didn’t work. We didn’t get the viral content bump. It just kind of did okay. You know what? We believe in you. You’ve got a lot of chances. You should try this another 9 or 10 times before we throw it out. We really want to have a success here.”

That is something that very few teams invest in. The powerful thing is because so few people are willing to invest that way, the ones that do, the ones that believe in this, the ones that invest long term, the ones that are willing to take those failures are going to have a much better shot at success, and they can stand out from the crowd. They can get these bumps. It’s powerful.

Not a requirement, but it really, really helps to have a strong engaged community, either on your site and around your brand, or at least in your niche and your topic area that will help, that wants to see you, your brand, your content succeed. If you’re in a space that has no community, I would work on building one, even if it’s very small. We’re not talking about building a community of thousands or tens of thousands. A community of 100 people, a community of 50 people even can be powerful enough to help content get that catalyst, that first bump that’ll boost it into viral potential.

Then finally, for this type of content, you need to have a logical and not overly promotional match between your brand and the content itself. You can see many sites in what I call sketchy niches. So like a criminal law site or a casino site or a pharmaceutical site that’s offering like an interactive musical experience widget, and you’re like, “Why in the world is this brand promoting this content? Why did they even make it? How does that match up with what they do? Oh, it’s clearly just intentionally promotional.”

Look, many of these brands go out there and they say, “Hey, the average web user doesn’t know and doesn’t care.” I agree. But the average web user is not an influencer. Influencers know. Well, they’re very, very suspicious of why content is being produced and promoted, and they’re very skeptical of promoting content that they don’t think is altruistic. So this kills a lot of content for brands that try and invest in it when there’s no match. So I think you really need that.

Now, when you do these linkbait bump kinds of things, I would strongly recommend that you follow up, that you consider the quality of the content that you’re producing. Thereafter, that you invest in reproducing these resources, keeping those resources updated, and that you don’t simply give up on content production after this. However, if you’re a small business site, a small or medium business, you might think about only doing one or two of these a year. If you are a heavy content player, you’re doing a lot of content marketing, content marketing is how you’re investing in web traffic, I’d probably be considering these weekly or monthly at the least.

All right, everyone. Look forward to your experiences with the linkbait bump, and I will see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

Pinpoint vs. Floodlight Content and Keyword Research Strategies – Whiteboard Friday

Posted by randfish

When we’re doing keyword research and targeting, we have a choice to make: Are we targeting broader keywords with multiple potential searcher intents, or are we targeting very narrow keywords where it’s pretty clear what the searchers were looking for? Those different approaches, it turns out, apply to content creation and site architecture, as well. In today’s Whiteboard Friday, Rand illustrates that connection.

Pinpoint vs Floodlight Content and Keyword Research Strategy Whiteboard

For reference, here are stills of this week’s whiteboards. Click on it to open a high resolution image in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about pinpoint versus floodlight tactics for content targeting, content strategy, and keyword research, keyword targeting strategy. This is also called the shotgun versus sniper approach, but I’m not a big gun fan. So I’m going to stick with my floodlight versus pinpoint, plus, you know, for the opening shot we don’t have a whole lot of weaponry here at Moz, but we do have lighting.

So let’s talk through this at first. You’re going through and doing some keyword research. You’re trying to figure out which terms and phrases to target. You might look down a list like this.

Well, maybe, I’m using an example here around antique science equipment. So you see these various terms and phrases. You’ve got your volume numbers. You probably have lots of other columns. Hopefully, you’ve watched the Whiteboard Friday on how to do keyword research like it’s 2015 and not 2010.

So you know you have all these other columns to choose from, but I’m simplifying here for the purpose of this experiment. So you might choose some of these different terms. Now, they’re going to have different kinds of tactics and a different strategic approach, depending on the breadth and depth of the topic that you’re targeting. That’s going to determine what types of content you want to create and where you place it in your information architecture. So I’ll show you what I mean.

The floodlight approach

For antique science equipment, this is a relatively broad phrase. I’m going to do my floodlight analysis on this, and floodlight analysis is basically saying like, “Okay, are there multiple potential searcher intents?” Yeah, absolutely. That’s a fairly broad phase. People could be looking to transact around it. They might be looking for research information, historical information, different types of scientific equipment that they’re looking for.

<img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/55b15fc96679b8.73854740.jpg" rel="box-shadow: 0 0 10px 0 #999; border-radius: 20px;"

Are there four or more approximately unique keyword terms and phrases to target? Well, absolutely, in fact, there’s probably more than that. So antique science equipment, antique scientific equipment, 18th century scientific equipment, all these different terms and phrases that you might explore there.

Is this a broad content topic with many potential subtopics? Again, yes is the answer to this. Are we talking about generally larger search volume? Again, yes, this is going to have a much larger search volume than some of the narrower terms and phrases. That’s not always the case, but it is here.

The pinpoint approach

For pinpoint analysis, we kind of go the opposite direction. So we might look at a term like antique test tubes, which is a very specific kind of search, and that has a clear single searcher intent or maybe two. Someone might be looking for actually purchasing one of those, or they might be looking to research them and see what kinds there are. Not a ton of additional intents behind that. One to three unique keywords, yeah, probably. It’s pretty specific. Antique test tubes, maybe 19th century test tubes, maybe old science test tubes, but you’re talking about a limited set of keywords that you’re targeting. It’s a narrow content topic, typically smaller search volume.

<img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/55b160069eb6b1.12473448.jpg" rel="box-shadow: 0 0 10px 0 #999; border-radius: 20px;"

Now, these are going to feed into your IA, your information architecture, and your site structure in this way. So floodlight content generally sits higher up. It’s the category or the subcategory, those broad topic terms and phrases. Those are going to turn into those broad topic category pages. Then you might have multiple, narrower subtopics. So we could go into lab equipment versus astronomical equipment versus chemistry equipment, and then we’d get into those individual pinpoints from the pinpoint analysis.

How do I decide which approach is best for my keywords?

Why are we doing this? Well, generally speaking, if you can take your terms and phrases and categorize them like this and then target them differently, you’re going to provide a better, more logical user experience. Someone who searches for antique scientific equipment, they’re going to really expect to see that category and then to be able to drill down into things. So you’re providing them the experience they predict, the one that they want, the one that they expect.

It’s better for topic modeling analysis and for all of the algorithms around things like Hummingbird, where Google looks at: Are you using the types of terms and phrases, do you have the type of architecture that we expect to find for this keyword?

It’s better for search intent targeting, because the searcher intent is going to be fulfilled if you provide the multiple paths versus the narrow focus. It’s easier keyword targeting for you. You’re going to be able to know, “Hey, I need to target a lot of different terms and phrases and variations in floodlight and one very specific one in pinpoint.”

There’s usually higher searcher satisfaction, which means you get lower bounce rate. You get more engagement. You usually get a higher conversion rate. So it’s good for all those things.

For example…

I’ll actually create pages for each of antique scientific equipment and antique test tubes to illustrate this. So I’ve got two different types of pages here. One is my antique scientific equipment page.

<img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/55b161fa871e32.54731215.jpg" rel="box-shadow: 0 0 10px 0 #999; border-radius: 20px;"

This is that floodlight, shotgun approach, and what we’re doing here is going to be very different from a pinpoint approach. It’s looking at like, okay, you’ve landed on antique scientific equipment. Now, where do you want to go? What do you want to specifically explore? So we’re going to have a little bit of content specifically about this topic, and how robust that is depends on the type of topic and the type of site you are.

If this is an e-commerce site or a site that’s showing information about various antiques, well maybe we don’t need very much content here. You can see the filtration that we’ve got is going to be pretty broad. So I can go into different centuries. I can go into chemistry, astronomy, physics. Maybe I have a safe for kids type of stuff if you want to buy your kids antique lab equipment, which you might be. Who knows? Maybe you’re awesome and your kids are too. Then different types of stuff at a very broad level. So I can go to microscopes or test tubes, lab searches.

This is great because it’s got broad intent foci, serving many different kinds of searchers with the same page because we don’t know exactly what they want. It’s got multiple keyword targets so that we can go after broad phrases like antique or old or historical or 13th, 14th, whatever century, science and scientific equipment ,materials, labs, etc., etc., etc. This is a broad page that could reach any and all of those. Then there’s lots of navigational and refinement options once you get there.

Total opposite of pinpoint content.

<img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/55b1622740f0b5.73477500.jpg" rel="box-shadow: 0 0 10px 0 #999; border-radius: 20px;"

Pinpoint content, like this antique test tubes page, we’re still going to have some filtration options, but one of the important things to note is note how these are links that take you deeper. Depending on how deep the search volume goes in terms of the types of queries that people are performing, you might want to make a specific page for 17th century antique test tubes. You might not, and if you don’t want to do that, you can have these be filters that are simply clickable and change the content of the page here, narrowing the options rather than creating completely separate pages.

So if there’s no search volume for these different things and you don’t think you need to separately target them, go ahead and just make them filters on the data that already appears on this page or the results that are already in here as opposed to links that are going to take you deeper into specific content and create a new page, a new experience.

You can also see I’ve got my individual content here. I probably would go ahead and add some content specifically to this page that is just unique here and that describes antique test tubes and the things that your searchers need. They might want to know things about price. They might want to know things about make and model. They might want to know things about what they were used for. Great. You can have that information broadly, and then individual pieces of content that someone might dig into.

This is narrower intent foci obviously, serving maybe one or two searcher intents. This is really talking about targeting maybe one to two separate keywords. So antique test tubes, maybe lab tubes or test tube sets, but not much beyond that.

Ten we’re going to have fewer navigational paths, fewer distractions. We want to keep the searcher. Because we know their intent, we want to guide them along the path that we know they probably want to take and that we want them to take.

So when you’re considering your content, choose wisely between shotgun/floodlight approach or sniper/pinpoint approach. Your searchers will be better served. You’ll probably rank better. You’ll be more likely to earn links and amplification. You’re going to be more successful.

Looking forward to the comments, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

Why We Can’t Do Keyword Research Like It’s 2010 – Whiteboard Friday

Posted by randfish

Keyword Research is a very different field than it was just five years ago, and if we don’t keep up with the times we might end up doing more harm than good. From the research itself to the selection and targeting process, in today’s Whiteboard Friday Rand explains what has changed and what we all need to do to conduct effective keyword research today.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

What do we need to change to keep up with the changing world of keyword research?

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat a little bit about keyword research, why it’s changed from the last five, six years and what we need to do differently now that things have changed. So I want to talk about changing up not just the research but also the selection and targeting process.

There are three big areas that I’ll cover here. There’s lots more in-depth stuff, but I think we should start with these three.

1) The Adwords keyword tool hides data!

This is where almost all of us in the SEO world start and oftentimes end with our keyword research. We go to AdWords Keyword Tool, what used to be the external keyword tool and now is inside AdWords Ad Planner. We go inside that tool, and we look at the volume that’s reported and we sort of record that as, well, it’s not good, but it’s the best we’re going to do.

However, I think there are a few things to consider here. First off, that tool is hiding data. What I mean by that is not that they’re not telling the truth, but they’re not telling the whole truth. They’re not telling nothing but the truth, because those rounded off numbers that you always see, you know that those are inaccurate. Anytime you’ve bought keywords, you’ve seen that the impression count never matches the count that you see in the AdWords tool. It’s not usually massively off, but it’s often off by a good degree, and the only thing it’s great for is telling relative volume from one from another.

But because AdWords hides data essentially by saying like, “Hey, you’re going to type in . . .” Let’s say I’m going to type in “college tuition,” and Google knows that a lot of people search for how to reduce college tuition, but that doesn’t come up in the suggestions because it’s not a commercial term, or they don’t think that an advertiser who bids on that is going to do particularly well and so they don’t show it in there. I’m giving an example. They might indeed show that one.

But because that data is hidden, we need to go deeper. We need to go beyond and look at things like Google Suggest and related searches, which are down at the bottom. We need to start conducting customer interviews and staff interviews, which hopefully has always been part of your brainstorming process but really needs to be now. Then you can apply that to AdWords. You can apply that to suggest and related.

The beautiful thing is once you get these tools from places like visiting forums or communities, discussion boards and seeing what terms and phrases people are using, you can collect all this stuff up, plug it back into AdWords, and now they will tell you how much volume they’ve got. So you take that how to lower college tuition term, you plug it into AdWords, they will show you a number, a non-zero number. They were just hiding it in the suggestions because they thought, “Hey, you probably don’t want to bid on that. That won’t bring you a good ROI.” So you’ve got to be careful with that, especially when it comes to SEO kinds of keyword research.

2) Building separate pages for each term or phrase doesn’t make sense

It used to be the case that we built separate pages for every single term and phrase that was in there, because we wanted to have the maximum keyword targeting that we could. So it didn’t matter to us that college scholarship and university scholarships were essentially people looking for exactly the same thing, just using different terminology. We would make one page for one and one page for the other. That’s not the case anymore.

Today, we need to group by the same searcher intent. If two searchers are searching for two different terms or phrases but both of them have exactly the same intent, they want the same information, they’re looking for the same answers, their query is going to be resolved by the same content, we want one page to serve those, and that’s changed up a little bit of how we’ve done keyword research and how we do selection and targeting as well.

3) Build your keyword consideration and prioritization spreadsheet with the right metrics

Everybody’s got an Excel version of this, because I think there’s just no awesome tool out there that everyone loves yet that kind of solves this problem for us, and Excel is very, very flexible. So we go into Excel, we put in our keyword, the volume, and then a lot of times we almost stop there. We did keyword volume and then like value to the business and then we prioritize.

What are all these new columns you’re showing me, Rand? Well, here I think is how sophisticated, modern SEOs that I’m seeing in the more advanced agencies, the more advanced in-house practitioners, this is what I’m seeing them add to the keyword process.

Difficulty

A lot of folks have done this, but difficulty helps us say, “Hey, this has a lot of volume, but it’s going to be tremendously hard to rank.”

The difficulty score that Moz uses and attempts to calculate is a weighted average of the top 10 domain authorities. It also uses page authority, so it’s kind of a weighted stack out of the two. If you’re seeing very, very challenging pages, very challenging domains to get in there, it’s going to be super hard to rank against them. The difficulty is high. For all of these ones it’s going to be high because college and university terms are just incredibly lucrative.

That difficulty can help bias you against chasing after terms and phrases for which you are very unlikely to rank for at least early on. If you feel like, “Hey, I already have a powerful domain. I can rank for everything I want. I am the thousand pound gorilla in my space,” great. Go after the difficulty of your choice, but this helps prioritize.

Opportunity

This is actually very rarely used, but I think sophisticated marketers are using it extremely intelligently. Essentially what they’re saying is, “Hey, if you look at a set of search results, sometimes there are two or three ads at the top instead of just the ones on the sidebar, and that’s biasing some of the click-through rate curve.” Sometimes there’s an instant answer or a Knowledge Graph or a news box or images or video, or all these kinds of things that search results can be marked up with, that are not just the classic 10 web results. Unfortunately, if you’re building a spreadsheet like this and treating every single search result like it’s just 10 blue links, well you’re going to lose out. You’re missing the potential opportunity and the opportunity cost that comes with ads at the top or all of these kinds of features that will bias the click-through rate curve.

So what I’ve seen some really smart marketers do is essentially build some kind of a framework to say, “Hey, you know what? When we see that there’s a top ad and an instant answer, we’re saying the opportunity if I was ranking number 1 is not 10 out of 10. I don’t expect to get whatever the average traffic for the number 1 position is. I expect to get something considerably less than that. Maybe something around 60% of that, because of this instant answer and these top ads.” So I’m going to mark this opportunity as a 6 out of 10.

There are 2 top ads here, so I’m giving this a 7 out of 10. This has two top ads and then it has a news block below the first position. So again, I’m going to reduce that click-through rate. I think that’s going down to a 6 out of 10.

You can get more and less scientific and specific with this. Click-through rate curves are imperfect by nature because we truly can’t measure exactly how those things change. However, I think smart marketers can make some good assumptions from general click-through rate data, which there are several resources out there on that to build a model like this and then include it in their keyword research.

This does mean that you have to run a query for every keyword you’re thinking about, but you should be doing that anyway. You want to get a good look at who’s ranking in those search results and what kind of content they’re building . If you’re running a keyword difficulty tool, you are already getting something like that.

Business value

This is a classic one. Business value is essentially saying, “What’s it worth to us if visitors come through with this search term?” You can get that from bidding through AdWords. That’s the most sort of scientific, mathematically sound way to get it. Then, of course, you can also get it through your own intuition. It’s better to start with your intuition than nothing if you don’t already have AdWords data or you haven’t started bidding, and then you can refine your sort of estimate over time as you see search visitors visit the pages that are ranking, as you potentially buy those ads, and those kinds of things.

You can get more sophisticated around this. I think a 10 point scale is just fine. You could also use a one, two, or three there, that’s also fine.

Requirements or Options

Then I don’t exactly know what to call this column. I can’t remember the person who’ve showed me theirs that had it in there. I think they called it Optional Data or Additional SERPs Data, but I’m going to call it Requirements or Options. Requirements because this is essentially saying, “Hey, if I want to rank in these search results, am I seeing that the top two or three are all video? Oh, they’re all video. They’re all coming from YouTube. If I want to be in there, I’ve got to be video.”

Or something like, “Hey, I’m seeing that most of the top results have been produced or updated in the last six months. Google appears to be biasing to very fresh information here.” So, for example, if I were searching for “university scholarships Cambridge 2015,” well, guess what? Google probably wants to bias to show results that have been either from the official page on Cambridge’s website or articles from this year about getting into that university and the scholarships that are available or offered. I saw those in two of these search results, both the college and university scholarships had a significant number of the SERPs where a fresh bump appeared to be required. You can see that a lot because the date will be shown ahead of the description, and the date will be very fresh, sometime in the last six months or a year.

Prioritization

Then finally I can build my prioritization. So based on all the data I had here, I essentially said, “Hey, you know what? These are not 1 and 2. This is actually 1A and 1B, because these are the same concepts. I’m going to build a single page to target both of those keyword phrases.” I think that makes good sense. Someone who is looking for college scholarships, university scholarships, same intent.

I am giving it a slight prioritization, 1A versus 1B, and the reason I do this is because I always have one keyword phrase that I’m leaning on a little more heavily. Because Google isn’t perfect around this, the search results will be a little different. I want to bias to one versus the other. In this case, my title tag, since I more targeting university over college, I might say something like college and university scholarships so that university and scholarships are nicely together, near the front of the title, that kind of thing. Then 1B, 2, 3.

This is kind of the way that modern SEOs are building a more sophisticated process with better data, more inclusive data that helps them select the right kinds of keywords and prioritize to the right ones. I’m sure you guys have built some awesome stuff. The Moz community is filled with very advanced marketers, probably plenty of you who’ve done even more than this.

I look forward to hearing from you in the comments. I would love to chat more about this topic, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

Has Google Gone Too Far with the Bias Toward Its Own Content?

Posted by ajfried

Since the beginning of SEO time, practitioners have been trying to crack the Google algorithm. Every once in a while, the industry gets a glimpse into how the search giant works and we have opportunity to deconstruct it. We don’t get many of these opportunities, but when we do—assuming we spot them in time—we try to take advantage of them so we can “fix the Internet.”

On Feb. 16, 2015, news started to circulate that NBC would start removing images and references of Brian Williams from its website.

This was it!

A golden opportunity.

This was our chance to learn more about the Knowledge Graph.

Expectation vs. reality

Often it’s difficult to predict what Google is truly going to do. We expect something to happen, but in reality it’s nothing like we imagined.

Expectation

What we expected to see was that Google would change the source of the image. Typically, if you hover over the image in the Knowledge Graph, it reveals the location of the image.

Keanu-Reeves-Image-Location.gif

This would mean that if the image disappeared from its original source, then the image displayed in the Knowledge Graph would likely change or even disappear entirely.

Reality (February 2015)

The only problem was, there was no official source (this changed, as you will soon see) and identifying where the image was coming from proved extremely challenging. In fact, when you clicked on the image, it took you to an image search result that didn’t even include the image.

Could it be? Had Google started its own database of owned or licensed images and was giving it priority over any other sources?

In order to find the source, we tried taking the image from the Knowledge Graph and “search by image” in images.google.com to find others like it. For the NBC Nightly News image, Google failed to even locate a match to the image it was actually using anywhere on the Internet. For other television programs, it was successful. Here is an example of what happened for Morning Joe:

Morning_Joe_image_search.png

So we found the potential source. In fact, we found three potential sources. Seemed kind of strange, but this seemed to be the discovery we were looking for.

This looks like Google is using someone else’s content and not referencing it. These images have a source, but Google is choosing not to show it.

Then Google pulled the ol’ switcheroo.

New reality (March 2015)

Now things changed and Google decided to put a source to their images. Unfortunately, I mistakenly assumed that hovering over an image showed the same thing as the file path at the bottom, but I was wrong. The URL you see when you hover over an image in the Knowledge Graph is actually nothing more than the title. The source is different.

Morning_Joe_Source.png

Luckily, I still had two screenshots I took when I first saw this saved on my desktop. Success. One screen capture was from NBC Nightly News, and the other from the news show Morning Joe (see above) showing that the source was changed.

NBC-nightly-news-crop.png

(NBC Nightly News screenshot.)

The source is a Google-owned property: gstatic.com. You can clearly see the difference in the source change. What started as a hypothesis in now a fact. Google is certainly creating a database of images.

If this is the direction Google is moving, then it is creating all kinds of potential risks for brands and individuals. The implications are a loss of control for any brand that is looking to optimize its Knowledge Graph results. As well, it seems this poses a conflict of interest to Google, whose mission is to organize the world’s information, not license and prioritize it.

How do we think Google is supposed to work?

Google is an information-retrieval system tasked with sourcing information from across the web and supplying the most relevant results to users’ searches. In recent months, the search giant has taken a more direct approach by answering questions and assumed questions in the Answer Box, some of which come from un-credited sources. Google has clearly demonstrated that it is building a knowledge base of facts that it uses as the basis for its Answer Boxes. When it sources information from that knowledge base, it doesn’t necessarily reference or credit any source.

However, I would argue there is a difference between an un-credited Answer Box and an un-credited image. An un-credited Answer Box provides a fact that is indisputable, part of the public domain, unlikely to change (e.g., what year was Abraham Lincoln shot? How long is the George Washington Bridge?) Answer Boxes that offer more than just a basic fact (or an opinion, instructions, etc.) always credit their sources.

There are four possibilities when it comes to Google referencing content:

  • Option 1: It credits the content because someone else owns the rights to it
  • Option 2: It doesn’t credit the content because it’s part of the public domain, as seen in some Answer Box results
  • Option 3: It doesn’t reference it because it owns or has licensed the content. If you search for “Chicken Pox” or other diseases, Google appears to be using images from licensed medical illustrators. The same goes for song lyrics, which Eric Enge discusses here: Google providing credit for content. This adds to the speculation that Google is giving preference to its own content by displaying it over everything else.
  • Option 4: It doesn’t credit the content, but neither does it necessarily own the rights to the content. This is a very gray area, and is where Google seemed to be back in February. If this were the case, it would imply that Google is “stealing” content—which I find hard to believe, but felt was necessary to include in this post for the sake of completeness.

Is this an isolated incident?

At Five Blocks, whenever we see these anomalies in search results, we try to compare the term in question against others like it. This is a categorization concept we use to bucket individuals or companies into similar groups. When we do this, we uncover some incredible trends that help us determine what a search result “should” look like for a given group. For example, when looking at searches for a group of people or companies in an industry, this grouping gives us a sense of how much social media presence the group has on average or how much media coverage it typically gets.

Upon further investigation of terms similar to NBC Nightly News (other news shows), we noticed the un-credited image scenario appeared to be a trend in February, but now all of the images are being hosted on gstatic.com. When we broadened the categories further to TV shows and movies, the trend persisted. Rather than show an image in the Knowledge Graph and from the actual source, Google tends to show an image and reference the source from Google’s own database of stored images.

And just to ensure this wasn’t a case of tunnel vision, we researched other categories, including sports teams, actors and video games, in addition to spot-checking other genres.

Unlike terms for specific TV shows and movies, terms in each of these other groups all link to the actual source in the Knowledge Graph.

Immediate implications

It’s easy to ignore this and say “Well, it’s Google. They are always doing something.” However, there are some serious implications to these actions:

  1. The TV shows/movies aren’t receiving their due credit because, from within the Knowledge Graph, there is no actual reference to the show’s official site
  2. The more Google moves toward licensing and then retrieving their own information, the more biased they become, preferring their own content over the equivalent—or possibly even superior—content from another source
  3. If feels wrong and misleading to get a Google Image Search result rather than an actual site because:
    • The search doesn’t include the original image
    • Considering how poor Image Search results are normally, it feels like a poor experience
  4. If Google is moving toward licensing as much content as possible, then it could make the Knowledge Graph infinitely more complicated when there is a “mistake” or something unflattering. How could one go about changing what Google shows about them?

Google is objectively becoming subjective

It is clear that Google is attempting to create databases of information, including lyrics stored in Google Play, photos, and, previously, facts in Freebase (which is now Wikidata and not owned by Google).

I am not normally one to point my finger and accuse Google of wrongdoing. But this really strikes me as an odd move, one bordering on a clear bias to direct users to stay within the search engine. The fact is, we trust Google with a heck of a lot of information with our searches. In return, I believe we should expect Google to return an array of relevant information for searchers to decide what they like best. The example cited above seems harmless, but what about determining which is the right religion? Or even who the prettiest girl in the world is?

Religion-and-beauty-queries.png

Questions such as these, which Google is returning credited answers for, could return results that are perceived as facts.

Should we next expect Google to decide who is objectively the best service provider (e.g., pizza chain, painter, or accountant), then feature them in an un-credited answer box? The direction Google is moving right now, it feels like we should be calling into question their objectivity.

But that’s only my (subjective) opinion.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it