Why Effective, Modern SEO Requires Technical, Creative, and Strategic Thinking – Whiteboard Friday

Posted by randfish

There’s no doubt that quite a bit has changed about SEO, and that the field is far more integrated with other aspects of online marketing than it once was. In today’s Whiteboard Friday, Rand pushes back against the idea that effective modern SEO doesn’t require any technical expertise, outlining a fantastic list of technical elements that today’s SEOs need to know about in order to be truly effective.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week I’m going to do something unusual. I don’t usually point out these inconsistencies or sort of take issue with other folks’ content on the web, because I generally find that that’s not all that valuable and useful. But I’m going to make an exception here.

There is an article by Jayson DeMers, who I think might actually be here in Seattle — maybe he and I can hang out at some point — called “Why Modern SEO Requires Almost No Technical Expertise.” It was an article that got a shocking amount of traction and attention. On Facebook, it has thousands of shares. On LinkedIn, it did really well. On Twitter, it got a bunch of attention.

Some folks in the SEO world have already pointed out some issues around this. But because of the increasing popularity of this article, and because I think there’s, like, this hopefulness from worlds outside of kind of the hardcore SEO world that are looking to this piece and going, “Look, this is great. We don’t have to be technical. We don’t have to worry about technical things in order to do SEO.”

Look, I completely get the appeal of that. I did want to point out some of the reasons why this is not so accurate. At the same time, I don’t want to rain on Jayson, because I think that it’s very possible he’s writing an article for Entrepreneur, maybe he has sort of a commitment to them. Maybe he had no idea that this article was going to spark so much attention and investment. He does make some good points. I think it’s just really the title and then some of the messages inside there that I take strong issue with, and so I wanted to bring those up.

First off, some of the good points he did bring up.

One, he wisely says, “You don’t need to know how to code or to write and read algorithms in order to do SEO.” I totally agree with that. If today you’re looking at SEO and you’re thinking, “Well, am I going to get more into this subject? Am I going to try investing in SEO? But I don’t even know HTML and CSS yet.”

Those are good skills to have, and they will help you in SEO, but you don’t need them. Jayson’s totally right. You don’t have to have them, and you can learn and pick up some of these things, and do searches, watch some Whiteboard Fridays, check out some guides, and pick up a lot of that stuff later on as you need it in your career. SEO doesn’t have that hard requirement.

And secondly, he makes an intelligent point that we’ve made many times here at Moz, which is that, broadly speaking, a better user experience is well correlated with better rankings.

You make a great website that delivers great user experience, that provides the answers to searchers’ questions and gives them extraordinarily good content, way better than what’s out there already in the search results, generally speaking you’re going to see happy searchers, and that’s going to lead to higher rankings.

But not entirely. There are a lot of other elements that go in here. So I’ll bring up some frustrating points around the piece as well.

First off, there’s no acknowledgment — and I find this a little disturbing — that the ability to read and write code, or even HTML and CSS, which I think are the basic place to start, is helpful or can take your SEO efforts to the next level. I think both of those things are true.

So being able to look at a web page, view source on it, or pull up Firebug in Firefox or something and diagnose what’s going on and then go, “Oh, that’s why Google is not able to see this content. That’s why we’re not ranking for this keyword or term, or why even when I enter this exact sentence in quotes into Google, which is on our page, this is why it’s not bringing it up. It’s because it’s loading it after the page from a remote file that Google can’t access.” These are technical things, and being able to see how that code is built, how it’s structured, and what’s going on there, very, very helpful.

Some coding knowledge also can take your SEO efforts even further. I mean, so many times, SEOs are stymied by the conversations that we have with our programmers and our developers and the technical staff on our teams. When we can have those conversations intelligently, because at least we understand the principles of how an if-then statement works, or what software engineering best practices are being used, or they can upload something into a GitHub repository, and we can take a look at it there, that kind of stuff is really helpful.

Secondly, I don’t like that the article overly reduces all of this information that we have about what we’ve learned about Google. So he mentions two sources. One is things that Google tells us, and others are SEO experiments. I think both of those are true. Although I’d add that there’s sort of a sixth sense of knowledge that we gain over time from looking at many, many search results and kind of having this feel for why things rank, and what might be wrong with a site, and getting really good at that using tools and data as well. There are people who can look at Open Site Explorer and then go, “Aha, I bet this is going to happen.” They can look, and 90% of the time they’re right.

So he boils this down to, one, write quality content, and two, reduce your bounce rate. Neither of those things are wrong. You should write quality content, although I’d argue there are lots of other forms of quality content that aren’t necessarily written — video, images and graphics, podcasts, lots of other stuff.

And secondly, that just doing those two things is not always enough. So you can see, like many, many folks look and go, “I have quality content. It has a low bounce rate. How come I don’t rank better?” Well, your competitors, they’re also going to have quality content with a low bounce rate. That’s not a very high bar.

Also, frustratingly, this really gets in my craw. I don’t think “write quality content” means anything. You tell me. When you hear that, to me that is a totally non-actionable, non-useful phrase that’s a piece of advice that is so generic as to be discardable. So I really wish that there was more substance behind that.

The article also makes, in my opinion, the totally inaccurate claim that modern SEO really is reduced to “the happier your users are when they visit your site, the higher you’re going to rank.”

Wow. Okay. Again, I think broadly these things are correlated. User happiness and rank is broadly correlated, but it’s not a one to one. This is not like a, “Oh, well, that’s a 1.0 correlation.”

I would guess that the correlation is probably closer to like the page authority range. I bet it’s like 0.35 or something correlation. If you were to actually measure this broadly across the web and say like, “Hey, were you happier with result one, two, three, four, or five,” the ordering would not be perfect at all. It probably wouldn’t even be close.

There’s a ton of reasons why sometimes someone who ranks on Page 2 or Page 3 or doesn’t rank at all for a query is doing a better piece of content than the person who does rank well or ranks on Page 1, Position 1.

Then the article suggests five and sort of a half steps to successful modern SEO, which I think is a really incomplete list. So Jayson gives us;

  • Good on-site experience
  • Writing good content
  • Getting others to acknowledge you as an authority
  • Rising in social popularity
  • Earning local relevance
  • Dealing with modern CMS systems (which he notes most modern CMS systems are SEO-friendly)

The thing is there’s nothing actually wrong with any of these. They’re all, generally speaking, correct, either directly or indirectly related to SEO. The one about local relevance, I have some issue with, because he doesn’t note that there’s a separate algorithm for sort of how local SEO is done and how Google ranks local sites in maps and in their local search results. Also not noted is that rising in social popularity won’t necessarily directly help your SEO, although it can have indirect and positive benefits.

I feel like this list is super incomplete. Okay, I brainstormed just off the top of my head in the 10 minutes before we filmed this video a list. The list was so long that, as you can see, I filled up the whole whiteboard and then didn’t have any more room. I’m not going to bother to erase and go try and be absolutely complete.

But there’s a huge, huge number of things that are important, critically important for technical SEO. If you don’t know how to do these things, you are sunk in many cases. You can’t be an effective SEO analyst, or consultant, or in-house team member, because you simply can’t diagnose the potential problems, rectify those potential problems, identify strategies that your competitors are using, be able to diagnose a traffic gain or loss. You have to have these skills in order to do that.

I’ll run through these quickly, but really the idea is just that this list is so huge and so long that I think it’s very, very, very wrong to say technical SEO is behind us. I almost feel like the opposite is true.

We have to be able to understand things like;

  • Content rendering and indexability
  • Crawl structure, internal links, JavaScript, Ajax. If something’s post-loading after the page and Google’s not able to index it, or there are links that are accessible via JavaScript or Ajax, maybe Google can’t necessarily see those or isn’t crawling them as effectively, or is crawling them, but isn’t assigning them as much link weight as they might be assigning other stuff, and you’ve made it tough to link to them externally, and so they can’t crawl it.
  • Disabling crawling and/or indexing of thin or incomplete or non-search-targeted content. We have a bunch of search results pages. Should we use rel=prev/next? Should we robots.txt those out? Should we disallow from crawling with meta robots? Should we rel=canonical them to other pages? Should we exclude them via the protocols inside Google Webmaster Tools, which is now Google Search Console?
  • Managing redirects, domain migrations, content updates. A new piece of content comes out, replacing an old piece of content, what do we do with that old piece of content? What’s the best practice? It varies by different things. We have a whole Whiteboard Friday about the different things that you could do with that. What about a big redirect or a domain migration? You buy another company and you’re redirecting their site to your site. You have to understand things about subdomain structures versus subfolders, which, again, we’ve done another Whiteboard Friday about that.
  • Proper error codes, downtime procedures, and not found pages. If your 404 pages turn out to all be 200 pages, well, now you’ve made a big error there, and Google could be crawling tons of 404 pages that they think are real pages, because you’ve made it a status code 200, or you’ve used a 404 code when you should have used a 410, which is a permanently removed, to be able to get it completely out of the indexes, as opposed to having Google revisit it and keep it in the index.

Downtime procedures. So there’s specifically a… I can’t even remember. It’s a 5xx code that you can use. Maybe it was a 503 or something that you can use that’s like, “Revisit later. We’re having some downtime right now.” Google urges you to use that specific code rather than using a 404, which tells them, “This page is now an error.”

Disney had that problem a while ago, if you guys remember, where they 404ed all their pages during an hour of downtime, and then their homepage, when you searched for Disney World, was, like, “Not found.” Oh, jeez, Disney World, not so good.

  • International and multi-language targeting issues. I won’t go into that. But you have to know the protocols there. Duplicate content, syndication, scrapers. How do we handle all that? Somebody else wants to take our content, put it on their site, what should we do? Someone’s scraping our content. What can we do? We have duplicate content on our own site. What should we do?
  • Diagnosing traffic drops via analytics and metrics. Being able to look at a rankings report, being able to look at analytics connecting those up and trying to see: Why did we go up or down? Did we have less pages being indexed, more pages being indexed, more pages getting traffic less, more keywords less?
  • Understanding advanced search parameters. Today, just today, I was checking out the related parameter in Google, which is fascinating for most sites. Well, for Moz, weirdly, related:oursite.com shows nothing. But for virtually every other sit, well, most other sites on the web, it does show some really interesting data, and you can see how Google is connecting up, essentially, intentions and topics from different sites and pages, which can be fascinating, could expose opportunities for links, could expose understanding of how they view your site versus your competition or who they think your competition is.

Then there are tons of parameters, like in URL and in anchor, and da, da, da, da. In anchor doesn’t work anymore, never mind about that one.

I have to go faster, because we’re just going to run out of these. Like, come on. Interpreting and leveraging data in Google Search Console. If you don’t know how to use that, Google could be telling you, you have all sorts of errors, and you don’t know what they are.

  • Leveraging topic modeling and extraction. Using all these cool tools that are coming out for better keyword research and better on-page targeting. I talked about a couple of those at MozCon, like MonkeyLearn. There’s the new Moz Context API, which will be coming out soon, around that. There’s the Alchemy API, which a lot of folks really like and use.
  • Identifying and extracting opportunities based on site crawls. You run a Screaming Frog crawl on your site and you’re going, “Oh, here’s all these problems and issues.” If you don’t have these technical skills, you can’t diagnose that. You can’t figure out what’s wrong. You can’t figure out what needs fixing, what needs addressing.
  • Using rich snippet format to stand out in the SERPs. This is just getting a better click-through rate, which can seriously help your site and obviously your traffic.
  • Applying Google-supported protocols like rel=canonical, meta description, rel=prev/next, hreflang, robots.txt, meta robots, x robots, NOODP, XML sitemaps, rel=nofollow. The list goes on and on and on. If you’re not technical, you don’t know what those are, you think you just need to write good content and lower your bounce rate, it’s not going to work.
  • Using APIs from services like AdWords or MozScape, or hrefs from Majestic, or SEM refs from SearchScape or Alchemy API. Those APIs can have powerful things that they can do for your site. There are some powerful problems they could help you solve if you know how to use them. It’s actually not that hard to write something, even inside a Google Doc or Excel, to pull from an API and get some data in there. There’s a bunch of good tutorials out there. Richard Baxter has one, Annie Cushing has one, I think Distilled has some. So really cool stuff there.
  • Diagnosing page load speed issues, which goes right to what Jayson was talking about. You need that fast-loading page. Well, if you don’t have any technical skills, you can’t figure out why your page might not be loading quickly.
  • Diagnosing mobile friendliness issues
  • Advising app developers on the new protocols around App deep linking, so that you can get the content from your mobile apps into the web search results on mobile devices. Awesome. Super powerful. Potentially crazy powerful, as mobile search is becoming bigger than desktop.

Okay, I’m going to take a deep breath and relax. I don’t know Jayson’s intention, and in fact, if he were in this room, he’d be like, “No, I totally agree with all those things. I wrote the article in a rush. I had no idea it was going to be big. I was just trying to make the broader points around you don’t have to be a coder in order to do SEO.” That’s completely fine.

So I’m not going to try and rain criticism down on him. But I think if you’re reading that article, or you’re seeing it in your feed, or your clients are, or your boss is, or other folks are in your world, maybe you can point them to this Whiteboard Friday and let them know, no, that’s not quite right. There’s a ton of technical SEO that is required in 2015 and will be for years to come, I think, that SEOs have to have in order to be effective at their jobs.

All right, everyone. Look forward to some great comments, and we’ll see you again next time for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

​We Want Your Stories: Accepting MozCon Ignite Pitches

Posted by EricaMcGillivray

We’re thrilled to announce the addition of a networking and Ignite-style event for attendees on Tuesday night at MozCon. For years, you’ve asked us for more networking and relaxing times, and this is what we’ve dreamed up. But we need your help!

We want you to share your stories, passions, and experiences. There are 16—yes, 16—speaking slots. Ignite-style talks are 5 minutes in length and slides auto-advance. That’s right, there’s no going back, and once it’s done, it’s done!

In order to encourage relaxation, none of these talks will be about online marketing. Instead, we want to use this opportunity to get to know our fellow community members better. We want to hear about your passion projects, interests, and the things that fascinate you outside marketing. Tell us about how you spend weekends making support banners for your favorite soccer team or why you mentor high school students, for example.

The basic details

  • To submit, just fill out the form below.
  • Please only submit one talk! We want the one you’re most excited about.
  • Talks cannot be about online marketing.
  • They are only 5 minutes in length, so plan accordingly.
  • If you are already speaking on the MozCon stage, you cannot pitch for this event.
  • Submissions close on Sunday, May 17 at 5pm PDT.
  • Selection decisions are final and will be made in late May / early June.
  • All presentations must adhere to the MozCon Code of Conduct.
  • You must attend MozCon, July 13-15, and the Tuesday night event in person, in Seattle.

If selected, you will get the following

  • 5 minutes on the Tuesday night stage to share with our audience. The event lasts from 7-10pm and will be at Benaroya Hall (where the Seattle Symphony plays).
  • $300 off a regular priced ticket to MozCon. (If you already purchased yours, we’ll issue a $300 refund for regular priced ticket or $100 for an early bird ticket. Discount not available for super early bird special.)
  • We will work with you to hone your talk!

Loading…

As we want to ensure every single speaker feels both comfortable and gives their best talk possible, myself and Matt Roney are here to help you. We’ll review your topic, settle on the title, walk through your presentation with you, and give you a tour of the stage earlier in the evening. While you do the great work, we’re here to help in anyway possible.

Unfortunately, we cannot provide travel coverage for these MozCon Ignite speaking slots.

What makes a great pitch

  • Focus on the five minute length.
  • Be passionate about what you’re speaking about. Tell us what’s great about it.
  • For extra credit, include links to videos of you doing public speaking.
  • Follow the guidelines. Yes, the word counts are limited on purpose. Do not submit links to Google Docs, etc. for more information. Tricky and multiple submissions will be disqualified.

We’re all super-excited about these talks, and we can’t wait to hear what you might talk about. Whether you want to tell us about how Frenchies are really English dogs or which coffee shop is the best in Seattle, this is going to be blast! The amazing Geraldine DeRuiter, known for her travel blogging and witty ways, will be emceeing this event.

If you’re still needing inspiration or a little confused about an Ignite talk, watch Geraldine’s talk from a few years ago about sharing personal news online:

Like our other speaker selections, we have a small committee at Moz running through these topics to get the best variety and fun possible. While we cannot vet your topic, feel free to ask questions in the comments.

Everyone who submits an Ignite pitch will be informed either way. Best of luck!


Haven’t bought your MozCon ticket?

Buy your ticket now

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

It’s Your Turn: Now Accepting Community Speaker Pitches for MozCon 2015

Posted by EricaMcGillivray

Yep, it’s that time of year, friends. Time to submit your online marketing talk pitch for MozCon 2015. I’m super excited this year as we’ll have 6 community speaker slots! That’s right—you all are so amazing that we want to see more from you.

The basic details:

  • To submit, just fill out the form below.
  • Talks must be about online marketing and are only 15 minutes in length.
  • Submissions close on Sunday, April 12 at 5pm PDT.
  • Final decisions are final and will be made in late April.
  • All presentations must adhere to the MozCon Code of Conduct.
  • You must attend MozCon in person, July 13-15 in Seattle.

Loading…


If you are selected, you will get the following:

  • 15 minutes on the MozCon stage to share with our audience, plus 5 minutes of Q&A.
  • A free ticket to MozCon. (If you already purchased yours, we’ll either refund or transfer the ticket to someone else.)
  • Four nights of lodging covered by us at our partner hotel.
  • A reimbursement for your travel (flight, train, car, etc.), up to $500 domestic and $750 international.
  • A free ticket for you to give to anyone you would like and a code for $300 off another ticket.
  • An invitation for you and your significant other to join us for the speakers’ dinner.

We work with you!

Pitching for a community speaker slot can feel intimidating. A lot of times, our ideas feel like an old hat and done a million times before. (When I say “our” here, I mean “mine.”)

At MozCon, we work with every single speaker to ensure your presentation is the best it can be. Myself and Matt Roney dedicate ourselves to helping you. Seriously, you get our personal cell phone numbers. Don’t get me wrong—you do the heavy lifting and the incredible work. But we set up calls, review sessions, and even take you up on the stage pre-MozCon to ensure that you feel awesome about your talk.


We’re happy to help, including:

  • Calls to discuss and refine your topic.
  • Assistance honing topic title and description.
  • Reviews of outlines and drafts (as many as you want!).
  • Best practices and guidance for slide decks, specifically for our stage.
  • A comprehensive, step-by-step guide for show flow.
  • Serving as an audience for practicing your talk.
  • Reviewing your final deck.
  • Sunday night pre-MozCon tour of the stage to meet our A/V crew, see your presentation on the screens, and test the clicker.
  • A dedicated crew to make your A/V outstanding.
  • Anything else we can do to make you successful.

Most of the above are required as part of the speaker process, so even those of you who don’t always ask for help (again, talking about myself here), will be sure to get it. We want you to know that anyone, regardless of experience or level of knowledge, can submit and present a great talk at MozCon. One of our past community speakers Zeph Snapp wrote a great post about his experiences with our process and at the show.


For great proposals:

  • Make sure to check out the confirmed MozCon 2015 topics from our other speakers so you don’t overlap.
  • Read about what makes a great pitch.
  • For extra jazz, include links to videos of you doing public speaking and your slide deck work in the optional fields.
  • Follow the guidelines. Yes, the word counts are limited on purpose. Do not submit links to Google Docs, etc. for more information. Tricky submissions will be disqualified.

While I can’t give direct pitch coaching—it would be unfair to others—I’m happy to answer your questions in the comments.

Submissions are reviewed by a selection committee at Moz, so multiple people look at and give their opinions on each pitch. The first run-through looks at pitches without speaker information attached to them in order to give an unbiased look at topics. Around 50% of pitches are weeded out here. The second run-through includes speaker bio information in order to get a more holistic view of the speaker and what your talk might be like in front of 1,400 people.

Everyone who submits a community speaker pitch will be informed either way. If your submission doesn’t make it and you’re wondering why, we can talk further on email as there’s always next year.

Finally, a big thank you to our wonderful community speakers from past MozCons including Stephanie BeadellMark TraphagenZeph SnappJustin Briggs, Darren Shaw, Dana Lookadoo, Fabio Ricotta, Jeff McRitchie, Sha Menz, Mike Arnesen, A. Litsa, and Kelsey Libert, who’ve all been so amazing.


Still need to confirm you’ll join us?

Buy your ticket!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

Eye Tracking in 2014: How Users View and Interact with Today’s Google SERPs

Posted by rMaynes1

In September 2014, Mediative released its latest eye-tracking research entitled “The Evolution of Google’s Search Engine Results Pages and Their Effects on User Behaviour“.

This large study had participants conduct various searches using Google on a desktop. For example, participants were asked “Imagine you’re moving from
Toronto to Vancouver. Use Google to find a moving company in Toronto.” Participants were all presented with the same Google SERP, no matter the search
query.

Mediative wanted to know where people look and click on the SERP the most, what role the location of the listing on the SERP plays in winning views and
clicks, and how click activity on listings has changed with the introduction of Google features such as the carousel, the knowledge graph etc.


Mediative discovered that, just as Google’s SERP has evolved over the past decade, so too has the way in which search engine users scan the page before
making a click.

Back in 2005 when
a similar eye-tracking study was conducted for the first time by Mediative (formerly Enquiro), it was
discovered that people searched in a distinctive “triangle” pattern, starting in the top left of the search results page where they expected the first
organic listing to be located, and reading across horizontally before moving their eyes down to the second organic listing, and reading horizontally, but
not quite as far. This area of concentrated gaze activity became known as Google’s “Golden Triangle”. The study concluded that if a business’s listing was
not in the Golden Triangle, its odds of being seen by a searcher were dramatically reduced.

Heat map from 2005 showing the area known as Google’s “Golden Triangle”.

But now, in 2014, the top organic results are no longer always in the top-left corner where searchers expect them to be, so they scan other areas of the
SERP, trying to seek out the top organic listing, but being distracted by other elements along the way. The #1 organic listing is shifting further down the
page, and while this listing still captures the most click activity (32.8%) regardless of what new elements are presented, the shifting location has opened
up the top of the page with more potential areas for businesses to achieve visibility.

Where scanning was once more
horizontal, the adoption of mobile devices over the past 9 years has habitually conditioned searchers to now scan
more
vertically—they are looking for the fastest path to the desired content, and, compared to 9 years ago, they are viewing more search results
listings during a single session and spending less time viewing each one.

Searchers on Google now scan far more vertically than several years ago.


One of the biggest changes from SERPS 9 years ago to today is that Google is now trying to keep people on the result page for as long as they can.

An example is in the case of the knowledge graph. In Mediative’s study. when searchers were looking for “weather in New Orleans”, the results page that was
presented to them showed exactly what they needed to know. Participants were asked to click on the result that they felt best met their needs, even if, if
reality, they wouldn’t have clicked through (in order to end that task). When a knowledge graph result
exactly met the intent of the searcher, the
study found 80% of people looked at that result, and 44% clicked on it. Google provided searchers with a relevant enough answer to keep them on the SERP.
The top organic listing captured 36.5% of pages clicks—compared to 82% when the knowledge graph did not provide the searcher with the answer they were
looking for.

It’s a similar case with the carousel results; when a searcher clicks on a listing, instead of going through to the listing’s website, another SERP is
presented specifically about the business, as Google tries to increase paid ad impressions/clicks on the Google search results page.

How can businesses stay on top of these changes and ensure they still get listed?

There are four main things to keep in mind:

1.
The basic fundamentals of SEO are as important as ever

Create unique, fresh content, which speaks to the needs of your customers as this will always trump chasing the algorithm. There are also on-page and
off-page SEO tactics that you can employ that can increase your chances of being listed in areas of the SERP other than your website’s organic listing such
as front-loading keywords in page titles and meta descriptions, getting listed on directories and ratings and reviews site, having social pages etc. It’s
important to note that SEO strategy is no longer a one-size-fits-all approach.

2.
Consider using schema mark-up wherever possible

In Mediative’s 2014 Google SERP research, it was discovered that blog posts that had been marked up using schema to show the picture and name of the author
got a significant amount of engagement, even when quite far down the first page—these listings garnered an average of 15.5% of total page clicks.

Note:

As of August 2014, Google removed authorship markup entirely. However, the results are still a good example of how schema mark-up can be used to make
your business listing stand out more on the SERP, potentially capturing more view and clicks, and therefore more website traffic.

In the study, participants were asked to “Imagine that you’re starting a business and you need to find a company to host your website. Use Google to find
information about website hosting companies”. The SERP presented is shown below:

Almost 45% of clicks went to 2 blog posts titled “Five Best Web Hosting Companies” and “10 Best Web Hosting Companies”.

In general, the top clicked posts were those that had titles including phrases such as:

  • “Best…”
  • “Reviews of…”
  • “Top 5…”
  • “How-to…”

According to Google, “On-page markup helps search engines understand the information on webpages and provide richer results…Google doesn’t use markup
for ranking purposes at this time-but rich snippets can make your web pages appear more prominently in search results, so you may see an increase in
traffic.”

Schema markup is probably the most under-utilized tool for SEO, presenting a huge opportunity for companies that do utilize the Google approved tool.
Searchmetrics reported that only 0.3% of websites
use schema markup, yet over a third of Google’s results contain rich snippets (additional text, images and links below the individual search results).
BruceClay.com reports rich snippets can increase CTRs of listings between
15-50% and that websites using schema markup tend to rank higher in search results.

Schema mark-up can be used to add star ratings, number of reviews, pricing (all shown in the listing below) and more to a search results page listing.


3.
Know the intent of your users

Understanding what searchers are trying to discover when they conduct a search can help determine how much effort you should try and put into appearing in
the number one organic listing, which can be an extremely difficult task without unlimited budget and resources—and, even if you do make it the number
one organic listing, traffic is not guaranteed as discovered in this reaserch. If you’re competing with big name brands, or ratings and review sites, and
THAT is what your customers want, they you are going to struggle to compete.

The importance of your business being the first listing vs. on the first page therefore, is highly dependent on the searcher’s intent, plus the strength of
your brand. The key is to always keep
user intent top-of-mind, and this can be established by talking to real people, rather than
guessing. What are they looking for when they are searching for your site? Structure your content around what people really want and need, list your site
on the directories that people actually visit or reference, create videos (if that’s what your audience wants)—know what your actual customers are
looking for, and then provide it.

There are going to be situations when a business can’t get to number one on the organic listings. As previously mentioned, the study shows that this is
still the key place to be, and the top organic listing captures more clicks that any other single listing. But if your chances of getting to that number
one spot are slim, you need to focus on other areas of the SERP, such as positions #4 or higher, which will be easier to obtain ranking for—businesses
that are positioned lower on the SERP (especially positions 2-4) see more click activity than they did several years ago, making this real estate much more
valuable. As
Gord Hotchkiss writes about, searchers tend to
“chunk” information on the SERP and scan each chuck in the same way they used to search the entire SERP—in a triangle pattern. Getting listed at the top
of a “chunk” can therefore be effective for many businesses. This idea of “chunking” and scanning can be seen in the heat map below.

To add to that, Mediative’s research showed that everything located above the top 4 organic listings (so, carousel results, knowledge graph, paid listings,
local listings etc.) combined captured 84% of clicks. If you can’t get your business listing to #1, but can get listed somewhere higher than #4, you have a
good chance of being seen, and clicked on by searchers. Ultimately, people expect Google to continue to do its job, and respond to search queries with the
most relevant results at the top. The study points out that only 1% of participants were willing to click through to Page 2 to see more results. If you’re
not listed on page 1 of Google for relevant searches, you may as well not exist online.

4.
A combination of SEO and paid search can maximize your visibility in SERP areas that have the biggest impact on both branding
and
traffic

Even though organic listings are where many businesses are striving to be listed (and where the majority of clicks take place), it’s important not to
forget about paid listings as a component of your digital strategy. Click-through rates for top sponsored listings (positions 1 and 2) have changed very
little in the past decade. Where the huge change has taken place is in the ability of sponsored ads on the right rail to attract attention and clicks.
Activity on this section of the page is almost non-existent. This can be put down to a couple of factors including searchers conditioned behaviour as
mentioned before, to scan more vertically, thanks to our increased mobile usage, and the fact that over the years we have learned that those results may
not typically be very relevant, or as good as the organic results, so we tend not to even take the time to view them.

Mediative’s research also found that there are branding effects of paid search, even if not directly driving traffic. We asked participants to “Imagine you
are traveling to New Orleans and are looking for somewhere to meet a friend for dinner in the French Quarter area. Use Google to find a restaurant.”
Participants were presented with a SERP showing 2 paid ads—the first was for opentable.com, and the second for the restaurant Remoulade, remoulade.com.

The top sponsored listing, opentable.com, was viewed by 84% of participants, and captured 26% of clicks. The second listing, remoulade.com, only captured
2% of clicks but was looked at by 73% of participants. By being seen by almost 3/4 of participants, the paid listing can increase brand affinity, and
therefore purchase (or choice) consideration in other areas! For example, if the searcher comes back and searches again another time, or clicks to
opentable.com and then sees Remoulade listed, it may benefit from a higher brand affinity from having already been seen in the paid listings. Mediative
conducted a
Brand Lift study featuring Honda that found the more real estate that brands own on the SERP, the higher the
CTR, and the higher the brand affinity, brand recognition, purchase consideration etc. Using paid search for more of a branding play is essentially free
brand advertising—while you should be prepared to get the clicks and pay for them of course, it likely that your business listing will be
seen
by a large number of people without capturing the same number of clicks. Impression data can also be easily tracked with Google paid ads so you know
exactly how many times your ad was shown, and can therefore estimate how many people actually looked at it from a branding point of view.


Rebecca Maynes is a Marketing Communications Strategist with Mediative, and was a major contributor on this study. The full study, including
click-through rates for all areas of the SERP, can be downloaded at

www.mediative.com/SERP.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 5 years ago from feedproxy.google.com