How to Combat 5 of the SEO World’s Most Infuriating Problems – Whiteboard Friday

Posted by randfish

These days, most of us have learned that spammy techniques aren’t the way to go, and we have a solid sense for the things we should be doing to rank higher, and ahead of our often spammier competitors. Sometimes, maddeningly, it just doesn’t work. In today’s Whiteboard Friday, Rand talks about five things that can infuriate SEOs with the best of intentions, why those problems exist, and what we can do about them.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

What SEO problems make you angry?

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about some of the most infuriating things in the SEO world, specifically five problems that I think plague a lot of folks and some of the ways that we can combat and address those.

I’m going to start with one of the things that really infuriates a lot of new folks to the field, especially folks who are building new and emerging sites and are doing SEO on them. You have all of these best practices list. You might look at a web developer’s cheat sheet or sort of a guide to on-page and on-site SEO. You go, “Hey, I’m doing it. I’ve got my clean URLs, my good, unique content, my solid keyword targeting, schema markup, useful internal links, my XML sitemap, and my fast load speed. I’m mobile friendly, and I don’t have manipulative links.”

Great. “Where are my results? What benefit am I getting from doing all these things, because I don’t see one?” I took a site that was not particularly SEO friendly, maybe it’s a new site, one I just launched or an emerging site, one that’s sort of slowly growing but not yet a power player. I do all this right stuff, and I don’t get SEO results.

This makes a lot of people stop investing in SEO, stop believing in SEO, and stop wanting to do it. I can understand where you’re coming from. The challenge is not one of you’ve done something wrong. It’s that this stuff, all of these things that you do right, especially things that you do right on your own site or from a best practices perspective, they don’t increase rankings. They don’t. That’s not what they’re designed to do.

1) Following best practices often does nothing for new and emerging sites

This stuff, all of these best practices are designed to protect you from potential problems. They’re designed to make sure that your site is properly optimized so that you can perform to the highest degree that you are able. But this is not actually rank boosting stuff unfortunately. That is very frustrating for many folks. So following a best practices list, the idea is not, “Hey, I’m going to grow my rankings by doing this.”

On the flip side, many folks do these things on larger, more well-established sites, sites that have a lot of ranking signals already in place. They’re bigger brands, they have lots of links to them, and they have lots of users and usage engagement signals. You fix this stuff. You fix stuff that’s already broken, and boom, rankings pop up. Things are going well, and more of your pages are indexed. You’re getting more search traffic, and it feels great. This is a challenge, on our part, of understanding what this stuff does, not a challenge on the search engine’s part of not ranking us properly for having done all of these right things.

2) My competition seems to be ranking on the back of spammy or manipulative links

What’s going on? I thought Google had introduced all these algorithms to kind of shut this stuff down. This seems very frustrating. How are they pulling this off? I look at their link profile, and I see a bunch of the directories, Web 2.0 sites — I love that the spam world decided that that’s Web 2.0 sites — article sites, private blog networks, and do follow blogs.

You look at this stuff and you go, “What is this junk? It’s terrible. Why isn’t Google penalizing them for this?” The answer, the right way to think about this and to come at this is: Are these really the reason that they rank? I think we need to ask ourselves that question.

One thing that we don’t know, that we can never know, is: Have these links been disavowed by our competitor here?

I’ve got my HulksIncredibleStore.com and their evil competitor Hulk-tastrophe.com. Hulk-tastrophe has got all of these terrible links, but maybe they disavowed those links and you would have no idea. Maybe they didn’t build those links. Perhaps those links came in from some other place. They are not responsible. Google is not treating them as responsible for it. They’re not actually what’s helping them.

If they are helping, and it’s possible they are, there are still instances where we’ve seen spam propping up sites. No doubt about it.

I think the next logical question is: Are you willing to loose your site or brand? What we don’t see anymore is we almost never see sites like this, who are ranking on the back of these things and have generally less legitimate and good links, ranking for two or three or four years. You can see it for a few months, maybe even a year, but this stuff is getting hit hard and getting hit frequently. So unless you’re willing to loose your site, pursuing their links is probably not a strategy.

Then what other signals, that you might not be considering potentially links, but also non-linking signals, could be helping them rank? I think a lot of us get blinded in the SEO world by link signals, and we forget to look at things like: Do they have a phenomenal user experience? Are they growing their brand? Are they doing offline kinds of things that are influencing online? Are they gaining engagement from other channels that’s then influencing their SEO? Do they have things coming in that I can’t see? If you don’t ask those questions, you can’t really learn from your competitors, and you just feel the frustration.

3) I have no visibility or understanding of why my rankings go up vs down

On my HulksIncredibleStore.com, I’ve got my infinite stretch shorts, which I don’t know why he never wears — he should really buy those — my soothing herbal tea, and my anger management books. I look at my rankings and they kind of jump up all the time, jump all over the place all the time. Actually, this is pretty normal. I think we’ve done some analyses here, and the average page one search results shift is 1.5 or 2 position changes daily. That’s sort of the MozCast dataset, if I’m recalling correctly. That means that, over the course of a week, it’s not uncommon or unnatural for you to be bouncing around four, five, or six positions up, down, and those kind of things.

I think we should understand what can be behind these things. That’s a very simple list. You made changes, Google made changes, your competitors made changes, or searcher behavior has changed in terms of volume, in terms of what they were engaging with, what they’re clicking on, what their intent behind searches are. Maybe there was just a new movie that came out and in one of the scenes Hulk talks about soothing herbal tea. So now people are searching for very different things than they were before. They want to see the scene. They’re looking for the YouTube video clip and those kind of things. Suddenly Hulk’s soothing herbal tea is no longer directing as well to your site.

So changes like these things can happen. We can’t understand all of them. I think what’s up to us to determine is the degree of analysis and action that’s actually going to provide a return on investment. Looking at these day over day or week over week and throwing up our hands and getting frustrated probably provides very little return on investment. Looking over the long term and saying, “Hey, over the last 6 months, we can observe 26 weeks of ranking change data, and we can see that in aggregate we are now ranking higher and for more keywords than we were previously, and so we’re going to continue pursuing this strategy. This is the set of keywords that we’ve fallen most on, and here are the factors that we’ve identified that are consistent across that group.” I think looking at rankings in aggregate can give us some real positive ROI. Looking at one or two, one week or the next week probably very little ROI.

4) I cannot influence or affect change in my organization because I cannot accurately quantify, predict, or control SEO

That’s true, especially with things like keyword not provided and certainly with the inaccuracy of data that’s provided to us through Google’s Keyword Planner inside of AdWords, for example, and the fact that no one can really control SEO, not fully anyway.

You get up in front of your team, your board, your manager, your client and you say, “Hey, if we don’t do these things, traffic will suffer,” and they go, “Well, you can’t be sure about that, and you can’t perfectly predict it. Last time you told us something, something else happened. So because the data is imperfect, we’d rather spend money on channels that we can perfectly predict, that we can very effectively quantify, and that we can very effectively control.” That is understandable. I think that businesses have a lot of risk aversion naturally, and so wanting to spend time and energy and effort in areas that you can control feels a lot safer.

Some ways to get around this are, first off, know your audience. If you know who you’re talking to in the room, you can often determine the things that will move the needle for them. For example, I find that many managers, many boards, many executives are much more influenced by competitive pressures than they are by, “We won’t do as well as we did before, or we’re loosing out on this potential opportunity.” Saying that is less powerful than saying, “This competitor, who I know we care about and we track ourselves against, is capturing this traffic and here’s how they’re doing it.”

Show multiple scenarios. Many of the SEO presentations that I see and have seen and still see from consultants and from in-house folks come with kind of a single, “Hey, here’s what we predict will happen if we do this or what we predict will happen if we don’t do this.” You’ve got to show multiple scenarios, especially when you know you have error bars because you can’t accurately quantify and predict. You need to show ranges.

So instead of this, I want to see: What happens if we do it a little bit? What happens if we really overinvest? What happens if Google makes a much bigger change on this particular factor than we expect or our competitors do a much bigger investment than we expect? How might those change the numbers?

Then I really do like bringing case studies, especially if you’re a consultant, but even in-house there are so many case studies in SEO on the Web today, you can almost always find someone who’s analogous or nearly analogous and show some of their data, some of the results that they’ve seen. Places like SEMrush, a tool that offers competitive intelligence around rankings, can be great for that. You can show, hey, this media site in our sector made these changes. Look at the delta of keywords they were ranking for versus R over the next six months. Correlation is not causation, but that can be a powerful influencer showing those kind of things.

Then last, but not least, any time you’re going to get up like this and present to a group around these topics, if you very possibly can, try to talk one-on-one with the participants before the meeting actually happens. I have found it almost universally the case that when you get into a group setting, if you haven’t had the discussions beforehand about like, “What are your concerns? What do you think is not valid about this data? Hey, I want to run this by you and get your thoughts before we go to the meeting.” If you don’t do that ahead of time, people can gang up and pile on. One person says, “Hey, I don’t think this is right,” and everybody in the room kind of looks around and goes, “Yeah, I also don’t think that’s right.” Then it just turns into warfare and conflict that you don’t want or need. If you address those things beforehand, then you can include the data, the presentations, and the “I don’t know the answer to this and I know this is important to so and so” in that presentation or in that discussion. It can be hugely helpful. Big difference between winning and losing with that.

5) Google is biasing to big brands. It feels hopeless to compete against them

A lot of people are feeling this hopelessness, hopelessness in SEO about competing against them. I get that pain. In fact, I’ve felt that very strongly for a long time in the SEO world, and I think the trend has only increased. This comes from all sorts of stuff. Brands now have the little dropdown next to their search result listing. There are these brand and entity connections. As Google is using answers and knowledge graph more and more, it’s feeling like those entities are having a bigger influence on where things rank and where they’re visible and where they’re pulling from.

User and usage behavior signals on the rise means that big brands, who have more of those signals, tend to perform better. Brands in the knowledge graph, brands growing links without any effort, they’re just growing links because they’re brands and people point to them naturally. Well, that is all really tough and can be very frustrating.

I think you have a few choices on the table. First off, you can choose to compete with brands where they can’t or won’t. So this is areas like we’re going after these keywords that we know these big brands are not chasing. We’re going after social channels or people on social media that we know big brands aren’t. We’re going after user generated content because they have all these corporate requirements and they won’t invest in that stuff. We’re going after content that they refuse to pursue for one reason or another. That can be very effective.

You better be building, growing, and leveraging your competitive advantage. Whenever you build an organization, you’ve got to say, “Hey, here’s who is out there. This is why we are uniquely better or a uniquely better choice for this set of customers than these other ones.” If you can leverage that, you can generally find opportunities to compete and even to win against big brands. But those things have to become obvious, they have to become well-known, and you need to essentially build some of your brand around those advantages, or they’re not going to give you help in search. That includes media, that includes content, that includes any sort of press and PR you’re doing. That includes how you do your own messaging, all of these things.

(C) You can choose to serve a market or a customer that they don’t or won’t. That can be a powerful way to go about search, because usually search is bifurcated by the customer type. There will be slightly different forms of search queries that are entered by different kinds of customers, and you can pursue one of those that isn’t pursued by the competition.

Last, but not least, I think for everyone in SEO we all realize we’re going to have to become brands ourselves. That means building the signals that are typically associated with brands — authority, recognition from an industry, recognition from a customer set, awareness of our brand even before a search has happened. I talked about this in a previous Whiteboard Friday, but I think because of these things, SEO is becoming a channel that you benefit from as you grow your brand rather than the channel you use to initially build your brand.

All right, everyone. Hope these have been helpful in combating some of these infuriating, frustrating problems and that we’ll see some great comments from you guys. I hope to participate in those as well, and we’ll catch you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

Technical Site Audit Checklist: 2015 Edition

Posted by GeoffKenyon

Back in 2011, I wrote a technical site audit checklist, and while it was thorough, there have been a lot of additions to what is encompassed in a site audit. I have gone through and updated that old checklist for 2015. Some of the biggest changes were the addition of sections for mobile, international, and site speed.

This checklist should help you put together a thorough site audit and determine what is holding back the organic performance of your site. At the end of your audit, don’t write a document that says what’s wrong with the website. Instead, create a document that says what needs to be done. Then explain why these actions need to be taken and why they are important. What I’ve found to really helpful is to provide a prioritized list along with your document of all the actions that you would like them to implement. This list can be handed off to a dev or content team to be implemented easily. These teams can refer to your more thorough document as needed.


Quick overview

Check indexed pages  
  • Do a site: search.
  • How many pages are returned? (This can be way off so don’t put too much stock in this).
  • Is the homepage showing up as the first result? 
  • If the homepage isn’t showing up as the first result, there could be issues, like a penalty or poor site architecture/internal linking, affecting the site. This may be less of a concern as Google’s John Mueller recently said that your homepage doesn’t need to be listed first.

Review the number of organic landing pages in Google Analytics

  • Does this match with the number of results in a site: search?
  • This is often the best view of how many pages are in a search engine’s index that search engines find valuable.

Search for the brand and branded terms

  • Is the homepage showing up at the top, or are correct pages showing up?
  • If the proper pages aren’t showing up as the first result, there could be issues, like a penalty, in play.
Check Google’s cache for key pages
  • Is the content showing up?
  • Are navigation links present?
  • Are there links that aren’t visible on the site?
PRO Tip:
Don’t forget to check the text-only version of the cached page. Here is a
bookmarklet to help you do that.

Do a mobile search for your brand and key landing pages

  • Does your listing have the “mobile friendly” label?
  • Are your landing pages mobile friendly?
  • If the answer is no to either of these, it may be costing you organic visits.

On-page optimization

Title tags are optimized
  • Title tags should be optimized and unique.
  • Your brand name should be included in your title tag to improve click-through rates.
  • Title tags are about 55-60 characters (512 pixels) to be fully displayed. You can test here or review title pixel widths in Screaming Frog.
Important pages have click-through rate optimized titles and meta descriptions
  • This will help improve your organic traffic independent of your rankings.
  • You can use SERP Turkey for this.

Check for pages missing page titles and meta descriptions
  
The on-page content includes the primary keyword phrase multiple times as well as variations and alternate keyword phrases
  
There is a significant amount of optimized, unique content on key pages
 
The primary keyword phrase is contained in the H1 tag
  

Images’ file names and alt text are optimized to include the primary keyword phrase associated with the page.
 
URLs are descriptive and optimized
  • While it is beneficial to include your keyword phrase in URLs, changing your URLs can negatively impact traffic when you do a 301. As such, I typically recommend optimizing URLs when the current ones are really bad or when you don’t have to change URLs with existing external links.
Clean URLs
  • No excessive parameters or session IDs.
  • URLs exposed to search engines should be static.
Short URLs
  • 115 characters or shorter – this character limit isn’t set in stone, but shorter URLs are better for usability.

Content

Homepage content is optimized
  • Does the homepage have at least one paragraph?
  • There has to be enough content on the page to give search engines an understanding of what a page is about. Based on my experience, I typically recommend at least 150 words.
Landing pages are optimized
  • Do these pages have at least a few paragraphs of content? Is it enough to give search engines an understanding of what the page is about?
  • Is it template text or is it completely unique?
Site contains real and substantial content
  • Is there real content on the site or is the “content” simply a list of links?
Proper keyword targeting
  • Does the intent behind the keyword match the intent of the landing page?
  • Are there pages targeting head terms, mid-tail, and long-tail keywords?
Keyword cannibalization
  • Do a site: search in Google for important keyword phrases.
  • Check for duplicate content/page titles using the Moz Pro Crawl Test.
Content to help users convert exists and is easily accessible to users
  • In addition to search engine driven content, there should be content to help educate users about the product or service.
Content formatting
  • Is the content formatted well and easy to read quickly?
  • Are H tags used?
  • Are images used?
  • Is the text broken down into easy to read paragraphs?
Good headlines on blog posts
  • Good headlines go a long way. Make sure the headlines are well written and draw users in.
Amount of content versus ads
  • Since the implementation of Panda, the amount of ad-space on a page has become important to evaluate.
  • Make sure there is significant unique content above the fold.
  • If you have more ads than unique content, you are probably going to have a problem.

Duplicate content

There should be one URL for each piece of content
  • Do URLs include parameters or tracking code? This will result in multiple URLs for a piece of content.
  • Does the same content reside on completely different URLs? This is often due to products/content being replicated across different categories.
Pro Tip:
Exclude common parameters, such as those used to designate tracking code, in Google Webmaster Tools. Read more at
Search Engine Land.
Do a search to check for duplicate content
  • Take a content snippet, put it in quotes and search for it.
  • Does the content show up elsewhere on the domain?
  • Has it been scraped? If the content has been scraped, you should file a content removal request with Google.
Sub-domain duplicate content
  • Does the same content exist on different sub-domains?
Check for a secure version of the site
  • Does the content exist on a secure version of the site?
Check other sites owned by the company
  • Is the content replicated on other domains owned by the company?
Check for “print” pages
  • If there are “printer friendly” versions of pages, they may be causing duplicate content.

Accessibility & Indexation

Check the robots.txt

  • Has the entire site, or important content been blocked? Is link equity being orphaned due to pages being blocked via the robots.txt?

Turn off JavaScript, cookies, and CSS

Now change your user agent to Googlebot

PRO Tip:
Use
SEO Browser to do a quick spot check.

Check the SEOmoz PRO Campaign

  • Check for 4xx errors and 5xx errors.

XML sitemaps are listed in the robots.txt file

XML sitemaps are submitted to Google/Bing Webmaster Tools

Check pages for meta robots noindex tag

  • Are pages accidentally being tagged with the meta robots noindex command
  • Are there pages that should have the noindex command applied
  • You can check the site quickly via a crawl tool such as Moz or Screaming Frog

Do goal pages have the noindex command applied?

  • This is important to prevent direct organic visits from showing up as goals in analytics

Site architecture and internal linking

Number of links on a page
Vertical linking structures are in place
  • Homepage links to category pages.
  • Category pages link to sub-category and product pages as appropriate.
  • Product pages link to relevant category pages.
Horizontal linking structures are in place
  • Category pages link to other relevant category pages.
  • Product pages link to other relevant product pages.
Links are in content
  • Does not utilize massive blocks of links stuck in the content to do internal linking.
Footer links
  • Does not use a block of footer links instead of proper navigation.
  • Does not link to landing pages with optimized anchors.
Good internal anchor text
 
Check for broken links
  • Link Checker and Xenu are good tools for this.

Technical issues

Proper use of 301s
  • Are 301s being used for all redirects?
  • If the root is being directed to a landing page, are they using a 301 instead of a 302?
  • Use Live HTTP Headers Firefox plugin to check 301s.
“Bad” redirects are avoided
  • These include 302s, 307s, meta refresh, and JavaScript redirects as they pass little to no value.
  • These redirects can easily be identified with a tool like Screaming Frog.
Redirects point directly to the final URL and do not leverage redirect chains
  • Redirect chains significantly diminish the amount of link equity associated with the final URL.
  • Google has said that they will stop following a redirect chain after several redirects.
Use of JavaScript
  • Is content being served in JavaScript?
  • Are links being served in JavaScript? Is this to do PR sculpting or is it accidental?
Use of iFrames
  • Is content being pulled in via iFrames?
Use of Flash
  • Is the entire site done in Flash, or is Flash used sparingly in a way that doesn’t hinder crawling?
Check for errors in Google Webmaster Tools
  • Google WMT will give you a good list of technical problems that they are encountering on your site (such as: 4xx and 5xx errors, inaccessible pages in the XML sitemap, and soft 404s)
XML Sitemaps  
  • Are XML sitemaps in place?
  • Are XML sitemaps covering for poor site architecture?
  • Are XML sitemaps structured to show indexation problems?
  • Do the sitemaps follow proper XML protocols
Canonical version of the site established through 301s
 
Canonical version of site is specified in Google Webmaster Tools
 
Rel canonical link tag is properly implemented across the site
Uses absolute URLs instead of relative URLs
  • This can cause a lot of problems if you have a root domain with secure sections.

Site speed


Review page load time for key pages 

Make sure compression is enabled


Enable caching


Optimize your images for the web


Minify your CSS/JS/HTML

Use a good, fast host
  • Consider using a CDN for your images.

Optimize your images for the web

Mobile

Review the mobile experience
  • Is there a mobile site set up?
  • If there is, is it a mobile site, responsive design, or dynamic serving?


Make sure analytics are set up if separate mobile content exists


If dynamic serving is being used, make sure the Vary HTTP header is being used

Review how the mobile experience matches up with the intent of mobile visitors
  • Do your mobile visitors have a different intent than desktop based visitors?
Ensure faulty mobile redirects do not exist
  • If your site redirects mobile visitors away from their intended URL (typically to the homepage), you’re likely going to run into issues impacting your mobile organic performance.
Ensure that the relationship between the mobile site and desktop site is established with proper markup
  • If a mobile site (m.) exists, does the desktop equivalent URL point to the mobile version with rel=”alternate”?
  • Does the mobile version canonical to the desktop version?
  • Official documentation.

International

Review international versions indicated in the URL
  • ex: site.com/uk/ or uk.site.com
Enable country based targeting in webmaster tools
  • If the site is targeted to one specific country, is this specified in webmaster tools? 
  • If the site has international sections, are they targeted in webmaster tools?
Implement hreflang / rel alternate if relevant
If there are multiple versions of a site in the same language (such as /us/ and /uk/, both in English), update the copy been updated so that they are both unique
 

Make sure the currency reflects the country targeted
 
Ensure the URL structure is in the native language 
  • Try to avoid having all URLs in the default language

Analytics

Analytics tracking code is on every page
  • You can check this using the “custom” filter in a Screaming Frog Crawl or by looking for self referrals.
  • Are there pages that should be blocked?
There is only one instance of a GA property on a page
  • Having the same Google Analytics property will create problems with pageview-related metrics such as inflating page views and pages per visit and reducing the bounce rate.
  • It is OK to have multiple GA properties listed, this won’t cause a problem.
Analytics is properly tracking and capturing internal searches
 

Demographics tracking is set up

Adwords and Adsense are properly linked if you are using these platforms
Internal IP addresses are excluded
UTM Campaign Parameters are used for other marketing efforts
Meta refresh and JavaScript redirects are avoided
  • These can artificially lower bounce rates.
Event tracking is set up for key user interactions

This audit covers the main technical elements of a site and should help you uncover any issues that are holding a site back. As with any project, the deliverable is critical. I’ve found focusing on the solution and impact (business case) is the best approach for site audit reports. While it is important to outline the problems, too much detail here can take away from the recommendations. If you’re looking for more resources on site audits, I recommend the following:

Helpful tools for doing a site audit:

Annie Cushing’s Site Audit
Web Developer Toolbar
User Agent Add-on
Firebug
Link Checker
SEObook Toolbar
MozBar (Moz’s SEO toolbar)
Xenu
Screaming Frog
Your own scraper
Inflow’s technical mobile best practices

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from moz.com