The updates add guidelines on content expertise and interstitial pages, while lumping “E-A-T” in with “Page Quality.”
Please visit Search Engine Land for the full article.
[ccw-atrib-link]
The … Continue reading “How to Create Good Quality Content”
The post How to Create Good Quality Content appeared first on OutreachMama.
[ccw-atrib-link]
Google made some small tweaks to their quality raters guidelines on July 27, 2017. The previous update, on March 14, 2017, was much larger.
The post Google updates quality raters guidelines with details around non-English language web pages appeared first on Search Engine Land.
Please visit Search Engine Land for the full article.
[ccw-atrib-link]
As an open source ecommerce platform, Magento is flexible and accessible for developers to work with and as a result, an active community of developers emerged on online forums and at offline meetups all over the world. Many of these were happily plugging away independently of Magento until the split from eBay in early 2015.
Free from the reins of eBay, Magento has decisively been reaching out to, promoting and rewarding the individuals, agencies and technology providers that make up its ecosystem. Last February they announced the Magento Masters Program, empowering the top platform advocates, frequent forum contributors and the innovative solution implementers. Then at April‘s Magento Imagine conference (the largest yet) the theme emerged as ‘We are Magento”, in celebration of the community.
The new Xcelerate Technology Partner Program focuses not on individuals but on business partnerships formed with the technology companies that offer tools for Magento merchants to implement.
Sharing ideas, opportunities and successes:
This is the Xcelerate Program tagline, which acts as a sort of mission statement to get the technology partners involved moving with regards to continuously considering Magento in their own technology roadmap and jointly communicating successes and learnings from working on implementations with merchants.
“In turn, the program offers members the tools to get moving, through events, resources and contacts. Our goal is to enable you to be an integral part of the Magento ecosystem” Jon Carmody, Head of Technology Partners
The program in practice:
The new program is accompanied by the new Marketplace from which the extensions can be purchased and downloaded. The program splits the extensions into 3 partnership levels:
Registered Partners – these are technology extensions that the new Magento Marketplace team test for code quality. Extensions must now pass this initial level to be eligible for the Marketplace. With each merchant having on average 15 extensions for their site, this is a win for merchants when it comes to extension trustworthiness.
Select Partners – extensions can enter this second tier if the technology falls into one of the strategic categories identified by Magento and if they pass an in-depth technical review. These will be marked as being ‘Select’ in the Marketplace.
Premier Partners – this level is by invitation only, chosen as providing crucial technology to Magento merchants (such as payments, marketing, tax software). The Magento team’s Extension Quality Program looks at coding structure, performance, scalability, security and compatibility but influence in the Community is also a consideration. dotmailer is proud to be the first Premier Technology Partner in the marketing space for Magento.
All in all, the latest move from Magento in illuminating its ecosystem should be positive for all; the merchants who can now choose from a vetted list of extensions and know when to expect tight integration, the technology partners building extensions now with clearer merchant needs/extension gaps in mind and guidance from Magento, and of course the solution implementers recommending the best extension for the merchant now knowing it will be maintained.
[ccw-atrib-link]
Posted by Cyrus-Shepard
Recently, Moz announced the results of our biennial Ranking Factors study. Today, we’d like to explore one of the most vital elements of the study: the Ranking Factors survey.
2015 Ranking Factors Expert Survey
Every two years, Moz surveys the brightest minds in SEO and search marketing with a comprehensive set of questions meant to gauge the current workings of Google’s search algorithm. This year’s panel of experts possesses a truly unique set of knowledge and perspectives. We’re thankful on behalf of the entire community for their contribution.
In addition to asking the participants about what does and doesn’t work in Google’s ranking algorithm today, one of the most illuminating group of questions asks the panel to predict the future of search – how the features of Google’s algorithm are expected to change over the next 12 months.
Amazingly, almost all of the factors that are expected to increase in influence revolved around user experience, including:
The experts predicted that more traditional ranking signals, such as those around links and URL structures, would largely remain the same, while the more manipulative aspects of SEO, like paid links and anchor text (which is subject to manipulation), would largely decrease in influence.
The survey also asks respondents to weight the importance of various factors within Google’s current ranking algorithm (on a scale of 1-10). Understanding these areas of importance helps to inform webmasters and marketers where to invest time and energy in working to improve the search presence of their websites.
These features describe use of the keyword term/phrase in particular parts of the HTML code on the page (title element, H1s, alt attributes, etc).
Highest influence: Keyword present in title element, 8.34
Lowest influence: Keyword present in specific HTML elements (bold/italic/li/a/etc), 4.16
![]() |
Titles are still very powerful. Overall, it’s about focus and matching query syntax. If your post is about airplane propellers but you go on a three paragraph rant about gorillas, you’re going to have a problem ranking for airplane propellers. |
![]() |
Keyword usage is vital to making the cut, but we don’t always see it correlate with ranking, because we’re only looking at what already made the cut. The page has to be relevant to appear for a query, IMO, but when it comes to how high the page ranks once it’s relevant, I think keywords have less impact than they once did. So, it’s a necessary but not sufficient condition to ranking. |
![]() |
In my experience, most of problems with organic visibility are related to on-page factors. When I look for an opportunity, I try to check for 2 strong things: presence of keyword in the title and in the main content. Having both can speed up your visibility, especially on long-tail queries. |
These features cover how keywords are used in the root or subdomain name, and how much impact this might have on search engine rankings.
Highest influence: Keyword is the exact match root domain name, 5.83
Lowest influence: Keyword is the domain extension, 2.55
![]() |
The only domain/keyword factor I’ve seen really influence rankings is an exact match. Subdomains, partial match, and others appear to have little or no effect. |
![]() |
There’s no direct influence, but an exact match root domain name can definitely lead to a higher CTR within the SERPs and therefore a better ranking in the long term. |
![]() |
It’s very easy to link keyword-rich domains with their success in Google’s results for the given keyword. I’m always mindful about other signals that align with domain name which may have contributed to its success. These includes inbound links, mentions, and local citations. |
These features describe link metrics for the individual ranking page (such as number of links, PageRank, etc).
Highest influence: Raw quantity of links from high-authority sites, 7.78
Lowest influence: Sentiment of the external links pointing to the page, 3.85
![]() |
High-quality links still rule rankings. The way a brand can earn links has become more important over the years, whereas link schemes can hurt a site more than ever before. There is a lot of FUD slinging in this respect! |
![]() |
Similar to my thoughts on content, I suspect link-based metrics are going to be used increasingly with a focus on verisimilitude (whether content is actually true or not) and relationships between nodes in Knowledge Graph. Google’s recent issues with things, such as the snippet results for “evolution,” highlight the importance of them only pulling things that are factually correct for featured parts of a SERP. Thus, just counting traditional link metrics won’t cut it anymore. |
![]() |
While anchor text is still a powerful ranking factor, using targeted anchor text carries a significant amount of risk and can easily wipe out your previous success. |
These features describe elements that indicate qualities of branding and brand metrics.
Highest influence: Search volume for the brand/domain, 6.54
Lowest influence: Popularity of business’s official social media profiles, 3.99
![]() |
This is clearly on deck to change very soon with the reintegration of Twitter into Google’s Real-Time Results. It will be interesting to see how this affects the “Breaking News” box and trending topics. Social influencers, quality and quantity of followers, RTs, and favorites will all be a factor. And what’s this?! Hashtags will be important again?! Have mercy! |
![]() |
Google has to give the people what they want, and if most of the time they are searching for a brand, Google is going to give them that brand. Google doesn’t have a brand bias, we do. |
![]() |
It’s already noticeable; brands are more prominently displayed in search results for both informational and commercial queries. I’m expecting Google will be paying more attention to brand-related metrics from now on (and certainly more initiatives to encourage site owners to optimize for better entity detection). |
These features relate to third-party metrics from social media sources (Facebook, Twitter, Google+, etc) for the ranking page.
Highest influence: Engagement with content/URL on social networks, 3.87
Lowest influence: Upvotes for the page on social sites, 2.7
![]() |
Social ranking factors are important in a revamped Query Deserves Freshness algorithm. Essentially, if your content gets a lot of natural tweets, shares, and likes, it will rank prominently for a short period of time, until larger and more authoritative sites catch up. |
![]() |
Social popularity has several factors to consider: (1) Years ago, Google and Bing said they take into account the authority of a social profile sharing a link and the popularity of the link being shared (retweets/reshares), and there was more complexity to social signals that was never revealed even back then. (2) My experience has been that social links and shares have more power for newsy/fresh-type content. For example, a lot of social shares for a dentist’s office website wouldn’t be nearly as powerful (or relevant to consider) as a lot of social shares for an article on a site with a constant flow of fresh content. |
![]() |
Honestly, I do not think that the so-called “social signals” have any direct influence on the Google Algorithm (that does not mean that a correlation doesn’t exist, though). My only doubt is related to Twitter, because of the renewed contract between Google and Twitter itself. That said, as of now I do not consider Twitter to offer any ranking signals, except for very specific niches related to news and “news-able” content, where QDF plays a fundamental role. |
These elements describe non-keyword-usage, non-link-metrics features of individual pages (such as length of the page, load speed, etc).
Highest influence: Uniqueness of the content on the page, 7.85
Lowest influence: Page contains Open Graph data and/or Twitter cards, 3.64
![]() |
By branching mobile search off of Google’s core ranking algorithm, having a “mobile-friendly” website is probably now less important for desktop search rankings. Our clients are seeing an ever-increasing percentage of organic search traffic coming from mobile devices, though (particularly in retail), so this is certainly not an excuse to ignore responsive design – the opposite, in fact. Click-through rate from the SERPs has been an important ranking signal for a long time and continues to be, flagging irrelevant or poor-quality search listings. |
![]() |
I believe many of these will be measured within the ecosystem, rather than absolutely. For example, the effect of bounce rate (or rather, bounce speed) on a site will be relative to the bounce speeds on other pages in similar positions for similar terms. |
![]() |
I want to answer these a certain way because, while I have been told by Google what matters to them, what I see in the SERPs does not back up what Google claims they want. There are a lot of sites out there with horrible UX that rank in the top three. While I believe it’s really important for conversion and to bring customers back, I don’t feel as though Google is all that concerned, based on the sites that rank highly. Additionally, Google practically screams “unique content,” yet sites that more or less steal and republish content from other sites are still ranking highly. What I think should matter to Google doesn’t seem to matter to them, based on the results they give me. |
These features describe link metrics about the domain hosting the page.
Highest influence: Quantity of unique linking domains to the domain, 7.45
Lowest influence: Sentiment of the external links pointing to the site, 3.91
![]() |
Quantity and quality of unique linking domains at the domain level is still among the most significant factors in determining how a domain will perform as a whole in the organic search results, and is among the best SEO “spot checks” for determining if a site will be successful relative to other competitor sites with similar content and selling points. |
![]() |
Throughout this survey, when I say “no direct influence,” this is interchangeable with “no direct positive influence.” For example, I’ve marked exact match domain as low numbers, while their actual influence may be higher – though negatively. |
![]() |
Topical relevancy has, in my opinion, gained much ground as a relevant ranking factor. Although I find it most at play when at page level, I am seeing significant shifts at overall domain relevancy, by long-tail growth or by topically-relevant domains linking to sites. One way I judge such movements is the growth of the long-tail relevant to the subject or ranking, when neither anchor text (exact match or synonyms) nor exact phrase is used in a site’s content, yet it still ranks very highly for long-tail and mid-tail synonyms. |
These features relate to the entire root domain, but don’t directly describe link- or keyword-based elements. Instead, they relate to things like the length of the domain name in characters.
Highest influence: Uniqueness of content across the whole site, 7.52
Lowest influence: Length of time until domain name expires, 2.45
![]() |
Character length of domain name is another correlative yet not causative factor, in my opinion. They don’t need to rule these out – it just so happens that longer domain names get clicked on, so they get ruled out quickly. |
![]() |
A few points: Google’s document inception date patents describe how Google might handle freshness and maturity of content for a query. The “trust signal” pages sound like a site quality metric that Google might use to score a page on the basis of site quality. Some white papers from Microsoft on web spam signals identified multiple hyphens in subdomains as evidence of web spam. The length of time until the domain expires was cited as a potential signal in Google’s patent on information retrieval through historic data, and was refuted by Matt Cutts after domain sellers started trying to use that information to sell domain extensions to “help the SEO” of a site. |
![]() |
I think that page speed only becomes a factor when it is significantly slow. I think that having error pages on the site doesn’t matter, unless there are so many that it greatly impacts Google’s ability to crawl. |
To bring it back to the beginning, we asked the experts if they had any comments or alternative signals they think will become more or less important over the next 12 months.
![]() |
While I expect that static factors, such as incoming links and anchor text, will remain influential, I think the power of these will be mediated by the presence or absence of engagement factors. |
![]() |
The app world and webpage world are getting lumped together. If you have the more popular app relative to your competitors, expect Google to notice. |
![]() |
Mobile will continue to increase, with directly-related factors increasing as well. Structured data will increase, along with more data partners and user segmentation/personalization of SERPs to match query intent, localization, and device-specific need states. |
![]() |
User location may have more influence in mobile SERPs as (a) more connected devices like cars and watches allow voice search, and (b) sites evolve accordingly to make such signals more accurate. |
![]() |
I really think that over the next 12-18 months we are going to see a larger impact of structured data in the SERPs. In fact, we are already seeing this. Google has teams that focus on artificial intelligence and machine learning. They are studying “relationships of interest” and, at the heart of what they are doing, are still looking to provide the most relevant result in the quickest fashion. Things like schema that help “educate” the search engines as to a given topic or entity are only going to become more important as a result. |
For more data, check out the complete Ranking Factors Survey results.
2015 Ranking Factors Expert Survey
Finally, we leave you with this infographic created by Kevin Engle which shows the relative weighting of broad areas of Google’s algorithm, according to the experts.
What’s your opinion on the future of search and SEO? Let us know in the comments below.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
[ccw-atrib-link]
Posted by wrttnwrd
In spite of all the advice, the strategic discussions and the conference talks, we Internet marketers are still algorithmic thinkers. That’s obvious when you think of SEO.
Even when we talk about content, we’re algorithmic thinkers. Ask yourself: How many times has a client asked you, “How much content do we need?” How often do you still hear “How unique does this page need to be?”
That’s 100% algorithmic thinking: Produce a certain amount of content, move up a certain number of spaces.
But you and I know it’s complete bullshit.
I’m not suggesting you ignore the algorithm. You should definitely chase it. Understanding a little bit about what goes on in Google’s pointy little head helps. But it’s not enough.
I have this friend.
He ranked #10 for “flibbergibbet.” He wanted to rank #1.
He compared his site to the #1 site and realized the #1 site had five hundred blog posts.
“That site has five hundred blog posts,” he said, “I must have more.”
So he hired a few writers and cranked out five thousand blogs posts that melted Microsoft Word’s grammar check. He didn’t move up in the rankings. I’m shocked.
“That guy’s spamming,” he decided, “I’ll just report him to Google and hope for the best.”
What happened? Why didn’t adding five thousand blog posts work?
It’s pretty obvious: My, uh, friend added nothing but crap content to a site that was already outranked. Bulk is no longer a ranking tactic. Google’s very aware of that tactic. Lots of smart engineers have put time into updates like Panda to compensate.
He started like this:
And ended up like this:
Alright, yeah, I was Mr. Flood The Site With Content, way back in 2003. Don’t judge me, whippersnappers.
Reality’s never that obvious. You’re scratching and clawing to move up two spots, you’ve got an overtasked IT team pushing back on changes, and you’ve got a boss who needs to know the implications of every recommendation.
Why fix duplication if rel=canonical can address it? Fixing duplication will take more time and cost more money. It’s easier to paste in one line of code. You and I know it’s better to fix the duplication. But it’s a hard sell.
Why deal with 302 versus 404 response codes and home page redirection? The basic user experience remains the same. Again, we just know that a server should return one home page without any redirects and that it should send a ‘not found’ 404 response if a page is missing. If it’s going to take 3 developer hours to reconfigure the server, though, how do we justify it? There’s no flashing sign reading “Your site has a problem!”
Why change this thing and not that thing?
At the same time, our boss/client sees that the site above theirs has five hundred blog posts and thousands of links from sites selling correspondence MBAs. So they want five thousand blog posts and cheap links as quickly as possible.
Cue crazy music.
SEO is, in some ways, for the insane. It’s an absurd collection of technical tweaks, content thinking, link building and other little tactics that may or may not work. A novice gets exposed to one piece of crappy information after another, with an occasional bit of useful stuff mixed in. They create sites that repel search engines and piss off users. They get more awful advice. The cycle repeats. Every time it does, best practices get more muddled.
SEO lacks clarity. We can’t easily weigh the value of one change or tactic over another. But we can look at our changes and tactics in context. When we examine the potential of several changes or tactics before we flip the switch, we get a closer balance between algorithm-thinking and actual strategy.
At some point you have to turn that knowledge into practice. You have to take action based on recommendations, your knowledge of SEO, and business considerations.
That’s hard when we can’t even agree on subdomains vs. subfolders.
I know subfolders work better. Sorry, couldn’t resist. Let the flaming comments commence.
To get clarity, take a deep breath and ask yourself:
“All other things being equal, will this change, tactic, or strategy move my site closer to perfect than my competitors?”
Breaking it down:
A change takes an existing component or policy and makes it something else. Replatforming is a massive change. Adding a new page is a smaller one. Adding ALT attributes to your images is another example. Changing the way your shopping cart works is yet another.
A tactic is a specific, executable practice. In SEO, that might be fixing broken links, optimizing ALT attributes, optimizing title tags or producing a specific piece of content.
A strategy is a broader decision that’ll cause change or drive tactics. A long-term content policy is the easiest example. Shifting away from asynchronous content and moving to server-generated content is another example.
No one knows exactly what Google considers “perfect,” and “perfect” can’t really exist, but you can bet a perfect web page/site would have all of the following:
These 8 categories (and any of the other bazillion that probably exist) give you a way to break down “perfect” and help you focus on what’s really going to move you forward. These different areas may involve different facets of your organization.
Your IT team can work on load time and creating an error-free front- and back-end. Link building requires the time and effort of content and outreach teams.
Tactics for relevant, visible content and current best practices in UX are going to be more involved, requiring research and real study of your audience.
What you need and what resources you have are going to impact which tactics are most realistic for you.
But there’s a basic rule: If a website would make Googlebot swoon and present zero obstacles to users, it’s close to perfect.
Assume every competing website is optimized exactly as well as yours.
Now ask: Will this [tactic, change or strategy] move you closer to perfect?
That’s the “all other things being equal” rule. And it’s an incredibly powerful rubric for evaluating potential changes before you act. Pretend you’re in a tie with your competitors. Will this one thing be the tiebreaker? Will it put you ahead? Or will it cause you to fall behind?
Perfect is great, but unattainable. What you really need is to be just a little perfect-er.
Chasing perfect can be dangerous. Perfect is the enemy of the good (I love that quote. Hated Voltaire. But I love that quote). If you wait for the opportunity/resources to reach perfection, you’ll never do anything. And the only way to reduce distance from perfect is to execute.
Instead of aiming for pure perfection, aim for more perfect than your competitors. Beat them feature-by-feature, tactic-by-tactic. Implement strategy that supports long-term superiority.
Don’t slack off. But set priorities and measure your effort. If fixing server response codes will take one hour and fixing duplication will take ten, fix the response codes first. Both move you closer to perfect. Fixing response codes may not move the needle as much, but it’s a lot easier to do. Then move on to fixing duplicates.
Do the 60% that gets you a 90% improvement. Then move on to the next thing and do it again. When you’re done, get to work on that last 40%. Repeat as necessary.
Take advantage of quick wins. That gives you more time to focus on your bigger solutions.
Google has lots of tweaks, tools and workarounds to help us mitigate sub-optimal sites:
Easy, right? All of these solutions may reduce distance from perfect (the search engines don’t guarantee it). But they don’t reduce it as much as fixing the problems.
The next time you set up rel=canonical, ask yourself:
“All other things being equal, will using rel=canonical to make up for duplication move my site closer to perfect than my competitors?”
Answer: Not if they’re using rel=canonical, too. You’re both using imperfect solutions that force search engines to crawl every page of your site, duplicates included. If you want to pass them on your way to perfect, you need to fix the duplicate content.
When you use Angular.js to deliver regular content pages, ask yourself:
“All other things being equal, will using HTML snapshots instead of actual, visible content move my site closer to perfect than my competitors?”
Answer: No. Just no. Not in your wildest, code-addled dreams. If I’m Google, which site will I prefer? The one that renders for me the same way it renders for users? Or the one that has to deliver two separate versions of every page?
When you spill banner ads all over your site, ask yourself…
You get the idea. Nofollow is better than follow, but banner pollution is still pretty dang far from perfect.
Mitigating SEO issues with search engine-specific tools is “fine.” But it’s far, far from perfect. If search engines are forced to choose, they’ll favor the site that just works.
By the way, distance from perfect absolutely applies to other channels.
I’m focusing on SEO, but think of other Internet marketing disciplines. I hear stuff like “How fast should my site be?” (Faster than it is right now.) Or “I’ve heard you shouldn’t have any content below the fold.” (Maybe in 2001.) Or “I need background video on my home page!” (Why? Do you have a reason?) Or, my favorite: “What’s a good bounce rate?” (Zero is pretty awesome.)
And Internet marketing venues are working to measure distance from perfect. Pay-per-click marketing has the quality score: A codified financial reward applied for seeking distance from perfect in as many elements as possible of your advertising program.
Social media venues are aggressively building their own forms of graphing, scoring and ranking systems designed to separate the good from the bad.
Really, all marketing includes some measure of distance from perfect. But no channel is more influenced by it than SEO. Instead of arguing one rule at a time, ask yourself and your boss or client: Will this move us closer to perfect?
Hell, you might even please a customer or two.
One last note for all of the SEOs in the crowd. Before you start pointing out edge cases, consider this: We spend our days combing Google for embarrassing rankings issues. Every now and then, we find one, point, and start yelling “SEE! SEE!!!! THE GOOGLES MADE MISTAKES!!!!” Google’s got lots of issues. Screwing up the rankings isn’t one of them.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
[ccw-atrib-link]