Fonctionnalités et réglages WordPress SEO by Yoast

Doc Rama du blog SEO http://blograma.fr aborde dans cette seconde vidéo concernant le plugin Yoast pour réaliser la SEO de votre blog ou site sous wordpress …

Reblogged 4 years ago from www.youtube.com

Link Echoes (a.k.a. Link Ghosts): Why Rankings Remain Even After Links Disappear – Whiteboard Friday

Posted by randfish

One of the more interesting phenomena illustrated by Rand’s IMEC Lab project is that of “link echoes,” sometimes referred to as “link ghosts.” The idea is that if we move a page up in rankings by pointing links to it, and then remove those links, the bump in rankings often remains.

In today’s Whiteboard Friday, Rand explains what’s going on.

One quick note: Rand mentions a bit.ly link in this video that isn’t quite accurate; here’s the correct one. =)

For reference, here’s a still of this week’s whiteboard!

Video Transcription

Howdy Moz fans and welcome to another edition of Whiteboard Friday. This week I’m going to talk a little bit about link echoes. This is the reverberation of the effect of a link across Google’s link graph and across the rankings, that has an impact even after a link has been removed. In the past, we have also referred to these as link ghosts, but I think link echoes is actually a much better name. I appreciate some folks pointing that out for me.

Let me show you exactly what I’m talking about. So, as you might know, I’ve been running a number of tests, and those tests have been happening through a project I call IMEC Lab. If you go to http://bit.ly/imeclab, you will find this project ongoing.

We’ve been performing a number of tests over the last six months. I started with a smaller group. The group has gotten bigger. So we’ve been able to test some really fascinating things. A number of those have been around tests related to links. I’m going to share one of those tests, because it helps really highlight what’s going on with link echoes.

So we had a page point ranking number 31 for a key phrase, a not very competitive keyword search phrase, and the only reason I’m not transparently sharing these, at least not yet, is because we prefer that Google didn’t know all of the websites and pages that we’re pointing links from. Otherwise, they could potentially mess with the test. We like to keep the test results as clean as possible, and so we’re not disclosing these for right now.

Another page, page B ranking number 11 for the same query. So page ranking for query A, that’s page A ranking number 31, page B ranking number 11. Of course, our first step . . . well, this was one of the steps in our test was we pointed 22 links from 22 different websites, all the same pages of those sites to both A and B. We were actually trying to test anchor text. So we pointed anchor text exact match links at A, non-match at B. We wanted to see which one would boost it up. Some of the links we put first, some of the links we put second. We tried to control a bunch of variables.

We ran tests like these many times. I think this particular one we repeated four or five different times. In this case, we saw A, the one that was ranking number 31, it moved up to position one. Just 22 links were able to move it, bam. Anchor text links able to move it up to position one. Anchor text links obviously still pretty darn powerful. We could see that in each of our tests.

B we pointed those same 22 links at, that moved up 6 positions. Remember it didn’t have the exact match anchor text, so it moved up to position five, still quite impressive.

Then we did something else. We took those links away. We removed all the links, and this is pretty natural. We want to run more tests. We’re going to use some of these same sites and pages, so we removed all the links, no longer exist. The next week, they’d all been indexed. What happened?

Well, gosh, page A, that was ranking number 31 and moved up to 1, even after all those pages that were linking to it had been indexed with no link there anymore by Google, didn’t move. It stayed in position number one. That’s pretty weird. Almost the same thing happened with result B. It moved down one position. It’s ranking number six.

Even weirder, this happened over four and a half months ago. We’re now in the middle end of July. This was in mid-April, early April. That’s a very long time, right? Google’s indexed these pages that we’re linking many times, never seen the links to them. As far as we can tell, there are no new links pointing to either of those pages. At least we haven’t seen them, and none of the link tools out there have seen them. So it’s possible, maybe some new links.

Here’s where it gets weird. This effect of these link tests, remaining in place long after the link had been removed, happened in every single link test we ran, of which I counted eight where I feel highly confident that there were no confounding variables, feeling really good that we followed a process kind of just like this. The links pointed, the ranking rose. The links disappeared, the ranking stayed high. Eight different consecutive tests every single time. In fact, there wasn’t one test where, when we removed the links, the rankings fell back to their original position. Some of them like this one fell a position or two. Almost everything that we moved from page two or three stayed on page one after we linked to it, even after removing the links.

This argues strongly in favor of a phenomenon that some SEOs have speculated about for a good amount of time. I believe one of them is Martin Panayotov — I might not be pronouncing his name correctly — and, of course, Moz contributor Michael King, iPullRank. Both of them had commented on a post years ago saying link ghosts, aka link echoes, are real. You guys should look into them. Sorry it took us so long to look into this, but this is fascinating.

Now, there could be a number of explanations behind this link echo phenomenon, the continuing reverberation of a link’s effect on a ranking. It could be that maybe that page ends up performing well in Google’s analysis of its user and usage data. It ranks well for this relatively unpopular query. It’s ranking number one. And you know what? Google’s finding that the click-throughs are still pretty high. There’s not a lot of pogo sticking back to the results. Maybe they’re going, “Hey, this page looks legit. Let’s leave it here,” even after the links disappear.

It could be that the site or page was bolstered by other factors, other ranking factors that we may not know about. It could be that every one of these eight times when we moved it up, maybe by moving it up through links we inadvertently did something else to it. Maybe that helped it rank higher for other pages, and those other pages generated links each of these times. That’s fairly unlikely when you repeat the test this many times, but not impossible.

Or it could be that Google actually has something in their algorithm around link echoes, where they say, “Hey, you know what? After a link has disappeared, that doesn’t necessarily mean we should take away the value of that link as a vote forever and ever.” In fact, maybe we should, for a long time, perhaps depending on how many links the page has or how uncompetitive the search results are, or something that they say, “You know what? Let’s leave some remnant, some echo, a ghost of that link’s value in the ranking equation for the site or page.” These things are all possible.

What’s fascinating about practice to me is that it means that, for a lot of us who worry tremendously about link reclamation, about losing links on sites or pages that may produce things freshly, but then remove them on blogs that don’t always stay consistent across time, that we may be getting more value than we think from a link that disappears in the future. Of course, learning more about how Google works, about their operations is just fascinating to me. Google says their mission is to organize the world’s information and to make it universally accessible and useful. Well, I think part of Moz’s mission and my mission is to organize information about how Google works and make it universally accessible and useful. That’s what I hope we’re doing with some of these tests, particularly around link ghosts.

So I’m looking forward to some great comments. I’m sure many of you are going to have things that you’ve observed as well. If you’d like to follow along with this and other tests, I’d suggest checking out . . . you can go to bit.ly/mozmadscience and see the full presentation from my MozCon talk, in which I talk about link ghosts and a number of other tests we’ve been performing. I’ll be sharing a few of those individually here on Whiteboard Friday as well. But link echoes is such a fascinating one, I thought we should bring that out right away.

Thanks everyone. Take care. We’ll see you again next week for another edition of Whiteboard Friday.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from feedproxy.google.com

Advanced SEO Techniques & Services Amazing Total SEO Success #6

Part 6 – My Advanced SEO Techniques Provides SEO Services like no SEO Services that gives Total SEO Success! My Advanced SEO Techniques Brings Really Amazing…

Reblogged 4 years ago from www.youtube.com

Search Engine Optimization Consultant-SEO Tips & Techniques

http://onlinemediamagic.com/blog Search Engine Optimization Consultant Get the latest SEO Tips and Techniques for 2012. Learn these techniques from SEO Exper…

Reblogged 4 years ago from www.youtube.com

Wordpess SEO by Yoast Tutorial Plugin How To

This is a how to tutorial on how to use the WordPress SEO by Yoast Plugin. This is great for your new WordPress website and the SEO by Yoast options dramatic…

Reblogged 4 years ago from www.youtube.com

What Happened after Google Pulled Author and Video Snippets: A Moz Case Study

Posted by Cyrus-Shepard

In the past 2 months Google made
big changes to its search results

Webmasters saw disappearing 
Google authorship photos, reduced video snippets, changes to local packs and in-depth articles, and more.

Here at Moz, we’ve closely monitored our own URLs to measure the effect of these changes on our actual traffic.
The results surprised us.

Authorship traffic—surprising results

In the early days of authorship, many webmasters worked hard to get their photo in Google search results. I confess, I doubt anyone worked harder at author snippets
than me

Search results soon became crowded with smiling faces staring back at us. Authors hired professional photographers. Publishers worked to correctly follow Google’s guidelines to set up authorship for thousands of authors.

The race for more clicks was on.

Then on June 28th,
Google cleared the page. No more author photos. 

To gauge the effect on traffic, we examined eight weeks’ worth of data from Google Analytics and Webmaster Tools, before and after the change. We then examined our top 15 authorship URLs (where author photos were known to show consistently) compared to non-authorship URLs. 

The results broke down like this:

Change in Google organic traffic to Moz

  • Total Site:  -1.76%
  • Top 15 Non-Authorship URLs:  -5.96%
  • Top 15 Authorship URLs:  -2.86%

Surprisingly,
authorship URLs performed as well as non-authorship URLs in terms of traffic. Even though Moz was highly optimized for authors, traffic didn’t significantly change.

On an individual level, things looked much different. We actually observed big changes in traffic with authorship URLs increasing or decreasing in traffic by as much as 45%. There is no clear pattern: Some went up, some went down—exactly like any URL would over an extended time.

Authorship photos don’t exist in a vacuum; each photo on the page competed for attention with all the other photos on the page.
Each search result is as unique as a fingerprint. What worked for one result didn’t work for another.

Consider what happens visually when multiple author photos exist in the same search result:

One hypothesis speculates that more photos has the effect of drawing eyes down the page. In the absence of rich snippets, search click-through rates might follow more closely studied models, which dictate that
results closer to the top earn more clicks.

In the absence of author photos, it’s likely click-through rate expectations have once again become more standardized.

Video snippets: a complex tale

Shortly after Google removed author photos, they took aim at video snippets as well. On July 17th,
MozCast reported a sharp decline in video thumbnails.

Most sites, Moz included, lost
100% of their video results. Other sites appeared to be “white-listed” as reported by former Mozzer Casey Henry at Wistia. 

A few of the sites Casey found where Google continues to show video thumbnails:

  • youtube.com
  • vimeo.com
  • vevo.com
  • ted.com
  • today.com
  • discovery.com

Aside from these “giants,” most webmasters, even very large publishers at the top of the industry, saw their video snippets vanish in search results.

How did this loss affect traffic for our URLs with embedded videos? Fortunately, here at Moz we have a large collection of ready-made video URLs we could easily study: our
Whiteboard Friday videos, which we produce every, well, Friday. 

To our surprise, most URLs actually saw more traffic.

On average, our Whiteboard Friday videos saw a
10% jump in organic traffic after losing video snippets.

A few other with video saw
dramatic increases:

The last example, the Learn SEO page, didn’t have an actual video on it, but a bug with Google caused them to display an older video thumbnail. (Several folks we’ve talked to speculate that Google removed video snippets simply to clean up their bugs in the system)

We witnessed a significant increase in traffic after losing video snippets. How did this happen? 

Did Google change the way they rank and show video pages?

It turns out that many of our URLs that contained videos also saw a significant change in the number of search
impressions at the exact same time.

According to Google, impressions for the majority of our video URLs shot up dramatically around July 14th.

Impressions for Whiteboard Friday URLs also rose 20% during this time. For Moz, most of the video URLs saw many more impressions, but for others, it appears rankings dropped.

While Moz saw video impressions rise,
other publishers saw the opposite effect.

Casey Henry, our friend at video hosting company
Wistia, reports seeing rankings drop for many video URLs that had thin or little content.

“…it’s only pages hosting video with thin content… the pages that only had video and a little bit of text went down.”


Casey Henry

For a broader perspective, we talked to
Marshall Simmonds, founder of Define Media Group, who monitors traffic to millions of daily video pageviews for large publishers. 

Marshall found that despite the fact that
most of the sites they monitor lost video snippets, they observed no visible change in either traffic or pageviews across hundreds of millions of visits.

Define Media Group also recently released its
2014 Mid-Year Digital Traffic Report which sheds fascinating light on current web traffic trends.

What does it all mean?

While we have anecdotal evidence of ranking and impression changes for video URLs on individual sites, on the grand scale across all Google search results these differences aren’t visible.

If you have video content, the evidence suggests it’s now worth more than ever to follow
video SEO best practices: (taken from video SEO expert Phil Nottingham)

  • Use a crawlable player (all the major video hosting platforms use these today)
  • Surround the video with supporting information (caption files and transcripts work great)
  • Include schema.org video markup

SEO finds a way

For the past several years web marketers competed for image and video snippets, and it’s with a sense of sadness that they’ve been taken away.

The smart strategy follows the data, which suggest that more traditional click-through rate optimization techniques and strategies could now be more effective. This means strong titles, meta descriptions, rich snippets (those that remain), brand building and traditional ranking signals.

What happened to your site when Google removed author photos and video snippets? Let us know in the comments below.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from feedproxy.google.com

The Month Google Shook the SERPs

Posted by Dr-Pete

As a group, we SEOs still tend to focus most of our attention on just one place – traditional, organic results. In the past two years, I’ve spent a lot of time studying these results and how they change over time. The more I experience the reality of SERPs in the wild, though, the more I’ve become interested in situations like this one (a search for “diabetes symptoms”)…

See the single blue link and half-snippet on the bottom-left? That’s the only thing about this above-the-fold page that most SEOs in 2014 would call “organic”. Of course, it’s easy to find fringe cases, but the deeper I dig into the feature landscape that surrounds and fundamentally alters SERPs, the more I find that the exceptions are inching gradually closer to the rule.

Monday, July 28th was my 44th birthday, and I think Google must have decided to celebrate by giving me extra work (hooray for job security?). In the month between June 28th and July 28th, there were four major shake-ups to the SERPs, all of them happening beyond traditional, organic results. This post is a recap of our data on each of those shake-ups.

Authorship photos disappear (June 28)

On June 25th, Google’s John Mueller made a surprise announcement via Google+:

We had seen 
authorship shake-ups in the past, but the largest recent drop had measured around 15%. It was clear that Google was rethinking the prevalence of author photos and their impact on perceived quality, but most of us assumed this would be a process of small tweaks. Given Google’s push toward Google+ and its inherent tie-in with authorship, not a single SEO I know had predicted a complete loss of authorship photos.

Yet, over the next few days, culminating on the morning of June 28th, a 
total loss of authorship photos is exactly what happened:

While some authorship photos still appeared in personalized results, the profile photos completely disappeared from general results, after previously being present on about 21% of the SERPs that MozCast tracks. It’s important to note that the concept of authorship remains, and author bylines are still being shown (we track that at about 24%, as of this writing), but the overall visual impact was dramatic for many SERPs.

In-depth gets deeper (July 2nd)

Most SEOs still don’t pay much attention to Google’s “In-depth Articles,” but they’ve been slowly gain SERP share. When we first started tracking them, they popped up on about 3.5% of the searches MozCast covers. This data seems to only get updated periodically, and the number had grown to roughly 6.0% by the end of June 2014. On the morning of July 2nd, I (and, seemingly, everyone else), missed a major change:

Overnight, the presence of in-depth articles jumped from 6.0% to 12.7%, more than doubling (a +112% increase, to be precise). Some examples of queries that gained in-depth articles include:

  • xbox 360
  • hotels
  • raspberry pi
  • samsung galaxy tab
  • job search
  • pilates
  • payday loans
  • apartments
  • car sales
  • web design

Here’s an example set of in-depth for a term SEOs know all too well, “payday loans”:

The motivation for this change is unclear, and it comes even as Google continues to test designs with pared down in-depth results (almost all of their tests seem to take up less space than the current design). Doubling this feature hardly indicates a lack of confidence, though, and many competitive terms are now showing in-depth results.

Video looks more like radio (July 16th)

Just a couple of weeks after the authorship drop, we saw a smaller but still significant shake-up in video results, with about 28% of results MozCast tracks losing video thumbnails:

As you can see, the presence of thumbnails does vary day-to-day, but the two plateaus, before and after June 16th, are clear here. At this point, the new number seems to be holding.

Since our data doesn’t connect the video thumbnails to specific results, it’s tough to say if this change indicates a removal of thumbnails or a drop in rankings for video results overall. Considering how smaller drops in authorship signaled a much larger change down the road, I think this shift deserves more attention. It could be that Google is generally questioning the value and prevalence of rich snippets, especially when quality concerns come into play.

I originally hypothesized that this might not be a true loss, but could be a sign that some video snippets were switching to the new “mega-video” format (or video answer box, if you prefer). This does not appear to be the case, as the larger video format is still fairly uncommon, and the numbers don’t match up.

For reference, here’s a mega-video format (for the query “bartender”):

Mega-videos are appearing on such seemingly generic queries as “partition”, “headlights”, and “california king bed”. If you have the budget and really want to dominate the SERPs, try writing a pop song.

Pigeons attack local results (July 24th)

By now, many of you have heard of 
Google’s “Pigeon” update. The Pigeon update hit local SERPs hard and seems to have dramatically changed how Google determines and uses a searcher’s location. Local search is more than an algorithmic layer, though – it’s also a feature set. When Pigeon hit, we saw a sharp decline in local “pack” results (the groups of 2-7 pinned local results):

We initially reported that pack results dropped more than 60% after the Pigeon update. We now are convinced that this was a mistake (indicated by the “?” zone) – essentially, Pigeon changed localization so much that it broke the method we were using. We’ve found a new method that seems to match manually setting your location, and the numbers for July 29-30 are, to the best of my knowledge, accurate.

According to these new numbers, local pack results have fallen 23.4% (in our data set) after the Pigeon update. This is the exact same number 
Darren Shaw of WhiteSpark found, using a completely different data set and methodology. The perfect match between those two numbers is probably a bit of luck, but they suggest that we’re at least on the right track. While I over-reported the initial drop, and I apologize for any confusion that may have caused, the corrected reality still shows a substantial change in pack results.

It’s important to note that this 23.4% drop is a net change – among queries, there were both losers and winners. Here are 10 searches that lost pack results (and have been manually verified):

  • jobs
  • cars for sale
  • apartments
  • cruises
  • train tickets
  • sofa
  • wheels
  • liposuction
  • social security card
  • motorcycle helmets

A couple of important notes – first, some searches that lost packs only lost packs in certain regions. Second, Pigeon is a very recent update and may still be rolling out or being tweaked. This is only the state of the data as we know it today.

Here are 10 searches that gained pack results (in our data set):

  • skechers
  • mortgage
  • apartments for rent
  • web designer
  • long john silvers
  • lamps
  • mystic
  • make a wish foundation
  • va hospital
  • internet service

The search for “mystic” is an interesting example – no matter what your location (if you’re in the US), Google is showing a pack result for Mystic, CT. This pattern seems to be popping up across the Pigeon update. For example, a search for “California Pizza Kitchen” automatically targets California, regardless of your location (h/t 
Tony Verre), and a search for “Buffalo Wild Wings” sends you to Buffalo, NY (h/t Andrew Mitschke).

Of course, local search is complex, and it seems like Google is trying to do a lot in one update. The simple fact that a search for “apartments” lost pack results in our data, while “apartments for rent” gained them, shows that the Pigeon update isn’t based on a few simplistic rules.

Some local SEOs have commented that Pigeon seemed to increase the number of smaller packs (2-3 results). Looking at the data for pack size before and after Pigeon, this is what we’re seeing:

Both before and after Pigeon, there are no 1-packs, and 4-, 5-, and 6-packs are relatively rare. After Pigeon, the distribution of 2-packs is similar, but there is a notable jump in 3-packs and a corresponding decrease in 7-packs. The total number of 3-packs actually increased after the Pigeon update. While our data set (once we restrict it to just searches with pack results) is fairly small, this data does seem to match the observations of local SEOs.

Sleep with one eye open

Ok, maybe that’s a bit melodramatic. All of the changes do go to show, though, that, if you’re laser-focused on ranking alone, you may be missing a lot. We as SEOs not only need to look beyond our own tunnel vision, we need to start paying more attention to post-ranking data, like CTR and search traffic. SERPs are getting richer and more dynamic, and Google can change the rules overnight.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from feedproxy.google.com