Local SEO beyond the browser

Are you making the most of offline advertising to complement your local SEO efforts? Columnist Marcus Miller explains the benefits of connecting your digital and physical marketing.

The post Local SEO beyond the browser appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Reblogged 3 years ago from feeds.searchengineland.com

Why Effective, Modern SEO Requires Technical, Creative, and Strategic Thinking – Whiteboard Friday

Posted by randfish

There’s no doubt that quite a bit has changed about SEO, and that the field is far more integrated with other aspects of online marketing than it once was. In today’s Whiteboard Friday, Rand pushes back against the idea that effective modern SEO doesn’t require any technical expertise, outlining a fantastic list of technical elements that today’s SEOs need to know about in order to be truly effective.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week I’m going to do something unusual. I don’t usually point out these inconsistencies or sort of take issue with other folks’ content on the web, because I generally find that that’s not all that valuable and useful. But I’m going to make an exception here.

There is an article by Jayson DeMers, who I think might actually be here in Seattle — maybe he and I can hang out at some point — called “Why Modern SEO Requires Almost No Technical Expertise.” It was an article that got a shocking amount of traction and attention. On Facebook, it has thousands of shares. On LinkedIn, it did really well. On Twitter, it got a bunch of attention.

Some folks in the SEO world have already pointed out some issues around this. But because of the increasing popularity of this article, and because I think there’s, like, this hopefulness from worlds outside of kind of the hardcore SEO world that are looking to this piece and going, “Look, this is great. We don’t have to be technical. We don’t have to worry about technical things in order to do SEO.”

Look, I completely get the appeal of that. I did want to point out some of the reasons why this is not so accurate. At the same time, I don’t want to rain on Jayson, because I think that it’s very possible he’s writing an article for Entrepreneur, maybe he has sort of a commitment to them. Maybe he had no idea that this article was going to spark so much attention and investment. He does make some good points. I think it’s just really the title and then some of the messages inside there that I take strong issue with, and so I wanted to bring those up.

First off, some of the good points he did bring up.

One, he wisely says, “You don’t need to know how to code or to write and read algorithms in order to do SEO.” I totally agree with that. If today you’re looking at SEO and you’re thinking, “Well, am I going to get more into this subject? Am I going to try investing in SEO? But I don’t even know HTML and CSS yet.”

Those are good skills to have, and they will help you in SEO, but you don’t need them. Jayson’s totally right. You don’t have to have them, and you can learn and pick up some of these things, and do searches, watch some Whiteboard Fridays, check out some guides, and pick up a lot of that stuff later on as you need it in your career. SEO doesn’t have that hard requirement.

And secondly, he makes an intelligent point that we’ve made many times here at Moz, which is that, broadly speaking, a better user experience is well correlated with better rankings.

You make a great website that delivers great user experience, that provides the answers to searchers’ questions and gives them extraordinarily good content, way better than what’s out there already in the search results, generally speaking you’re going to see happy searchers, and that’s going to lead to higher rankings.

But not entirely. There are a lot of other elements that go in here. So I’ll bring up some frustrating points around the piece as well.

First off, there’s no acknowledgment — and I find this a little disturbing — that the ability to read and write code, or even HTML and CSS, which I think are the basic place to start, is helpful or can take your SEO efforts to the next level. I think both of those things are true.

So being able to look at a web page, view source on it, or pull up Firebug in Firefox or something and diagnose what’s going on and then go, “Oh, that’s why Google is not able to see this content. That’s why we’re not ranking for this keyword or term, or why even when I enter this exact sentence in quotes into Google, which is on our page, this is why it’s not bringing it up. It’s because it’s loading it after the page from a remote file that Google can’t access.” These are technical things, and being able to see how that code is built, how it’s structured, and what’s going on there, very, very helpful.

Some coding knowledge also can take your SEO efforts even further. I mean, so many times, SEOs are stymied by the conversations that we have with our programmers and our developers and the technical staff on our teams. When we can have those conversations intelligently, because at least we understand the principles of how an if-then statement works, or what software engineering best practices are being used, or they can upload something into a GitHub repository, and we can take a look at it there, that kind of stuff is really helpful.

Secondly, I don’t like that the article overly reduces all of this information that we have about what we’ve learned about Google. So he mentions two sources. One is things that Google tells us, and others are SEO experiments. I think both of those are true. Although I’d add that there’s sort of a sixth sense of knowledge that we gain over time from looking at many, many search results and kind of having this feel for why things rank, and what might be wrong with a site, and getting really good at that using tools and data as well. There are people who can look at Open Site Explorer and then go, “Aha, I bet this is going to happen.” They can look, and 90% of the time they’re right.

So he boils this down to, one, write quality content, and two, reduce your bounce rate. Neither of those things are wrong. You should write quality content, although I’d argue there are lots of other forms of quality content that aren’t necessarily written — video, images and graphics, podcasts, lots of other stuff.

And secondly, that just doing those two things is not always enough. So you can see, like many, many folks look and go, “I have quality content. It has a low bounce rate. How come I don’t rank better?” Well, your competitors, they’re also going to have quality content with a low bounce rate. That’s not a very high bar.

Also, frustratingly, this really gets in my craw. I don’t think “write quality content” means anything. You tell me. When you hear that, to me that is a totally non-actionable, non-useful phrase that’s a piece of advice that is so generic as to be discardable. So I really wish that there was more substance behind that.

The article also makes, in my opinion, the totally inaccurate claim that modern SEO really is reduced to “the happier your users are when they visit your site, the higher you’re going to rank.”

Wow. Okay. Again, I think broadly these things are correlated. User happiness and rank is broadly correlated, but it’s not a one to one. This is not like a, “Oh, well, that’s a 1.0 correlation.”

I would guess that the correlation is probably closer to like the page authority range. I bet it’s like 0.35 or something correlation. If you were to actually measure this broadly across the web and say like, “Hey, were you happier with result one, two, three, four, or five,” the ordering would not be perfect at all. It probably wouldn’t even be close.

There’s a ton of reasons why sometimes someone who ranks on Page 2 or Page 3 or doesn’t rank at all for a query is doing a better piece of content than the person who does rank well or ranks on Page 1, Position 1.

Then the article suggests five and sort of a half steps to successful modern SEO, which I think is a really incomplete list. So Jayson gives us;

  • Good on-site experience
  • Writing good content
  • Getting others to acknowledge you as an authority
  • Rising in social popularity
  • Earning local relevance
  • Dealing with modern CMS systems (which he notes most modern CMS systems are SEO-friendly)

The thing is there’s nothing actually wrong with any of these. They’re all, generally speaking, correct, either directly or indirectly related to SEO. The one about local relevance, I have some issue with, because he doesn’t note that there’s a separate algorithm for sort of how local SEO is done and how Google ranks local sites in maps and in their local search results. Also not noted is that rising in social popularity won’t necessarily directly help your SEO, although it can have indirect and positive benefits.

I feel like this list is super incomplete. Okay, I brainstormed just off the top of my head in the 10 minutes before we filmed this video a list. The list was so long that, as you can see, I filled up the whole whiteboard and then didn’t have any more room. I’m not going to bother to erase and go try and be absolutely complete.

But there’s a huge, huge number of things that are important, critically important for technical SEO. If you don’t know how to do these things, you are sunk in many cases. You can’t be an effective SEO analyst, or consultant, or in-house team member, because you simply can’t diagnose the potential problems, rectify those potential problems, identify strategies that your competitors are using, be able to diagnose a traffic gain or loss. You have to have these skills in order to do that.

I’ll run through these quickly, but really the idea is just that this list is so huge and so long that I think it’s very, very, very wrong to say technical SEO is behind us. I almost feel like the opposite is true.

We have to be able to understand things like;

  • Content rendering and indexability
  • Crawl structure, internal links, JavaScript, Ajax. If something’s post-loading after the page and Google’s not able to index it, or there are links that are accessible via JavaScript or Ajax, maybe Google can’t necessarily see those or isn’t crawling them as effectively, or is crawling them, but isn’t assigning them as much link weight as they might be assigning other stuff, and you’ve made it tough to link to them externally, and so they can’t crawl it.
  • Disabling crawling and/or indexing of thin or incomplete or non-search-targeted content. We have a bunch of search results pages. Should we use rel=prev/next? Should we robots.txt those out? Should we disallow from crawling with meta robots? Should we rel=canonical them to other pages? Should we exclude them via the protocols inside Google Webmaster Tools, which is now Google Search Console?
  • Managing redirects, domain migrations, content updates. A new piece of content comes out, replacing an old piece of content, what do we do with that old piece of content? What’s the best practice? It varies by different things. We have a whole Whiteboard Friday about the different things that you could do with that. What about a big redirect or a domain migration? You buy another company and you’re redirecting their site to your site. You have to understand things about subdomain structures versus subfolders, which, again, we’ve done another Whiteboard Friday about that.
  • Proper error codes, downtime procedures, and not found pages. If your 404 pages turn out to all be 200 pages, well, now you’ve made a big error there, and Google could be crawling tons of 404 pages that they think are real pages, because you’ve made it a status code 200, or you’ve used a 404 code when you should have used a 410, which is a permanently removed, to be able to get it completely out of the indexes, as opposed to having Google revisit it and keep it in the index.

Downtime procedures. So there’s specifically a… I can’t even remember. It’s a 5xx code that you can use. Maybe it was a 503 or something that you can use that’s like, “Revisit later. We’re having some downtime right now.” Google urges you to use that specific code rather than using a 404, which tells them, “This page is now an error.”

Disney had that problem a while ago, if you guys remember, where they 404ed all their pages during an hour of downtime, and then their homepage, when you searched for Disney World, was, like, “Not found.” Oh, jeez, Disney World, not so good.

  • International and multi-language targeting issues. I won’t go into that. But you have to know the protocols there. Duplicate content, syndication, scrapers. How do we handle all that? Somebody else wants to take our content, put it on their site, what should we do? Someone’s scraping our content. What can we do? We have duplicate content on our own site. What should we do?
  • Diagnosing traffic drops via analytics and metrics. Being able to look at a rankings report, being able to look at analytics connecting those up and trying to see: Why did we go up or down? Did we have less pages being indexed, more pages being indexed, more pages getting traffic less, more keywords less?
  • Understanding advanced search parameters. Today, just today, I was checking out the related parameter in Google, which is fascinating for most sites. Well, for Moz, weirdly, related:oursite.com shows nothing. But for virtually every other sit, well, most other sites on the web, it does show some really interesting data, and you can see how Google is connecting up, essentially, intentions and topics from different sites and pages, which can be fascinating, could expose opportunities for links, could expose understanding of how they view your site versus your competition or who they think your competition is.

Then there are tons of parameters, like in URL and in anchor, and da, da, da, da. In anchor doesn’t work anymore, never mind about that one.

I have to go faster, because we’re just going to run out of these. Like, come on. Interpreting and leveraging data in Google Search Console. If you don’t know how to use that, Google could be telling you, you have all sorts of errors, and you don’t know what they are.

  • Leveraging topic modeling and extraction. Using all these cool tools that are coming out for better keyword research and better on-page targeting. I talked about a couple of those at MozCon, like MonkeyLearn. There’s the new Moz Context API, which will be coming out soon, around that. There’s the Alchemy API, which a lot of folks really like and use.
  • Identifying and extracting opportunities based on site crawls. You run a Screaming Frog crawl on your site and you’re going, “Oh, here’s all these problems and issues.” If you don’t have these technical skills, you can’t diagnose that. You can’t figure out what’s wrong. You can’t figure out what needs fixing, what needs addressing.
  • Using rich snippet format to stand out in the SERPs. This is just getting a better click-through rate, which can seriously help your site and obviously your traffic.
  • Applying Google-supported protocols like rel=canonical, meta description, rel=prev/next, hreflang, robots.txt, meta robots, x robots, NOODP, XML sitemaps, rel=nofollow. The list goes on and on and on. If you’re not technical, you don’t know what those are, you think you just need to write good content and lower your bounce rate, it’s not going to work.
  • Using APIs from services like AdWords or MozScape, or hrefs from Majestic, or SEM refs from SearchScape or Alchemy API. Those APIs can have powerful things that they can do for your site. There are some powerful problems they could help you solve if you know how to use them. It’s actually not that hard to write something, even inside a Google Doc or Excel, to pull from an API and get some data in there. There’s a bunch of good tutorials out there. Richard Baxter has one, Annie Cushing has one, I think Distilled has some. So really cool stuff there.
  • Diagnosing page load speed issues, which goes right to what Jayson was talking about. You need that fast-loading page. Well, if you don’t have any technical skills, you can’t figure out why your page might not be loading quickly.
  • Diagnosing mobile friendliness issues
  • Advising app developers on the new protocols around App deep linking, so that you can get the content from your mobile apps into the web search results on mobile devices. Awesome. Super powerful. Potentially crazy powerful, as mobile search is becoming bigger than desktop.

Okay, I’m going to take a deep breath and relax. I don’t know Jayson’s intention, and in fact, if he were in this room, he’d be like, “No, I totally agree with all those things. I wrote the article in a rush. I had no idea it was going to be big. I was just trying to make the broader points around you don’t have to be a coder in order to do SEO.” That’s completely fine.

So I’m not going to try and rain criticism down on him. But I think if you’re reading that article, or you’re seeing it in your feed, or your clients are, or your boss is, or other folks are in your world, maybe you can point them to this Whiteboard Friday and let them know, no, that’s not quite right. There’s a ton of technical SEO that is required in 2015 and will be for years to come, I think, that SEOs have to have in order to be effective at their jobs.

All right, everyone. Look forward to some great comments, and we’ll see you again next time for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

UX, Content Quality, and SEO – Whiteboard Friday

Posted by EricEnge

Editor’s note: Today we’re featuring back-to-back episodes of Whiteboard Friday from our friends at Stone Temple Consulting. Make sure to also check out the first episode, “Becoming Better SEO Scientists” from Mark Traphagen.

User experience and the quality of your content have an incredibly broad impact on your SEO efforts. In this episode of Whiteboard Friday, Stone Temple’s Eric Enge shows you how paying attention to your users can benefit your position in the SERPs.

For reference, here’s a still of this week’s whiteboard.
Click on it to open a high resolution image in a new tab!

Video transcription

Hi, Mozzers. I’m Eric Enge, CEO of Stone Temple Consulting. Today I want to talk to you about one of the most underappreciated aspects of SEO, and that is the interaction between user experience, content quality, and your SEO rankings and traffic.

I’m going to take you through a little history first. You know, we all know about the Panda algorithm update that came out in February 23, 2011, and of course more recently we have the search quality update that came out in May 19, 2015. Our Panda friend had 27 different updates that we know of along the way. So a lot of stuff has gone on, but we need to realize that that is not where it all started.

The link algorithm from the very beginning was about search quality. Links allowed Google to have an algorithm that gave better results than the other search engines of their day, which were dependent on keywords. These things however, that I’ve just talked about, are still just the tip of the iceberg. Google goes a lot deeper than that, and I want to walk you through the different things that it does.

So consider for a moment, you have someone search on the phrase “men’s shoes” and they come to your website.

What is that they want when they come to your website? Do they want sneakers, sandals, dress shoes? Well, those are sort of the obvious things that they might want. But you need to think a little bit more about what the user really wants to be able to know before they buy from you.

First of all, there has to be a way to buy. By the way, affiliate sites don’t have ways to buy. So the line of thinking I’m talking about might not work out so well for affiliate sites and works better for people who can actually sell the product directly. But in addition to a way to buy, they might want a privacy policy. They might want to see an About Us page. They might want to be able to see your phone number. These are all different kinds of things that users look for when they arrive on the pages of your site.

So as we think about this, what is it that we can do to do a better job with our websites? Well, first of all, lose the focus on keywords. Don’t get me wrong, keywords haven’t gone entirely away. But the pages where we overemphasize one particular keyword over another or related phrases are long gone, and you need to have a broader focus on how you approach things.

User experience is now a big deal. You really need to think about how users are interacting with your page and how that shows your overall page quality. Think about the percent satisfaction. If I send a hundred users to your page from my search engine, how many of those users are going to be happy with the content or the products or everything that they see with your page? You need to think through the big picture. So at the end of the day, this impacts the content on your page to be sure, but a lot more than that it impacts the design, related items that you have on the page.

So let me just give you an example of that. I looked at one page recently that was for a flower site. It was a page about annuals on that site, and that page had no link to their perennials page. Well, okay, a fairly good percentage of people who arrive on a page about annuals are also going to want to have perennials as something they might consider buying. So that page was probably coming across as a poor user experience. So these related items concepts are incredibly important.

Then the links to your page is actually a way to get to some of those related items, and so those are really important as well. What are the related products that you link to?

Finally, really it impacts everything you do with your page design. You need to move past the old-fashioned way of thinking about SEO and into the era of: How am I doing with satisfying all the people who come to the pages of your site?

Thank you, Mozzers. Have a great day.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

Data Mining with Majestic

Majestic offers an incredible amount of data for us to use in our SEO efforts to increase rankings in the organic arena and position our Brand. As the size of the website increases so does the challenge to identify the most appropriate dataset to answer a specific question Management will formulate: Why has there been…

The post Data Mining with Majestic appeared first on Majestic Blog.

Reblogged 4 years ago from blog.majestic.com

How to Have a Successful Local SEO Campaign in 2015

Posted by Casey_Meraz

Another year in search has passed. It’s now 2015 and we have seen some major changes in local ranking factors since 2014, which I also expect to change greatly throughout 2015. For some a new year means a fresh starting point and yet for others it’s a time of reflection to analyze how successful your campaign has been. Whatever boat you’re in, make sure to sit down and read this guide. 

In this guide we will cover how you can have a successful local SEO campaign in 2015 starting with the basics and getting down to five action items you should focus on now. This is not limited to Google My Business and also includes localized organic results. 

Now the question is where do you start?

Since Pigeon has now rolled out to the US, UK, Australia, and Canada it’s important to make sure your strategies are in line with this no matter what part of the world you’re in. A successful local SEO Campaign in 2015 will be much more successful if you put more work into it. Don’t be fooled though. More work by itself isn’t going to get you where you need to be. You need to work smarter towards the goals which are going to fuel your conversions.

For some industries that might mean more localized content, for others it may mean more social interaction in your local area. Whatever it ends up being, the root of it should be the same for most. You need to get more conversions for your website or your client’s website. So with this in mind let’s make sure we’re on the same page as far as our goals are concerned.

Things you need to know first

Focus on the right goals

Recently I had a conversation with a client who wanted to really nail in the point that
he was not interested in traffic. He was interested in the conversions he could track. He was also interested to see how all of these content resource pieces I recommended would help. He was tired of the silly graphs from other agencies that showed great rankings on a variety of keywords when he was more interested to see which efforts brought him the most value. Instead, he wanted to see how his campaign was bringing him conversions or meaningful traffic. I really appreciated this statement and I felt like he really got it.

Still, however, far too often I have to talk to potential clients and explain to them why their sexy looking traffic reports aren’t actually helping them. You can have all of the traffic in the world but if it doesn’t meet one of your goals of conversions or education then it’s probably not helping. Even if you make the client happy with your snazzy reports for a few months, eventually they’re going to want to know their return on investment (ROI).

It’s 2015. If your clients aren’t tracking conversions properly, give them the help they need. Record their contacts in a CRM and track the source of each of these contacts. Track them all the way through the sales funnel. 

That’s a simple and basic marketing example but as SEOs
your role has transformed. If you can show this type of actual value and develop a plan accordingly, you will be unstoppable.

Second, don’t get tunnel vision

You may wonder why I started a little more basic than normal in this post. The fact is that in this industry there is not a full formal training program that covers all aspects of what we do. 

We all come from different walks of life and experience which makes it easy for us to get tunnel vision. You probably opened this article with the idea of “How Can I Dominate My Google Local Rankings?” While we cover some actionable tips you should be using, you need to think outside of the box as well. Your website is not the only online property you need to be concerned about.

Mike Ramsey from Nifty Marketing put out a great study on 
measuring the click-through rates from the new local stack. In this study he measured click-through rates of users conducting several different searches like “Salt Lake City Hotel” in the example below. With so many different options look where the users are clicking:

They’re really clicking all over the place! While it’s cool to be number one, it’s much better if you get clicks from your paid ad, organic result, local result, and barnacle SEO efforts (which we’ll talk about a little later). 

If you combine your conversion marketing data with your biggest priorities, you can put together a plan to tackle the most important areas for your industry. Don’t assume it’s a one-size-fits-all approach. 

Third, some spam still works. Don’t do it and rise above it.

There’s no doubt that some spammy tactics are still working. Google gets better everyday but you still see crap
like this example below show up in the SERPs.

While it sucks to see that kind of stuff, remember that in time it disappears (just as it did before this article was published). If you take shortcuts, you’re going to get caught and it’s not worth it for the client or the heartache on your site. Maintain the course and do things the right way. 

Now let’s get tactical and prepare for 2015

Now it’s time for some practical and tactical takeaways you can use to dominate your local search campaign in 2015.

Practical tip 1: start with an audit

Over the years, one of the best lessons I have learned is it’s OK to say “I don’t know” when you don’t have the answer. Consulting with industry experts or people with more experience than you is not a bad thing and will likely only lead to you to enhance your knowledge and get a different perspective. It can be humbling but the experience is amazing. It can open your mind.

Last year, I had the opportunity to work with over ten of the industry’s best minds and retained them for site audits on different matters. 

The perspective this gives is absolutely incredible and I believe it’s a great way to learn. Everyone in this industry has come from a different background and seen different things over the years. Combining that knowledge is invaluable to the success of your clients’ projects. Don’t be afraid to do it and learn from it. This is also a good idea if you feel like your project has reached a stalemate. Getting more advice, identifying potential problems, and having a fresh perspective will do wonders for your success.

As many of the experts have confirmed, ever since the Pigeon update, organic and local ranking factors have been more tied together than ever. Since they started going this direction in a big way, I would not expect it to stop. 

This means that you really do need to worry about things like site speed, content, penalties, mobile compatibility, site structure, and more. On a side note, guess what will happen to your organic results if you keep this as a top priority? They will flourish and you will thank me.

If you don’t have the budget or resources to get a third party opinion, you can also conduct an independent audit. 

Do it yourself local SEO audit resources:

Do it yourself organic SEO audit resources:

Alternatively if you’re more in the organic boat you should also check out this guide by Steve Webb on
How To Perform The World’s Greatest SEO Audit

Whatever your situation is, it’s worth the time to have this perspective yearly or even a couple times a year if possible.

Practical tip 2: consider behavioral signals and optimize accordingly

I remember having a conversation with Darren Shaw, the founder of 
Whitespark, at MozCon 2013 about his thoughts on user behavior affecting local results. At the time I didn’t do too much testing around it. However just this year, Darren had a mind-blowing presentation at the Dallas State of Search where he threw in the behavioral signals curve ball. Phil Rozek also spoke about behavioral signals and provided a great slide deck with actionable items (included below). 

We have always speculated on behavioral signals but between his tests and some of Rand’s IMEC Lab tests, I became more of a believer last year. Now, before we go too deep on this remember that your local campaign is NOT only focused on just your local pack results. If user behavior can have an impact on search results, we should definitely be optimizing for our users.


You can view Phil Rozek’s presentation below: 

Don’t just optimize for the engines, optimize for the humans. One day when Skynet is around this may not be an issue, but for now you need to do it.

So how can you optimize for behavioral signals?

There is a dark side and a light side path to this question. If you ask me I will always say follow the light side as it will be effective and you don’t have to worry about being penalized. That’s a serious issue and it’s unethical for you to put your clients in that position.

Local SEO: how to optimize for behavioral signals

Do you remember the click-through study we looked at a bit earlier from Nifty Marketing? Do you remember where the users clicked? If you look again or just analyze user and shopper behavior, you might notice that many of the results with the most reviews got clicks. We know that reviews are hard to get so here are two quick ways that I use and recommend to my clients:


1. Solicit your Gmail clients for reviews

If you have a list of happy Gmail clients you can simply send them an email with a direct link to your Google My Business Page. Just get the URL of your local page by pulling up your URL and copying and pasting it. A URL will look like the one below:

Once you have this URL, simply remove the /posts and replace it with: 

 /?hl=en&review=1


It will look like this:

If your clients click on this link via their logged-in Gmail, they will automatically be taken to the review page which will open up the box to leave a review which looks like the example below. It doesn’t get much more simple than that. 

2. Check out a service like Mike Blumenthal’s Get Five Stars for reviews

I recently used this with a client and got a lot of great feedback and several reviews.

Remember that these reviews will also help on third-party sites and can help your Google My Business ranking positions as well as click-through rates. You can
check out Get Five Stars Here.

Another way outside of getting reviews is to optimize the appearance of your Google My Business Page. 


3. Optimize your local photos

Your Google My Business page includes photos. Don’t use generic photos. Use high quality photos so when the users hover over your listing they get an accurate representation of what they’re looking for. Doing this will increase your click-through rate. 

Organic SEO: Optimize for Behavioral Signals

The optimization for click-through rates on organic results typically focus on three areas. While you’re likely very familiar with the first two, you should not ignore them.


1. Title tags: optimize them for the user and engine

Optimize your meta title tags to increase click-through rates. Each page should have a unique title tag and should help the viewer with their query. The example below (although fabricated) is a good example of what NOT to do. 


2. Meta descriptions: optimize them for the user

Optimize your meta description to get the user to click on the search result. If you’re not doing this just because Google may or may not pull it, you’re missing opportunities and clicks. 


3. Review Schema markup: add this to appropriate pages

Reviewing
Schema markup is still a very overlooked opportunity. Like we talked about above in the local section, if you don’t have reviews coded in Schema, you could be missing out on getting the orange stars in organic results. 

Practical tip 3: don’t ignore barnacle SEO

I firmly believe that most people are not taking advantage of barnacle SEO still to this day and I’m a big fan. When I first heard Will Scott introduce this term at Pubcon I thought it was spot on. According to Will Scott’s website Search Influence, barnacle SEO is “attaching oneself to a large fixed object and waiting for the customers to float by in the current.” In a nutshell, we know that if you’re trying to rank on page one of Google you will find others that you may be able to attach to. If Yelp results come up for a lot of your search terms you might identify that as an opportunity. But there are three main ways you can take advantage of this.


1. You can try to have the most visible profile on that third party page

If Yelp is ranking for LA Personal Injury Attorneys, it would suit you to figure out how the top users are showing up there. Maybe your customers are headed there and then doing some shopping and making a selection. Or maybe they’re using it for a research platform and then will visit your website. If your profile looks great and shows up high on the list, you just gave yourself a better chance at getting a conversion.


2. You can try to get your page to rank

Hey, just because you don’t own Yelp.com or whatever similar site you’ve found, doesn’t mean you shouldn’t put in the effort to have it rank. If Google is already showing you that they trust a third party site by ranking it, you can use similar organic ranking techniques that you would use on your own site to make your profile page stronger. Over time you might add this to your bio on interviews or other websites to earn links. If you increase the visibility of your profile on search engines and they see your website on the same page you might increase conversions.


3. You can help your Google My Business

If the site you’re using passes link juice and you earn links to the third party profile page, you will start to see some strong results. Links are a big factor in local since Pigeon this year and it’s an opportunity that should not be missed.


So how can you use this advice?

Start by finding a list of potential barnacle SEO partners for your industry. As an example, I did a search for “Personal Injury Attorneys” in Los Angeles. In addition to the law firms that showed up in the results on the first page, I also identified four additional places I may be able to show up on.

  1. Yelp
  2.  Thumbtack
  3. Avvo
  4. Wikipedia

If you were attorney, it would be worth your while to explore these and see if any make sense for you to contribute to.

Practical tip 4: earn some good links

Most people get too carried away with link building. I know because I used to do it. The key with link building is to change your approach to understand that
it’s always better to get fewer high quality links than hundreds or thousands of low quality links

For example, a link like this one that one of our clients earned is what I’m talking about. 

If you want to increase your local rankings you can do so by earning these links to your associated Google My Business landing page.

Do you know the URL you entered in your Google My Business page when you set it up? That’s the one I’m talking about. In most cases this will be linked to either a local landing page for that location or the home page. It’s essential to your success that you earn solid links to this page.


Simple resources for link building

Practical tip 5: have consistent citations and remove duplicates

Identifying and correcting incorrect or duplicate citations has been getting easier and easier over the years. Even if you don’t want to pay someone to do it, you can sign up for some great do-it-yourself tools. Your goal with any citation cleanup program is this:

  1. Ensure there are no duplicate citations
  2. Ensure there are no incorrect citations with wrong phone numbers, old addresses, etc. 

You can ignore small differences and inconsistencies like St vs. Street. I believe the importance of citations has been greatly reduced over the past year. At the same time, you still want to be the least imperfect and provide your customers with accurate information if they’re looking on third party websites.  

Let’s do good things in 2015

2014 was a tough year in search altogether. We had ups like Penguin refreshes and we had downs like the removal of authorship. I’m guessing 2015 will be no different. Staying on the roller coaster and keeping with the idea of having the “least imperfect” site is the best way to ring out the new year and march on moving forward. If you had a tough year in local search, keep your head up high, fix any existing issues, and sprint through this year by making positive changes to your site. 

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from moz.com

10 Predictions for the Marketing World in 2015

Posted by randfish

The beginning of the year marks the traditional week for bloggers to prognosticate about the 12 months ahead, and, over the last decade I’ve created a tradition of joining in this festive custom to predict the big trends in SEO and web marketing. However, I divine the future by a strict code: I’m only allowed to make predictions IF my predictions from last year were at least moderately accurate (otherwise, why should you listen to me?). So, before I bring my crystal-ball-gazing, let’s have a look at how I did for 2014.

Yes, we’ll get to that, but not until you prove you’re a real Wizard, mustache-man.

You can find 
my post from January 5th of last year here, but I won’t force you to read through it. Here’s how I do grading:

  • Spot On (+2) – when a prediction hits the nail on the head and the primary criteria are fulfilled
  • Partially Accurate (+1) – predictions that are in the area, but are somewhat different than reality
  • Not Completely Wrong (-1) – those that landed near the truth, but couldn’t be called “correct” in any real sense
  • Off the Mark (-2) – guesses which didn’t come close

If the score is positive, prepare for more predictions, and if it’s negative, I’m clearly losing the pulse of the industry. Let’s tally up the numbers.

In 2014, I made 6 predictions:

#1: Twitter will go Facebook’s route and create insights-style pages for at least some non-advertising accounts

Grade: +2

Twitter rolled out Twitter analytics for all users this year (
starting in July for some accounts, and then in August for everyone), and while it’s not nearly as full-featured as Facebook’s “Insights” pages, it’s definitely in line with the spirit of this prediction.

#2: We will see Google test search results with no external, organic listings

Grade: -2

I’m very happy to be wrong about this one. To my knowledge, Google has yet to go this direction and completely eliminate external-pointing links on search results pages. Let’s hope they never do.

That said, there are plenty of SERPs where Google is taking more and more of the traffic away from everyone but themselves, e.g.:

I think many SERPs that have basic, obvious functions like ”
timer” are going to be less and less valuable as traffic sources over time.

#3: Google will publicly acknowledge algorithmic updates targeting both guest posting and embeddable infographics/badges as manipulative linking practices

Grade: -1

Google most certainly did release an update (possibly several)
targeted at guest posts, but they didn’t publicly talk about something specifically algorithmic targeting emebedded content/badges. It’s very possible this was included in the rolling Penguin updates, but the prediction said “publicly acknowledge” so I’m giving myself a -1.

#4: One of these 5 marketing automation companies will be purchased in the 9-10 figure $ range: Hubspot, Marketo, Act-On, Silverpop, or Sailthru

Grade: +2

Silverpop was 
purchased by IBM in April of 2014. While a price wasn’t revealed, the “sources” quoted by the media estimated the deal in the ~$270mm range. I’m actually surprised there wasn’t another sale, but this one was spot-on, so it gets the full +2.

#5: Resumes listing “content marketing” will grow faster than either SEO or “social media marketing”

Grade: +1

As a percentage, this certainly appears to be the case. Here’s some stats:

  • US profiles with “content marketing”
    • June 2013: 30,145
    • January 2015: 68,580
    • Growth: 227.5%
  • US profiles with “SEO”
    • June 2013: 364,119
    • January 2015: 596,050
    • Growth: 163.7%
  • US profiles with “social media marketing”
    • June 2013: 938,951
    • January 2015: 1,990,677
    • Growth: 212%

Granted, content marketing appears on far fewer profiles than SEO or social media marketing, but it has seen greater growth. I’m only giving myself a +1 rather than a +2 on this because, while the prediction was mathematically correct, the numbers of SEO and social still dwarf content marketing as a term. In fact, in LinkedIn’s 
annual year-end report of which skills got people hired the most, SEO was #5! Clearly, the term and the skillset continue to endure and be in high demand.

#6: There will be more traffic sent by Pinterest than Twitter in Q4 2014 (in the US)

Grade: +1

This is probably accurate, since Pinterest appears to have grown faster in 2014 than Twitter by a good amount AND this was 
already true in most of 2014 according to SharedCount (though I’m not totally sold on the methodology of coverage for their numbers). However, we won’t know the truth for a few months to come, so I’d be presumptuous in giving a full +2. I am a bit surprised that Pinterest continues to grow at such a rapid pace — certainly a very impressive feat for an established social network.


SOURCE: 
Global Web Index

With Twitter’s expected moves into embedded video, it’s my guess that we’ll continue to see a lot more Twitter engagement and activity on Twitter itself, and referring traffic outward won’t be as considerable a focus. Pinterest seems to be one of the only social networks that continues that push (as Facebook, Instagram, LinkedIn, and YouTube all seem to be pursuing a “keep them here” strategy).

——————————–

Final Score: +3

That positive number means I’ve passed my bar and can make another set of predictions for 2015. I’m going to be a little more aggressive this year, even though it risks ruining my sterling record, simply because I think it’s more exciting 🙂

Thus, here are my 10 predictions for what the marketing world will bring us in 2015:

#1: We’ll see the first major not-for-profit University in the US offer a degree in Internet Marketing, including classes on SEO.

There are already some private, for-profit offerings from places like Fullsail and Univ. of Phoenix, but I don’t know that these pedigrees carry much weight. Seeing a Stanford, a Wharton, or a University of Washington offer undergraduate or MBA programs in our field would be a boon to those seeking options and an equal boon to the universities.

The biggest reason I think we’re ripe for this in 2015 is the 
LinkedIn top 25 job skills data showing the immense value of SEO (#5) and digital/online marketing (#16) in a profile when seeking a new job. That should (hopefully) be a direct barometer for what colleges seek to include in their repertoire.

#2: Google will continue the trend of providing instant answers in search results with more interactive tools.

Google has been doing instant answers for a long time, but in addition to queries with immediate and direct responses, they’ve also undercut a number of online tool vendors by building their own versions directly into the SERPs, like they do currently for queries like ”
timer” and “calculator.”

I predict in 2015, we’ll see more partnerships like what’s provided with 
OpenTable and the ability to book reservations directly from the SERPs, possibly with companies like Uber, Flixster (they really need to get back to a better instant answer for movies+city), Zillow, or others that have unique data that could be surfaced directly.

#3: 2015 will be the year Facebook begins including some form of web content (not on Facebook’s site) in their search functionality.

Facebook 
severed their search relationship with Bing in 2014, and I’m going to make a very risky prediction that in 2015, we’ll see Facebook’s new search emerge and use some form of non-Facebook web data. Whether they’ll actually build their own crawler or merely license certain data from outside their properties is another matter, but I think Facebook’s shown an interest in getting more sophisticated with their ad offerings, and any form of search data/history about their users would provide a powerful addition to what they can do today.

#4: Google’s indexation of Twitter will grow dramatically, and a significantly higher percentage of tweets, hashtags, and profiles will be indexed by the year’s end.

Twitter has been 
putting more muscle behind their indexation and SEO efforts, and I’ve seen more and more Twitter URLs creeping into the search results over the last 6 months. I think that trend continues, and in 2015, we see Twitter.com enter the top 5-6 “big domains” in Mozcast.

#5: The EU will take additional regulatory action against Google that will create new, substantive changes to the search results for European searchers.

In 2014, we saw the EU 
enforce the “right to be forgotten” and settle some antitrust issues that require Google to edit what it displays in the SERPs. I don’t think the EU is done with Google. As the press has noted, there are plenty of calls in the European Parliament to break up the company, and while I think the EU will stop short of that measure, I believe we’ll see additional regulatory action that affects search results.

On a personal opinion note, I would add that while I’m not thrilled with how the EU has gone about their regulation of Google, I am impressed by their ability to do so. In the US, with 
Google becoming the second largest lobbying spender in the country and a masterful influencer of politicians, I think it’s extremely unlikely that they suffer any antitrust or regulatory action in their home country — not because they haven’t engaged in monopolistic behavior, but because they were smart enough to spend money to manipulate elected officials before that happened (unlike Microsoft, who, in the 1990’s, assumed they wouldn’t become a target).

Thus, if there is to be any hedge to Google’s power in search, it will probably come from the EU and the EU alone. There’s no competitor with the teeth or market share to have an impact (at least outside of China, Russia, and South Korea), and no other government is likely to take them on.

#6: Mobile search, mobile devices, SSL/HTTPS referrals, and apps will combine to make traffic source data increasingly hard to come by.

I’ll estimate that by year’s end, many major publishers will see 40%+ of their traffic coming from “direct” even though most of that is search and social referrers that fail to pass the proper referral string. Hopefully, we’ll be able to verify that through folks like 
Define Media Group, whose data sharing this year has made them one of the best allies marketers have in understanding the landscape of web traffic patterns.

BTW – I’d already estimate that 30-50% of all “direct” traffic is, in fact, search or social traffic that hasn’t been properly attributed. This is a huge challenge for web marketers — maybe one of the greatest challenges we face, because saying “I brought in a lot more traffic, I just can’t prove it or measure it,” isn’t going to get you nearly the buy-in, raises, or respect that your paid-traffic compatriots can earn by having every last visit they drive perfectly attributed.

#7: The content advertising/recommendation platforms will continue to consolidate, and either Taboola or Outbrain will be acquired or do some heavy acquiring themselves.

We just witnessed the 
surprising shutdown of nRelate, which I suspect had something to do with IAC politics more than just performance and potential for the company. But given that less than 2% of the web’s largest sites use content recommendation/promotion services and yet both Outbrain and Taboola are expected to have pulled in north of $200m in 2014, this is a massive area for future growth.

Yahoo!, Facebook, and Google are all potential acquirers here, and I could even see AOL (who already own Gravity) or Buzzfeed making a play. Likewise, there’s a slew of smaller/other players that Taboola or Outbrain themselves could acquire: Zemanta, Adblade, Zegnet, Nativo, Disqus, Gravity, etc. It’s a marketplace as ripe for acquisition as it is for growth.

#8: Promoted pins will make Pinterest an emerging juggernaut in the social media and social advertising world, particularly for e-commerce.

I’d estimate we’ll see figures north of $50m spent on promoted pins in 2015. This is coming after Pinterest only just 
opened their ad platform beyond a beta group this January. But, thanks to high engagement, lots of traffic, and a consumer base that B2C marketers absolutely love and often struggle to reach, I think Pinterest is going to have a big ad opportunity on their hands.

Note the promoted pin from Mad Hippie on the right

(apologies for very unappetizing recipes featured around it)

#9: Foursquare (and/or Swarm) will be bought, merge with someone, or shut down in 2015 (probably one of the first two).

I used to love Foursquare. I used the service multiple times every day, tracked where I went with it, ran into friends in foreign cities thanks to its notifications, and even used it to see where to go sometimes (in Brazil, for example, I found Foursquare’s business location data far superior to Google Maps’). Then came the split from Swarm. Most of my friends who were using Foursquare stopped, and the few who continued did so less frequently. Swarm itself tried to compete with Yelp, but it looks like 
neither is doing well in the app rankings these days.

I feel a lot of empathy for Dennis and the Foursquare team. I can totally understand the appeal, from a development and product perspective, of splitting up the two apps to let each concentrate on what it’s best at, and not dilute a single product with multiple primary use cases. Heck, we’re trying to learn that lesson at Moz and refocus our products back on SEO, so I’m hardly one to criticize. That said, I think there’s trouble brewing for the company and probably some pressure to sell while their location and check-in data, which is still hugely valuable, is robust enough and unique enough to command a high price.

#10: Amazon will not take considerable search share from Google, nor will mobile search harm Google’s ad revenue substantively.

The “Google’s-in-trouble” pundits are mostly talking about two trends that could hurt Google’s revenue in the year ahead. First, mobile searchers being less valuable to Google because they don’t click on ads as often and advertisers won’t pay as much for them. And, second, Amazon becoming the destination for direct, commercial queries ahead of Google.

In 2015, I don’t see either of these taking a toll on Google. I believe most of Amazon’s impact as a direct navigation destination for e-commerce shoppers has already taken place and while Google would love to get those searchers back, that’s already a lost battle (to the extent it was lost). I also don’t think mobile is a big concern for Google — in fact, I think they’re pivoting it into an opportunity, and taking advantage of their ability to connect mobile to desktop through Google+/Android/Chrome. Desktop search may have flatter growth, and it may even decline 5-10% before reaching a state of equilibrium, but mobile is growing at such a huge clip that Google has plenty of time and even plentier eyeballs and clicks to figure out how to drive more revenue per searcher.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from moz.com

Outbound Links Help SEO Rankings

Learn SEO Class http://www.learnseoclass.com Just a quick update on my seo efforts. I have been link building for my website learnseoclass.com and was curiou…

Reblogged 4 years ago from www.youtube.com