Using Term Frequency Analysis to Measure Your Content Quality

Posted by EricEnge

It’s time to look at your content differently—time to start understanding just how good it really is. I am not simply talking about titles, keyword usage, and meta descriptions. I am talking about the entire page experience. In today’s post, I am going to introduce the general concept of content quality analysis, why it should matter to you, and how to use term frequency (TF) analysis to gather ideas on how to improve your content.

TF analysis is usually combined with inverse document frequency analysis (collectively TF-IDF analysis). TF-IDF analysis has been a staple concept for information retrieval science for a long time. You can read more about TF-IDF and other search science concepts in Cyrus Shepard’s
excellent article here.

For purposes of today’s post, I am going to show you how you can use TF analysis to get clues as to what Google is valuing in the content of sites that currently outrank you. But first, let’s get oriented.

Conceptualizing page quality

Start by asking yourself if your page provides a quality experience to people who visit it. For example, if a search engine sends 100 people to your page, how many of them will be happy? Seventy percent? Thirty percent? Less? What if your competitor’s page gets a higher percentage of happy users than yours does? Does that feel like an “uh-oh”?

Let’s think about this with a specific example in mind. What if you ran a golf club site, and 100 people come to your page after searching on a phrase like “golf clubs.” What are the kinds of things they may be looking for?

Here are some things they might want:

  1. A way to buy golf clubs on your site (you would need to see a shopping cart of some sort).
  2. The ability to select specific brands, perhaps by links to other pages about those brands of golf clubs.
  3. Information on how to pick the club that is best for them.
  4. The ability to select specific types of clubs (drivers, putters, irons, etc.). Again, this may be via links to other pages.
  5. A site search box.
  6. Pricing info.
  7. Info on shipping costs.
  8. Expert analysis comparing different golf club brands.
  9. End user reviews of your company so they can determine if they want to do business with you.
  10. How your return policy works.
  11. How they can file a complaint.
  12. Information about your company. Perhaps an “about us” page.
  13. A link to a privacy policy page.
  14. Whether or not you have been “in the news” recently.
  15. Trust symbols that show that you are a reputable organization.
  16. A way to access pages to buy different products, such as golf balls or tees.
  17. Information about specific golf courses.
  18. Tips on how to improve their golf game.

This is really only a partial list, and the specifics of your site can certainly vary for any number of reasons from what I laid out above. So how do you figure out what it is that people really want? You could pull in data from a number of sources. For example, using data from your site search box can be invaluable. You can do user testing on your site. You can conduct surveys. These are all good sources of data.

You can also look at your analytics data to see what pages get visited the most. Just be careful how you use that data. For example, if most of your traffic is from search, this data will be biased by incoming search traffic, and hence what Google chooses to rank. In addition, you may only have a small percentage of the visitors to your site going to your privacy policy, but chances are good that there are significantly more users than that who notice whether or not you have a privacy policy. Many of these will be satisfied just to see that you have one and won’t actually go check it out.

Whatever you do, it’s worth using many of these methods to determine what users want from the pages of your site and then using the resulting information to improve your overall site experience.

Is Google using this type of info as a ranking factor?

At some level, they clearly are. Clearly Google and Bing have evolved far beyond the initial TF-IDF concepts, but we can still use them to better understand our own content.

The first major indication we had that Google was performing content quality analysis was with the release of the
Panda algorithm in February of 2011. More recently, we know that on April 21 Google will release an algorithm that makes the mobile friendliness of a web site a ranking factor. Pure and simple, this algo is about the user experience with a page.

Exactly how Google is performing these measurements is not known, but
what we do know is their intent. They want to make their search engine look good, largely because it helps them make more money. Sending users to pages that make them happy will do that. Google has every incentive to improve the quality of their search results in as many ways as they can.

Ultimately, we don’t actually know what Google is measuring and using. It may be that the only SEO impact of providing pages that satisfy a very high percentage of users is an indirect one. I.e., so many people like your site that it gets written about more, linked to more, has tons of social shares, gets great engagement, that Google sees other signals that it uses as ranking factors, and this is why your rankings improve.

But, do I care if the impact is a direct one or an indirect one? Well, NO.

Using TF analysis to evaluate your page

TF-IDF analysis is more about relevance than content quality, but we can still use various precepts from it to help us understand our own content quality. One way to do this is to compare the results of a TF analysis of all the keywords on your page with those pages that currently outrank you in the search results. In this section, I am going to outline the basic concepts for how you can do this. In the next section I will show you a process that you can use with publicly available tools and a spreadsheet.

The simplest form of TF analysis is to count the number of uses of each keyword on a page. However, the problem with that is that a page using a keyword 10 times will be seen as 10 times more valuable than a page that uses a keyword only once. For that reason, we dampen the calculations. I have seen two methods for doing this, as follows:

term frequency calculation

The first method relies on dividing the number of repetitions of a keyword by the count for the most popular word on the entire page. Basically, what this does is eliminate the inherent advantage that longer documents might otherwise have over shorter ones. The second method dampens the total impact in a different way, by taking the log base 10 for the actual keyword count. Both of these achieve the effect of still valuing incremental uses of a keyword, but dampening it substantially. I prefer to use method 1, but you can use either method for our purposes here.

Once you have the TF calculated for every different keyword found on your page, you can then start to do the same analysis for pages that outrank you for a given search term. If you were to do this for five competing pages, the result might look something like this:

term frequency spreadsheet

I will show you how to set up the spreadsheet later, but for now, let’s do the fun part, which is to figure out how to analyze the results. Here are some of the things to look for:

  1. Are there any highly related words that all or most of your competitors are using that you don’t use at all?
  2. Are there any such words that you use significantly less, on average, than your competitors?
  3. Also look for words that you use significantly more than competitors.

You can then tag these words for further analysis. Once you are done, your spreadsheet may now look like this:

second stage term frequency analysis spreadsheet

In order to make this fit into this screen shot above and keep it legibly, I eliminated some columns you saw in my first spreadsheet. However, I did a sample analysis for the movie “Woman in Gold”. You can see the
full spreadsheet of calculations here. Note that we used an automated approach to marking some items at “Low Ratio,” “High Ratio,” or “All Competitors Have, Client Does Not.”

None of these flags by themselves have meaning, so you now need to put all of this into context. In our example, the following words probably have no significance at all: “get”, “you”, “top”, “see”, “we”, “all”, “but”, and other words of this type. These are just very basic English language words.

But, we can see other things of note relating to the target page (a.k.a. the client page):

  1. It’s missing any mention of actor ryan reynolds
  2. It’s missing any mention of actor helen mirren
  3. The page has no reviews
  4. Words like “family” and “story” are not mentioned
  5. “Austrian” and “maria altmann” are not used at all
  6. The phrase “woman in gold” and words “billing” and “info” are used proportionally more than they are with the other pages

Note that the last item is only visible if you open
the spreadsheet. The issues above could well be significant, as the lead actors, reviews, and other indications that the page has in-depth content. We see that competing pages that rank have details of the story, so that’s an indication that this is what Google (and users) are looking for. The fact that the main key phrase, and the word “billing”, are used to a proportionally high degree also makes it seem a bit spammy.

In fact, if you look at the information closely, you can see that the target page is quite thin in overall content. So much so, that it almost looks like a doorway page. In fact, it looks like it was put together by the movie studio itself, just not very well, as it presents little in the way of a home page experience that would cause it to rank for the name of the movie!

In the many different times I have done an analysis using these methods, I’ve been able to make many different types of observations about pages. A few of the more interesting ones include:

  1. A page that had no privacy policy, yet was taking personally identifiable info from users.
  2. A major lack of important synonyms that would indicate a real depth of available content.
  3. Comparatively low Domain Authority competitors ranking with in-depth content.

These types of observations are interesting and valuable, but it’s important to stress that you shouldn’t be overly mechanical about this. The value in this type of analysis is that it gives you a technical way to compare the content on your page with that of your competitors. This type of analysis should be used in combination with other methods that you use for evaluating that same page. I’ll address this some more in the summary section of this below.

How do you execute this for yourself?

The
full spreadsheet contains all the formulas so all you need to do is link in the keyword count data. I have tried this with two different keyword density tools, the one from Searchmetrics, and this one from motoricerca.info.

I am not endorsing these tools, and I have no financial interest in either one—they just seemed to work fairly well for the process I outlined above. To provide the data in the right format, please do the following:

  1. Run all the URLs you are testing through the keyword density tool.
  2. Copy and paste all the one word, two word, and three word results into a tab on the spreadsheet.
  3. Sort them all so you get total word counts aligned by position as I have shown in the linked spreadsheet.
  4. Set up the formulas as I did in the demo spreadsheet (you can just use the demo spreadsheet).
  5. Then do your analysis!

This may sound a bit tedious (and it is), but it has worked very well for us at STC.

Summary

You can also use usability groups and a number of other methods to figure out what users are really looking for on your site. However, what this does is give us a look at what Google has chosen to rank the highest in its search results. Don’t treat this as some sort of magic formula where you mechanically tweak the content to get better metrics in this analysis.

Instead, use this as a method for slicing into your content to better see it the way a machine might see it. It can yield some surprising (and wonderful) insights!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

The Most Important Link Penalty Removal Tool: Your Mindset

Posted by Eric Enge

Let’s face it. Getting slapped by a manual link penalty, or by the Penguin algorithm, really stinks. Once this has happened to you, your business is in a world of hurt. Worse still is the fact that you can’t get clear information from Google on which of your links are the bad ones. In today’s post, I am going to focus on the number one reason why people fail to get out from under these types of problems, and how to improve your chances of success.

The mindset

Success begins, continues, and ends with the right mindset. A large percentage of people I see who go through a link cleanup process are not aggressive enough about cleaning up their links. They worry about preserving some of that hard-won link juice they obtained over the years.

You have to start by understanding what a link cleanup process looks like, and just how long it can take. Some of the people I have spoken with have gone through a process like this one:

link removal timeline

In this fictitious timeline example, we see someone who spends four months working on trying to recover, and at the end of it all, they have not been successful.
A lot of time and money have been spent, and they have nothing to show for it. Then, the people at Google get frustrated and send them a message that basically tells them they are not getting it. At this point, they have no idea when they will be able to recover. The result is that the complete process might end up taking six months or more.

In contrast, imagine someone who is far more aggressive in removing and disavowing links. They are so aggressive that 20 percent of the links they cut out are actually ones that Google has not currently judged as being bad. They also start on March 9, and by April 30, the penalty has been lifted on their site.

Now they can begin rebuilding their business, five or months sooner than the person who does not take as aggressive an approach. Yes, they cut out some links that Google was not currently penalizing, but this is a small price to pay for getting your penalty cleared five months sooner. In addition, using our mindset-based approach, the 20 percent of links we cut out were probably not links that were helping much anyway, and that Google might also take action on them in the future.

Now that you understand the approach, it’s time to make the commitment. You have to make the decision that you are going to do whatever it takes to get this done, and that getting it done means cutting hard and deep, because that’s what will get you through it the fastest. Once you’ve got your head on straight about what it will take and have summoned the courage to go through with it, then and only then, you’re ready to do the work. Now let’s look at what that work entails.

Obtaining link data

We use four sources of data for links:

  1. Google Webmaster Tools
  2. Open Site Explorer
  3. Majestic SEO
  4. ahrefs

You will want to pull in data from all four of these sources, get them into one list, and then dedupe them to create a master list. Focus only on followed links as well, as nofollowed links are not an issue. The overall process is shown here:

pulling a link set

One other simplification is also possible at this stage. Once you have obtained a list of the followed links, there is another thing you can do to dramatically simplify your life.
You don’t need to look at every single link.

You do need to look at a small sampling of links from every domain that links to you. Chances are that this is a significantly smaller quantity of links to look at than all links. If a domain has 12 links to you, and you look at three of them, and any of those are bad, you will need to disavow the entire domain anyway.

I take the time to emphasize this because I’ve seen people with more than 1 million inbound links from 10,000 linking domains. Evaluating 1 million individual links could take a lifetime. Looking at 10,000 domains is not small, but it’s 100 times smaller than 1 million. But here is where the mindset comes in.
Do examine every domain.

This may be a grinding and brutal process, but there is no shortcut available here. What you don’t look at will hurt you. The sooner you start on the entire list, the sooner you will get the job done.

How to evaluate links

Now that you have a list, you can get to work. This is a key part where having the right mindset is critical. The first part of the process is really quite simple. You need to eliminate each and every one of these types of links:

  1. Article directory links
  2. Links in forum comments, or their related profiles
  3. Links in blog comments, or their related profiles
  4. Links from countries where you don’t operate/sell your products
  5. Links from link sharing schemes such as Link Wheels
  6. Any links you know were paid for

Here is an example of a foreign language link that looks somewhat out of place:

foreign language link

For the most part, you should also remove any links you have from web directories. Sure, if you have a link from DMOZ, Business.com, or BestofTheWeb.com, and the most important one or two directories dedicated to your market space, you can probably keep those.

For a decade I have offered people a rule for these types of directories, which is “no more than seven links from directories.” Even the good ones carry little to no value, and the bad ones can definitely hurt you. So there is absolutely no win to be had running around getting links from a bunch of directories, and there is no win in trying to keep them during a link cleanup process.

Note that I am NOT talking about local business directories such as Yelp, CityPages, YellowPages, SuperPages, etc. Those are a different class of directory that you don’t need to worry about. But general purpose web directories are, generally speaking, a poison.

Rich anchor text

Rich anchor text has been the downfall of many a publisher. Here is one of my favorite examples ever of rich anchor text:

The author wanted the link to say “buy cars,” but was too lazy to fit the two words into the same sentence! Of course, you may have many guest posts that you have written that are not nearly as obvious as this one. One great way to deal with that is to take your list of links that you built and sort them by URL and look at the overall mix of anchor text. You know it’s a problem if it looks anything like this:

overly optimized anchor text

The problem with the distribution in the above image is that the percentage of links that are non “rich” in nature is way too small. In the real world, most people don’t conveniently link to you using one of your key money phrases. Some do, but it’s normally a small percentage.

Other types of bad links

There is no way for me to cover every type of bad link in this post, but here are other types of links, or link scenarios, to be concerned about:

  1. If a large percentage of your links are coming from over on the right rail of sites, or in the footers of sites
  2. If there are sites that give you a site-wide link, or a very large number of links from one domain
  3. Links that come from sites whose IP address is identical in the A block, B block, and C block (read more about what these are here)
  4. Links from crappy sites

The definition of a crappy site may seem subjective, but if a site has not been updated in a while, or its information is of poor quality, or it just seems to have no one who cares about it, you can probably consider it a crappy site. Remember our discussion on mindset. Your objective is to be harsh in cleaning up your links.

In fact, the most important principle in evaluating links is this:
If you can argue that it’s a good link, it’s NOT. You don’t have to argue for good quality links. To put it another way, if they are not obviously good, then out they go!

Quick case study anecdote: I know of someone who really took a major knife to their backlinks. They removed and/or disavowed every link they had that was below a Moz Domain Authority of 70. They did not even try to justify or keep any links with lower DA than that. It worked like a champ. The penalty was lifted. If you are willing to try a hyper-aggressive approach like this one, you can avoid all the work evaluating links I just outlined above. Just get the Domain Authority data for all the links pointing to your site and bring out the hatchet.

No doubt that they ended up cutting out a large number of links that were perfectly fine, but their approach was way faster than doing the complete domain by domain analysis.

Requesting link removals

Why is it that we request link removals? Can’t we just build a
disavow file and submit that to Google? In my experience, for manual link penalties, the answer to this question is no, you can’t. (Note: if you have been hit by Penguin, and not a manual link penalty, you may not need to request link removals.)

Yes, disavowing a link is supposed to tell Google that you don’t want to receive any PageRank, or benefit, from it. However, there is a human element at play here.
Google likes to see that you put some effort into cleaning up the bad links that you have gotten that led to your penalty. The more bad links you have, the more important this becomes.

This does make the process a lot more expensive to get through, but if you approach this with the “whatever it takes” mindset, you dive into the requesting link removal process and go ahead and get it done.

I usually have people go through three rounds of requests asking people to remove links. This can be a very annoying process for those receiving your request, so you need to be aware of that. Don’t start your email with a line like “Your site is causing mine to be penalized …”, as that’s just plain offensive.

I’d be honest, and tell them “Hey, we’ve been hit by a penalty, and as part of our effort to recover we are trying to get many of the links we have gotten to our site removed. We don’t know which sites are causing the problem, but we’d appreciate your help …”

Note that some people will come back to you and ask for money to remove the link. Just ignore them, and put their domains in your disavow file.

Once you are done with the overall removal requests, and had whatever success you have had, take the rest of the domains and disavow them. There is a complete guide to
creating a disavow file here. The one incremental tip I would add is that you should nearly always disavow entire domains, not just the individual links you see.

This is important because even with the four tools we used to get information on as many links as we could, we still only have a subset of the total links. For example, the tools may have only seen one link from a domain, but in fact you have five. If you disavow only the one link, you still have four problem links, and that will torpedo your reconsideration request.

Disavowing the domain is a better-safe-than-sorry step you should take almost every time. As I illustrated at the beginning of this post, adding extra cleanup/reconsideration request loops is very expensive for your business.

The overall process

When all is said and done, the process looks something like this:

link removal process

If you run this process efficiently, and you don’t try to cut corners, you might be able to get out from your penalty in a single pass through the process. If so, congratulations!

What about tools?

There are some fairly well-known tools that are designed to help you with the link cleanup process. These include
Link Detox and Remove’em. In addition, at STC we have developed our own internal tool that we use with our clients.

These tools can be useful in flagging some of your links, but they are not comprehensive—they will help identify some really obvious offenders, but the great majority of links you need to deal with and remove/disavow are not identified. Plan on investing substantial manual time and effort to do the heavy lifting of a comprehensive review of all your links. Remember the “mindset.”

Summary

As I write this post, I have this sense of being heartless because I outline an approach that is often grueling to execute. But consider it tough love. Recovering from link penalties is indeed brutal.
In my experience, the winners are the ones who come with meat cleaver in hand, don’t try to cut corners, and take on the full task from the very start, no matter how extensive an effort it may be.

Does this type of process succeed? You bet. Here is an example of a traffic chart from a successful recovery:

manual penalty recovery graph

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

Local Centroids are Now Individual Users: How Can We Optimize for Their Searches?

Posted by MiriamEllis

“Google is getting better at detecting location at a more granular level—even on the desktop.
The user is the new centroid.” – 
David Mihm

The history of the centroid

The above quote succinctly summarizes the current state of affairs for local business owners and their customers. The concept of a centroid—
a central point of relevance—is almost as old as local search. In 2008, people like Mike Blumenthal and Google Maps Manager Carter Maslan were sharing statistics like this:

“…research indicates that up to 80% of the variation in rank can be explained by distance from the centroid on certain searches.”

At that time, businesses located near town hall or a similar central hub appeared to be experiencing a ranking advantage.

Fast forward to 2013, and Mike weighed in again with 
an updated definition of “industry centroids”

“If you read their (Google’s) patents, they actually deal with the center of the industries … as defining the center of the search. So if all the lawyers are on the corner of Main and State, that typically defines the center of the search, rather than the center of the city… it isn’t even the centroid of the city that matters. It matters that you are near where the other people in your industry are.”

In other words, Google’s perception of a centralized location for auto dealerships could be completely different than that for medical practices, and that
neither might be located anywhere near the city center.

While the concepts of city and industry centroids may still play a part in some searches,
local search results in 2015 clearly indicate Google’s shift toward deeming the physical location of the desktop or mobile user a powerful factor in determining relevance. The relationship between where your customer is when he performs a search and where your business is physically located has never been more important.

Moreover, in this new, user-centric environment, Google has moved beyond simply detecting cities to detecting neighborhoods and even streets. What this means for local business owners is that
your hyperlocal information has become a powerful component of your business data. This post will teach you how to better serve your most local customers.

Seeing the centroid in action

If you do business in a small town with few competitors, ranking for your product/service + city terms is likely to cover most of your bases. The user-as-centroid phenomenon is most applicable in mid-to-large sized towns and cities with reasonable competition. I’ll be using two districts in San Francisco—Bernal Heights and North Beach—in these illustrations and we’ll be going on a hunt for pizza.

On a desktop, searching for “pizza north beach san francisco” or setting my location to this neighborhood and city while searching for the product, Google will show me something like this:

Performing this same search, but with “bernal heights” substituted, Google shows me pizzerias in a completely different part of the city:

local result bernal heights pizza san francisco

And, when I move over to my mobile device, Google narrows the initial results down to
just three enviable players in each district. These simple illustrations demonstrate Google’s increasing sensitivity to serving me nearby businesses offering what I want.

The physical address of your business is the most important factor in serving the user as centroid. This isn’t something you can control, but there are things you
can do to market your business as being highly relevant to your hyperlocal geography.

Specialized content for the user-centroid

We’ll break this down into four common business models to help get you thinking about planning content that serves your most local customers.

1. Single-location business

Make the shift toward viewing your business not just as “Tony’s Pizza in San Francisco”, but as “Tony’s Pizza
in North Beach, San Francisco”. Consider:

  • Improving core pages of your website or creating new pages to include references to the proud part you play in the neighborhood scene. Talk about the history of your area and where you fit into that.
  • Interview locals and ask them to share their memories about the neighborhood and what they like about living there.
  • Showcase your participation in local events.
  • Plan an event, contest or special for customers in your district.
  • Take pictures, label them with hyperlocal terms, post them on your site and share them socially.
  • Blog about local happenings that are relevant to you and your customers, such as a street market where you buy the tomatoes that top your pizzas or a local award you’ve won.
  • Depending on your industry, there will be opportunities for hyperlocal content specific to your business. For example, a restaurant can make sure its menu is in crawlable text and can name some favorite dishes after the neighborhood—The Bernal Heights Special. Meanwhile, a spa in North Beach can create a hyperlocal name for a service—The North Beach Organic Spa Package. Not only does this show district pride, but customers may mention these products and services by name in their reviews, reinforcing your local connection.

2. Multi-location business within a single city

All that applies to the single location applies to you, too, but you’ve got to find a way to scale building out content for each neighborhood.

  • If your resources are strong, build a local landing page for each of your locations, including basic optimization for the neighborhood name. Meanwhile, create blog categories for each neighborhood and rotate your efforts on a week by week basis. First week, blog about neighborhood A, next week, find something interesting to write about concerning neighborhood B. Over time, you’ll have developed a nice body of content proving your involvement in each district.
  • If you’re short on resources, you’ll still want to build out a basic landing page for each of your stores in your city and make the very best effort you can to showcase your neighborhood pride on these pages.

3. Multiple businesses, multiple cities

Again, scaling this is going to be key and how much you can do will depend upon your resources.

  • The minimum requirement will be a landing page on the site for each physical location, with basic optimization for your neighborhood terms.
  • Beyond this, you’ll be making a decision about how much hyperlocal content you can add to the site/blog for each district, or whether time can be utilized more effectively via off-site social outreach. If you’ve got lots of neighborhoods to cover in lots of different cities, designating a social representative for each store and giving him the keys to your profiles (after a training session in company policies) may make the most sense.

4. Service area businesses (SABs)

Very often, service area businesses are left out in the cold with various local developments, but in my own limited testing, Google is applying at least some hyperlocal care to these business models. I can search for a neighborhood plumber, just as I would a pizza:

local results plumber bernal heights san francisco

To be painstakingly honest, plumbers are going to have to be pretty ingenious to come up with a ton of engaging industry/neighborhood content and may be confined mainly to creating some decent service area landing pages that share a bit about their work in various neighborhoods. Other business models, like contractors, home staging firms and caterers should find it quite easy to talk about district architecture, curb appeal and events on a hyperlocal front.

While your SAB is still unlikely to beat out a competitor with a physical location in a given neighborhood, you still have a chance to associate your business with that area of your town with well-planned content.


Need creative inspiration for the writing projects ahead?
Don’t miss this awesome wildcard search tip Mary Bowling shared at LocalUp. Add an underscore or asterisk to your search terms and just look at the good stuff Google will suggest to you:

wildcard search content ideas

Does Tony’s patio make his business one of
Bernal Heights’ dog-friendly restaurants or does his rooftop view make his restaurant the most picturesque lunch spot in the district? If so, he’s got two new topics to write about, either on his basic landing pages or his blog.

Hop over to 
Whitespark’s favorite takeaways from Mike Ramsey’s LocalUp presentation, too.

Citations and reviews with the user centroid in mind

Here are the basics about citations, broken into the same four business models:

1. Single-location business

You get just one citation on each platform, unless you have multiple departments or practitioners. That means one Google+ Local page, one Yelp profile, one Best of the Web listing. etc. You do not get one citation for your city and another for your neighborhood. Very simple.

2. Multi-location business within a single city

As with the single location business, you are entitled to just one set of citations per physical location. That means one Google+ Local listing for your North Beach pizza place and another for your restaurant in Bernal Heights.

A regular FAQ here in the Moz Q&A Forum relates to how Google will differentiate between two businesses located in the same city. Here are some tips:

  • Google no longer supports the use of modifiers in the business name field, so you can no longer be Tony’s Pizza – Bernal Heights, unless your restaurant is actually named this. You can only be Tony’s Pizza.
  • Facebook’s policies are different than Google’s. To my understanding, Facebook won’t permit you to build more than one Facebook Place for the identical brand name. Thus, to comply with their guidelines, you must differentiate by using those neighborhood names or other modifiers. Given that this same rule applies to all of your competitors, this should not be seen as a danger to your NAP consistency, because apparently, no multi-location business creating Facebook Places will have 100% consistent NAP. The playing field is, then, even.
  • The correct place to differentiate your businesses on all other platforms is in the address field. Google will understand that one of your branches is on A St. and the other is on B St. and will choose which one they feel is most relevant to the user.
  • Google is not a fan of call centers. Unless it’s absolutely impossible to do so, use a unique local phone number for each physical location to prevent mix-ups on Google’s part, and use this number consistently across all web-based mentions of the business.
  • Though you can’t put your neighborhood name in the title, you can definitely include it in the business description field most citation platforms provide.
  • Link your citations to their respective local landing pages on your website, not to your homepage.

3. Multiple businesses, multiple cities

Everything in business model #2 applies to you as well. You are allowed one set of citations for each of your physical locations, and while you can’t modify your Google+ Local business name, you can mention your neighborhood in the description. Promote each location equally in all you do and then rely on Google to separate your locations for various users based on your addresses and phone numbers.

4. SABs

You are exactly like business model #1 when it comes to citations, with the exception of needing to abide by Google’s rules about hiding your address if you don’t serve customers at your place of business. Don’t build out additional citations for neighborhoods you serve, other cities you serve or various service offerings. Just create one citation set. You should be fine mentioning some neighborhoods in your citation descriptions, but don’t go overboard on this.

When it comes to review management, you’ll be managing unique sets of reviews for each of your physical locations. One method for preventing business owner burnout is to manage each location in rotation. One week, tend to owner responses for Business A. Do Business B the following week. In week three, ask for some reviews for Business A and do the same for B in week four. Vary the tasks and take your time unless faced with a sudden reputation crisis.

You can take some additional steps to “hyperlocalize” your review profiles:

  • Write about your neighborhood in the business description on your profile.
  • You can’t compel random customers to mention your neighborhood, but you can certainly do so from time to time when your write responses. “We’ve just installed the first soda fountain Bernal Heights has seen since 1959. Come have a cool drink on us this summer.”
  • Offer a neighborhood special to people who bring in a piece of mail with their address on it. Prepare a little handout for all-comers, highlighting a couple of review profiles where you’d love to hear how they liked the Bernal Heights special. Or, gather email addresses if possible and follow up via email shortly after the time of service.
  • If your business model is one that permits you to name your goods or service packages, don’t forget the tip mentioned earlier about thinking hyperlocal when brainstorming names. Pretty cool if you can get your customers talking about how your “North Beach Artichoke Pizza” is the best pie in town!

Investigate your social-hyperlocal opportunties

I still consider website-based content publication to be more than half the battle in ranking locally, but sometimes, real-time social outreach can accomplish things static articles or scheduled blog posts can’t. The amount of effort you invest in social outreach should be based on your resources and an assessment of how naturally your industry lends itself to socialization. Fire insurance salesmen are going to find it harder to light up their neighborhood community than yoga studios will. Consider your options:

Remember that you are investigating each opportunity to see how it stacks up not just to promoting your location in your city, but in your neighborhood.

Who are the people in your neighborhood?

Remember that Sesame Street jingle? It hails from a time when urban dwellers strongly identified with a certain district of hometown. People were “from the neighborhood.” If my grandfather was a Mission District fella, maybe yours was from Chinatown. Now, we’re shifting in fascinating directions. Even as we’ve settled into telecommuting to jobs in distant states or countries, Amazon is offering one hour home delivery to our neighbors in Manhattan. Doctors are making house calls again! Any day now, I’m expecting a milkman to start making his rounds around here. Commerce has stretched to span the globe and now it’s zooming in to meet the needs of the family next door.

If the big guys are setting their sights on near-instant services within your community, take note.
You live in that community. You talk, face-to-face, with your neighbors every day and know the flavor of the local scene better than any remote competitor can right now.

Now is the time to reinvigorate that old neighborhood pride in the way you’re visualizing your business, marketing it and personally communicating to customers that you’re right there for them.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

SEO Positioning Tips

Search engines are constantly testing new formats, changing the metrics that assign relevance and add or subtract weight from each component. In order to sta…

Reblogged 4 years ago from www.youtube.com

Flash SEO – more secrets from Technoracle!

This is a video showcasing how the headless Adobe Flash (R) player used by Google to gain access to text content within SWF’s can grab text from multiple sta…

Reblogged 4 years ago from www.youtube.com

Optimizacija strani | Link Juice in razporeditev Page Rang

Se želite čim višje uvrstiti v iskalnikih? Kako link juice vpliva na optimizacijo? Kaj je page rang in kako sta link juice in page rang povezana? Oglejte si …

Reblogged 4 years ago from www.youtube.com