Many local business owners are surprised with the information that appears when they (and their customers) come
across their business listings at Google and Bing. Often, incorrect or out-of-date information shows up with
no explanation about where it comes from.
In some cases, even business owners who have already claimed their listings at major search engines like Google
and Bing continue to see improper information displayed about their businesses, which understandably just adds
to their frustration.
The reason this happens is that these search giants pull in business information from a variety of other sources,
in addition to maintaining their own business databases. They both do the best they can to match the data that
comes in from these other sources with what they have in their own index, but sometimes that doesn’t happen properly.
If the information is different enough from the correct listing, search engines might think it’s a different
business—or they might even feel that the wrong information appears so many times in the other places from
which they get their data that the info might actually be “right.”
The sources that Google and Bing pull information from vary from country to country. Each has its own set of
important players, known as data aggregators.
These aggregators have typically accumulated their business databases by scanning and transcribing things
like phone records, utility records, business registration websites, and printed yellow pages directories.
Google also crawls the web looking for business information wherever it can find it: online yellow pages directories,
review sites, local newspaper sites, and blogs. Many of these sources get their information from the same aggregators
that Google does—just one more reason you need to make sure your business information is correct at those handful
of primary providers in your country. If your data is wrong at those aggregators, it’s likely to be wrong in many
places across the web, including Google.
The data aggregators of the future
Factual is a relatively new player on the scene; they were hardly on anyone’s radar
less than two years ago.
And yet today, if you visit their homepage, you see a who’s who of local
search portals, including Yelp, Bing, and TripAdvisor. It’s clear they’re a force to be reckoned with, especially globally.
Even for experts, the local search ecosystem is incredibly confusing! But hopefully browsing the local search ecosystem
graphic relevant to your country will give you a better understanding of how these local data sites fit together,
and identify places to clean up incorrect listing information you might not otherwise have known about.
Every business that competes in a local market and who competes for the
display of localized results in SERPs will likely find the need to
conduct a local SEO audit at some point. Whether you’ve hired an SEO in
the past or not, the best way to beat the competition is to know where
you stand and what you need to fix, and then develop a plan to win based
on the competitive landscape.
While this may seem like a daunting task, the good news is that you can
do this for your business or your client using this Ultimate Local SEO
guide was created as a complete checklist and will show you what areas
you should focus on, what needs to be optimized, and what you need to do
to fix any problems you encounter. To make things easier, I have also
included many additional resources for further reading on the topics
In this guide I am going to cover the top areas we review for clients
who either want to know how they can improve or the ones that need a
local SEO audit. To make it easier I have included detailed explanations
of the topics and an Excel template you can use to conduct the audit.
Also since the Pigeon update, local search has started to weigh organic
factors more heavily so I have included them in this audit. However, if
after you have read this you’re looking for an even deeper audit for
Organic SEO, you should also check out Steve Webb’s article, ”
How to Perform the World’s Greatest SEO Audit.”
Who is this guide for?
This guide is intended for those businesses that already have an
existing Google My Business page. It’s also mostly geared towards brick
and mortar stores. If you don’t have a public address and you’re a
service area business, you can ignore the parts where I mention
publishing your physical address. If you don’t have a listing setup
already, it’s a little bit harder to audit. That being said, new
businesses can use this as a road map.
What we won’t cover
The local algorithm is complicated and ever evolving. Although we can
look at considerations such as proximity to similar businesses or
driving directions requests, I have decided to not include these since
we have limited control over them. This audit mainly covers the items
the website owner is in direct control over.
A little background
Being ready and willing to adopt change in online marketing is an
important factor in the path of success. Search changes and you have to
be ready to change with it. The good news is that if you’re constantly
trying to do the right thing while be the least imperfect, your results
will only get better with updates.
Some goons will always try to cheat the systems for a quick win, but
they will get caught and penalized eventually. However, if you stick
with the right path you can sleep easier at night knowing you don’t have
to worry about penalties.
But why are audits so important?
At my company we have found through a lot of trial and error that we
can provide the best results for our clients when we start a project off
with a complete and full understanding of the project as opposed to
just bits and pieces. If we have a complete snapshot of their SEO
efforts along with their competition we can create a plan that is going
to be much more effective and sustainable.
We now live in a world where marketers not only need to be forward
thinking with their strategies but they must also evaluate and consider
the work done by prior employees and SEOs who have worked on the website
in the past. If you don’t know what potential damage has been done, how
could you possibly be sure your efforts will help your client long
Given the impact and potential severity of penalties, it’s
irresponsible to ignore this or participate in activities that can harm
the client in the long run. Again, sadly, this is a lesson I have
learned the hard way.
What aspects does this local SEO audit cover?
Knowing what to include in your audit is a great first step. We have
broken our audit down into several different categories we find to be
essential to local SEO success. They are:
1) Google My Business page audit
2) Website & landing page audit
3) Citation analysis
4) Organic link & penalty analysis
5) Review analysis
6) Social analysis
7) Competition analysis
8) Ongoing strategy
Analyzing all of these factors will allow you to develop a strategy
with a much better picture of the major problems and what you’re up
against as far as the competition is concerned. If you don’t have the
full picture with all of the details, then you might uncover more
Before we get started, a disclaimer
In this guide I am going to try to break things down to make it easy
for beginners and advanced users. That being said, it’s a wise idea to
seek advice or read more about a topic if you don’t quite understand it.
If something is over your head, please don’t hesitate to reach out for
clarification. It’s always better to be safe than sorry.
How to use this guide for your local SEO audit
This guide is broken up into two parts including this post and a
spreadsheet. The written part that you are reading now will also
this spreadsheet which
will allow you to collect pertinent client intake information, record
problems, and serve as an easy reference as to what your ultimate goal
is for each of the items.
To use the spreadsheet you can click the link and then go to File > Make A Copy.
spreadsheet includes five tabs that each serve a different purpose. They are:
Current info – This tab allows you to record the information
the customer submits and compare it against the Google My Business
information you find. It also allows you to record your notes for any
proposed changes. This will help you when it comes time to report on
Questions to ask – These are some basic questions you can ask your clients up front that may save a lot of time in the long run.
Competitor information – You can use this tab to track your competitors and compare your metrics side by side.
Top 50 citations audit – This is the list of the top 50 citation sources as provided by Whitespark.
Audit steps – For the more advanced user I took everything in
this long document and condensed it to this easy to use spreadsheet with
an audit checklist and some small notes on what you’re checking for.
Get your audit shoes on. Now let’s get started
Step 1: Gather the facts
Whether you’re conducting this audit for a client or your own business
it’s important to start off with the right information. If clients fill
out this information properly, you can save a lot of time and also help
identify major issues right off the bat. Not only can we help identify
some of the common local SEO issues like inconsistent NAP with this
information, we can also have it recorded in the spreadsheet I mentioned
Since this is an audit, the spreadsheet has information to include the
current information and a column for proposed changes for the client.
Later, these will be used as action items.
The first tab in this spreadsheet has everything we need to get started
under the company information tab. This includes all of the basic
information we will need to be successful.
This information should be provided by the client up front so that we
can compare it to the information already existing on the web.
You can use the audit spreadsheet and enter this under the “Provided Information” column. This will help us identify problems easily as we collect more information.
The basic information we will need to get started will include NAP
information and other items. A sample of this can be seen below:
Questions to ask up front
Once we have the basic company information we can also ask some
questions. Keep in mind that the goal here is to be the least imperfect.
While some of these factors are more important than others, it’s always
good to do more and have a better understanding of the potential issues
rather than taking shortcuts. Shortcuts will just create more work
Feel free to edit the spreadsheet and add more questions to your copy based on your experience.
1) Have you ever been penalized or think you may have been? The
client should have a good idea if they were penalized in the past.
2) Have you ever hired anyone to build citations for you? If they
hired anyone to build citations for them they should have some
documentation which will make the citation audit portion of the audit
3) Have you ever hired an SEO company to work with you? If they
hired an SEO in the past it’s important to check any work they completed
4) Have you ever hired anyone to build links for you? If they have
hired anyone in the past to build links they will hopefully have
documentation you can review. If you see bad links you know you will
have your work cut out for you.
5) What are the primary keywords you want to rank for? Knowing
what the client wants and developing a strategy based off this is
essential to your local SEO success.
6) Have you ever used another business name in the past? Companies
that used a different name or that were acquired can lead to NAP
7) Is your business address a PO Box? PO Boxes and UPS boxes are a
no no. It’s good to know this up front before you get started.
8) Is your phone number a land line? Some Local SEOs claim that
landlines may provide some benefit. Regardless it’s good to know where
the phone number is registered.
9) Do other websites 301 redirect to your website? If other
websites redirect to their domain you may need to do an analysis on
these domains as well. Specifically for penalty evaluation.
10) Did you ever previously use call tracking numbers? Previously
used call tracking numbers can be a nightmare as far as local SEO is
concerned. If a client previously used call tracking numbers you will
want to search for these when we get to the citation portion of this
document. Cleaning up wrong phone numbers, including tracking numbers,
in the local ecosystem is essential to your local success.
Local SEO audit phase 1: Google My Business page
Google My Business Dashboard
has a lot of useful information. Although I reference the Google
Guidelines below, be sure to check them often. Google does change these
sometimes and you won’t really get any official notice. This happened
rather recently when they started allowing descriptive words in the
business name. Keep in mind that if any changes were recently made to
your Google My Business page they may not show in the live version. It
may take up to three days for these to show in the search results.
information collected below should be put in the “Current Info” tab on
the spreadsheet under the Google My Business Information. This will also
help us identify discrepancies right away when we look at the
1. Locate the proper Google My Business page we should be working with
We can’t really get started with an audit unless we know the proper
page we’re working this. Usually if a client hires you they already have
How to do this: If your client already has a Google My Business login, and log in to their dashboard
using the proper credentials. In the back end of the dashboard it
should show the businesses associated with this account. Copy this URL
and confirm with the business owner that this is the page they intend to
use. If it’s not their primary one we will correct this a bit later
Goal: We want to find and record the proper Google My Business URL in our Local SEO Audit Spreadsheet.
2. Find and destroy duplicate pages
Editor’s Note: Since the publication of this post,
Google has shut down Mapmaker. For a current list of best practices for
managing duplicate GMB listings, read: https://moz.com/blog/delete-gmb-listing.
Duplicate Google My Business listings can be one of the greatest threats to any local SEO campaign.
How to: There are several ways to find possible
duplicate pages but I have found the easiest way is to use Google
MapMaker. To do this log in to your Google account and visit
http://www.google.com/mapmaker or https://plus.google.com/local.
From this page you can search the business phone number such as
555-555-5555 or the business names. If you see multiple listings you
didn’t know about, a major priority is to record those URLs and delete
I personally see a lot of issues when dealing with attorneys where each
attorney has their own profile or in the case where an office has
moved. There should only be one listing and it should be 100% correct.
Goal: Make sure there are no duplicate listings. Kill any duplicates.
3. Ensure that the local listing is not penalized (IMPORTANT!)
Figuring out Google penalties in the local landscape is not usually a
walk in the park. In fact there are a lot of variables to consider and
now this is a bigger deal post Pigeon as more organic signals are
involved. We will look at other types of penalties later in this guide.
Unlike organic penalties Google does not notify businesses of local
penalties unless your account is suspended with a big red warning on the
back end of your My Business page.
According to Phil Rozek from Local Visibility System “My first
must-look-at item is: is the client’s site or Google Places page being
penalized, or at risk of getting penalized?”
How to do this: If your keyword is “Los Angeles personal
injury attorney” then you could search for this keyword on Google Maps
and Google Search results. If your business listing appears on the maps
side in position C for example but then does not appear at all in local
search results performing a normal Google Search, then it’s likely there
is a penalty in place. Sometimes you see listings that are not
suppressed on the maps side but are suppressed on the places side. This
is an easy way to take a look.
Goal: Do your best to determine that the listing is not penalized. If it is consult a penalty expert for further guidance.
4. Is the Google My Business page associated with an email address on the customer’s domain?
In my experience it’s best practice to have the login information for
the business under an email address associated with the domain name.
Additionally this ensures that the client has primary control of their
listing. As an example if you run Moz.com and had local listings your
Google My Business login should be firstname.lastname@example.org instead of
email@example.com. This helps associate that you are indeed the
Rumors are flying about Google’s upcoming mobile-friendly update, and
bits of reliable information have come from several sources. My
and I wanted to cut through the noise and bring online marketers a
clearer picture of what’s in store later this month. In this post,
you’ll find our answers to nine key questions about the update.
1. What changes is Google making to its algorithm on April 21st?
Answer: Recently, Google has been rolling out lots of
changes to apps, Google Play, the presentation of mobile SERPS, and some
of the more advanced development guidelines that impact mobile; we
believe that many of these are in preparation for the 4/21 update.
Google has been downplaying some of these changes, and we have no
exclusive advanced knowledge about anything that Google will announce on
4/21, but based on what we have seen and heard recently, here is our
best guess of what is coming in the future (on 4/21 or soon thereafter):
We believe Google will launch a new mobile crawler (probably with an
Android user-agent) that can do a better job of crawling single-page web
apps, Android apps, and maybe even Deep Links in iOS apps. The new
Mobile-Friendly guidelines that launched last month focus on exposing JS
and CSS because Android apps are built in Java, and single-page web
Some example sites that use Responsive Design well in a single-page app architecture are:
Google has also recently been pushing for more feeds from Trusted
Partners, which are a key component of both mobile apps and single-page
web apps since Phantom JS and Prerender IO (and similar technologies)
together essentially generate crawlable feeds for indexing single-page
web apps. We think this increased focus on JS, CSS, and feeds is also
the reason why Google needs the
additional mobile index that Gary Illyes mentioned in his “Meet the Search Engines” interview
at SMX West a couple weeks ago, and why suddenly Google has been
talking about apps as “first class citizens,” as called out by Mariya
Moeva in the title of her SMX West presentation.
A new mobile-only index to go with the new crawler also makes sense
because Google wants to index and rank both app content and deep links
to screens in apps, but it does not necessarily want to figure them into
the desktop algorithm or slow it down with content that should never
rank in a desktop search. We also think that the recent increased focus
on deep links and the announcement from Google about Google Play’s
new automated and manual review process
are related. This announcement indicates, almost definitively, that
Google has built a crawler that is capable of crawling Android apps. We
believe that this new crawler will also be able to index more than one
content rendering (web page or app screen data-set) to one URL/URI and
it will probably will focus more on feeds, schema and sitemaps for its
own efficiency. Most of the native apps that would benefit from deep
linking are driven by data feeds, and crawling the feeds instead of the
apps would give Google the ability to understand the app content,
especially for iOS apps, (which they are still not likely able to
crawl), without having to crawl the app code. Then, it can crawl the
deep-linked web content to validate the app content.
FYI: Garry Illyes mentioned that Google is retiring their
old AJAX indexing instructions, but did not say how they would be replaced, except to specify in a Google+
post that Google would not click links to get more content. Instead,
they would need an OnLoad event to trigger further crawling. These
webmaster instructions for making AJAX crawlable were often relied on as
a way to make single-page web apps crawlable, and we think that feeds
will play a role here, too, as part of the replacement. Relying more
heavily on feeds also makes it easier for Google to scrape data directly
into SERPS, which they have been doing more and more. (See the appendix
of this slide deck,
starting on slide 30, for lots of mobile examples of this change in
play already.) This probably will include the ability to scrape forms
directly into a SERP, à la the form markup for auto-complete that Google just announced.
We are also inclined to believe that the use of the new
“Mobile-Friendly” designation in mobile SERPS may be temporary, as long
as SEOs and webmasters feel incentivized to make their CSS and
“Mobile-Friendly” in the SERP is a bit clunky, and takes up a lot of
space, so Google may decide switch to something else, like the
“slow” tag shown
to the right, originally spotted in testing by Barry Schwartz. In fact,
showing the “Slow” tag might make sense later in the game, after most
webmasters have made the updates, and Google instead needs to create a
more serious and impactful negative incentive for the stragglers. (This
is Barry’s image; we have not actually seen this one yet).
In terms of the Mobile-Friendly announcement, it is surprising that
Google has not focused more on mobile page speed, minimizing redirects
and avoiding mobile-only errors—their historical focus for mobile SEO.
This could be because page speed does not matter as much in the
evaluation of content if Google is getting most of its crawl information
from feeds. Our guess is that things like page speed and load time will
rebound in focus after 4/21. We also think mobile UX indicators that
are currently showing at the bottom of the Google PageSpeed tool (at the
bottom of the “mobile” tab) will play into the new mobile algorithm—we
have actually witnessed Google testing their inclusion in the
Mobile-Friendly tool already, as shown below, and of course, they were
recently added to everyone’s Webmaster Tools reports. It is possible
pages are in the new index as possible at launch.
2. If my site is not mobile-friendly, will this impact my desktop rankings as well?
Answer: On a panel at SMX Munich (2 weeks after SMX
West) Zineb from Google answered ‘no’ without hesitation. We took this
as another indication that the new index is related to a new crawler
and/or a major change to the infrastructure they are using to parse,
index, and evaluate mobile search results but not desktop results. That
said, you should probably take some time soon to make sure that
your site works—at least in a passable way—on mobile devices, just in
case there are eventual desktop repercussions (and because this is a
user experience best practice that can lead to other improvements that are still desktop ranking factors, such as decreasing your bounce rate).
3. How much will mobile rankings be impacted?
Answer: On the same panel at SMX Munich (mentioned
above), Zineb said that this 4/21 change will be bigger than the Panda
and Penguin updates. Again, we think this fits well with an
infrastructure change. It is unclear if all mobile devices will be
impacted in the change or not. The change might be more impactful for
Android devices or might impact Android and iOS devices equally—though
currently we are seeing significant differences between iOS and Android
for some types of search results, with more significant changes
happening on Android than on iOS.
Deep linking is a key distinction between mobile SERPs on the Android
OS and SERPs on iOS (currently, SERPs only display Android app deep
links, and only on Android devices). But there is reason to believe this
gap will be closing. For example, in his recent Moz post and in his
presentation at SMX West, Justin Briggs mentioned that a few sample
iOS deep links were validating in Google’s deep link tool.
This may indicate that iOS apps with deep links will be easier to
surface in the new framework, but it is still possible that won’t make
it into the 4/21 update. It is also unclear whether or not Google will
maintain its stance on tablets being more like desktop experiences than
they are like mobile devices, and what exactly Google is considering
“mobile.” What we can say here, though, is that Android tablets DO
appear to be including the App Pack results, so we think they will
change their stance here, and start to classify tablets as mobile on
Welcome to our newest installment of our educational Next Level
series! In our last episode, Jo Cameron taught you how to whip up intelligent SEO reports
for your clients to deliver impressive, actionable insights. Today, our
friendly neighborhood Training Program Manager, Brian Childs, is here
to show you an easy workflow for targeting multiple keywords with a
single page. Read on and level up!
For those who have taken any of the Moz Training Bootcamps,
you’ll know that we approach keyword research with the goal of
identifying concepts rather than individual keywords. A common term for
this in SEO is “niche keywords.” I think of a “niche” as a set of
related words or concepts that are essentially variants of the same
Let’s pretend my broad subject is: Why are cats jerks?
Some niche topics within this subject are:
Why does my cat keep knocking things off the counter?
Why does my cat destroy my furniture?
Why did I agree to get this cat?
I can then find variants of these niche topics using Keyword Explorer or another tool, looking for the keywords with the best qualities (Difficulty, Search Volume, Opportunity, etc).
By organizing your keyword research in this way, it conceptually aligns with the search logic of Google’s Hummingbird algorithm update.
Once we have niche topics identified for our subject, we then
dive into specific keyword variants to find opportunities where we can
rank. This process is covered in-depth during the Keyword Research Bootcamp class.
Should I optimize my page for multiple keywords?
The answer for most sites is a resounding yes.
If you develop a strategy of optimizing your pages for only one
keyword, this can lead to a couple of issues. For example, if a content
writer feels restricted to one keyword for a page they might develop
very thin content that doesn’t discuss the broader concept in much
useful detail. In turn, the marketing manager may end up spreading
valuable information across multiple pages, which reduces the potential
authority of each page. Your site architecture may then become larger
than necessary, making the search engine less likely to distinguish your
unique value and deliver it into a SERP.
As recent studies
have shown, a single high-ranking page can show up in dozens — if not
hundreds — of SERPs. A good practice is to identify relevant search
queries related to a given topic and then use those queries as your H2
So how do you find niche keyword topics? This is the process I use that relies on a relatively new SERP feature: the “People also ask” boxes.
How to find niche keywords
Step 1: Enter a relevant question into your search engine
Question-format search queries are great because they often generate featured snippets. Featured snippets
are the little boxes that show up at the top of search results, usually
displaying one- to two-sentence answers or a list. Recently, when
featured snippets are displayed, there is commonly another box nearby
showing “People also ask” This second box allows you to peer into the
logic of the search algorithm. It shows you what the search engine
“thinks” are closely related topics.
Step 2: Select the most relevant “People also ask” query
a look at those initial “People also ask” suggestions. They are often
different variants of your query, representing slightly different search
intent. Choose the one that most aligns with the search intent of your
target user. What happens? A new set of three “People also ask”
suggestions will populate at the bottom of the listthat are associated with the first option you chose.
This is why I refer to these as choose-your-own-adventure boxes. With
each selection, you dive deeper into the topic as defined by the search
Step 3: Find suggestions with low-value featured snippets
“People also ask” suggestion is a featured snippet. As you dig deeper
into the topic by selecting one “People also ask” after another, keep an
eye out for featured snippets that are not particularly helpful. This
is the search engine attempting to generate a simple answer to a
question and not quite hitting the mark. These present an opportunity.
Keep track of the ones you think could be improved. In the following
example, we see the Featured Snippet being generated by an article that
doesn’t fully answer the question for an average user.
Step 4: Compile a list of “People also ask” questions
you’ve explored deep into the algorithm’s contextually related results
using the “People also ask” box, make a list of all the questions you
found highly related to your desired topic. I usually just pile these
into an Excel sheet as I find them.
Step 5: Analyze your list of words using a keyword research tool
With a nice list of keywords that you know are generating featured snippets, plug the words into Keyword Explorer
or your preferred keyword research tool. Now just apply your normal
assessment criteria for a keyword (usually a combination of search
volume and competitiveness).
Step 6: Apply the keywords to your page title and heading tags
you’ve narrowed the list to a set of keywords you’d like to target on
the page, have your content team go to work generating relevant,
valuable answers to the questions. Place your target keywords as the
heading tags (H2, H3) and a concise, valuable description immediately
following those headings.
Do you find it difficult to decipher the different components of Google’s search results these days?
You’re not alone. It’s hard for professional SEOs, let alone local business owners, to figure out how
and from where Google is surfacing various pieces of information. Knowing which search engine results
represent paid advertisements vs. natural or organic results vs. social results vs. local results is basically a
For local queries in particular, Google tends to return results that include a blend of both website
and Local information—mostly stemming from Google+ Local pages.
Here’s a screenshot of what that looks like. Note that for this query, “car insurance,” the searcher
does not even need to specify where he’s looking for car insurance—Google guesses his location based
on a variety of signals.
Since the Spring of 2012, Google has returned an increasing percentage of these “blended” results for local
queries—meaning it’s important to optimize not only your website, but your Google+ Local page, and
all associated local and social media profiles.
Conceptualizing the Online Landscape
This graphic is meant to represent the influence of both organic search ranking factors
(things related to your website) and social media factors (such as reviews left for your business
on your Google+ Local Page, Yelp, Citysearch, and other sites around the web).
In order to truly succeed in local search marketing, your business will need to make both organic
and social media efforts. But not every business should focus on the same
mix of techniques to achieve success.
Where Should You Prioritize Your Resources?
If your business sells products or services to customers located in your geographic area, optimizing
for local search will almost always be a must for you.
and Bing have indicated that over 20% of all desktop search queries are local in nature and that
somewhere around 50% of queries on mobile phones and tablets are local.
These percentages will only increase in the coming years.
Depending on your business model, local search may be a key component of your overall marketing mix.
But it shouldn’t necessarily be the first place you start with your online marketing.
Many factors like the age of your website, whether you have someone available in-house to work on
your online marketing, and the physical location of your business all influence whether local
search should be your primary focus, or whether you might be better served taking a look at
organic search or social media first.
Google, Bing, and the other search engines have revolutionized how we learn, how we collaborate,
how we shop and how we interact within our local communities. Today, Google alone handles more
than 100 billion searches per month
around the world. Of those searches,
From these numbers, we can extrapolate that there are approximately seven billion unique local searches
per month on Google in the United States.
Google, Yahoo!, and Bing are all currently returning local
results that have challenged traditional print Yellow Pages and, in many
areas, exceeded their usage as the preferred method for discovering
local businesses and local information. As of March 9, 2009, Google began showing local results for generic queries,
meaning that Internet users no longer need to include any city or
geographic terms in their search to be shown results that are local to
Additionally, mobile search is absolutely exploding. Mobile searches primarily pull their results from local search engines.
What this means for your business: The potential to attract new customers via local search is enormous.
Depending upon your business model, your marketing budget and your
resources, local search may be the right match for your business—or,
conversely, other forms of marketing may be smarter for you. Visit and
read all three of the articles in the “keep learning”
section below to determine the best possible marketing channels for your
Few marketing terms light up local business owners’ eyes like the words “social media”.
Most business owners are already using social media in their personal lives, and it’s
only natural to want to use this channel to grow your business. However, many business
owners dive headfirst into social media without an understanding of what kinds of
content to post, how to attract followers and fans, or the nuances of each social platform—let
alone a concrete strategy to make the most of their time spent.
The essence of social media is really a public conversation with your customers and prospective
customers—sort of like one big dinner party. Just as partygoers aren’t really attending with the
intent of being sold to, neither are social media followers looking for a constant sales pitch.
And no partygoer likes to be cornered by someone who only talks about themselves, so don’t be “that guy”
on social media either! Engage your followers, ask their opinions, and give them a sense of investment in your business.
In terms of making the most of your time, spend it on the sites where your customers are.
Just because you think you should be on Facebook doesn’t necessarily mean that it will pan out for you.
Survey your customers to find out which sites they’re on, whether in-store, via email, or when they call for an appointment.
You’re likely to get a running start on your social media campaign if you “fish where the fish are.”
Check out the “keep learning” section below. There, you’ll find
specific guides to help you build and engage a following on the major
networking sites for local businesses.
A few weeks ago, rankings for pages on a key section of my
site dropped an average of a full position in one day. I’ve been an SEO
for 7 years now, but I still ran around like a chicken with my head cut
off, panicked that I wouldn’t be able to figure out my mistake. There
are so many things that could’ve gone wrong: Did I or my team
unintentionally mess with internal link equity? Did we lose links? Did
one of Google’s now-constant algorithm updates screw me over?
Since the drop happened to a group of pages, I made the
assumption it had to do with our site or page structure (it didn’t). I
wasted a good day focused on technical SEO. Once I realized my error, I
decided to put together a guide to make sure that next time, I’ll do my
research effectively. And you, my friends, will reap the rewards.
First, make sure there’s actually a rankings change
Okay, I have to start with this: before you go down this rabbit hole of rankings changes, make sure there was actually a rankings change.
Your rankings tracker may not have localized properly, or have picked
up on one of Google’s rankings experiments or personalization.
Has organic traffic dropped to the affected page(s)?
We’re starting here because this is the most reliable data you
have about your site. Google Search Console and rankings trackers are
trying to look at what Google’s doing; your web analytics tool is just
tracking user counts.
Compare organic traffic to the affected page(s) week-over-week
both before and after the drop, making sure to compare similar days of
Is the drop more significant than most week-over-week changes?
Is the drop over a holiday weekend? Is there any reason search volume could’ve dropped?
Use the Search Analytics section to see clicks, impressions, and average position for a given keyword, page, or combo.
Does GSC show a similar rankings drop to what you saw in your
rankings tracker? (Make sure to run the report with the selected
Does your rankings tracker show a sustained rankings drop?
I recommend tracking rankings daily for your important keywords,
so you’ll know if the rankings drop is sustained within a few days.
If you’re looking for a tool recommendation, I’m loving Stat.
If you’ve just seen a drop in your rankings tool and your
traffic and GSC clicks are still up, keep an eye on things and try not
to panic. I’ve seen too many natural fluctuations to go to my boss as
soon as I see an issue.
But if you’re seeing that there’s a rankings change, start going through this guide.
Figure out what went wrong
1. Did Google update their algorithm?
Google rolls out a
new algorithm update at least every day, most silently. Good news is,
there are leagues of SEOs dedicated to documenting those changes.
Are there any SEO articles or blogs talking about a change around the date you saw the change? Check out:
Do you have any SEO friends who have seen a change? Pro tip: Make friends with SEOs who run sites similar to yours, or in your industry. I can’t tell you how helpful it’s been to talk frankly about tests I’d like to run with SEOs who’ve run similar tests.
If this is your issue…
bad news here is that if Google’s updated their algorithm, you’re going
to have to change your approach to SEO in one way or another.
next move is to put together a strategy to either pull yourself out of
this penalty, or at the very least to protect your site from the next
2. Did your site lose links?
Pull the lost links report from Ahrefs or Majestic. They’re the most reputable link counters out there, and their indexes are updated daily.
Has there been a noticeable site-wide link drop?
Has there been a noticeable link drop to the page or group of pages you’ve seen a rankings change for?
Has there been a noticeable link drop to pages on your site that link to the page or group of pages you’ve seen a rankings change for?
Screaming Frog on your site to find which pages link internally to the
affected pages. Check internal link counts for pages one link away from
Has there been a noticeable link drop to inbound links to the page or group of pages you’ve seen a rankings change for?
Use Ahrefs or Majestic to find the sites that link to your affected pages.
Have any of them suffered recent link drops?
Have they recently updated their site? Did that change their URLs, navigation structure, or on-page content?
If this is your issue…
The key here is to figure out who you lost links from and why, so you can try to regain or replace them.
Can you get the links back?
Do you have a relationship with the site owner who provided the links? Reaching out may help.
the links removed during a site update? Maybe it was accidental. Reach
out and see if you can convince them to replace them.
links removed and replaced with links to a different source? Investigate
the new source — how can you make your links more appealing than
theirs? Update your content and reach out to the linking site owner.
Can you convince your internal team to invest in new links to quickly replace the old ones?
Show your manager(s) how much a drop in link count affected your rankings and ask for the resources it’ll take to replace them.
will be tricky if you were the one to build the now-lost links in the
first place, so if you did, make sure you’ve put together a strategy to
build longer-term ones next time.
3. Did you change the affected page(s)?
you or your team changed the affected pages recently, Google may not
think that they’re as relevant to the target keyword as they used to be.
Did you change the URL?
DO NOT CHANGE URLS. URLs act as unique identifiers for Google; a new URL means a new page, even if the content is the same.
Has the target keyword been removed from the page title, H1, or H2s?
Is the keyword density for the target keyword lower than it used to be?
Can Google read all of the content on the page?
Look at Google’s cache by searching for cache:www.yourdomain.com/your-page to see what Google sees.
Can Google access your site? Check Google Search Console for server and crawl reports.
If this is your issue…
Good news! You can probably revert your site and regain the traffic you’ve lost.
If you changed the URL, see if you can change it back. If not, make sure the old URL is 301 redirecting to the new URL.
you changed the text on the page, try reverting it back to the old
text. Wait until your rankings are back up, then try changing the text
again, this time keeping keyword density in mind.
can’t read all of the content on your page, THIS IS A BIG DEAL.
Communicate that to your dev team. (I’ve found dev teams often
undervalue the impact of SEO, but “Googlebot can’t read the page” is a
pretty understandable, impactful problem.)
4. Did you change internal links to the affected page(s)?
you or your team added or removed internal links, that could change the
way link equity flows through your site, changing Google’s perceived
value of the pages on your site.
Did you or your team recently update site navigation anywhere? Some common locations to check:
Suggested blog posts
Did you or your team recently update key pages on your site that link to target pages? Some pages to check:
Top category pages
Linkbait blog posts or articles
Did you or your team recently update anchor text on links to target pages? Does it still include the target keyword?
If this is your issue…
out how many internal links have been removed from pointing to your
affected pages. If you have access to the old version of your site, run
Screaming Frog (or a similar crawler) on the new and old versions of
your site so you can compare inbound link counts (referred to as inlinks
in SF). If you don’t have access to the old version of your site, take a
couple of hours to compare navigation changes and mark down wherever
the new layout may have hurt the affected pages.
How you fix the
problem depends on how much impact you have on the site structure. It’s
best to fix the issue in the navigational structure of the site, but
many of us SEOs are overruled by the UX team when it comes to primary
navigation. If that’s the case for you, think about systematic ways to
add links where you can control the content. Some common options:
In the product description
In blog posts
In the footer (since UX will generally admit, few people use the footer)
in mind that removing links and adding them back later, or from
different places on the site, may not have the same effect as the
original internal links. You’ll want to keep an eye on your rankings,
and add more internal links than the affected pages lost, to make sure
you regain your Google rankings.
5. Google’s user feedback says you should rank differently.
is using machine learning to determine rankings. That means they’re at
least in part measuring the value of your pages based on their
click-through rate from SERPs and how long visitors stay on your page
before returning to Google.
Did you recently add a popup that is increasing bounce rate?
Is the page taking longer to load?
Check server response time. People are likely to give up if nothing happens for a few seconds.
Check full page load. Have you added something that takes forever to load and is causing visitors to give up quickly?
Have you changed your page titles? Is that lowering CTR? (I optimized page titles in late November, and that one change moved the average rank of 500 pages up from 12 to 9. One would assume things can go in reverse.)
If this is your issue…
If the issue is a new popup, do your best to convince your marketing team to test a different type of popup. Some options:
Stable banners at the top or bottom of the page (with a big CLICK ME button!)
your page is taking longer to load, you’ll need the dev team. Put
together the lost value from fewer SEO conversions now that you’ve lost
some rankings and you’ll have a pretty strong case for dev time.
you’ve changed your page titles, change them back, quick! Mark this
test as a dud, and make sure you learn from it before you run your next
6. Your competition made a change.
have changed rank not because you did anything, but because your
competition got stronger or weaker. Use your ranking tool to identify
competitors that gained or lost the most from your rankings change. Use a
tool like Versionista (paid, but worth it) or Wayback Machine (free, but spotty data) to find changes in your competitors’ sites.
Which competitors gained or lost the most as your site’s rankings changed?
Has that competition gained or lost inbound links? (Refer to #2 for detailed questions)
Has that competition changed their competing page? (Refer to #3 for detailed questions)
Has that competition changed their internal link structure? (Refer to #4 for detailed questions)
that competition started getting better click-through rates or dwell
time to their pages from SERPs? (Refer to #5 for detailed questions)
If this is your issue…
probably fuming, and your managers are probably fuming at you. But
there’s a benefit to this: you can learn about what works from your
competitors. They did the research and tested a change, and it paid off
for them. Now you know the value! Imitate your competitor, but try to do
it better than them this time — otherwise you’ll always be playing
Now you know what to do
You may still be panicking, but hopefully
this post can guide you to some constructive solutions. I find that the
best response to a drop in rankings is a good explanation and a plan.
And, to the Moz community of other brilliant SEOs: comment below if you see something I’ve missed!
Authority (DA) is a search engine ranking score developed by Moz that
predicts how well a website will rank on search engine result pages
(SERPs). A Domain Authority score ranges from one to 100, with higher
scores corresponding to a greater ability to rank.
Authority is calculated by evaluating multiple factors, including
linking root domains and number of total links, into a single DA score.
This score can then be used when comparing websites or tracking the
“ranking strength” of a website over time. Domain Authority is not a metric used by Google in determining search rankings and has no effect on the SERPs.
You can view a website’s DA by using MozBar (a free Chrome-extension), Link Explorer (a backlink analysis tool), the SERP Analysis section of Keyword Explorer, and dozens of other SEO tools across the web.
How is Domain Authority scored?
score Domain Authority on a 100-point logarithmic scale. Thus, it’s
significantly easier to grow your score from 20 to 30 than it is to grow
from 70 to 80.
What is a “good” Domain Authority?
speaking, sites with a very large number of high-quality external links
(such as Wikipedia or Google.com) are at the top end of the Domain
Authority scale, whereas small businesses and websites with fewer inbound links may have a much lower DA score. Brand-new websites will always start with a Domain Authority score of one.
Domain Authority is meant to be a predictor of a site’s ranking
ability, having a very high DA score shouldn’t be your only goal. Look
at the DA scores for the sites you’re directly competing with in the
SERPs and aim to have a higher score than your competitors. It’s best
used as a comparative metric (rather than an absolute, concrete
score) when doing research in the search results and determining which
sites may have more powerful/important link profiles than others. Because it’s a comparative tool, there isn’t necessarily a “good” or “bad” Domain Authority score.
How to use Domain Authority correctly
Domain Authority vs. Page Authority
Whereas Domain Authority measures the predictive ranking strength of entire domains or subdomains, Page Authority measures the strength of individual pages.
Where can you find Domain Authority?
Domain Authority metrics are incorporated into dozens of SEO and online marketing platforms across the web.
Authority is based on data from our Link Explorer web index and uses
dozens of factors in its calculations. The actual Domain Authority
calculation itself uses a machine learning model to predictively find a
“best fit” algorithm that most closely correlates our link data with
rankings across thousands of actual search results that we use as
standards to scale against.
Since Authority is based on machine
learning calculations, your site’s score will often fluctuate as more,
less, or different data points are used in the calculation — for
instance, if Facebook were to acquire a billion new links, everyone’s PA
and DA would drop relative to Facebook. For this reason, keep in mind
that you should always use Domain Authority as a relative metric
to compare against the link profiles of other sites, as opposed to an
absolute value scoring the efficacy of your internal SEO efforts.
How do I influence Domain Authority?
Authority is difficult to influence directly. It is made up of an
aggregate of metrics and link data that have an impact on the authority
score. This was done intentionally; this metric is meant to approximate
how competitive a given site is in Google search results. Since Google
takes a lot of factors into account, a metric that tries to calculate it
must incorporate a lot of factors as well.
best way to influence the Domain Authority metric is to improve your
overall SEO. In particular, you should focus on your link profile by getting more links from other well-linked-to pages.
Why did my Authority change?
Because Domain Authority (and, for that matter, Page Authority)
is comprised of multiple metrics and calculations, pinpointing the
exact cause of a change can be a challenge. If your score has gone up or
down, there are many potential influencing factors including things
Your link profile growth hasn’t yet been captured in our web index.
The highest-authority sites experienced substantial link growth, skewing the scaling process.
You earned links from places that don’t contribute to Google ranking.
We crawled (and included in our index) more or fewer of your linking domains than we had previously.
Your Domain Authority is on the lower end of the scoring spectrum and is thus more impacted by scaling fluctuation.
You can read more about how to interpret these (and other) fluctuations in Authority scores here.
key to understanding Page and Domain Authority fluctuations is that
these metrics don’t exist in a vacuum — they depend on many positive and
negative factors so that even if a given site improves its SEO, its
Authority score(s) may not always reflect it. A good metaphor to help
understand why is how “best of” rankings work. Let’s look at an example:
Singapore has the best air quality in 2015, and improves it even
further in 2016, are they guaranteed to remain at #1? What if Denmark
also improves its air quality, or New Zealand (which, say, had been left
out of the rankings in 2015) joins the rating system? Maybe countries
2–10 all improved dramatically and Singapore has now fallen to #11, even
though they technically got better, not worse. Because there are many
other factors at play, Singapore’s ranking could change in spite of any
action (or inaction) whatsoever on their part.
(and Page Authority) work in a similar fashion. Since they’re scaled on a
100-point system, after each update, the recalculations mean that
Authority score of a given page/site could go down even if that
page/site has improved their link quantity and quality. Such is the
nature of a relative, scaled system. As such — and this is important
enough that we’ll emphasize it once more — Authority scores are best viewed as comparative rather than absolute metrics.
How can you effectively apply link metrics like Domain Authority and Page Authority alongside your other SEO metrics? Where and when does it make sense to take them into account, and what exactly do
they mean? In today’s Whiteboard Friday, Rand answers these questions
and more, arming you with the knowledge you need to better understand
and execute your SEO work.
Click on the whiteboard image above to open a high-resolution version in a new tab!
Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about when and how to use Domain Authority and Page Authority and link count metrics.
many of you have written to us at Moz over the years and certainly I go
to lots of conferences and events and speak to folks who are like,
“Well, I’ve been measuring my link building activity with DA,” or, “Hey,
I got a high DA link,” and I want to confirm when is it the right time
to be using something like DA or PA or a raw link count metric, like
number of linking root domains or something like Spam Score or a traffic estimation, these types of metrics.
I’m going to walk you through kind of these three — Page Authority,
Domain Authority, and linking root domains — just to get a refresher
course on what they are. Page Authority and Domain Authority are
actually a little complicated. So I think that’s worthwhile. Then we’ll
chat about when to use which metrics. So I’ve got sort of the three
primary things that people use link metrics for in the SEO world, and
we’ll walk through those.
So to start, Page Authority is basically — you can see I’ve
written a ton of different little metrics in here — linking URLs,
linking root domains, MozRank, MozTrust,
linking subdomains, anchor text, linking pages, followed links, no
followed links, 301s, 302s, new versus old links, TLD, domain name,
branded domain mentions, Spam Score, and many, many other metrics.
Basically, what PA is, is it’s every metric that we could
possibly come up with from our link index all taken together and then
thrown into a model with some training data. So the training
data in this case, quite obviously, is Google search results, because
what we want the Page Authority score to ultimately be is a predictor of
how well a given page is going to rank in Google search results
assuming we know nothing else about it except link data. So this is
using no on-page data, no content data, no engagement or visit data,
none of the patterns or branding or entity matches, just link data.
So this is everything we possibly know about a page from its link
profile and the domain that page is on, and then we insert that in as
the input alongside the training data. We have a machine learning model
that essentially learns against Google search results and builds the
best possible model it can. That model, by the way, throws away some of
this stuff, because it’s not useful, and it adds in a bunch of this
stuff, like vectors or various attributes of each one. So it might say,
“Oh, anchor text distribution, that’s actually not useful, but Domain
Authority ordered by the root domains with more than 500 links to them.”
I’m making stuff up, right? But you could have those sorts of filters
on this data and thus come up with very complex models, which is what
machine learning is designed to do.
All we have to worry about
is that this is essentially the best predictive score we can come up
with based on the links. So it’s useful for a bunch of things. If we’re
trying to say how well do we think this page might rank independent of
all non-link factors, PA, great model. Good data for that.
Domain Authority is once you have the PA model in your head and
you’re sort of like, “Okay, got it, machine learning against Google’s
results to produce the best predictive score for ranking in Google.” DA is just the PA model at the root domain level. So
not subdomains, just root domains, which means it’s got some weirdness.
It can’t, for example, say that randfishkin.blogspot.com is different
than www.blogspot.com. But obviously, a link from www.blogspot.com
is way more valuable than from my personal subdomain at Blogspot or
Tumblr or WordPress or any of these hosted subdomains. So that’s kind of
an edge case that unfortunately DA doesn’t do a great job of
What it’s good for is it’s relatively well-suited to be
predictive of how a domain’s pages will rank in Google. So it removes
all the page-level information, but it’s still operative at the domain
level. It can be very useful for that.
Linking Root Domain
Then linking root domains is the simplest one. This is basically a count of all the unique root domains with at least one link on them that point to a given page or a site.
So if I tell you that this URL A has 410 linking root domains, that
basically means that there are 410 domains with at least one link
pointing to URL A.
What I haven’t told you is whether they’re followed or no
followed. Usually, this is a combination of those two unless it’s
specified. So even a no followed link could go into the linking root
domains, which is why you should always double check. If you’re using
Ahrefs or Majestic or Moz and you hover on the whatever, the little
question mark icon next to any given metric, it will tell you what it
includes and what it doesn’t include.
When to use which metric(s)
All right. So how do we use these?
Well, for month over month link building performance, which is
something that a lot of folks track, I would actually not suggest making
DA your primary one. This is for a few reasons. So Moz’s index, which
is the only thing currently that calculates DA or a machine
learning-like model out there among the major toolsets for link data,
only updates about once every month. So if you are doing your report
before the DA has updated from the last link index, that can be quite
Now, I will say we are only a few months away from a new index
that’s going to replace Mozscape that will calculate DA and PA and all
these other things much, much more quickly. I know that’s been something
many folks have been asking for. It is on its way.
But in the meantime, what I recommend using is:
1. Linking root domains, the count of linking root domains and how that’s grown over time.
2. Organic rankings for your targeted keywords.
I know this is not a direct link metric, but this really helps to tell
you about the performance of how those links have been affected. So if
you’re measuring month to month, it should be the case that any months
you’ve got in a 20 or 30-day period, Google probably has counted and
recognized within a few days of finding them, and Google is pretty good
at crawling nearly the whole web within a week or two weeks. So this is
going to be a reasonable proxy for how your link building campaign has
helped your organic search campaign.
3. The distribution of Domain Authority.
So I think, in this case, Domain Authority can be useful. It wouldn’t
be my first or second choice, but I think it certainly can belong in a
link building performance report. It’s helpful to see the high DA links
that you’re getting. It’s a good sorting mechanism to sort of say,
“These are, generally speaking, more important, more authoritative
4. Spam Score I like
as well, because if you’ve been doing a lot of link building, it is the
case that Domain Authority doesn’t penalize or doesn’t lower its score
for a high Spam Score. It will show you, “Hey, this is an authoritative
site with a lot of DA and good-looking links, but it also looks quite
spammy to us.” So, for example, you might see that something has a DA of
60, but a Spam Score of 7 or 8, which might be mildly concerning. I
start to really worry when you get to like 9, 10, or 11.