Search Engine Optimization and internet marketing are the processes which
might find you using a tool or two from a source which is not known. People who love analytics and metrics might need a lot of
information which is only available from paid tools. Fortunately, Google offers some tools which can suit your SEO needs. Some
of these tools are features and tricks present on our daily search engine tasks, but we often fail to use them.
Ryan Johnson, the Senior Sales Manager from Digital Services, has carefully chosen Google tools to help your SEO campaign and internet marketing strategies proceed successfully:
1. Google Analytics.
This tool can be helpful in getting insight on the appropriate content
to include in the future posts. It contains metrics that show you useful information for plans.
2. Google Autocomplete Feature.
This feature enables you
to see some completed search suggestions whenever
you are in the midst of typing something in the Google search bar.
These ideas come from the search metrics of Google giving you
other marketers tips that have previously been on a high search
density. Making a long list of many auto-completed search
suggestions creates spinnable content that can bring high ranking
3. Google Search Console.
This tool can help you
monitor and improve your ranking in the search
results of particular keywords. For instance, individuals can go to
“Search Analytics”. This menu is in the Search Console dashboard
and enables one view the clicks, click through rate, impressions, and
average search position for up to 999 keywords that relate
to your website.
4. Going to the Google’s “Searches Related To” Section.
This section can be your
next key to success. This area is at the
bottom of the Google search page. It contains items and queries in an
elongated form. These ideas can be your next blogging ideas or
titles of your posts. For instance, when you search “dog food”, this
area can have statements like “best dog food for skin allergies”.
Making a long list of these ideas can make a blog that is authoritative
in that particular niche.
5. Google AdWords.
If keywords have a high search density on Google, there is not much
need of spending money on a different SEO deal. Persons running paid search ads can access this from the keywords tab in their
Google Analytics campaign. Here, one can find the actual search queries to invest money. Google AdWords has a keywords search
feature that is one of the best and most reliable.
People carrying out
Search Engine Optimization may require using
a tool or more. Countless tools are emerging from different sources
which promise to deliver but at times scam individuals and
fail to help. However, Google has some tools which any person doing SEO
and internet marketing can use. Some of these tools exist
as services present in our search engines but often go ignored. From
the tools, recommended by Semalt, it is clear that numerous
SEO tasks such as keyword search, Google bots and crawlers are possible
using tools provided by Google. As a result, the need for
going for funny unproven tools is not there.
Many times when Google discharges information for its procedures,
which regulates the ranking of websites, I would advise clients to make no amendments to anything. Our company has never been
involved in search engine junk tactics like the creation of insignificant low-quality incoming website links. Creating exclusive
content and quality links have always been our motivation.
Still, we deal with the cases that when clients come to us telling that after the changes of Google released, like the Penguin 2.1 appraise, it affected their rankings. Clients usually think that it relates to some difficulties with their success process, but we discover most of them had bad incoming links created by the previous SEO company they were working with.
Incoming links are an
essential element for determining websites
ranking in Google. Earlier, a website with the highest number of
incoming links would rank higher than a competing website
with lower backlinks. Firms would create lots of incoming links
regardless of their quality or if they made any sense since Google
didn’t care about any relationships between the links and the websites.
Google then revolutionized. Panda was first introduced in
early 2011, then Penguin was released later in 2012. All of a sudden,
the quality of incoming links took over the quantity, at the
same time, the firms that got involved in the collection of a large
number of backlinks were getting involved in wrong techniques
during the creation of content. According to Semalt expert Andrew Dyhan,
an online marketing specialist, what Panda has started was
finished by the Penguin. Many websites witnessed a drastic fall in
their rankings, revenue, and traffic, and this is where one of
our clients was caught up. This was a great astonishment for the
affected webmasters whose sites got hit.
The bad incoming links
don’t show the harmless effect until they
hurt a great deal someday. Clients don’t realize the urgency of getting
rid of backlinks before we show them the analysis of their
account of the client. If on the first stages, removing bad links is
not very time-consuming and expensive, later when the loss will
bad enough, removal of incoming bad links became the urgent priority
that will make you ready to any amount in. We immediately
witness positive output once we embarked on bad links removal, but
unfortunately, some clients tend to be willing to get the immediate
In order to maintain
clients, any SEO firm should immediately look
out for bad links and eliminate them. Currently, we reveal to potential
customers regarding removal of bad links and if obligatory,
we advise them to invest in it for an initial couple of words. The
greater lesson is to have an understanding of where Google is headed
and drive clients towards that route. For the SEO customers, it is no
longer an option to be up to date with the trend in order to
succeed. Also, abandoning the necessity to get assistance from the SEO
firm very likely will make your competitor a winner. Therefore,
following up with professionals is important.
optimization (SEO) experts understand the importance of
link building. Essentially, link building is one of the cornerstones of
effective SEO strategy. This is because Google’s algorithm
relies a lot on inbound links to determine website’s authority that
influences organic search rankings.
Links are a critical factor in increasing brand visibility and referral traffic. In spite of this, a recent survey indicates that only 62 percent of all marketers are engaged in link building. So, why some marketers avoid pursuing this strategy? Andrew Dyhan, the Customer Success Manager of Digital Services explains the factors, which make link building a critical aspect of SEO.
Fear of Google
penalization is the primary reason why many marketers
avoid building links. It is quite fair, however, in most cases, this
threat is overrated. Google’s penalization bases on Google’s Penguin
update. According to this update, if you build links that violate
Google’s terms of service, the search engine will respond in a
form of burying your website in a deep sea of content where users won’t
find you. This translates into less traffic and low rankings.
So, what are the unhealthy links that earn you penalties?
Links from bad sites
low-authority sources and spammy sites are the first type
of links you want to avoid. At the most basic level, the value of a link
is determined by the authority of the site it emanated from.
In other words, if you source links from high authority sites, you
command more authority on your site. On the other hand, if you
build links from a questionable or spammy site, the authority of your
domain takes a beating.
Contextually inappropriate links
Unlike the past,
Google’s algorithms are advanced enough to detect
how content fits the needs of the audience and the natural use of
language. In simple words, if you link to content that has nothing
to do with the piece, Google will flag you down and punish you for
trying to mislead users.
Initially, it was common
practice to include keywords in the anchor
text of your links. Today, doing that might get you penalized by Google
because SEO enthusiasts started abusing the practice by stuffing
keywords into links where they did not belong. In spite of this, you
can still optimize your anchor text, however, it must be
contextually appropriate for the link.
Spammy links include
posting comments on a forum with just a link
to your website and no other content. Why? Because the main goal of such
a link is to drive traffic to your site without giving any
value to readers. In addition, Google can penalize you if you place
links on the same pages of the site repeatedly.
Links from schemes
Any link you build with
the sole intention of driving traffic to
your site without giving the user any valuable information is suspect
and subject to Google penalties. There are a number of such
links including reciprocal links and link wheels where the intention is
to pass authority to sites within the wheel. To find out
what Google considers as link schemes, read their article on the subject
to avoid getting in trouble with the search engine.
Other techniques of manipulating site rankings
Normally, Google’s main
aim is to reduce the possibility of SEO
enthusiasts manipulating their site rankings using links. As long as
you are using links in a way that is beneficial to users, there
is nothing to worry about. However, if you are trying to use underhand
methods to drive traffic and manipulate rankings, you are
setting yourself up for Google’s penalization on your site.
Ultimately, an official Google penalty is a manual action similar to blacklisting. This is what strikes fear in every webmaster but most of the time, Google’s heavy hand only comes down on intentional offenders. However, webmasters often panic and think they have been penalized when their site experience a drop in traffic. But if you avoid running afoul of Google’s mode of operation or work with the specialized SEO services provider, who technically track your website’s performance, you will have nothing to worry about.
A few weeks ago, rankings for pages on a key section of my
site dropped an average of a full position in one day. I’ve been an SEO
for 7 years now, but I still ran around like a chicken with my head cut
off, panicked that I wouldn’t be able to figure out my mistake. There
are so many things that could’ve gone wrong: Did I or my team
unintentionally mess with internal link equity? Did we lose links? Did
one of Google’s now-constant algorithm updates screw me over?
Since the drop happened to a group of pages, I made the
assumption it had to do with our site or page structure (it didn’t). I
wasted a good day focused on technical SEO. Once I realized my error, I
decided to put together a guide to make sure that next time, I’ll do my
research effectively. And you, my friends, will reap the rewards.
First, make sure there’s actually a rankings change
Okay, I have to start with this: before you go down this rabbit hole of rankings changes, make sure there was actually a rankings change.
Your rankings tracker may not have localized properly, or have picked
up on one of Google’s rankings experiments or personalization.
Has organic traffic dropped to the affected page(s)?
We’re starting here because this is the most reliable data you
have about your site. Google Search Console and rankings trackers are
trying to look at what Google’s doing; your web analytics tool is just
tracking user counts.
Compare organic traffic to the affected page(s) week-over-week
both before and after the drop, making sure to compare similar days of
Is the drop more significant than most week-over-week changes?
Is the drop over a holiday weekend? Is there any reason search volume could’ve dropped?
Use the Search Analytics section to see clicks, impressions, and average position for a given keyword, page, or combo.
Does GSC show a similar rankings drop to what you saw in your
rankings tracker? (Make sure to run the report with the selected
Does your rankings tracker show a sustained rankings drop?
I recommend tracking rankings daily for your important keywords,
so you’ll know if the rankings drop is sustained within a few days.
If you’re looking for a tool recommendation, I’m loving Stat.
If you’ve just seen a drop in your rankings tool and your
traffic and GSC clicks are still up, keep an eye on things and try not
to panic. I’ve seen too many natural fluctuations to go to my boss as
soon as I see an issue.
But if you’re seeing that there’s a rankings change, start going through this guide.
Figure out what went wrong
1. Did Google update their algorithm?
Google rolls out a
new algorithm update at least every day, most silently. Good news is,
there are leagues of SEOs dedicated to documenting those changes.
Are there any SEO articles or blogs talking about a change around the date you saw the change? Check out:
Do you have any SEO friends who have seen a change? Pro tip: Make friends with SEOs who run sites similar to yours, or in your industry. I can’t tell you how helpful it’s been to talk frankly about tests I’d like to run with SEOs who’ve run similar tests.
If this is your issue…
bad news here is that if Google’s updated their algorithm, you’re going
to have to change your approach to SEO in one way or another.
next move is to put together a strategy to either pull yourself out of
this penalty, or at the very least to protect your site from the next
2. Did your site lose links?
Pull the lost links report from Ahrefs or Majestic. They’re the most reputable link counters out there, and their indexes are updated daily.
Has there been a noticeable site-wide link drop?
Has there been a noticeable link drop to the page or group of pages you’ve seen a rankings change for?
Has there been a noticeable link drop to pages on your site that link to the page or group of pages you’ve seen a rankings change for?
Screaming Frog on your site to find which pages link internally to the
affected pages. Check internal link counts for pages one link away from
Has there been a noticeable link drop to inbound links to the page or group of pages you’ve seen a rankings change for?
Use Ahrefs or Majestic to find the sites that link to your affected pages.
Have any of them suffered recent link drops?
Have they recently updated their site? Did that change their URLs, navigation structure, or on-page content?
If this is your issue…
The key here is to figure out who you lost links from and why, so you can try to regain or replace them.
Can you get the links back?
Do you have a relationship with the site owner who provided the links? Reaching out may help.
the links removed during a site update? Maybe it was accidental. Reach
out and see if you can convince them to replace them.
links removed and replaced with links to a different source? Investigate
the new source — how can you make your links more appealing than
theirs? Update your content and reach out to the linking site owner.
Can you convince your internal team to invest in new links to quickly replace the old ones?
Show your manager(s) how much a drop in link count affected your rankings and ask for the resources it’ll take to replace them.
will be tricky if you were the one to build the now-lost links in the
first place, so if you did, make sure you’ve put together a strategy to
build longer-term ones next time.
3. Did you change the affected page(s)?
you or your team changed the affected pages recently, Google may not
think that they’re as relevant to the target keyword as they used to be.
Did you change the URL?
DO NOT CHANGE URLS. URLs act as unique identifiers for Google; a new URL means a new page, even if the content is the same.
Has the target keyword been removed from the page title, H1, or H2s?
Is the keyword density for the target keyword lower than it used to be?
Can Google read all of the content on the page?
Look at Google’s cache by searching for cache:www.yourdomain.com/your-page to see what Google sees.
Can Google access your site? Check Google Search Console for server and crawl reports.
If this is your issue…
Good news! You can probably revert your site and regain the traffic you’ve lost.
If you changed the URL, see if you can change it back. If not, make sure the old URL is 301 redirecting to the new URL.
you changed the text on the page, try reverting it back to the old
text. Wait until your rankings are back up, then try changing the text
again, this time keeping keyword density in mind.
can’t read all of the content on your page, THIS IS A BIG DEAL.
Communicate that to your dev team. (I’ve found dev teams often
undervalue the impact of SEO, but “Googlebot can’t read the page” is a
pretty understandable, impactful problem.)
4. Did you change internal links to the affected page(s)?
you or your team added or removed internal links, that could change the
way link equity flows through your site, changing Google’s perceived
value of the pages on your site.
Did you or your team recently update site navigation anywhere? Some common locations to check:
Suggested blog posts
Did you or your team recently update key pages on your site that link to target pages? Some pages to check:
Top category pages
Linkbait blog posts or articles
Did you or your team recently update anchor text on links to target pages? Does it still include the target keyword?
If this is your issue…
out how many internal links have been removed from pointing to your
affected pages. If you have access to the old version of your site, run
Screaming Frog (or a similar crawler) on the new and old versions of
your site so you can compare inbound link counts (referred to as inlinks
in SF). If you don’t have access to the old version of your site, take a
couple of hours to compare navigation changes and mark down wherever
the new layout may have hurt the affected pages.
How you fix the
problem depends on how much impact you have on the site structure. It’s
best to fix the issue in the navigational structure of the site, but
many of us SEOs are overruled by the UX team when it comes to primary
navigation. If that’s the case for you, think about systematic ways to
add links where you can control the content. Some common options:
In the product description
In blog posts
In the footer (since UX will generally admit, few people use the footer)
in mind that removing links and adding them back later, or from
different places on the site, may not have the same effect as the
original internal links. You’ll want to keep an eye on your rankings,
and add more internal links than the affected pages lost, to make sure
you regain your Google rankings.
5. Google’s user feedback says you should rank differently.
is using machine learning to determine rankings. That means they’re at
least in part measuring the value of your pages based on their
click-through rate from SERPs and how long visitors stay on your page
before returning to Google.
Did you recently add a popup that is increasing bounce rate?
Is the page taking longer to load?
Check server response time. People are likely to give up if nothing happens for a few seconds.
Check full page load. Have you added something that takes forever to load and is causing visitors to give up quickly?
Have you changed your page titles? Is that lowering CTR? (I optimized page titles in late November, and that one change moved the average rank of 500 pages up from 12 to 9. One would assume things can go in reverse.)
If this is your issue…
If the issue is a new popup, do your best to convince your marketing team to test a different type of popup. Some options:
Stable banners at the top or bottom of the page (with a big CLICK ME button!)
your page is taking longer to load, you’ll need the dev team. Put
together the lost value from fewer SEO conversions now that you’ve lost
some rankings and you’ll have a pretty strong case for dev time.
you’ve changed your page titles, change them back, quick! Mark this
test as a dud, and make sure you learn from it before you run your next
6. Your competition made a change.
have changed rank not because you did anything, but because your
competition got stronger or weaker. Use your ranking tool to identify
competitors that gained or lost the most from your rankings change. Use a
tool like Versionista (paid, but worth it) or Wayback Machine (free, but spotty data) to find changes in your competitors’ sites.
Which competitors gained or lost the most as your site’s rankings changed?
Has that competition gained or lost inbound links? (Refer to #2 for detailed questions)
Has that competition changed their competing page? (Refer to #3 for detailed questions)
Has that competition changed their internal link structure? (Refer to #4 for detailed questions)
that competition started getting better click-through rates or dwell
time to their pages from SERPs? (Refer to #5 for detailed questions)
If this is your issue…
probably fuming, and your managers are probably fuming at you. But
there’s a benefit to this: you can learn about what works from your
competitors. They did the research and tested a change, and it paid off
for them. Now you know the value! Imitate your competitor, but try to do
it better than them this time — otherwise you’ll always be playing
Now you know what to do
You may still be panicking, but hopefully
this post can guide you to some constructive solutions. I find that the
best response to a drop in rankings is a good explanation and a plan.
And, to the Moz community of other brilliant SEOs: comment below if you see something I’ve missed!
How can you effectively apply link metrics like Domain Authority and Page Authority alongside your other SEO metrics? Where and when does it make sense to take them into account, and what exactly do
they mean? In today’s Whiteboard Friday, Rand answers these questions
and more, arming you with the knowledge you need to better understand
and execute your SEO work.
Click on the whiteboard image above to open a high-resolution version in a new tab!
Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about when and how to use Domain Authority and Page Authority and link count metrics.
many of you have written to us at Moz over the years and certainly I go
to lots of conferences and events and speak to folks who are like,
“Well, I’ve been measuring my link building activity with DA,” or, “Hey,
I got a high DA link,” and I want to confirm when is it the right time
to be using something like DA or PA or a raw link count metric, like
number of linking root domains or something like Spam Score or a traffic estimation, these types of metrics.
I’m going to walk you through kind of these three — Page Authority,
Domain Authority, and linking root domains — just to get a refresher
course on what they are. Page Authority and Domain Authority are
actually a little complicated. So I think that’s worthwhile. Then we’ll
chat about when to use which metrics. So I’ve got sort of the three
primary things that people use link metrics for in the SEO world, and
we’ll walk through those.
So to start, Page Authority is basically — you can see I’ve
written a ton of different little metrics in here — linking URLs,
linking root domains, MozRank, MozTrust,
linking subdomains, anchor text, linking pages, followed links, no
followed links, 301s, 302s, new versus old links, TLD, domain name,
branded domain mentions, Spam Score, and many, many other metrics.
Basically, what PA is, is it’s every metric that we could
possibly come up with from our link index all taken together and then
thrown into a model with some training data. So the training
data in this case, quite obviously, is Google search results, because
what we want the Page Authority score to ultimately be is a predictor of
how well a given page is going to rank in Google search results
assuming we know nothing else about it except link data. So this is
using no on-page data, no content data, no engagement or visit data,
none of the patterns or branding or entity matches, just link data.
So this is everything we possibly know about a page from its link
profile and the domain that page is on, and then we insert that in as
the input alongside the training data. We have a machine learning model
that essentially learns against Google search results and builds the
best possible model it can. That model, by the way, throws away some of
this stuff, because it’s not useful, and it adds in a bunch of this
stuff, like vectors or various attributes of each one. So it might say,
“Oh, anchor text distribution, that’s actually not useful, but Domain
Authority ordered by the root domains with more than 500 links to them.”
I’m making stuff up, right? But you could have those sorts of filters
on this data and thus come up with very complex models, which is what
machine learning is designed to do.
All we have to worry about
is that this is essentially the best predictive score we can come up
with based on the links. So it’s useful for a bunch of things. If we’re
trying to say how well do we think this page might rank independent of
all non-link factors, PA, great model. Good data for that.
Domain Authority is once you have the PA model in your head and
you’re sort of like, “Okay, got it, machine learning against Google’s
results to produce the best predictive score for ranking in Google.” DA is just the PA model at the root domain level. So
not subdomains, just root domains, which means it’s got some weirdness.
It can’t, for example, say that randfishkin.blogspot.com is different
than www.blogspot.com. But obviously, a link from www.blogspot.com
is way more valuable than from my personal subdomain at Blogspot or
Tumblr or WordPress or any of these hosted subdomains. So that’s kind of
an edge case that unfortunately DA doesn’t do a great job of
What it’s good for is it’s relatively well-suited to be
predictive of how a domain’s pages will rank in Google. So it removes
all the page-level information, but it’s still operative at the domain
level. It can be very useful for that.
Linking Root Domain
Then linking root domains is the simplest one. This is basically a count of all the unique root domains with at least one link on them that point to a given page or a site.
So if I tell you that this URL A has 410 linking root domains, that
basically means that there are 410 domains with at least one link
pointing to URL A.
What I haven’t told you is whether they’re followed or no
followed. Usually, this is a combination of those two unless it’s
specified. So even a no followed link could go into the linking root
domains, which is why you should always double check. If you’re using
Ahrefs or Majestic or Moz and you hover on the whatever, the little
question mark icon next to any given metric, it will tell you what it
includes and what it doesn’t include.
When to use which metric(s)
All right. So how do we use these?
Well, for month over month link building performance, which is
something that a lot of folks track, I would actually not suggest making
DA your primary one. This is for a few reasons. So Moz’s index, which
is the only thing currently that calculates DA or a machine
learning-like model out there among the major toolsets for link data,
only updates about once every month. So if you are doing your report
before the DA has updated from the last link index, that can be quite
Now, I will say we are only a few months away from a new index
that’s going to replace Mozscape that will calculate DA and PA and all
these other things much, much more quickly. I know that’s been something
many folks have been asking for. It is on its way.
But in the meantime, what I recommend using is:
1. Linking root domains, the count of linking root domains and how that’s grown over time.
2. Organic rankings for your targeted keywords.
I know this is not a direct link metric, but this really helps to tell
you about the performance of how those links have been affected. So if
you’re measuring month to month, it should be the case that any months
you’ve got in a 20 or 30-day period, Google probably has counted and
recognized within a few days of finding them, and Google is pretty good
at crawling nearly the whole web within a week or two weeks. So this is
going to be a reasonable proxy for how your link building campaign has
helped your organic search campaign.
3. The distribution of Domain Authority.
So I think, in this case, Domain Authority can be useful. It wouldn’t
be my first or second choice, but I think it certainly can belong in a
link building performance report. It’s helpful to see the high DA links
that you’re getting. It’s a good sorting mechanism to sort of say,
“These are, generally speaking, more important, more authoritative
4. Spam Score I like
as well, because if you’ve been doing a lot of link building, it is the
case that Domain Authority doesn’t penalize or doesn’t lower its score
for a high Spam Score. It will show you, “Hey, this is an authoritative
site with a lot of DA and good-looking links, but it also looks quite
spammy to us.” So, for example, you might see that something has a DA of
60, but a Spam Score of 7 or 8, which might be mildly concerning. I
start to really worry when you get to like 9, 10, or 11.
Whether you’re brand new to link building or have been
doing it for a while, we’re sure you’ll find something useful in this
guide. The landscape of SEO and link building is always changing, and
today, the importance of building high-quality links has never been
higher. The need to understand and implement high-quality campaigns is
essential if you’re going to compete and thrive online, and that isn’t
going to change any time soon. This guide is designed to get you going
quickly and in the right direction. There is a lot to take in, but we’ve
broken everything up into easy-to-digest chapters and have included
lots of examples along the way. We hope you enjoy The Beginner’s Guide
to Link Building!
Link building is the process of acquiring hyperlinks from
other websites to your own. A hyperlink (usually just called a link) is a
way for users to navigate between pages on the internet. Search engines
use links to crawl the web; they will crawl the links between the
individual pages on your website, and they will crawl the links between
entire websites. There are many techniques for building links, and while
they vary in difficulty, SEOs tend to agree that link building is one
of the hardest parts of their jobs. Many SEOs spend the majority of
their time trying to do it well. For that reason, if you can master the
art of building high-quality links, it can truly put you ahead of both
other SEOs and your competition.
Why is link building important for SEO?
The anatomy of a hyperlink
In order to understand the importance of link building, it’s
important to first understand the basics of how a link is created, how
the search engines see links, and what they can interpret from them.
Start of link tag: Called an
anchor tag (hence the “a”), this opens the link tag and tells search
engines that a link to something else is about to follow.
Link referral location: The
“href” stands for “hyperlink referral,” and the text inside the
quotation marks indicates the URL to which the link is pointing. This
doesn’t always have to be a web page; it could be the address of an
image or a file to download. Occasionally, you’ll see something other
than a URL, beginning with a # sign. These are local links, which take
you to a different section of the page you’re already on.
Visible/anchor text of link:
This is the little bit of text that users see on the page, and on
which they need to click if they want to open the link. The text is
usually formatted in some way to make it stand out from the text that
surrounds it, often with blue color and/or underlining, signaling to
users that it is a clickable link.
Closure of link tag: This signals the end of the link tag to the search engines.
What links mean for search engines
There are two fundamental ways that the search engines use links:
To discover new web pages
To help determine how well a page should rank in their results
Once search engines have crawled pages on the web, they can
extract the content of those pages and add it to their indexes. In this
way, they can decide if they feel a page is of sufficient quality to be
ranked well for relevant keywords (Google created a short video
to explain that process). When they are deciding this, the search
engines do not just look at the content of the page; they also look at
the number of links pointing to that page from external websites and the
quality of those external websites. Generally speaking, the more
high-quality websites that link to you, the more likely you are to rank
well in search results.
Links as a ranking factor are what allowed Google to start to
dominate the search engine market back in the late 1990s. One of
Google’s founders, Larry Page, invented PageRank,
which Google used to measure the quality of a page based in part on the
number of links pointing to it. This metric was then used as part of
the overall ranking algorithm and became a strong signal because it was a
very good way of determining the quality of a page.
It was so effective because it was based upon the idea that a
link could be seen as a vote of confidence about a page, i.e., it
wouldn’t get links if it didn’t deserve to. The theory is that when
someone links to another website, they are effectively saying it is a
good resource. Otherwise, they wouldn’t link to it, much in the same way
that you wouldn’t send a friend to a bad restaurant.
However, SEOs soon discovered how to manipulate PageRank and
search results for chosen keywords. Google started actively trying to
find ways to discover websites which were manipulating search results,
and began rolling out regular updates which were specifically aimed at filtering out websites that didn’t deserve to rank.
This has also led to Google starting to discount a number of
link building techniques that were previously deemed fine, for example,
submitting your website to web directories and getting a link in return.
This was a technique that Google actually recommended at one point, but
it became abused and overused by SEOs, so Google stopped passing as
much value from that sort of links.
More recently, Google has actively penalized the rankings of
websites who have attempted such overuse of these techniques—often
referred to as over-optimisation—in their link building. Google’s
regular Penguin updates
are one such example. Knowing which link building techniques to avoid
and stay within Google’s guidelines is an important subject that we’ll
discuss later in this guide.
We don’t know the full algorithm that Google uses to
determine its search results—that’s the company’s “secret sauce.”
Despite that fact, the general consensus among the SEO community
(according to the 2015 Moz search ranking factors survey) is that links still play a big role in that algorithm. They represent the largest two slices of the pie chart below.