Building a website and
getting traffic are two different things.
Launching online store is an easy task but it may get difficult when you
are trying to get customers in through the internet. This
problem becomes intense as competitors limit our chances of getting to
the market. When you have conducted research and set up a
website, you need to gain recognition by search engines such as Google,
so you will need search engine optimization. This process
involves modifying your site’s features according to the requirements
of search engine algorithms in order to get a top position on
specific key terms.
Jason Adler, the Customer Success Manager of gives valuable tips on how to make your content authoritative and website optimized to help you rank higher in search engines.
Improve the speed and response of your site
Work on your meta titles and descriptions to be more compelling and actionable
Use longer, more unique product descriptions and long tail keywords
Add customer reviews to attract more clients
Post content mentioning what customers are searching
When doing SEO, several tools could be beneficial to your efforts. These tools include:
1. PicMonkey and Kraken. Combination of these tools gives you
web design optimization features. They can edit and reduce the size of pictures in the internet pages increasing the speed and
response of your website.
This tool helps those who have concerns about their
customer feedback. GetFiveStars will send an email to your clients with
the necessary information. It will intercept positive or negative
feedback on the internet as well as improve your ranking.
3. Autopilot. This is a cheap tool which can automate some aspects
of your marketing campaigns. This app comes with a 30-day free trial followed by a 25usd monthly subscription.
This tool can evaluate the meta titles and descriptions
of a website. You can use it to make descriptions compelling and guide
the content at the webpage. Brian Dean and Backlinko developed
this tool. It is a premium service, and users pay to get their click
through rates getting high.
5. Semalt Keyword Suggestions.
This is a keyword search tool which
can provide long tail keywords. It can be useful in giving you the
search terms shoppers are using to buy from the internet. It can
give you keywords to make a listing or create online content.
Building a website and
getting a flow of visitors can be difficult
to achieve. With SEO, things turn out to be a bit simpler, and there is
some hope. Optimizing your website through SEO tactics like
the content selection and backlinking can make your site get traffic.
Using eCommerce SEO tools, you can
establish a consistent effective practice to ensure that your website
will remain on top search positions and, consequently, receive
it has become a common perspective that as long as you have good
content, the rest will take care of itself. It is true that SEO and
Content marketing are inextricably linked good content is necessary
for SEO. However, for a successful optimization campaign, you will need
much more than just good content.
The Customer Success Manager of
Digital Services, Nik Chaykovskiy explains why quality content is not enough for running SEO efficiently.
In theory, the idea is
accurate. All search engines strive to provide
their users’ base with the best content and so they have algorithms
that rank good content higher. By producing more content, you can
have more indexable topics which will cover many search requests.
Furthermore, as long as the content is good, more users will visit
On the other hand, if
you have no content at all, you do not stand
a chance of having efficient SEO. If the material is poor and not
trustworthy, your results will be similar. For your content to be good,
it has to meet a variety of standards to prove it is good. This ranges
from uniqueness to practicality, relevance and entertainment.
Let’s assume that your
content is good and that you produce content
on a regular basis. This content is futile so long as it is not
visible. If your users are not aware about your work, then they cannot
read or view it. With the advances of Google, it still relies on the
feedback of its users to help ranking the quality of content. Thus,
if these users cannot view your work, Google cannot judge the quality of
Much of this feedback
is usually provided through shares and links,
which Google considers as trustworthy. By earning lots of links, you
can be seen as a good source of content and as a result, you will
rise in the search rankings. However, these links are not earned by good
content only. You have to take initiative by promoting and
syndicating your links, sometimes by building some manual links as well.
Having good content on
your site is a good start. However, you should
never neglect the technical factors that are necessary for your site to
rank highly in search engine results. Most template sites, like
WordPress and Wix come equipped with a technical structure which makes
it easier to be indexed by a search engine.
This, however, is not
enough too. You will need to create Meta data
and title tags, improve the security of your site, update your
robits.txt file, create and update a sitemap and increase the speed of
your site if you want your site to be in fighting shape.
You can only tap into
the true power of content if you are able to
integrate it with a variety of other marketing strategies. For instance,
you can use email marketing and social media marketing to the
benefit of your content. By using these strategies in conjunction with
each other, you stand the better chance to take the most of your
content. SEO is a complex strategy. It cannot be boiled down to a single
If you are a website
owner who wants to get the best out of SEO, you need
to understand that the timing of your SEO strategies and exercises is
just as important as the strategies or exercises themselves.
Whether you are doing a
website launch or redesign, the best time to
start doing SEO is before you release it to search engines for indexing
and ranking. In other words, SEO work and website design
should start at the same time.
Some aspects of SEO are unique to a new website, some are unique to
redesigned a website and others are common to both cases. Nik Chaykovskiy, the expert of
Digital Services explains the features of doing SEO for various purposes.
SEO For New Websites
For new websites, SEO
should run concurrently with web design and user
experience (UX). This means, the SEO strategy will determine the kind
of content that will be used and where it will be placed on
the website. Success in SEO is, thus, a result of the effort put
towards intertwining web design, UX, and SEO. Starting SEO during
the web design process accelerates the results of SEO.
However, this has
another implication to your business: your UX and
web design experts must understand SEO and your SEO guys must understand
UX and web design. These experts might need training in four
critical areas (design, UX, front end development, and SEO). But the
effort is definitely worthwhile. It’ll be easier to brainstorm and
the chances of the new website being successful are significantly
SEO Tips For Website Redesign
If you’ve already
launched your site, the best thing to start with is
an SEO audit. An SEO audit helps you to know your SEO strengths, such
as the pages that are ranked highly by search engines. You also
know what SEO aspects make the page perform well so that you can
maintain/preserve or replicate those competencies as you redesign
Also important during
website redesign is having a 301 redirect plan.
As the old URLs are being changed, the redesign team should carefully
redirect traffic to the new URLs so that traffic to the site
is not lost once the redesigned website is fully functional. It would
be, and has been for a few businesses, lethal for the business
if this SEO aspect is forgotten. Imagine what would happen if your
site’s visitors were greeted by a “404 – Page Not Found” error when
they tried accessing your site’s webpages.
SEO Is An Ongoing Activity
For both new websites
and redesigned ones, SEO is an ongoing activity.
In today’s fast-paced technological atmosphere, most of the SEO aspects
change within short time spans. At one time you’ll need to
install an SEO plugging and at another time formulate your content
according to the current trend. Failure to roll with the SEO
trends would be a definite death of the business, especially if sales
heavily rely on digital marketing.
Know the right SEO steps
to take now to make your SEO strategy and
website more successful. If your site is already running, engage an SEO
professional and make sure a thorough SEO audit is done. This
is the best way to know what actions best suit your website with regard
to SEO success. And if you are planning on building a new site,
involve your SEO team right from the start. SEO done during the early
stages of website development helps to avoid SEO mishaps in the
structure of the site that could cost you time and money in the future.
Rumors are flying about Google’s upcoming mobile-friendly update, and
bits of reliable information have come from several sources. My
and I wanted to cut through the noise and bring online marketers a
clearer picture of what’s in store later this month. In this post,
you’ll find our answers to nine key questions about the update.
1. What changes is Google making to its algorithm on April 21st?
Answer: Recently, Google has been rolling out lots of
changes to apps, Google Play, the presentation of mobile SERPS, and some
of the more advanced development guidelines that impact mobile; we
believe that many of these are in preparation for the 4/21 update.
Google has been downplaying some of these changes, and we have no
exclusive advanced knowledge about anything that Google will announce on
4/21, but based on what we have seen and heard recently, here is our
best guess of what is coming in the future (on 4/21 or soon thereafter):
We believe Google will launch a new mobile crawler (probably with an
Android user-agent) that can do a better job of crawling single-page web
apps, Android apps, and maybe even Deep Links in iOS apps. The new
Mobile-Friendly guidelines that launched last month focus on exposing JS
and CSS because Android apps are built in Java, and single-page web
Some example sites that use Responsive Design well in a single-page app architecture are:
Google has also recently been pushing for more feeds from Trusted
Partners, which are a key component of both mobile apps and single-page
web apps since Phantom JS and Prerender IO (and similar technologies)
together essentially generate crawlable feeds for indexing single-page
web apps. We think this increased focus on JS, CSS, and feeds is also
the reason why Google needs the
additional mobile index that Gary Illyes mentioned in his “Meet the Search Engines” interview
at SMX West a couple weeks ago, and why suddenly Google has been
talking about apps as “first class citizens,” as called out by Mariya
Moeva in the title of her SMX West presentation.
A new mobile-only index to go with the new crawler also makes sense
because Google wants to index and rank both app content and deep links
to screens in apps, but it does not necessarily want to figure them into
the desktop algorithm or slow it down with content that should never
rank in a desktop search. We also think that the recent increased focus
on deep links and the announcement from Google about Google Play’s
new automated and manual review process
are related. This announcement indicates, almost definitively, that
Google has built a crawler that is capable of crawling Android apps. We
believe that this new crawler will also be able to index more than one
content rendering (web page or app screen data-set) to one URL/URI and
it will probably will focus more on feeds, schema and sitemaps for its
own efficiency. Most of the native apps that would benefit from deep
linking are driven by data feeds, and crawling the feeds instead of the
apps would give Google the ability to understand the app content,
especially for iOS apps, (which they are still not likely able to
crawl), without having to crawl the app code. Then, it can crawl the
deep-linked web content to validate the app content.
FYI: Garry Illyes mentioned that Google is retiring their
old AJAX indexing instructions, but did not say how they would be replaced, except to specify in a Google+
post that Google would not click links to get more content. Instead,
they would need an OnLoad event to trigger further crawling. These
webmaster instructions for making AJAX crawlable were often relied on as
a way to make single-page web apps crawlable, and we think that feeds
will play a role here, too, as part of the replacement. Relying more
heavily on feeds also makes it easier for Google to scrape data directly
into SERPS, which they have been doing more and more. (See the appendix
of this slide deck,
starting on slide 30, for lots of mobile examples of this change in
play already.) This probably will include the ability to scrape forms
directly into a SERP, à la the form markup for auto-complete that Google just announced.
We are also inclined to believe that the use of the new
“Mobile-Friendly” designation in mobile SERPS may be temporary, as long
as SEOs and webmasters feel incentivized to make their CSS and
“Mobile-Friendly” in the SERP is a bit clunky, and takes up a lot of
space, so Google may decide switch to something else, like the
“slow” tag shown
to the right, originally spotted in testing by Barry Schwartz. In fact,
showing the “Slow” tag might make sense later in the game, after most
webmasters have made the updates, and Google instead needs to create a
more serious and impactful negative incentive for the stragglers. (This
is Barry’s image; we have not actually seen this one yet).
In terms of the Mobile-Friendly announcement, it is surprising that
Google has not focused more on mobile page speed, minimizing redirects
and avoiding mobile-only errors—their historical focus for mobile SEO.
This could be because page speed does not matter as much in the
evaluation of content if Google is getting most of its crawl information
from feeds. Our guess is that things like page speed and load time will
rebound in focus after 4/21. We also think mobile UX indicators that
are currently showing at the bottom of the Google PageSpeed tool (at the
bottom of the “mobile” tab) will play into the new mobile algorithm—we
have actually witnessed Google testing their inclusion in the
Mobile-Friendly tool already, as shown below, and of course, they were
recently added to everyone’s Webmaster Tools reports. It is possible
pages are in the new index as possible at launch.
2. If my site is not mobile-friendly, will this impact my desktop rankings as well?
Answer: On a panel at SMX Munich (2 weeks after SMX
West) Zineb from Google answered ‘no’ without hesitation. We took this
as another indication that the new index is related to a new crawler
and/or a major change to the infrastructure they are using to parse,
index, and evaluate mobile search results but not desktop results. That
said, you should probably take some time soon to make sure that
your site works—at least in a passable way—on mobile devices, just in
case there are eventual desktop repercussions (and because this is a
user experience best practice that can lead to other improvements that are still desktop ranking factors, such as decreasing your bounce rate).
3. How much will mobile rankings be impacted?
Answer: On the same panel at SMX Munich (mentioned
above), Zineb said that this 4/21 change will be bigger than the Panda
and Penguin updates. Again, we think this fits well with an
infrastructure change. It is unclear if all mobile devices will be
impacted in the change or not. The change might be more impactful for
Android devices or might impact Android and iOS devices equally—though
currently we are seeing significant differences between iOS and Android
for some types of search results, with more significant changes
happening on Android than on iOS.
Deep linking is a key distinction between mobile SERPs on the Android
OS and SERPs on iOS (currently, SERPs only display Android app deep
links, and only on Android devices). But there is reason to believe this
gap will be closing. For example, in his recent Moz post and in his
presentation at SMX West, Justin Briggs mentioned that a few sample
iOS deep links were validating in Google’s deep link tool.
This may indicate that iOS apps with deep links will be easier to
surface in the new framework, but it is still possible that won’t make
it into the 4/21 update. It is also unclear whether or not Google will
maintain its stance on tablets being more like desktop experiences than
they are like mobile devices, and what exactly Google is considering
“mobile.” What we can say here, though, is that Android tablets DO
appear to be including the App Pack results, so we think they will
change their stance here, and start to classify tablets as mobile on
For small sites on common CMSes (think WordPress), this can be
accomplished. If that’s you, PageSpeed Insights is a great place to
start. For most sites, a perfect score isn’t realistic. So where do we
That’s what this post is about. I want to make three points:
Latency can hurt load times more than bandwidth
PageSpeed Insights scores shouldn’t be taken at face value
Improvement starts with measurement, goal setting, and prioritization
I’m writing with SEO practitioners in mind. I’ll skip over some
of the more technical bits. You should walk away with enough perspective
to start asking the right questions. And you may make better
recommendations as a result.
Disclaimer: HTTP2 improves some of the issues
discussed in this post. Specifically, multiple requests to the same
server are less problematic. It is not a panacea.
Latency can hurt load times more than bandwidth
first look at PageSpeed Insights’ rules could make you think it’s all
about serving fewer bytes to the user. Minify, optimize, compress. Size
is only half the story. It also takes take time for your request simply
to reach a server. And then it takes time for the server to respond to
What happens when you make a request?
If a user types a
URL into a browser address bar and hits enter, a request is made. Lots
of things happen when that request is made. The very last part of that
is transferring the requested content. It’s only this last bit that is
affected by bandwidth and the size of the content.
Fulfilling a request requires (more or less) these steps:
Find the server
Connect to the server
Wait for a response
Each of these steps takes time, not just the last. The
first three are independent of file size; they are effectively constant
costs. These costs are incurred with each request regardless of whether
the payload is a tiny, minified CSS file or a huge uncompressed image.
Why does it take time to get a response?
The factor we
can’t avoid is that network signals can’t travel faster than the speed
of light. That’s a theoretical maximum; in reality, it will take longer
than that for data to transfer. For instance, it takes light about 40ms for a round trip between Paris and New York.
If it takes twice that time for data to actually cross the Atlantic,
then the minimum time it will take to get a response from a server is
This is why CDNs are commonly used. CDNs put servers physically
closer to users, which is the only way to reduce the time it takes to
reach the server.
The life of a request, measured by Chrome Dev Tools.
All of the values in the red box are what I’m considering
“latency.” They total about 220ms. The actual transfer of content took
0.7ms. No compression or reduction of filesize could help this; the only
way to reduce the time taken by the request is to reduce latency.
Don’t we need to make a lot of requests to load a page?
take more than one request to load all of the content necessary to
render a page. If that URL corresponded to a webpage, the browser will
usually discover that it needs to load more resources to render the
go through the same steps listed above to load each of these files.
Fortunately, once a server has been found (“DNS Lookup” in the
image above), the browser won’t need to look it up again. It will still
have to connect, and we’ll have to wait for a response.
A skeptical read of PageSpeed Insights tests
All of the
PageSpeed Insights evaluations cover things that can impact site speed.
For large sites, some of them aren’t so easy to implement. And depending
on how your site is designed, some may be more impactful than others.
That’s not to say you have an excuse not to do these things — they’re
all best-practice, and they all help. But they don’t represent the whole
site speed picture.
With that in mind, here’s a “skeptical reading” of each of the PageSpeed Insights rules.
Tests focusing on reducing bandwidth use
Unless you have huge images, this might not be a big deal. This is
only measuring whether images could be further compressed — not whether
you’re loading too many.
Compression is easy. You should use it. It also may not make much
Will likely reduce overhead only by tens of KB. Latency will have a bigger impact than response size.
Will likely reduce overhead only by tens of KB. Latency will have a bigger impact than response size.
Probably not as important as consolidating JS into a single file to reduce the number of requests that have to be made.
Tests focusing on reducing latency
Leverage browser caching
Definitely let’s cache our own files. Lots of the
files that could benefit from caching are probably hosted on 3rd-party
servers. You’d have to host them yourself to change cache times.
Reduce server response time
Threshold on PSI is too high. It also tries to exclude
the physical latency of the server—instead looking only at how long it
takes the server to respond once it receives a request.
Avoid landing page redirects
A valid concern, but can be frustratingly difficult.
Having zero requests on top of the initial page load to render
above-the-fold content isn’t necessary to meet most performance goals.
Prioritize visible content
Actually kind of important.
Don’t treat these as the final word
on site performance! Independent of these tests, here are some things to
think about. Some aren’t covered at all by PageSpeed Insights, and some
are only covered halfway:
Caching content you control.
Reducing the amount of content you’re loading from 3rd-party domains.
Reducing server response time beyond the minimum required to pass PageSpeed Insights’ test.
Moving the server closer to the end user. Basically, use a CDN.
Reducing blocking requests. Ensuring you’re using HTTP2 will help here.
Instead of trusting the Pagespeed Insights tool, go ahead and
load your page in Chrome. Check out how it performs. Look at what
requests actually seem to take more time. Often the answer will be obvious: too much time will be spent loading ads, for instance.
If a perfect PageSpeed Insights score isn’t
your goal, you need to know what your goal will be. This is important,
because it allows you to compare current performance to that goal. You
can see whether reducing bandwidth requirements will actually meet your
goal, or whether you also need to do something to reduce latency (use a
CDN, handle fewer requests, load high-priority content first).
Prioritizing page speed “fixes” is important
— that’s not the only type of prioritization. There’s also the question
of what actually needs to be loaded. PageSpeed Insights does try to
figure out whether you’re prioritizing above-the-fold content. This is a
great target. It’s also not a perfect assessment; it might be easier to
split content into “critical” and “non-critical” paths, regardless of
what is ostensibly above the fold.
For instance: If your site relies on ad revenue, you might load
all content on the page and only then begin to load ads. Figuring out
how to serve less is a challenge best tackled by you and your team.
After all, PageSpeed Insights is a one-size-fits-all solution.
The story so far has been that PageSpeed
Insights can be useful, but there are smarter ways to assess and improve
site speed. A perfect score doesn’t guarantee a fast site.
Ben is a Principal Consultant who joined Distilled
in 2010. Now he focuses on leveling up our team. Through group training
and internal consultation, he guides team members as they effect change
for our clients.
Welcome to our newest installment of our educational Next Level
series! In our last episode, Jo Cameron taught you how to whip up intelligent SEO reports
for your clients to deliver impressive, actionable insights. Today, our
friendly neighborhood Training Program Manager, Brian Childs, is here
to show you an easy workflow for targeting multiple keywords with a
single page. Read on and level up!
For those who have taken any of the Moz Training Bootcamps,
you’ll know that we approach keyword research with the goal of
identifying concepts rather than individual keywords. A common term for
this in SEO is “niche keywords.” I think of a “niche” as a set of
related words or concepts that are essentially variants of the same
Let’s pretend my broad subject is: Why are cats jerks?
Some niche topics within this subject are:
Why does my cat keep knocking things off the counter?
Why does my cat destroy my furniture?
Why did I agree to get this cat?
I can then find variants of these niche topics using Keyword Explorer or another tool, looking for the keywords with the best qualities (Difficulty, Search Volume, Opportunity, etc).
By organizing your keyword research in this way, it conceptually aligns with the search logic of Google’s Hummingbird algorithm update.
Once we have niche topics identified for our subject, we then
dive into specific keyword variants to find opportunities where we can
rank. This process is covered in-depth during the Keyword Research Bootcamp class.
Should I optimize my page for multiple keywords?
The answer for most sites is a resounding yes.
If you develop a strategy of optimizing your pages for only one
keyword, this can lead to a couple of issues. For example, if a content
writer feels restricted to one keyword for a page they might develop
very thin content that doesn’t discuss the broader concept in much
useful detail. In turn, the marketing manager may end up spreading
valuable information across multiple pages, which reduces the potential
authority of each page. Your site architecture may then become larger
than necessary, making the search engine less likely to distinguish your
unique value and deliver it into a SERP.
As recent studies
have shown, a single high-ranking page can show up in dozens — if not
hundreds — of SERPs. A good practice is to identify relevant search
queries related to a given topic and then use those queries as your H2
So how do you find niche keyword topics? This is the process I use that relies on a relatively new SERP feature: the “People also ask” boxes.
How to find niche keywords
Step 1: Enter a relevant question into your search engine
Question-format search queries are great because they often generate featured snippets. Featured snippets
are the little boxes that show up at the top of search results, usually
displaying one- to two-sentence answers or a list. Recently, when
featured snippets are displayed, there is commonly another box nearby
showing “People also ask” This second box allows you to peer into the
logic of the search algorithm. It shows you what the search engine
“thinks” are closely related topics.
Step 2: Select the most relevant “People also ask” query
a look at those initial “People also ask” suggestions. They are often
different variants of your query, representing slightly different search
intent. Choose the one that most aligns with the search intent of your
target user. What happens? A new set of three “People also ask”
suggestions will populate at the bottom of the listthat are associated with the first option you chose.
This is why I refer to these as choose-your-own-adventure boxes. With
each selection, you dive deeper into the topic as defined by the search
Step 3: Find suggestions with low-value featured snippets
“People also ask” suggestion is a featured snippet. As you dig deeper
into the topic by selecting one “People also ask” after another, keep an
eye out for featured snippets that are not particularly helpful. This
is the search engine attempting to generate a simple answer to a
question and not quite hitting the mark. These present an opportunity.
Keep track of the ones you think could be improved. In the following
example, we see the Featured Snippet being generated by an article that
doesn’t fully answer the question for an average user.
Step 4: Compile a list of “People also ask” questions
you’ve explored deep into the algorithm’s contextually related results
using the “People also ask” box, make a list of all the questions you
found highly related to your desired topic. I usually just pile these
into an Excel sheet as I find them.
Step 5: Analyze your list of words using a keyword research tool
With a nice list of keywords that you know are generating featured snippets, plug the words into Keyword Explorer
or your preferred keyword research tool. Now just apply your normal
assessment criteria for a keyword (usually a combination of search
volume and competitiveness).
Step 6: Apply the keywords to your page title and heading tags
you’ve narrowed the list to a set of keywords you’d like to target on
the page, have your content team go to work generating relevant,
valuable answers to the questions. Place your target keywords as the
heading tags (H2, H3) and a concise, valuable description immediately
following those headings.
Measure niche keywords in your campaign
While your content writers are generating the content, you can update your Moz Pro campaign
and begin baselining your rank position for the keywords you’re using
in the heading tags. Add the keywords to your campaign and then label them appropriately. I recommend using a label associated with the niche topic.
For example, let’s pretend I have a business that helps people
find lost pets. One common niche topic relates to people trying to find
the phone numbers of kennels. Within that topic area, there will be
dozens of variants. Let’s pretend that I write a useful article about
how to quickly find the phone numbers of nearby animal shelters and
In this case, I would label all of the keywords I target in that
article with something like “kennel phone numbers” in my Moz Pro
campaign rankings tool.
Then, once the post is written, I can report on the average
search visibility of all the search terms I used, simply by selecting
the label “kennel phone numbers.” If the article is successful, I should
see the rank positions moving up on average, showing that I’m ranking
for multiple keywords.
Want to learn more SEO shortcuts?
If you found this kind of article helpful, consider signing up for the How to Bring SEO In-House
seminar. The class covers things like how to set up your team for
success, tips for doing research quickly, and how to report on SEO to
Can you learn SEO in an hour? Surprisingly, the answer is yes, at least when it comes to the fundamentals.
From Rand Fishkin, we present you with a six-part series of roughly ten-minute-long videos designed to deliver core SEO concepts.
This short course is perfect for a wide range of people including
beginner SEOs, clients, and team members. Each video covers an important
Link Building – Ready to dive in?
Content and Additional Resources
Part 1: SEO Strategy
Kicking things off is the man who wrote the original guide on SEO,
our friend Rand Fiskin. Covering topics like ranking for low-demand,
high-conversion keyword, or high-demand, low-competition keywords, to
building links with content. Even experienced SEOs sometimes forget
these lessons, so here’s a good place to start.
Part 2: Keyword Research
Before doing any SEO work, it’s important to get a handle on your keyword research.
Aside from helping to inform your strategy and structure your content,
you’ll get to know the needs of your searchers, the search demand
landscape of the SERPs, and what kind of competition you’re up against.
Part 3: Searcher Satisfaction
Satisfying your searchers is a big part of what it means to be successful in modern SEO. And optimal searcher satisfaction
means gaining a deep understanding of them and the queries they use to
search. In this video, Rand covers everything you need to know about how
to satisfy searchers, including the top four priorities you need to
have and tips on how to avoid pogo-sticking in the SERPs.
Part 4: Keyword Targeting & On-Page Optimization
We’ve covered strategy, keyword research, and how to satisfy searcher intent — now it’s time to tackle optimizing the On-page SEO!
In this video, Rand offers up an on-page SEO checklist to start you off
on your way towards perfectly optimized and keyword-targeted pages.
Part 5: Technical SEO
Get ready for one of the meatiest SEO topics in our series: technical SEO.
In this lesson, Rand covers essential technical topics from
crawlability to internal link structure to subfolders and far more.
Watch on for a firmer grasp of technical SEO fundamentals!
Part 6: Link Building
The final lesson deals with a topic that’s a perennial favorite among SEOs: link building. Today, learn why links are important to both SEO and to Google, how Google likely measures the value of links, and a few key ways to begin earning your own.