At a glance......
- 1 Initial Analysis
- 2 Content Strategy
- 3 Link Strategy
- 4 What did Google’s PageRank do?
- 5 Results
- 6 Conclusion
User Review( vote)
Since starting TSI’s organic seo services, I’ve realized that working with my own sites is hugely different from working with clients; especially if the website has weak foundations.
I know how I want my money sites to look, so I build them using rigorous attention to detail. But if you take a website that’s developed by somebody else without a certain level of SEO knowledge – there’s normally quite a lot of on-site and off-site to fix.
Here’s how my team broke down the initial analysis:
- Keyword Research
- On-Site Audit
- Backlink Audit
- Competitor Analysis
My team broke down keyword research into two main workflows: one is used to monitor the health of a website, and the other is for content gap analysis.
When we’re looking to track keywords for a website, we want to track some of the core terms, but also terms that are having problems. If a term is suffering from keyword cannibalization that we’re trying to fix – it’s worth tracking this daily until it’s resolved.
Since this client needed a huge content strategy, we did both a health check and initial content gap analysis. This approach included breaking down all keywords for that industry into topics of relevant terms.
In total, this process took over 20 hours and included thousands of keywords chunked into neat topics. This work later helped with choosing page titles, headings and content.
Here’s an example of how we did it:
Step 1. Search Broad Keywords
Step 2. Review Parent Topics
Step 3. Find Competitors for Parent Topics
Step 4. Reverse Engineer Competitor’s Keywords
Step 5. Exclude Outdated Keywords
There is the option to also export all of these keywords into excel documents and then filter them that way. But most of the time, a lot of the top keywords are fairly similar.
Here’s an example for the best dog food term:
- best dog food
- best dog foods
- healthiest dog food
- what is the best dog food
- top rated dog food
- best food for dogs
While each keyword is unique, they all follow a singular intent. The users are interested in finding out what are the best dog foods in the market.If you can include 1 – 2 exact match for each of these terms in the text – you’re going to have a solid foundation.
Finding all the technical and content issues with the website requires a full on-site audit. However, while big reports are easy on the eyes, it’s small changes that make the difference.
We audited the website and found a whole bunch of technical issues, from lack of breadcrumbs, poor internal link structures, bad quality anchor text and unoptimized titles.
A full on-site audit tutorial is too big for this post (perhaps coming soon), but here are some quick tips:
- Screaming Frog – A cheap way to regularly crawl your website. There are lots of ways to find errors, redirects, and missing metadata. You can also use a custom search to find all references of your keywords.
- Sitebulb – This tool is more expensive and is a monthly recurring fee. However, it gives you lots of extra data that would be impossible to spot manually and hard with Screaming Frog. An example would be empty hyperlink references.
- Site Search – By using Google’s site search (site:domain.com) and operators, you can find hundreds of issues with index management, outdated page titles, and multiple pages targeting the same keyword. There are a lot of quick wins here.
- Page Titles – If you wrote your page titles 1 – 2 years ago, you may find that they’re outdated now. A quick site search with “intitle:2018” will find all your content that is either not updated or not yet crawled by Google.
- Internal Links – A major way to pass relevance signals and authority to your core pages is through internal links. Make sure that your pages are well interlinked and you’re not using low-quality anchors from your power pages, such as “click here” or “more information”.
We focused on fixing around 5 issues at a time varying from small changes like improving accessibility, to bigger changes like introducing breadcrumbs for a custom build website.
The website had a relatively small backlink profile, which meant it lacked authority, relevance signals and entry points for crawling. It also meant that a full in-depth link analysis was unnecessary for this campaign.When a website has an algorithmic penalty, manual penalty or came in unfavorable after an update, it’s a great decision to audit your link profile page-by-page.
In this instance, the initial check revealed there was nothing to be concerned about, so we moved on to technical implementation as soon as possible.
Had the website experienced problems with the link profile, we would have done a full backlink audit to try and recover this. Here’s what to look out for:
- Link Distribution – Pointing too many links toward internal pages instead of your homepage can cause lots of issues. So make sure that you’re not overdoing it.
- Anchor Text Analysis – Using exact match, partial match and topical anchors are a great way to pass third-party relevance signals. Too many and you’ll be caught out over-optimizing, but too few and you won’t be competitive. Read more about anchor optimization.
- Referring IP Analysis – There are a finite number of IPv4 Addresses, so this isn’t often a big cause for concern. However, it’s worth making sure that you’ve not got too many links from the same IP address.
- Autonomous System Numbers – Since a server can be assigned any number of IP addresses, these systems often include an ASN. This is another way that Google could flag large numbers of websites from the same origin.
My team did a case study on how to remove an algorithmic penalty, a lot of these audits come included in any penalty removal campaign.
The difference between a search analyst and data scientist is how you approach the search engines. An analyst is focused on reviewing the SERPs and finding what is working best today, while a data scientist wants to understand how things work.
We built our team to include both since competitor analysis requires a keen eye for reviewing the SERPs and algorithm analysis requires solid data scientists.
If you want to do SEO at a high level, you’ve got to constantly be reviewing competitors. You will notice that tons of best practices get ignored in the top positions and the devil is in the details.
In this instance, we found that both more content and more links would be required for long-term success.
Building any long-term authority website in competitive industries will include both an authoritative link profile and content plan.
My team reviewed their existing content, looked at how other websites in their industry wanted to help users and then addressed these four cornerstones:
- User Intent – before we did anything, we wanted to nail the user intent on every page. This research meant that we identified three pillars of content for their site. We’ll get into this in further detail below.
- Service Pages – these pages were dedicated to explaining what service was offered, how to contact and what was included with that offering.
- Blog Content – these posts were dedicated to providing non-commercial, informative content that was interesting to the reader.
- Resource Center – this section was dedicated to giving basic information about topics in their industry. Instead of using Wikipedia for all our links to authority content, we wanted to use internal links instead.
Here’s a little bit about each section and our strategy for them:
The biggest mistake I see all the time is the simplest thing to check:
What types of content is Google ranking in the top 10 positions?
If you’re serving 10,000 words of content in a huge blog post, but Google is only interested in serving service pages with 50 words of content – you’ve missed the point.
Another commonly found problem is including too much content in a single post, when your competitors have several shorter posts.
One of the main attractions for Thailand are the yoga retreats. If you’re searching for this in America, you’re expecting to find destinations. Let’s take a look:
The first position is called Yoga Journal and includes almost no content aside from images and headings. That’s exactly what the users were looking for.
There are other websites doing a similar service and can help you make bookings. While others have written a blog and clearly labelled what they believe are the best retreats.
Notice that none of these pages include information such as “what is yoga?”. It’s because that’s not the user intent.
But these websites didn’t exclude the term because nobody is searching for it:
Instead, they created content specifically designed to rank for that keyword:
The first piece of content that we wanted to create was the service pages. These are often short in content and outline your basic service.
You want to include your contact details, list of services, unique selling points and any niche-relevant interactive content.
The other key point on service pages is that they don’t require lots of content. Often 3 – 4 bullet points with some fancy icons can go a long way.
This is something that even Google has documented.
The second piece of content that we needed to create were informational blog posts. There were a lot of opportunities for interesting pieces of content in this niche.
While the service pages were designed to answer a purchase-based user intent, this was informational. It can help boost the rankings of your service page through topical clusters, internal linking and as a buffer for tiered links.
We wanted these pieces of content to focus on ‘Top 10 Tips to deal with X’ type queries.
Our target audience for bloggers are people that would be considered middle-of-the-funnel. So instead of providing explanations, we wanted to focus on creative solutions.
Lastly, we wanted to create a resource center for all things to do with this niche. If your niche is technical, it’s a great idea to have one of these.
Let me explain…
One of the most abused SEO strategies is to link to irrelevant Wikipedia articles. This tactic is used because people heard that external links are good for ranking. Which is true.
But we’re talking long-term success for a real business, not flipping an affiliate site. So, we wanted to become that external link for others; the person you would link toward when you needed to add some context.
It also helped because we could answer a different set of questions. Where the blog would answer stuff like ‘Top 10 Tips to deal with X’, this section would focus on ‘what is X’.
It’s a very different user-intent and can sometimes be separated into a completely different parent topic. Rather than cram one page full of thousands of words in content, why not create two pages and utilize the benefits of internal linking?
The bread and butter for any campaign is a solid link strategy. This campaign is no different in that regards, but we identified a few really important link categories for the website.
To get the most out of our links my team decided to take advantage of the PageRank algorithm. We identified Tier 1 links that had high levels of trust and then sent juice to them.
I’m going to cover why we chose to take advantage of PageRank algorithm in the following section, but here’s what we found would work for the website:
- Internal Link Building – Some of the cheapest ways to improve rankings is by building links internally. This fact is why a solid content strategy is so important.
- Diversity Links – Any audiophiles know there’s a constant battle between signal-to-noise. The same is true for search engine optimization.
- Outreach Links – We wanted to reach out to big names in the industry to acquire link placements. We use a white-hat link strategy and managed to get some incredible links.
- Tier 2 Link Building – links pass more authority the higher their URL Rating. To boost the authority passed from outreach links we did some tier 2 links too.
Internal Link Building
One of the initiatives that we’ve started at TSI is to refer to internal linking as ‘internal link building.’
This is how seriously we take it.
Every great link building strategy includes internal links built in the supplementary content.
Google knows that links play a major part in their algorithm, so they monitor it closely. To bypass some of these filters you can just add supplementary content and power up those articles too.
This is great for your user-intent analysis that we’ve already done above. Now, instead of one article, you can have links pointed toward 10 different articles all clustered.
With this campaign, a simple test was to review the competitors and find common trends in the top 10 positions. One easy to spot trend was that we needed to add Breadcrumbs.However, a less obvious find is when you have multiple pages targeted with the same internal anchor text.
Finding this is really simple:
Step 1. Crawl your website with Screaming Frog
Step 2. Bulk Export all Anchor Text
Step 3. Filter By Anchor Text Containing Keyword
Step 4. Reduce Anchor Text Cannibalization
BONUS: You can use the Configuration > Custom > Search feature in Screaming Frog to parse specific keywords across your entire website.
It’s super simple, here’s a quick video by my Head of Strategies talking through the process:
My team and I noticed that directory links were something that the competition had that we didn’t. This is an easy win and helped us to accelerate the link profile rapidly.
Not only did we get to create tons of new entry points to the website, improve anchor text ratios and link distribution; we also gained link velocity.
This helped to quickly excel the website’s rankings and was all built within the first month.
Tier 1 – Outreach Links
After a lot of research, my team decided that very small numbers of high-quality links were going to be better for trust in this niche.
We targeted extremely specific websites that built their entire domain around each keyword we targeted on the service pages.
For example, if your website is in the fitness niche, there are going to be websites focused on weightlifting, others on running, and others on diet or wellbeing. Rather than matching up the running domains with a weightlifting page, try to keep them on topic.
This looks the most natural, so it’s the least likely to be picked up by Google.
The trick was to write the content ourselves and focus on providing external links that made sense to the piece of content. It’s a form of guest posting, but because these sites have lots of traffic – content quality needs to be better than algorithmically friendly.
You can read these guest posting tips to get ahead of the game.
It also provided us the opportunity to tailor our content relevance as much as possible – but the real magic comes in the next step.
We recently published how we converted to a scalable white hat link building scheme, this shows you exactly how my team does outreach campaigns.
Tier 2 – Guest Posting
Once you’ve built super relevant content on high-authority websites, you need to power it up.
A big misconception in the industry is that people think Google works at the domain level. In reality, they’re working at a page-by-page level.
This is called Google’s PageRank algorithm and is actually the crux of this entire strategy.
So what does that mean for you?
Google takes the links pointed toward each page and gives that page a value. Then the number of links coming out of that page will be a fraction of the power of that page.
This means that even if a page has a high domain authority, it’s not necessarily passing a lot of authority to your website.
But you can change that with this approach:
With the previous step, we had already acquired a DR70+ link, but we need to give it some juice to power up our rankings.
Since we wrote the first tier link content ourselves, we built it to rank for specific keywords that discussed the need for our client’s services. This way we could build guest posts that would not only help to pass authority, but also build up organic traffic.
Then we could benefit from both an improvement in organic traffic but also referral traffic. Truthfully, this is a bit overkill, but we wanted to see how much we could squeeze it.
What did Google’s PageRank do?
As mentioned before, the element that brought the biggest bang for the buck in this campaign was the use of second-tier link building.
It helped us to take a relatively small number of links (4 – 5 every two months) and then power them up the following month.
So, here’s everything you need to know about Google’s PageRank.
What is Google PageRank?
Google uses a citation-based algorithm that looks at the relationship between individual pages. While referring domains are important, it’s the referring pages that are important.
This diagram in Google’s patent helps us to see that PageRank is calculated by the number of links pointed towards a page, as well as the number of external links from that page.
It’s clear to see why this strategy works, but what’s the math behind it?
How does Google PageRank work?
Earlier this year Majestic put out this amazing video that goes through the exact science of how PageRank actually works. This video is extremely detailed.
They break down how Domain Ratings provide radically different results from URL Ratings. It’s seriously worth a watch.
Why we chose it for this client?
My team tailors the strategy for each campaign based on the initial analysis. So, what made us decide that PageRank sculpting was the best strategy for this site?
The nature of the client’s website meant that bloggers aren’t really the industry experts that we needed, but we know it’s a strategy that works well.
So we decided to alternate between first tier links one month, followed by second tier guest posting the following month.
Truthfully, the blogger approach probably could still have worked in this industry too.
We figured that such a trust dominated industry might be worth focusing on high-quality websites rather than high-quantity websites.
The problem with always getting high-quality links is that often you’re getting fresh articles without much PageRank (i.e. link juice) in very small quantities.
They’re good – but they don’t always outweigh cost-to-result ratio of blogger links.
We chose to use guest posting en masse to improve crawl rates, anchor text and authority toward these pages.
I advise that you avoid using auto-generated link services as your second tier links. While they will definitely improve crawl rates of your page – they’re not going to help with authority.
You also run the risk of having your first tier links de-indexed – which is not what you want… and definitely not what your referring domains want.
Instead, focus on good quality blogs that would be safe if they pointed toward your own page and then just direct them toward these trustworthy sources.
In this case, we had the time and resources to give this a shot at the start of the campaign instead of later down the line – which garnered some excellent results.
The campaign has been a steady increase after the first month, reaching a peak through October and November.
There was an exponential increase in keywords toward the start of the campaign, with stability through the year of updates.
There is a direct correlation between the number of links we built followed by increases in the number of keywords they began to rank highly for:
While their social media campaigns and paid search have seen some modest increases in traffic, it’s their organic traffic that has seen a 432.69% increase when comparing the end of year performance against the month they joined.
As you’ve likely experienced in your own SEO career, the paradigm of “one size fits all” really has no place in our industry.
Every niche is different and every site has its own particular needs.
In this particular case, you’ve seen how to apply extensive keyword research to develop a content strategy.