Keywords and content length

As an active Quora user, I always get the same few questions in my feed that is a variant of things like:

What is the best way to improve SEO

How can I improve our SEO performance?

How can improve SEO ranking on the first page?

What are the quickest ways to drive SEO traffic to a new website?

The answers these questions also become redundant. You’ve probably seen the same old combination of “create original content”, “install Yoast”, “off/on-site SEO”, “create powerful titles”, etcetera. To a lesser extent, you will also see users self-promoting their “white hat” link building services.

What is rarely promoted is the meat & potatoes of any high-ranking blog, better content. Don’t believe us? Just take the word of Google’s spokesperson, Danny Sullivan:

Want to do better with a broad change? Have great content. Yeah, the same boring answer. But if you want a better idea of what we consider great content, read our raters guidelines. That's like almost 200 pages of things to consider: — Danny Sullivan (@dannysullivan) August 1, 2018 


It’s been proven by some of the biggest voices in SEO that longer content likely ranks higher. Taking a look at this analysis by Hubspot, 2,000 words or longer seemed to be the sweet spot for optimal search engine traffic.

Short posts under-perform. Source:

When it comes to social media shares, which means more backlink and traffic potential, extremely long articles usually receive the most likes. 

This doesn’t mean trying to stuff words in every nook and cranny, like the intro/conclusion. It means providing as much valuable content to your post as possible to get more shares, mentions, and backlinks.

Obviously, creating a long article takes an investment of time, or money if you hire a content writer. This is why the typical blog will contain 500-words or less, thinking brevity is sufficient to deliver a message. Even writing thoughtful 1,000-word blog post will be enough to stand out from the sea of mediocracy. Authenticity

Everyone has access to Google and Wikipedia so there is no shortage of spun content or regurgitated ideas on the net. This is why Google developers are actively sharpening their indexing algorithm to find content that’s truthful and authoritative.

Taking a look at Google’s Medic update, websites that littered with the web with nonfactual or biased content seemed to be at the receiving end. While it affected a broad range of categories, the update particularly targeted Charlatans making exaggerated claims without evidence or credentials to back them up. Google stated that they did not target health blogs, but rather they coincidentally were the majority of offenders.

This was made apparent by a Google’s John Mueller this Reddit thread:

“At the point where they’re hand-picking what they consider to be “real data,” it’s extremely”really hard to make objective recommendations :-). Would comparing with other sites be useful?

One approach might be to do an A/B test based on their requests, update some part of the site accordingly, and see what happens (probably not much). Doing that the other way around wouldn’t work though, improving just a part of the site’s general quality (in whatever ways make sense for a site like that) is kinda hard without improving the whole site.

My thoughts are that if the site was ranking reasonably previously, then technically it’s unlikely to be that bad. There’s always something that can be tweaked & improved from a technical point of view, and these can give you incremental wins, and there’s also a clean-out of duplicate content, which is somewhere between technical & quality, which can help over the long run. However, if you’ve seen a significant, steady change around the core algorithm updates, then you probably want to go past incremental updates and instead rethink things overall.

The quality rater’s guidelines & the old Panda blog post (“More guidance on building high-quality sites”) are good places to get ideas. The important point (in my eyes) is that this is not a “tweak h1’s, inject keywords, get links” kind of traditional SEO work, but rather you’d want to step back, understand where the site’s audience is & where it’s going, and rethink how you’d like to position the site within the 2019+-web. As an SEO consultant, you’ve probably seen a lot of potential directions, and how they’ve evolved over the years, so you might be in a good place to make informed recommendations.”

Long-tail Keywords

When researching keywords to base your article from, you may be tempted to pick out the ones with the biggest numbers. What you need to consider that most simple keywords will be super competitive fora fresh blog on the web.

Did you know that most search engine queries are actually long-tail keywords? Internet users are more complex than we thought and they are looking for specific answers to specific questions.

In fact, at least less than 30% of searches are from simple searches. Ironically, it’s these queries that receive considerable investment into SEO, even though conversion rates are considerably lower.

 Source: Neil Patel

To find a long-tail keyword, find phrases like “red converse shoes for girls”, instead of “converse shoe”. As you venture into more complex long-tail keywords, you will probably face less competition.

As stated above, your content should have a decent length (1-2,000+ words) and this will also help with building natural long-tail keywords without research. If you are writing readable content with a variety of information, you will probably hit on phrases that humans will naturally search for. Especially considering how Google is trying to push out robotic keyword-stuffed content, let the long-tail keywords flow nocturnally.

The Off-Site SEO Minefield

Believe it or not, Google actually doesn’t want you to a mass amount of links to get ranked. While top-ranking blogs in any SERP will have a high amount of links, most of them should always be naturally obtained.

An SEO agency using old-fashioned methods may use private blog networks (PBN), which are content farms built upon expired domains, to build backlinks anchored on keywords. These unnatural backlink profiles are leading Google to push down website rankings, or even de-indexing otherwise legitimate domains.

Final Thoughts

It is becoming apparent that Google no longer want SEO agencies poking and prodding their way to the top. The “Wild West” internet where everyone’s opinion had weight may be coming to an end as the best way to rank as a business is to have industry experts giving valid advice to organic search queries. It’s a grim future for SEO agencies but good news for knowledge seekers and professional writers alike.