After a decade of practicing SEO and following some constant principles, I have felt that it might be a time to write an article about them.
They are also closely related to some topics in the industry that are continuously heavily debated. Mostly in heated arguments on social networks and forums.
The concept of “Deserved rankings”
I’d like to introduce you to the concept I am basing my entire SEO philosophy on. I call it “Deserved ranking level, or DRL“. The basic premise is this:
As long as the website is at the level of its deserved rankings, there is almost no chance for it to be brought down by Google algorithm updates.
Deserved rankings is the level of rankings that “feels just right”, given the level of trust users have in the site’s content. This is closely playing with the EAT concept, but not quiet. EAT stands for Expertise, Authority, Trustworthiness. It is a concept first mentioned in Google’s Search Quality Rater guidelines. However, my definition of this concept is a few years older and not quite the same thing. But let’s leave that for later.
Unfortunately, DRL depends on too intangible factors. For this reason, I don’t have a formula to quantify it and show it as a number. At least not for now. But it still fits my workflow very nicely. My clients had no loss of traffic on all algorithm update since the early 10’s.
Two different kinds of SEO
So if there is a deserved ranking level, then there must be websites that are below it, and also above it. Both situations are bad for their own reasons.
SEO when a website is below its Deserved Ranking Level
Let’s say that an article is great and given all the domain and other factors, it currently deserves to be ranked at #5 on Google for a particular search query.
If the content is blocked by robots.txt, or if it is noindexed, then it really doesn’t matter what it deserves. It simply won’t rank anywhere. As soon as we unblock it, it will take its rightful place.
The same is with entire Technical SEO. Setting up the Search Console? Submitting the Sitemap? Making sure the protocol is HTTPS? Checking duplicate content? Having Meta tags? Fixing crawl errors? URL structure? Image tags? There are dozens of things found on most SEO checklists. None of those are any kind of a ranking signal. All they do is clearing the way for the page to rank where it was already supposed to be ranking.
Technical SEO is relatively quickly learnable and it is quite easy. Tedious sometimes, but easy.
If the website is below its DRL, it requires technical SEO to raise it up to reach its Deserved Ranking Level. This is a boost nonetheless, but it is one that a junior SEO can perform in a breeze.
Backlinks are a very sensitive topic, and we will deal with them later. Right now, let’s just say that a website is at its level of deserved rankings if it has collected a “reasonable number” of real votes. This number depends on the age of the website and the linked article.
I like referring to backlinks as “Votes”, because that’s precisely what they are. This is very hard to quantify, but there is definitely a phenomena where the content has too many links, no matter if they are spammy or perfectly valid.
Standard technical SEO cannot push the website’s rankings. It can only move it up if it is below the deserved position. The thing is, it often is below the deserved position. This is where “easy money” is made. Just imagine how big “Guru” must be an SEO that makes the site rank in a few days after doing “his magic”? Site owners don’t have to know that all he did was release the breaks, right? ;)
SEO to push the Deserved Ranking level up
OK, so If 90% of what we usually call SEO cannot push the site up beyond its already deserved position, then what can?
Well, it is complicated unfortunately. There is no one-size-fits-all formula. Everything starts with the strategy. Part of it is evaluating things that are happening way earlier than the website was even taken into the consideration.
Is the company worth boosting?
As an Executive MBA and a former business consultant, I cannot unlearn certain concepts that I find very important while doing the initial SEO research. So who is behind the company? What are their motives? Are they really good at what they do? Are their products and services exceptional? How do they communicate their values? How do they treat their customers?
if the company is not good, then what is the purpose of boosting it? There is no SEO that can change this. From the first day of doing SEO for such a company, it is all swimming against the tide. There is also a moral question. As SEOs, we are primarily serving end users. Are we ready to throw garbage at them just to take the money?
This philosophy is very much inline with Google or any other search engine. If they are going to rank bad stuff, people will slowly leave them for their competitors. Yes, they need to push ads and they need to make money. But as much as they may want to do it, they will never be able to kick the organic results out. So since they need them to be there, they also need them to be exceptional, and meet the user’s intent and desired user experience in the best possible way.
Even when they serve paid ads, they still have relevance scores that are making ads more or less expensive based on various quality factors. Their primary goal is to treat the searcher as a king, to make more money. As simple as that.
So, is the website that sits below the Deserved ranking level harmful for Google? No!
It is simply not optimized well, but it still may be fantastic for the user. How many times have we heard something like “but, but… They are not SEO optimized at all, and they are still ranking well“.
Google will do anything in its best ability to serve even sub optimized websites, if they are meeting searcher’s intent and the expected User Experience.
OK, that is clear. But what about websites that rank higher than they deserve? Well, that is the domain of the dreaded “falling on Algorithm updates” situation.
There is no way Google would destroy a website that ranks at its Deserved ranking position. I mean, it happened in the past as a result of mistakes that were mostly undone. But it never happens as a result of some Google’s “vendetta” against certain websites.
I leave some room to object against favoring certain big players in some niches, but even there, they do leave some level of fairness. They could certainly improve in that department. But again, websites do not fall without a reason. They fall down because they rank higher than they deserve.
How to rank a website higher than it deserves (Above its DRL)?
We have established that this is bad. Right?
OK, so how to do it? Simple: by manipulation. What are the manipulation methods? Well, it almost entirely falls to link building. Any kind of link building. Literally.
We have already used the common comparison of links and votes. They are very similar in nature. A vote is valid only if it comes from a sane person that states its free will. We cannot vote for our entire family and friends at elections. It doesn’t work like that.
Backlinks are coming from various domains. Some are weaker, and some are stronger. When I say stronger, I mean they have accumulated more trust from Google. They can rank their content better and faster. Also, Google does less split testing with them, moving them up and down to establish the best deserved position. They already have some credit.
Their credit and trust extends to their voting power. If they link to someone, then that someone must be doing something good. If a lot of trusted sites are linking or “voting” for a website, then it should rank better. That is the general idea.
Any kind of link building, including guest posting, is taking a shortcut and abusing the trust Google has in the linking website. It is not a real vote. It is merely the owner of the linked website pretending to be someone else voting for his/her own website. Yes, we can vote for ourselves on elections, but we cannot do that in someone else’s name.
It is hard for Google to prove that. But it is investing insane amounts of resources and money into state-of-the-art AI and machine learning methods, to weed out false votes.
The premises for proper SEO were laid-out not years, but decades ago. Each Google’s algorithm change, or at least 90% of them, are Google trying to get closer to enforcing what it preaches since forever.
There are other manipulation methods, but most of them are more technical and they were already dealt with by older algorithm updates.
A website that is above its DRL is pushed there artificially. It indeed ranks. It may be outranking its competitors that have way better content and UX. This behavior is a target of Google algorithm updates. On each update, these websites are switching their higher positions with the ones that are at their DRL, or even below it.
No. Not actually. If we push the deserved ranking level up, then it is OK to nudge others to vote faster. The site will align its actual ranking level with the deserved one quicker. There will be no mismatch and the website is safe.
This is all nice on paper, but how do we do that? Well, the answer comes from proper research. I think it was Brian Dean who said that nobody ever links to a website. They link to resources. I have read that years after pursuing the very same tactic, because it was perfectly logical.
I’ll try to be general and on the topic in this article, but here is just a single example of what I consider to be proper link-building. One of our clients is a wholesale gluten-free bakery. A special thing about them is exceptional quality and country-wide, daily-fresh Gluten free bread. They were already in business for two years, but without a website. The problem was, we had scheduled the website to finish it in March, and they asked if they can target the tourists for the summer season with organic rankings. Well… That is not that easy. I knew that despite all the local SEO, local landing pages, targeting the right languages etc, I simply must reach for some very strong backlinks.
To cut the long story short, after doing the research and the gap-analysis, I have decided to create a precise interactive map of all places in the country where fresh gluten-free bread can be bought. This is of high value to the target customers, especially tourists.
So, after creating the map, I have searched who is already ranking at top spots for my key target search phrases. Luckily, there are only a few that cover almost the entire demand. Second good thing was that none of the ranking websites were competitors. They didn’t make products, but rather they were travel guides. So basically along with the link juice, I was able to harness a lot of referral traffic from those pages.
So it was just a matter of reaching out and saying how great their content was. It was a bit dated though, so it was a good time for them to enrich their content with this great resource… Few have responded and gladly placed the links. They have also added surrounding content explaining how the map is very useful (for their readers). The result: doubling the sales in the summer season. The company is growing strongly ever since, and today they even rank at #2 for though YMYL phrase “what is gluten”, plus top five in tons of local searches targeting all major cities in the country. Almost all websites listed below them have much higher DR and DA ratings.
YMYL website is another Google Quality Raters Guide term, meaning “Your Money or Your Life“. Since such websites affect human lives and wealth, it is harder to rank for them because the criteria is theoretically more strict. DR is “Domain rating“, and DA is “Domain Authority“. Both are basically a way to rate websites based on “strengths” of their backlinks. Yet, many low DR or DA websites are outranking stronger ones. This is because their deserved ranking level is more realistic. Many high DR or DA websites are strong mathematically, but their deserved ranking level forces them to rank lower, so they do.
There is a difference between “Links” and “Link building”
Any larger SEO Social Media group or forum has threads where people are asking about most important ranking signals and factors.
Almost all the time, someone says “Links”. And yes, they are right. On the other hand, we hear questions like “Can we rank a website without links?” and then this person comes under fire for being so ignorant. “Of course you cannot”.
But is that person really ignorant? Or the question was badly asked? The heated debate is mostly because of a simple misunderstanding. People are often confusing “links” and “link building”. You cannot rank without links, but you can rank without active link building. Links will still come if the site is trustworthy and link-worthy. Only they will come at a slower pace. This is often much safer in the long term because there will be no reason for future algorithm updates to adjust the DRL.
But what if a company wants to sell a highly trendy product, like a Fidget Spinner? Is traditional SEO good for it? Of course not. By the time the website starts to rank naturally, the opportunity is long gone. If the company wants to utilize at least some of the organic channel to cut the overall costs, it has to go nasty here. It must utilize the blackest hat tactics possible. Yes, the whole thing will be penalized but they don’t care and they have collected the profits. The website was hopefully not their main domain, but an expendable one.
Why am I saying this? Because of the main topic of this article, which are SEO debates. Am I pro or against link building? Neither. It is just the matter of doing it correctly and safely, if safety is needed. Am I against so-called black-hat SEO tactics? No. If the company and the product are good, and if it needs to shine and burn quickly, then that may be the solution.
Debates are meaningless. Everything has its time and place.
How to make the website deserve higher rankings?
Good question. We cannot go into endless details here, but we can make a high-level overview. Everything starts with the problem that was formed in the user’s head. At some point he has decided to search the solution for it on the Internet.
In this process, he could be in any of the major user categories. There are several ways to categorize searchers, but to keep it simple, let’s say that he is either a cold, warm or a hot lead.
- Cold lead – has no knowledge of the proposed solution or the company
- Warm lead – knows generally about the solution, but has no idea about the company or the particular product or service
- Hot lead – Knows what he wants and searches for the shortest way to get it
There is no way to provide the same level of UX (user experience) for all the three of them, on a single landing page. Actually there is, with very advanced copywriting, but for now let’s say that we must create separate content for all three, and each has a different goal. Cold lead needs to be warmed up, Warm lead needs to be heated, and the hot lead needs to buy our product or service.
The best way to plan this is in the early phase of setting up the website’s content hierarchy. Also, the sales funnel needs to be aligned with it.
Targeting the right “user temperature” is the first step of providing great UX, but it is just the beginning. The rest of the UX is gained by Frictionless flow, easy site navigation, logical structure, great content writing and copywriting, easy content readability, site design, site performance and a ton of other things.
On top of this, each one of the three parts of the EAT package needs to be at the highest level possible, and certainly above the competition.
The last sentence is where I get roasted, and for a good reason.
Google has confirmed that it doesn’t measure E.A.T.
Yes. That’s correct. EAT is not a ranking signal. It is merely a concept for Search Quality Raters. Something for them to strive to. So wait a minute, am I contradicting myself?
Not at all.
Google has no way to measure EAT because it is hard to quantify and prove it. At least for now. But what is the purpose of Search raters? Their job is not to manually correct indexing of something. They have been given a picture of the ideal world—something Google wants to target with its algorithm. So by using the Search raters, they are measuring how close the next iteration of the algorithm has got. And it is getting closer and it is getting smarter.
Machine learning and big data is allowing Google to get better in spotting a myriad of various signals that are able to raise websites which are more closely aligned to idealized EAT.
So while EAT is not a ranking signal per se, it is still a good measure for a site that wants to do great on Google. Why? Well, because in most niches EAT is exactly what users want.
In most niches, people want to hear advice from an expert, provided he is not dull. So he needs to be Authoritative. This Authority is offline as much as it is online. The more famous the person is, the more he is associated with success in his niche, the more perceived Authority he has. If this is happening consistently over a long period of time, that makes him and his company Trustworthy. If company and its people are trustworthy, then the website is trustworthy as well, providing that is has great design and UX.
So, would you like a website that means EAT for you to rank at the top for your search?
Of course you would. And so would Google. EAT starts offline, and it happens across many channels. EAT is the one that makes natural links come at a faster pace. It is the one that generates great reviews and shares. It is the one that generates more traffic and higher conversion rates. It is the one that brings more sales. EAT is everything.
EAT, which is not a ranking signal at all, eventually becomes the strongest ranking factor. Much like Gravity: it is –1040 times weaker compared to Electromagnetic force. It is beyond measure weaker than Nuclear force. But eventually, on the exponential curve, it crashes them all. It ignites stars, holds galaxies together and creates Black holes. It even keeps lite from escaping.
Again, if we have EAT, that means that we also have great backlinks, UX, Engagements, site performance, Conversion optimization, Copywriting… Basically all confirmed or unconfirmed ranking factors and signals are part of it.
Ranking Signals vs Ranking Factors
I have used the words factors and signals interchangeably, so It may be a good idea to explain why. There is a difference.
A ranking signal is a direct, confirmed tangible thing that can boost rankings. HTTPS is one of them, but it is a rather weak signal. Speed is also a ranking signal, especially mobile speed. Also mobile friendliness and quality backlinks. We can say Keyword intent as well, but that can also fall under ranking factors. There is a popular belief that Google has over 200 ranking signals. Whatever they are, they are like Coca Cola’s recipe: a securely kept secret.
There is a new ranking signal called “RankBrain”. This one is based on machine learning and it has a “mind of its own”. Google’s engineers like to say that they have no clue why it selects certain pages over the other. We don’t know if it has replaced any of the old signals, but we may very well assume that it might do so, as it gets smarter and better.
It is not hard to imagine a not-so-distant future where RankBrain becomes the ultimate boss.
Ranking factors are things that strongly correlate with better rankings. This is mostly User Experience and user intent. Also all three of the EAT family; Expertise, Authority and Trustworthiness. In short, the more consistently awesome the website is for the searcher, the higher it will eventually rank.
It is what people actually want. Which brings us to another point:
Googlebots are like Commander Data: they strive to be as human as possible
All ranking factors are based on real human interactions. If an unknown person approached us, we instinctively don’t trust it. We might start trusting it quicker or slower. What would help? Well, appearance for starters. We trust nicer looking people (or nicer looking websites). Our trust depends on the context where we meet people. If we meet a person in the Bank and as a Personal banker, we trust it when it talks about financial topics. If we run into that person in the night in the empty street, we are cautious. If we meet a random person on the street talking about financials, we take everything it says with a grain of salt. All this is about relevancy.
We trust a person more if it is literate and talks normally. We even trust it more if it has a lower voice pitch. (Content presentation and copywriting). Btw, did you know Margaret Thatcher worked hard to lower her voice down? Who would trust a high-pitched Iron lady.
I can go on, but you got the point. The direction Googlebots are heading to is getting as close to humans as possible. Until we develop real AI, machine learning is the next best things and Google is getting better in it every day.
This again brings us back to increasing UX, improving design and copyrighting, and increasing Conversion rates to increase DRL.
Conversion rates are a ranking factor? You’re nuts, I’m outta here!
Hold on, think about it. What is increasing when Conversion rates are increasing?
So does Topical authority and Expertise, UX, blah blah, do I need to repeat those every time?
If there is one single metric that clearly measures increase in DRL, then that’s Conversion Rate. The number itself means nothing.
-What’s your conversion rate?
-Thanks, what now?
Nothing. It means nothing. It only starts to mean something if we take steps to increases the conversion rate(s). So if we did proper CRO (Conversion Rate Optimization), and if we increased conversion rates from 4% to 6%, that’s 50% increase in trust! And profits, BTW.
Is there anyone who dares to challenge me that 50% increase in website’s trust won’t do a thing for SEO?
The only problem is how to do it properly. First thing to do is to clean out all spam and get real numbers. Then the entire website needs to be tagged properly, ideally using GTM (Google Tag Manager). All “user temperatures” have their own goals. We cannot expect from a cold lead to convert immediately, in most niches. But there are still goals for them. Those goals worth less. The ultimate goal is the financial transaction, or a contact from a hot lead. These are the most valuable goals.
Informational websites that serve ads have their own goals. Those are not merely clicks. Those are also how effectively and for how long the viewer are exposed to those ads. Are there attention catchers that slow the users down before the ad is being served?
Purely informational websites that don’t sell anything obviously have goals in form of scroll depth, time on page and visited pages.
Increasing Conversion rates is one of the best and most measurable ways to push website’s DRL up.
What about keyword research?
Ah, thanks for the reminder, I have almost forgotten it. :)
Keyword research is important. This is because, no matter how hard Google tries to move beyond keywords, content is still made of words. Written or spoken (in the voice search).
If we have words, then we also have key-words. Those are the ones most relevant to the topic.
Google is getting brilliant in understanding the relations between keywords and entities, and this makes the job of SEOs much easier in one part. There is no more need to repeat keywords XY times, or to use variations, exact matches etc.
Still, a good SEO needs to be able to derive all the main keywords for the targeted topic. These keywords will be fantastic guides for the content writers, telling them what to write about, while knowing what people actually want to hear and are most interested in.
Good keyword research is able to produce great content. It gives so many ideas about subtopics that would otherwise be forgotten.
To do the keyword research, SEOs rely mostly on paid or free tools that are generating those keywords. The two most common algorithms used are either Paid keywords used in searches (Google Keyword planner), or Google Suggest which can be seen while typing a query into Google search.
Both are useful, but they have some major flaws. Google Keywords planner is relying on the paid search data which is coming from someone selecting keywords for their ads. Google suggest will suggest only keywords that are often used along with the main (seed) keyword. But so far there is no tool that is able to generate totally different keywords that are not associated with the seed keyword, but they are associated with the overall topic of the article.
That is one of the situations where an SEO earns his paycheck. It takes creativity and unconventional research methods to uncover those keywords. And the last place to search for them is the competitor’s website. Researching competition for this purpose should be the last step, only to verify that our content is better. We cannot be better if we are copying them, and our content needs to be multiple times better than theirs.
This is where SEO becomes exciting, despite the fact that not even Keywords are a ranking factor. They too are just a guide to steer Google in properly displaying search results and understanding the content.
Keywords are another topic of heated debates, with “just write naturally” argument at one end, and “use as many keywords as you can” at the other.
The truth is that it is hard to write precisely targeted content by just writing naturally, even for an expert in the topic. Keyword research is providing structure, continuity and security. But one does not have to be a slave of density ratios and content analysis plugins.
So, do you mean that SEO plugins are useless?
I didn’t say that. SEO plugins are essential because popular CMS systems like WordPress don’t even have meta tags, and they are creating unwanted thin content like Media attachments and various useless archives. Good SEO plugin will take care of the sitemap and keep it in sync with content’s index status for example. It will take care of many essential technical aspects of SEO.
Even the content analysis feature can be useful, mostly to beginners. But just like any other keyword tool, they are limited, and they are suggesting the same basic methods that most of our competition is using. They won’t provide a framework for outstanding content that will eventually kill-off competition. They will not give that exciting “predator feeling” when we zero in on a heavily competing search phrase, and watch our content slowly reaching the top and “cementing there forever”, leaving competition in despair.
Can’t we just hire a Copywriter?
Sure. By all means, that is precisely what a big company should do. Why a “big” company? Well, most companies cannot afford it. This article would cost about $17.000 to be written by a Copywriter. With all the research maxed out, it can reach $50.000.
– You’re kidding right?
No of course not. Those are official prices. Oh wait! Did you mean Content writer?
There’s a big difference there, and the two are often confused, leading to another heated debate.
A Content writer does some research, reads up what’s on the Internet, and writes his/her own piece. Some are very talented and they are a pleasure to read.
But there’s one problem—people don’t read.
A Copywriter will sell the product or the idea even to those who don’t read, which are the majority of people on the Internet. When was the last time you have read an entire page of something that is not of your direct interest? We don’t do that.
We are in the search mode, skimming around and looking for quick information. You have read this article all the way down only because you are my hot lead. You are deeply interested in SEO. Maybe you will later send me an email asking to create an entire SEO strategy for your business, and this article gave you the confidence to do that. Or you are a competitor. :)
But Hot leads are a minority. A good Copywriter will also move both cold and warm leads into the funnel in a much greater percentage.
Imagine a landing page of a flagship smartphone, that is projected to make $50.000.000 in the next six months. A good copywriter will make it earn at least $75.000.000. A good Conversion optimizer will add $25.000.000 more.
So what businesses usually do, is searching for cheap content writers that are often pitching their services as copywriting. That content often won’t be read. That’s ok. A Copywriter’s content won’t be read either by most people. But they will generate a huge difference in sales, and the Copywriter’s content will provide hugely better UX, which is ultimately dancing with SEO.
There is a flow in both of them: neither is an expert in the company’s niche, and neither is a voice of the company. Sure, they can write in the name of the alleged voice, but this pretending is not as nearly as effective as the real, live company’s face. A one that is turned into a star by smart PR. The one that extends its EAT to the company and its website.
So what is the second-best solution? There is only one: someone in the company needs to make a sacrifice. Ideally he/she should have some writing talent and a bit of charisma if possible. Then an expert should provide intermediate copywriting education for this person. It is a quantifiable science. It can be thought and trained.
Once the person is trained, an SEO should constantly provide new topics with pre-researched keywords, headings and the content structure to write around. People start to have fun with this approach. This is similar to a great coach that only asks questions, and the person already has all the answers.
It all leads to increasing the Deserved Ranking Level of the website; a sure way to stay strong and grow on Google for many years to come. Google algorithm update? Can’t wait for the next one!
Instead of the conclusion…
At 5.000+ words, this article is already too long. We do have to stop at some point. Let’s do it now. I am not even targeting any keyword with it lol. Or maybe DRL becomes a new buzzword? :) We’ll see. See you in the next article!