Bill Slawski of SEO By The Sea just posted about an interesting post about a patent that Google filed a couple of years ago. In the patent, Google describes a system where they trick webmasters that they think may be spamming during their transition from low rank to high rank. Basically, they give the webmaster undesirable results for his efforts and track his responses. If they don’t like his responses, he may end up ranking lower than he originally did.
QUOTE – “The rank transition may provide confusing indications of the impact on rank in response to rank-modifying spamming activities. Implementations consistent with the principles of the invention may also observe spammers’ reactions to rank changes to identify documents that are actively being manipulated.”
QUOTE – “If the document is determined to be subjected to rank-modifying spamming, then the document, site, domain, and/or contributing links may be designated as spam. This spam can either be investigated, ignored, or used as contra-indications of quality (e.g., to degrade the rank of the spam or make the rank of the spam negative).”
Google Does Use This
As you may have figured out, just because Google files a patent, it doesn’t mean that they use it. I’ve personally seen it in action and so has every “black-hat SEO” who’s trying to rank a new page for a desired keyword. I don’t do black-hat SEO anymore but, I did just go and check around the black-hat community to see if this is still in action and it definitely is.
What This Algorithm Actually Does
First of all, I’m referring to how this algorithm responds to link building. I never performed or observed any dirty on-page tactics. When I created my pages, I created content for humans and I made sure they got what they were looking for. Also, I’m going to try not give out too many details because I didn’t create this site to teach black-hat SEO tactics.
When you first start building links on a new site, there won’t be much of a response for a while. It’s kinda funny. When I got started, I knew pretty much everything about Google’s algorithm. Everything except for the delay algorithm. As you can imagine, I was very disturbed when I had more going for my site than my competitors and still found my page invisible for the keywords I was chasing. It took me completely by surprise and it was very unsettling to say the least.
The vets told me to just keep doing what I was doing and everything would work out. They said more than that but, as I said earlier, I’m intentionally leaving out details.I’m really not trying to make this site a go-to spot for black-hats. Black-hat sucks. It’s stressful, boring, and honestly, it’s unnecessary. There are far more effective and fulfilling ways to build links that don’t violate Google’s guidelines. Any-hoo.
Eventually, the delay expires and you start moving up and down in the rankings, sometimes in the extreme. Then rankings jumps expire and the you hit another delay. That all continues in cycles until the algorithm determines that you’re trustworthy. That’s what I saw anyway.
Does this algorithm deter spammers? I don’t know but, I do know that it scares the hell out of newbies. To tell you the truth it scares some of the veterans too. Think about it this way. Every time you go to rank a new site, you have to wonder if this will be the time that it doesn’t work. The transition period isn’t all that consistent and when your rankings don’t settle as quickly as they did last time it can be a little nerve wracking.
Here’s a couple of more quotes from the patent to show you this is the same algorithm.
QUOTE – “Correlation can be used as a powerful statistical prediction tool. In the event of a delayed (positive) rank response, the changes made during the delay period that impact particular documents can be identified. In the event of a negative initial rank response, correlation can be used to identify reversion changes during the initial negative rank response. In either case, successive attempts to manipulate a document’s rank will be highlighted in correlation over time. Thus, correlation over time can be used as an automated indicator of rank-modifying spam.
QUOTE – Alternatively, or additionally, noise may be injected into the document’s rank determination. This noise might cause random, variable, and/or undesirable changes in the document’s rank in an attempt to get the spammer to take corrective action. This corrective action may assist in identifying the document as being subjected to rank-modifying spamming.
What I Didn’t Know
I did notice that there was something in there that I have never seen or heard of before. The patent makes multiple references to noting changes to “suspicious documents” so as to better identify web spam. I hadn’t heard of that before I read the patent and honestly, I feel kinda disappointed in myself for not thinking of it myself. Every good black-hat knows that Google tailors their algorithm around human behavior. It makes since that they would make note of changes to web-pages during suspicious periods to better understand web spam behavior. It’s difficult to say how well it works without seeing the data. I’ll definitely be even more mindful when I’m making changes to my sites in the future. Even now that I’m using strictly white-hat SEO, I still find myself being careful not to do anything that might be misinterpreted.
Hat tip to Bill Slawski for finding that patent