SEO, Search engine optimization has turned significantly since its initial days when the term was first used over a decade ago. Now the leaders of online industry are hinting that there’s a new fundamental and philosophical shift.
They’re claiming that the traditional SEO methods no longer work, as those previous methods needs to be put into perspective. SEO has always been reactive by nature, as the experts would attempt to predict what a particular search engine such as Google uses to qualify a site for better indexing.
These experts would then find better more efficient means and methods which would satisfy the beast, this in order for their sites to perform better in the search engines organic rankings.
The majority of the tactics which were used has evolved over time, which reflects the exact blueprint of the classic cat and mouse game, a chase which these online marketers would be forced to play.
They would attempt to reverse engineer and then analyze the ranking algorithms of these search engines, all this to catapult their sites to achieve better results.
Once Upon A Time In The Wild World Wide Web
In the pioneer days of SEO, these search engines would rely heavily on the webmasters, and their proper use of the HTML meta tags, which would identify the exact keywords which were related to the content that was on the page of their sites.
The search engine robots would then prioritize their organic rankings based strictly on characteristics such as the depth and frequency of keyword density, which is how often a particular “keyword” on the page occurred, this to determine ranking order.
Then the craftiest of webmasters began “gaming” the search rankings through what’s known as “keyword stuffing.” The trick being a site would have countless repetitions of the same exact keyword placed near the top or at the bottom of its webpage.
The text would usually be the same color as the background, hiding it from human eyes, as the reader would never see it while the search engine “bots” could read it.
Then There Was Link Building
So eventually, the algorithms of Google in particular was tweaked from keyword content to link building. They determined ranking a page by how many links were pointing to a particular site, this to determine their significance.
Initially, the webmasters would then just maintain a “links” page on their site, this for the purpose of trading or selling reciprocal links. This involved “I’ll link to you if you link back to me.”
Google then eventually flushed this out as well, as this practice became a lot more complex because of the various services which offered three-way reciprocal linking, a method which was more sophisticated and harder to detect for Google, temporarily anyways.
The search engines then continued to “tweak” by they refining the quality of the links expected, as the various SEO tacticians once again attempted to game Google by trying to trick it.
The Attempts To Outsmart The Search Engines
The tactics and the attempts to “fool” Google by the savvy SEO experts continued to evolve, they attempting to stay ahead of the curve. The search engines however became efficient in debunking these efforts.
Google in particular continued to fight the battle by they releasing update after update, such as “Vince,” for instance, which was a major shift as they began to penalize the larger well known brand sites from their organic search results.
The Evolution Of Better Search Results
What the search engines began to do was factor in user behavior as a better indication when it came to the quality of a website. The key indicators being how long a visitor would stay on a particular page before they returning to the search results.
Then Google in 2010 began to consider social signals, looking at how frequent a site was mentioned in the social media sphere. This new criteria then set a new platform for increased scrutiny of sites, which were based on offline reputation, and what the readers thought of the sites.
Collectively, all of these efforts favored overall brand reputation along with user preference, while moving away from the tactical methods which were once used by the SEO experts and webmasters.
Welcome To The Zoo We Have Cookies
Google then became yet more vigilant as they rolled out their most aggressive algorithm update, which allowed their “bots” to detect quality signals which were just content and links, as they increased the quality criteria.
When Google and their initial Panda update was activated, it made significant sweeping changes to the organic search results, which wiped out over 12% percent of sites from its index, this based solely on what they thought was weak content.
They continued to lay down the hammer as numerous other algorithm releases followed, such as the Penguin update, which began discounting the once evergreen inbound linking structures.
As a result, there were countless online businesses and authority sites who had built successful online sites over the years, to suddenly lose 50% percent or more of their traffic, literally overnight.
These sites believed that they were complying within the rules, but most of them ignored one significant point, which was that their entire site structure was designed to ranking well just on Google, and anywhere else on the web, they didn’t bother.
So What To Do Now
So because of these sweeping changes, the SEO experts are beginning to make a fundamental shift. Until now, everything that involved SEO was reactive in nature, which was anticipating what the search engines wanted, and then tailoring and adapting their sites to it.
But most now believe that this isn’t going to work for much longer. It’s now thought that these online businesses will now need to begin thinking beyond the search engine rankings, as SEO no longer appears to work that well.