By advertisers we’re referring to re-sellers of some outside product, subscription, or service. These are typically affiliate groups who get a big bonus (ranging from $15 to $100) when you sign up for something as harmless as a free trial. So, why do these affiliate sites so entirely outrank pages that, you know, actually answer the question? To answer this thoroughly, let’s take a quick trip in Rob’s magical time machine, visiting a time long, long ago: the mid 90s.
The Birth of Link-Based Algorithms
In 1993, the Mosaic browser was released and the world was given access to a graphical web. It wasn’t fast, it wasn’t real pretty, but it was visual – and that meant the start of a public internet. Sites were originally categorized and put into independently-owned directories, and major directories were the primary way typical users navigated the chaos that is the web. Then innovative search engines took the field, with key players at the time including Yahoo!, Lycos, and AltaVista (and feel free to give those names a moment of silence).These original search engines looked through actual web pages, indexing the content found inside. It actually read text and matched it up to your query. The theory was simple: If a site uses the words you’re searching for, it’s probably on the topic you’re searching for.
Quickly, however, the search engine result pages plummeted into chaos. The reason was the first wave of search engine optimization, which recognized how the sites were being ranked and tried to put in the exact right number of keyword repetitions (that’s where the “keyword saturation” idea, and the obsession some SEOs still have with it, comes from).
The gaming got bad fast. Beyond simply optimizing their site for the given keywords, webmasters and first-generation SEOs were creating duplicate pages that had the exact same content but was located at a different address. An entire page of the search results could be from the exact same publisher.
That’s where Google came in. Larry Page and Sergey Brin were certain they could make a better search. This time they wouldn’t be using keywords found within a site as the mode of ranking (although it would, and still does, play a role). Rather, they were on a hunt for a method by which site popularity could be automatically tracked. What they came to was the idea of link popularity, and in the early phases, that was largely based on the site’s PageRank.
PageRank is, in the broadest (and admittedly least thorough) summary possible, a simple way of indicating how many links point to a site. Links are more powerful if they originate from sites that have a high PageRank themselves. So, theoretically, as people went around sharing the link – be it on their own site, on a blog, or on any other medium that could be indexed – those links would serve as evidence for the high quality the site being linked to.
Immediately, Google’s algorithm gave results that were substantially better than those on Yahoo! and other major search sites (which were basically just spam). However, it didn’t take shady SEOs long to catch up.
The Next Generation of Spam
Search engine optimization is, overall, a good thing. It’s a way by which webmasters can communicate with the search engines, telling them what their site is about, what they want to be discovered for, and so on and so forth. The problem is that there are certain optimizers who use a set of tactics that sabotage the aim of the search engines (which is providing high-quality content for searchers). In the Google era this spamming has happened in the form of link generation.More inbound links bolsters your search engine ranking, so it’s not hard to see the solution to increasing rank: Get more links! Rather than getting these links by word-of-mouth marketing, providing guest content, or otherwise earning the links, however, most spam groups these days rely on the network of sites that sell links, the self-publishing article sites, and other mediums that allow you to push links by yourself.
The Capital Power Conundrum
So the quick answer to why Google provides spam results is that, since they have an automated system, people are always trying to go through and game the SERP. It’s impossible to afford a manual management system and it’s impossible to spam-proof an automated one. But there’s one other question that deserves attention: Why is it just spam sites doing this?Well, obviously, it’s not just spam sites, but the sites that are re-selling for someone have a huge advantage: an advertising and SEO budget. Since they can make a huge amount of money with every signup on the site, they generate hundreds, thousands, or tens of thousands of dollars each month that can be turned inward for buying links, article submission packages, and so forth. As the profit (i.e., capital) generated is re-invested, they become more profitable and more powerful.
What about the sites you actually want to see? What you’re probably looking for is an informational site that provides direct, honest, and unbiased information. In other words, you’re looking for a group who will charitably give you information with little hope of a large return. While you may be fine with an ad or two on these sites, the overall revenue for the publisher and for each advertisement you click on is substantially lower. So the honest, informational sites have less incentive to hire SEOs and they have less starting and ongoing resources to invest in optimization.
Why Social Web, Curation, Etc., Won’t Stop Gaming
There have been numerous propositions for how this situation can be improved or fixed. Social sharing as a ranking factor (including Twitter shares, Facebook likes, and Google +1s) sounds great. Human curation and contributions (such as we see on Blekko) are fantastic conceptually. Here’s the problem: The moment one of these factors becomes significant enough, it absolutely, definitely will be gamed.The social web is on the rise and once its hit the mainstream, you bet that there will be groups who create hundreds – even thousands – of social accounts for the simple purpose of selling you their “likes.” In the same way, if Blekko became popular, then SEO groups would hire third parties or develop networks to game the human contributions.
For every step the search sites take toward “fixing” the problem and establishing countermeasures, the SEOs have come up with one more way to get around the countermeasure. The only long-term fixes would involve either an idealistic (probably impossible) way to detect those who are gaming the search sites or full, single-group human curation (which is far too expensive to pull off). Sadly, there isn’t a quick fix. The search sites will continue to create well-rounded algorithms that are more likely to surface good results, but it’s impossible to “kill spam” completely – at least in today’s world.
The good news is that webmasters who don’t invest in gaming will still see the best long-term results. Focusing on quality, basic promotion through guest blogging or social sites, and honestly providing value will get organic results over time – and won’t be tossed to the sidelines with algorithm updates.
And, on the whole, we’re improving. The number of spam sites filtered out of the top results has multiplied like bunnies, and upcoming ideas such as social feedback, site blocking, and other user contributions will have a valuable role to play. The mistake is thinking that these new ideas are solutions, when in fact they’re just one more territory where the search sites will do war with the black-hat SEOs. But each one of these territories gives Google, Bing, and other search sites an additional advantage – and the overall picture, believe it or not, is getting gradually less spammy.
Source http://www.searchenginejournal.com/
No comments:
Post a Comment