Those that have been around a while will tell you, if you donâ€™t like the way things are on the Internet, just wait a few daysâ€¦ itâ€™ll change. Actually, it changes more often than that. Google makes upwards of 500 changes per year to their algorithm alone. Services come and go, as do the companies that would make use of them; what works best for rankings changes periodically; whatâ€™s acceptable and unacceptable by the search enginesâ€™ standards varies and the shiny object that seems to grab nearly everyoneâ€™s attention can change as quickly as the tide.
With all that said, though, the changes that take place in terms of what does and doesnâ€™t work in optimization are typically not major. More often than not, they simply involve a bit more emphasis here, a bit less there.
Major algorithm updates from Google are usually only seen once or twice a year, if that. The recent Panda update could certainly be called a major update, although in reality, it just started doing what Google had been talking about for some time. From my point of view, the cascading changes that seem to be tied to the launch of Google+ are greater changes than were seen with Panda.
For instance, the removal of citations and 3rd party reviews from Places will hopefully help mitigate some of the abuse thatâ€™s being seen there. Under the surface, of course, it acts as just another magnet to draw more people in to use Google. I suspect thatâ€™s the main goal of the transformation weâ€™re experiencing.
Regardless of the magnitude of the change, however, there often (usually?) comes an accompanying wave of misinformation, and care must be taken not to jump at every new shiny penny that is waved in front of you. Just because someone says it on the internet, doesnâ€™t necessarily mean itâ€™s true.
We put this question to our contributors:
2. If you could mandate just one change to the dynamics of search ranking, what would that change be?
Mandate?Â Yeah.Â That SEOs learn that User Experience is the foundation by which SEO goes from okay to great.Â From myopic to sustainable.
Removing the maps from the desktop searches. itâ€™s an incredibly useful feature for mobile searches, but itâ€™s amazingly confusing while you are sitting on your chair.
I would like to back some of the social influence out of the result pages for a while. I like that it is there and I think it adds significant authority to some results to see influential connections listed there, but too many people are still doing social wrong for it to be universally appropriate.
Less emphasis on external linking and anchor text. As we have seen over the years, it is a highly manipulated metric upon which to be basing trust, relevance, and authority â€“ thus, search ranking.
I think links as a core relevance metric are rapidly losing their value. Search needs to take it to the next level with accurate sentiment analysis, and reliable authenticity metrics need to be invented â€“ if they donâ€™t exist already.
That link spam didnâ€™t work and was not part of the mix in the SERPs.
Iâ€™d like to see quality content rewarded even more. I feel like sites tend to earn a reputation that lets any of their content rank well, even if itâ€™s not the best.
Mandate one change? That for me would be the choice of switching between personal search to a clean SERP, more from a search user perspective. I love how Google is integrating XFN and FOAF into our SERPs but sometimes you just want a clean result signed in or out.
To stop having changes in the Google algorithm â€“ LOL! Since I know that the changes will always happen no matter what, I just have to eat my humble pie and deal with it 😉 In other words, I just have to â€˜future proofâ€™ my backlinks.
Relevancy rather than back links.
“Thin” and stupid content would be automatically be detected and kicked out of the index. And the site owner would instantly receive an email that said, “What are you thinking? This content is crap!” I know Panda did this to a certain extent – but you still see stupid content with top rankings.
Without a doubt it would be lowering the importance on anchor text within backlinks.
Remove all links that arenâ€™t in the flow of page content from the ranking equation. Only award authority for in-article or in-text links. That would end practices like comment spam, forums spam and sitewide footer links. Reputable SEO firms still use these techniques, because they still work, and because they canâ€™t beat the folks who are using them.
Remove those types of links and link spam loses a lot of its value.
Get on that, Google, will ya?
From a practitionerâ€™s perspective, Iâ€™d like to close that irritating loop hole that allows sites to rank for scraped content. From a userâ€™s perspective, Iâ€™d like to see the age of a specific page be taken into account in the SERPs. Pulling up single information pages, reading really good information and finding out that itâ€™s from three years ago and no longer relevant (or viable) is highly irritating.
While Iâ€™m wishing, can we get eHow taken off the map, and can Google just hand me a list of their algorithms? Iâ€™ll take those with steak, please.
Other than the obvious of making all my sites rank #1, I am not sure I would. 😛 I honestly think that despite the amount of bickering that SEOs do in regards to ranking, I think the engines, especially Google has taken Information Retrieval to a new level and I am constantly amazed at their abilities.
Iâ€™d like to see sites that offer a good user experience rewarded. If a website is able to take a user from the search results to two clicks or less, that site should be rewarded for it. Sites that offer difficult navigability, or donâ€™t have the information presented in a logical manner should be penalized for it.
I would be inclined to push for search engines to drop the preference for exact domain names that they currently hold. I think itâ€™s an outdated method of deciding the importance of a websiteÂ and unfair to the smaller operators out there who havenâ€™t got the liquidity or budgets to run off and purchase single, exact match words for some of the extortionate amounts that they can be bought and sold for across nearly every vertical.
The ownership of an exact match domain has no relevance in the real world to the quality of the underlying site either, and currently, I have seen occasions when the most ridiculous and ludicrous sites can obtain highly targeted traffic incredibly quickly without much effort above sites that have spent ten times as much in terms of expenditure and effort to produce quality over a much longer period of time.
If I could change just one thing in determining how relevant a page is to a search, I would decrease the value of exact domain match.Â The fact that someone was savvy enough in the 90â€™s to snatch a domain with valuable keywords in it, does not mean they are determined to provide content most relevant to those keywords.
I don’t think a single change to any of the many dynamics that influence search rankingÂ would make a difference, because I don’t believe there is a single variable that both negatively impacts results overall and is weighted heavily enough to effect positive change in SERP relevancy if it were removed or altered.
I would let Google have what it wants â€“ the ability to return quality SERPs.
Google should do more to fight the influence and power of obviouslyÂ manipulativeÂ and spammy links. There’s still far too many sites/pages ranking because of spam-based link building, and until this ends, too many marketers will be tempted to take the easy path. I wrote about this in-depth here:Â http://www.seomoz.org/blog/im-getting-more-worried-about-the-effectiveness-of-webspam
I may hear some gasps with this answer, but I would place much less emphasis on the role of links in SEO.Â The level of importance linkbuilding has risen to, is not only part of what contributes to so many fraudsters in the industry, but has over the years, played a large role in turning the internet into a giant spam filled cesspool.
A change needs to be made which cannot be easily manipulated and automated through the use of robots.Â Unfortunately, hindsight is 20/20 and I doubt that Larry Page could have foreseen the issues this would cause when Page Rank was first conceived.
On the bright side, search engines â€“ most notably Google – are moving toward a more social-centric method of ranking pages.Â I can see some areas where this can be just as easily manipulated as links are, so it is a wait-and-see game at this time as to how well that will help to clean up much of the spam we see today.
I’m not one that lives and dies by search rankings so I have no mandate on what I would change or should be changed. I try to use search engine traffic as an accent or supporting role to the overall traffic for the web sites that we work on. This gives us the freedom not be tied to the demands of the search engines while at the same time giving us the opportunity to chase better converting traffic than you can get from natural search results.
Some way to truly give those who work hard to build a legitimate informative site (or blog) their proper spot in the search results instead of getting knocked out of the top by those who do not deserve to be there.Â There must be a way to give the little guys a fair shake at being on the front page of the results instead of some big name brand just because of who they are.Â Some of the branded site are the most useless sites around and it just does not seem fair when there are so many small guys working their hearts out (legitimately) to get on the front page.
As both a user and a marketer, Iâ€™d like to see a stronger focus on the freshness of content. I run into search results over and over again with 4-5 year old information on them that isnâ€™t even relevant anymore to what Iâ€™m searching for. The search engines need to start considering how recent certain results are for various queries. A good example would be if youâ€™re having trouble with your latest Apple/Windows update and you search on it and results are showing up from 2-3 versions ago.
I honestly donâ€™t know but I think their QDF algorithm could use some work, at least in the non-English language, sometimes really random and old stuff shows up for a day or two, just to disappear again. I wouldnâ€™t mind that, but they are often irrelevant to the search term.
Wouldnâ€™t we all want our sites to always appear at the top, the emphasis switch to ranking our great content over that useless drivel our competition writes? Unfortunately, search rankings are always going to be gamed. As we see more block segmentation usage within the search engines, we should hopefully see a slight bias for content linked to within articles, not just footer of the article, bios and similar. As emphasis and content is analysed – not just words – and with additional technologies, content hopefully will return to being king and I will not have to worry about this SEO malarkey.
As a searcher, Iâ€™d like the ability to choose the algorithm that I use to search with.Â Let me decide whether I want the help of a reference librarian to power my results, or the village expert on a specific subject, or a buying agent who has my best interests in mind, or the most knowledgeable tour guide.
A reference librarian mode might focus upon finding useful informational type results, geared towards my knowledge level, giving me access to both main stream sources and more narrow niche results.
The village expert mode might help me find answers from my friends, or people who have shown some level of expertise on a topic, whether those answers are coming in response to my questions in real time, or giving me access to answers for the same question in the past.
A buying agent mode might bring me consumer report quality reviews of products and services, and reviews from my peers and others who might be in a similar situation as me. Iâ€™d have access to product searches and reviews, to manufacturers websites, to comparison shopping and booking and relevant news. I would be able to make informed decisions with the help of the buying agent mode.
A tour guide mode would give me access to events and businesses and services in an area of my choosing. It would enable me to find out about the history of somewhere Iâ€™ve travelled to, places to stay, things to see.
I could see other useful modes of searching that might be useful and defined by a specific persona that I would like help from.Â If Iâ€™m a java programmer visiting Java searching for a cup of Java, Iâ€™d prefer to provide a little information up front to a search engine about the intent of my search than have it try to guess.
You can begin with Chapter One of Critical Thinking for the Discerning SEO here.
1 thought on “Ch. 3 – Managing Change – Critical Thinking”
The one thing I’d change in a search algorithm is to offer the logged-in user an Advanced option to fine-tune various factors and save those settings. This would give the search engineers an additional insight. I am thinking that a person’s age or other circumstances might require a different set of results for the same queries.
Comments are closed.