An important aspect of Search Engine Optimization is making your website easy for both users and search engine robots to understand. Although search engines have become increasingly sophisticated, in many ways they still can’t see and understand a web page the same way a human does. SEO helps the engines figure out what each page is about, and how it may be useful for users.
A Common Argument Against SEO
We frequently hear statements like this:
“No smart engineer would ever build a search engine that requires websites to follow certain rules or principles in order to be ranked or indexed. Anyone with half a brain would want a system that can crawl through any architecture, parse any amount of complex or imperfect code and still find a way to return the best and most relevant results, not the ones that have been “optimized” by unlicensed search marketing experts.”
Imagine you posted online a picture of your family dog. A human might describe it as “a black, medium-sized dog – looks like a Lab, playing fetch in the park.” On the other hand, the best search engine in the world would struggle to understand the photo at anywhere near that level of sophistication. How do you make a search engine understand a photograph? Fortunately, SEO allows webmasters to provide “clues” that the engines can use to understand content. In fact, adding proper structure to your content is essential to SEO.
Understanding both the abilities and limitations of search engines allows you to properly build, format and annotate your web content in a way that search spiders can digest. Without SEO, many websites remain invisible to search engines.
The Limits of Search Engine Technology
The major search engines all operate on the same principles, as explained in Chapter 1. Automated search bots crawl the web, follow links and index content in massive databases. They accomplish this with a type of dazzling artificial intelligence that is nothing short of amazing. That said, modern search technology is not all-powerful. There are technical limitations of all kinds that cause immense problems in both inclusion and rankings. We’ve listed the most common below:
- Spidering and Indexing Problems
Search engines aren’t good at completing online forms (such as a login), and thus any content contained behind them may remain hidden.
Websites using a CMS (Content Management System) often create duplicate versions of the same page – a major problem for search engines looking for completely original content.
Errors in a website’s crawling directives (robots.txt) may lead to blocking search engines entirely.
Poor link structures lead to search engines failing to reach all of a website’s content. In other cases, poor link structures allow search engines to spider content, but leave it so minimally exposed that it’s deemed “unimportant” by the engine’s index.
Interpreting Non-Text Content
Although the engines are getting better at reading non-HTML text, content in rich media format is traditionally difficult for search engines to parse.
This includes text in Flash files, images, photos, video, audio & plug-in content.
2. Content to Query Matching
Text that is not written in common terms that people use to search. For example, writing about “food cooling units” when people actually search for “refrigerators”.
Language and internationalization subtleties. For example, color vs colour. When in doubt, check what people are searching for and use exact matches in your content.
Location targeting, such as targeting content in Polish when the majority of the people who would visit your website are from Japan.
Mixed contextual signals. For example, the title of your blog post is “Mexico’s Best Coffee” but the post itself is about a vacation resort in Canada which happens to serve great coffee. These mixed messages send confusing signals to search engines.
3. The “Tree Falls in a Forest”
SEO isn’t just about getting the technical details of search-engine friendly web development correct. It’s also about marketing. This is perhaps the most important concept to grasp about the functionality of search engines. You can build a perfect website, but its content can remain invisible to search engines unless you promote it. This is due to the nature of search technology, which relies on the metrics of relevance and importance to display results.
The “tree falls in a forest” adage postulates that if no one is around to hear the sound, it may not exist at all – and this translates perfectly to search engines and web content. Put another way – if no one links to your content, the search engines may choose to ignore it.
The engines by themselves have no inherent gauge of quality and no potential way to discover fantastic pieces of content on the web. Only humans have this power – to discover, react, comment and link to. Thus, great content cannot simply be created – it must be shared and talked about. Search engines already do a great job of promoting high quality content on websites that have become popular, but they cannot generate this popularity – this is a task that demands talented Internet marketers.