First, a little lingo lesson. A “bot” is a computer program that, once put into action, runs automatically. The internet is often referred to as “the web” because it functions much like a spider’s web, with many different ways to connect between two specific points.
Google uses a bot (aptly named “Googlebot”) to crawl the web. This type of bot, also used by other search engines, are also referred to as spiders. Got it?
Googlebot crawls through the web looking for things that have changed. It wants to find new pages, new links, as well as deleted content and dead links. Links in and out of web pages are particularly important to search engines, so dead links are a sign that something is wrong, and therefore is going to be a problem for users – and therefore, this info will ultimately weight negatively when compiling search results. Maintaining logical links (like within a site) as well as high quality incoming links (like from reputable directories) is a key process in SEO.
Since search engines are text based, Googlebot and other spiders can only read text based content. That means that they do not know what images represent unless they are appropriately described in what is called “alt text.” This also makes it particularly important to use a text based navigation system rather than something fancier, however visually appealing.
Spider accessibility is also determined by the speed with which the site is deployed by the hosting server when requested. Googlebot won’t hang around waiting (and neither will your potential customers).
Keeping your website clean, readable, and readily available to those spiders is essential to getting your site’s content indexed and presented in results.