While most business who have to coordinate marketing dollars for online advertisement, overwhelming choose to do so on Google’s AdWords /Analytics platform, however, there other judicious marketers who also divvy some funds to Bing Ads/Search.
Microsoft senior program manager Frederic Dubut’s took to the stage at SMX East to explain how Bing crawls, indexes and renders websites to produce its search results. Dubut’s presentation was perhaps particularly interesting to SEO specialists who want to expand their reach beyond Google’s Analytics platform but needed to know exactly how their sites fare in Bing Search.
Starting with the baseline understanding of what a crawler refers to in digital marketing terms, an algorithm designed to fetch online HTML for indexing, Dubut explains why a ‘crawl manager’ is used on the Bing platform.
Like most good supervisors, the crawl manager’s job is to listen to signals and set a budget. Its job is to estimate and determine “how much it can crawl a site without hurting the site’s performance?” (also called informally, “crawl budget”). When the crawl manager senses that it’s crawling too much, it will back off crawling. And when it still hasn’t identified a point of diminishing return, it will continue to increase the crawling.
In particular, a crawl manager uses subdomains, domains, server, and IP Addresses as its identifying signals and when those identifiers are optimized organic search results become a higher value.
What does all of this mumbo-jumbo mean for marketers?
The Bing search platform relies heavily on the Bing crawler and it’s pertinent that SEO specialists make use of the Crawl Control in Bing Webmaster tools to get the best return on their investment when using the platform.
Here are a handful of fundamentals to set in place to have a great ‘crawl’ effect:
Free up server resources by:
- Rejecting malicious actors through server-side security work.
Finding ways to reduce crawl demand:
- Remove duplicate content or leverage canonical tags
- Consolidate redirect
- Use an XML sitemap (include “lastmod”)
- Remove unnecessary URL parameters
- Remove any junk URLs or unused resources
- Consider performance optimization for heavy, slow pages
- If leveraging a separate mobile configuration, consider responsive web design
Since each bottleneck has its own crawl budget – monitor budget for each property, domain and IP (or IP range).
During major URLs shifts, allow roughly two weeks to recrawl everything (as URL shifts are going to temporarily double crawl demand).
While SEO marketers are plugging away at making these refinements to their domains, Dubut says that the Bing team is busy queuing up improvements to the platforms crawl efficiency by focusing on the number of useful crawls determined by measuring the number of new pages, new links or updated content that’s crawled versus the total amount of crawls for the site.
For more details about Bing Crawler and its plans on improvement, visit the company’s new blog series here.