Microsoft’s search engine, Bing, has announced that it hopes to change the way search engines discover new and updated content.
At the moment, search engines generally find new and updated content by crawling pages. When a search engine’s bot lands on a page, it crawls the links on that page and then follows them, crawling the links on the pages that is has followed. This essentially creates a fairly robust index of a website.
But Bing wants to change this or at least hopes to.
Instead, Bing is encouraging publishers to submit their new and updated content to their URL submission tool (via SEL). This tool specifically asks Bing to crawl that URL, rather than relying on Bing to find your page through another page.
It is hoped that webmasters, publishers, SEOs and content management systems, such as WordPress, will use Bing’s tool or API to submit new URLs. This, in turn, reduces Bing’s reliance on crawling, thus reducing the amount of resources needed.
The URL submission tool will employ a quota-based system to prevent spam. Having a verified site in Bing Webmaster Tools that has accrued some age will help raise the quotas for that particular website.
For now, not much will change. It is encouraged to use the URL submission tool, but Bing will continue to crawl the web as normal for the next few years.
The Bing Webmaster Tools team have published a guide on how to submit a URL to Bing using the tool.