Difference between revisions of "Robots Are Our Friends"

From Code4Lib
Jump to: navigation, search
Line 12: Line 12:
  
 
== HTML5 Microdata ==
 
== HTML5 Microdata ==
 +
 +
== Services ==
 +
 +
* [http://jronallo.github.com/blog/dpla-strawman-technical-proposal/ Collection Achievements and Profiles System and DPLA Crawler Services]
 +
* [https://www.google.com/webmasters/tools Google Webmaster Tools]
 +
* Python's [http://docs.python.org/2/library/robotparser.html robotparser]
  
 
== Form Letter ==
 
== Form Letter ==

Revision as of 23:01, 6 November 2012

For a variety of reasons cultural heritage organizations often have robots.txt documents that restrict what web crawlers (aka robots) can see on a website. This is a bad thing because it means that the content that libraries, archives and museums are putting online becomes virtually invisible to search engines like Google, Bing, Yahoo, is less likely to be shared in social media sites like Facebook, Twitter, Pinterest and stands less of a chance of being incorporated into datasets such as Wikipedia. The Robots Are Our Friends campaign aims to help promote an understanding of the role that robots.txt plays in determining the footprint our cultural heritage collections have on the Web.

Background

Typical Reasons for not allowing Robots

Throttling

Sitemaps

HTML5 Microdata

Services

Form Letter