If for some reason you need a list of all sorts of different web user agents (spiders, robots, crawlers, browsers), then you can check the list here. These are separated into three huge pages with descriptions, and links to the vendors’ sites.
This might be useful when writing a robots.txt file or developing a site with different content varied by the web client.