LinksManager.com runs an automated Reciprocal
Link Checker and Dead Link Checker (User-agent: linksmanager and
User-agent: linksmanager_bot ) for all
LinksManager customer's web sites. If you are linking with a website that
is powered by LinksManager.com, you might see LinksManager.com/linkchecker.html listed in your
server log reports.
LinksManager's Link Checkers are part of the LinksManager Service and behave like search engine spiders
and verifiers, checking to see
if customer account links are dead and/or if the link swap partners are linking back to the
customer's web site.
Both of these automated processes abide by any relevant instructions in a
website's robots.txt file. The checkers only browse text and do not download high bandwidth files such as
graphics or rich media. The checkers use proxy servers and are programmed to be
"smart" and look for links pages by first spidering paths that include
the word "link" and "resources" in it. The checkers are intentionally programmed not to
put excessive load onto any one site. There is at least an 8 second pause
between page requests and the checkers do not request more than 300 pages.
Most checks find the link quickly and move on. In the rare event that 300 pages were requested, it would be over a period of more than 40 minutes and
shouldn't put any stress on the server being checked.
You can decrease spidering time by making sure the word "link" is in
the path to your links pages. For example if the link is located on this
it will be spidered quickly and only a handful of pages should be requested from
If the link is located on this page on your site:
it will take longer for our spider to find your links pages as it will start at
yourdomain.com and work it's way through your site looking for our customer's
Some webmasters keep their links pages on third party domains.
LinksManager's Link Checkers do not cross domains to third parties.
You can "steer" LinksManager in the right direction with
robots.txt by excluding directories that do not contain your links pages.
LinksManager abides by robots.txt and will not enter directories that are
excluded for User-agent: linksmanager.
For more information about how to create a robots.txt file for your website, see
the Robot Exclusion
A good primer on the use of robots.txt can be found at http://www.freefind.com/library/howto/robots.
Here are some examples of how robots.txt can be used to allow or
disallow robot traffic:
This example allows all robots to visit all
files. * means wildcard or all robots:
This example blocks ALL robots:
This example blocks all robots from only the images and richmedia directories
but it allows robots to visit the rest of the site:
(make sure to use the correct paths for your site this is just an example)
This example bans Google from spidering the entire
This example allows LinksManager to visit only the links pages on this
By not including:
LinksManager will still be able to visit the /links directory in the above
Make sure your robots.txt file uses the correct paths for your website.
LinksManager's Link Checkers were designed to be sensitive to webmasters concerns about robots.
LinksManager saves our customers incredible amounts of time by automating the task of verifying reciprocal link swaps, and
checking for 404/dead link status.
If you do not want your site checked by LinksManager.com, place a robots.txt file
on your website. If you have questions about the checkers,
feel free to Contact Us.
When contacting us, please make sure to include the full address to your website.
Feel free to review our Code of Ethics
page which discusses the LinksManager concept and additional LinksManager