As you read this, broken links could be damaging your website. Aside from the annoyance they cause visitors, they can have an impact on rankings, traffic and conversions. Dead Link Checker crawls through your website, identifying dead links for you to correct.
Why you need to check for dead links
Dead links are bad news. They can damage your website by:- Preventing search engine website crawlers (Google, Yahoo, Bing, etc.) from indexing the page - which negatively impacts your rankings.
- Damaging user experience - by redirecting visitors to 404 error pages. The irritation this causes can then have a negative impact on traffic and conversions.
What Dead Link Checker can be used for
Dead Link Checker is a handy tool for webmasters, SEO professionals and agencies.Webmasters can use Dead Link Checker to:
- Identify and correct broken links on their site.
- Analyse broken links on their competitors' sites.
- Identify and correct broken links on their clients' sites.
- Analyse broken links on their clients' competitors' sites.
- Identify opportunities for broken link building.
What we offer
- Four ways to check your websites.
- Free manual checking - for single and multiple sites.
- Three great value subscription packages for automated link checking - ideal for agencies and webmasters who need to regularly monitor more than one site.
How it works
The URL you provide is scanned for links to other web pages. Any links which are found are then checked to confirm that they exist. If they do exist, and if they are on the same website as the original URL, then they too will be scanned for links.
The depth of this search depends on the 'scan type' selected - a quick scan only checks links on the page provided; a full scan goes to a maximum of ten levels.
The scan will respect the target site's robots.txt file - for more information visit robotstxt.org
Excluding directories from the scanThe user-agent used by this site when scanning is 'www.deadlinkchecker.com'. You can exclude directories from the scan by adding a section to robots.txt, such as:
User-agent: www.deadlinkchecker.com Disallow: /shoppingbasket/ Crawl-delay: 1
This will disallow access to any pages in the shopping basket directory or subdirectories of it. The optional crawl-delay directive tells the scanner to wait at least one second between page requests on this domain. This can be used to avoid putting a high load on the server but it may result in the full scan taking significantly more time to complete.
For more information on broken links and link rot check, see here.
Other free to use utilities from DLCWebsites.com
- RandomNumberGenerator.com - generates random numbers within any range that you specify. Fully customisable.
- WhatAreCookies.com - details of how to enable and disable cookies on your browser. Includes Internet Explorer, Chrome, Firefox and several others.
Contact us
If you have any questions or feedback about the tool please email us.Dead Link Checker
DLC Websites
118 Moggs Mead
Petersfield
Hampshire
United Kingdom
GU31 4PY