It would seem that link building tools are just like buses – you wait ages for one to come along and then two come at once. Case in point: this week I came across two new tools from Citation Labs called Outbound Scraper and Contact Finder.
The idea behind these tools (scraping a page and extracting outbound URLs then finding contact details for said URLs) is nothing new. However, what is different is that unlike previous tools that have promised these functionalities, these actually work but more importantly, they work well.
Let’s start from the beginning though. Why exactly would you need these tools?
As I’m sure anybody who has ever done any link building will know, lists are a link builders friend, and the Internet is full of lists. It’s a safe bet that whatever industry your working in, somebody somewhere will have compiled a list of the top blogs/sites in that industry, and more often than not, they will be annually updated.
The problem with lists, especially long ones, is that it can be very time consuming (not to mention boring) extracting URLs, visiting the sites and finding contact details and site metrics. This is where Outbound Scraper and Contact Finder come in.
I thought the best way to demonstrate how useful these two tools are was to show a real life example. We are currently working on promoting an infographic for a fashion client, a big part of which is conducting blogger outreach into the fashion blog community, so this was my starting point.
The first step is to find a list to scrape. In this instance I have two lists of fashion blogs that I am going to combine:
To use the Outbound Scraper you simply enter the URL of the page you want to scrape, give the job a title and hit the scrape outbound links button. Typically this process will only take a few seconds and will provide you with a CSV file of all that pages outbound link URLs via the “Outbounds” tab.
Click image to view large screenshot
I would recommend quickly opening up, checking and filtering the URL list to get rid of any that are unwanted. Usually you will find Twitter, Facebook and the like showing up on this list.
Now that you have all the scraped URLs, you can now enter them into the Contact Finder tool. Simply copy and paste all the URLs into the Contact Finder tool and start running the report. Again this will typically only take a few seconds to run and will provide you with a CSV of all the URLs with accompanying email addresses and/or contact page URLs. Now obviously, the success rate for finding the correct email address is not perfect. It is however very good. Also, when the tool is unable to find an email address, it will nearly always return a valid contact page/ form URL. In this example the Contact Finder managed to find contact details for 160 out of 252 URLs, and of the 13 reports I have run so far, the tool has had a 68% success rate at finding a contact detail.
So now that you have a long list of target URLs and the relevant contact information, whats next? Well usually I would want to prioritise my list of targets. To do this, I would require accompanying metrics for the sites in my list. This includes things such as number of backlinks and linking domains, Page Rank, DA, PA, and geographic location to name but a few. Collecting these manually for a list of more than a hundred sites would take a very long time. Fortunately this is where another great backlink tool comes i – Backlink Profiler from Link Research Tools.
Although this isn’t really what this tool is designed for, it is very useful. To automatically collect all these metrics, follow these steps:
- Go back to your scraped list of URLs, copy and paste the list into a new spreadsheet, insert a cell at the top of the list and type “URL” into it. Save this as a .csv file. It should look something like this:
- Now start a Backlink Profiler Report (BPR) in Link Research Tools (LRT).
- It doesnt matter what you enter in the URL field as we’re not actually interested in finding any backlinks. You do however need to add a valid URL. For this example I’m going to enter www.epiphanysolutions.co.uk.
- Click the choose file button and select the CSV we previously created with our list of URL’s. Ensure the “Analyze Uploaded File Only” option is checked. So now instead of retrieving the metrics for the URL we entered, it will only get them for the URLs listed in the CSV.
- Next select which metrics you wish to use. This is entirely dependant on what you want to acheive. For this example I have chosen Basic SEO Stats, Google Metrics, Link Source Country, SEOmoz metrics and Majestic Metrics.
- Hit “run report”
The tool will now go and collect all the metrics you requested for all the URL’s in your list.
Now all you need to do is export and combine this data with the contact details and you have a comprehensive list of target sites and contact details which can be prioritized however you like. For example, perhaps you only want to target sites hosted in the UK, or maybe you want to prioritise the most authoritative first based on Domain Authority. With all this data, this is easily achievable with some simple Excel filtering. This whole process, from scraping the URLs, to finding contact details and gathering all the site metrics took about ten minutes!
Any thoughts? @alexp622