Blog Posts About


This section is all about things I do for a living. I currently work at where my main task is to acquire new clients for the business. If you haven’t guessed, this section is almost exclusively devoted to Internet Marketing with a bit of posts about code mixed in.

Diary WordPress

Merged All My Old Web Properties

I could go on a rant about how I want to create a stronger personal brand, but that would be bullshit.

While doing some spring autumn cleaning, I discovered a few funky files informing me that I’ve been hacked.

I had neglected to update core WordPress, themes and plugins of a lot of my first web projects. I have payed the price for my laziness.

I feel like it will be easier to manage and less risky if I just begin to dump it all my content here.

I had not found a niche that I was particular passionate about. I also hadn’t/haven’t found the willpower to force myself into becoming obsessed about a topic. So I am not too affected by the hack (most of the sites had been abandoned), it’s just pretty annoying to have to half-ass redirect everything late at night.

In any case, there’s a bunch more cleaning to be done in the next few days.

AdWords JavaScript

AdWords Broken URL Checker That Won’t Timeout Ever

A common problem with broken URL checker scripts is that they tend to timeout when running on large AdWords accounts. This is because Google would rather you use their API.

I’d love to start using the Google API and I even got my test MCC account up and running. The problem is that I’m not really at that level of coding comfort yet, so I’ve started playing with scripts. The following Broken URL Checker was my first time using JavaScript!

Don’t get me wrong, there’s a few great link checker AdWords scripts out there and some not so great ones.

Here’s one script that works perfectly fine on smaller AdWords account.

Google also has it’s own Link Checker script that checks 800 URLs every execution. It’s actually pretty neat but I haven’t tested it out.

In theory, Google’s Link Checker should also work if the Free AdWords Scripts keeps timing out. I decided to just write my own little version of his script that allows me to segment my account. In other words, instead of running 1 script to check all my destination pages for errors, I run multiple scripts that check a unique set of campaigns each.

It’s heavily inspired (partially stolen) from Russel Savage’s script but there are some key differences.

The most notable difference is the fact that I no longer take a look at keyword destination URLs. I don’t have that many in my account so I have a separate script that doesn’t use my custom campaign filter.

In fact, the filter in the AdGroup selector is the only other difference other than some comments that I just added.

My Broken URL Checker Code

* Broken URL Checker /w Segments
* Version 1.0
* Created by: Philip Tomlinson (@philtomm)
* Modified Version of Russ Savage's Broken URL Finder
function main() {
  // You can add more if you want:
  var BAD_CODES = [404,500];
  var TO = [''/* insert email in the quotes like this ''*/];
  var SUBJECT = 'Broken AdWords for Campaign URL Report - ' + _getDateString(); 
  //You may want to add what letters you're targeting
  var HTTP_OPTIONS = {
  // This is the start of my changes
  // Establishing variables for the alphabet filter
  var alphaIndex;
  var alphabet = ["A","B","C","D","E","F","G","H","I","J","K","L","M","N","O","P","Q","R","S","T","U","V","W","X","Y","Z"];
  var iters = [];
  // This filter selects all campaigns starting with the letters A to G.
  for (alphaIndex = 0 /*starting letter number*/; alphaIndex < 7 /*number - 1 equals last letter */; ++alphaIndex){  
  var iterPush = [
      .withCondition("Status = 'ENABLED'")
      .withCondition("AdGroupStatus = 'ENABLED'")
      .withCondition("CampaignStatus = 'ENABLED'")
      .withCondition("Type = 'TEXT_AD'")
      .withCondition("CampaignName STARTS_WITH '" + alphabet[alphaIndex] + "'")
    var iters = iters.concat(iterPush);
  /* Everything else is from Russel Savage's post here: */
  var already_checked = {}; 
  var bad_entities = [];
  for(var x in iters) {
    var iter = iters[x];
    while(iter.hasNext()) {
      var entity =;
      if(entity.getDestinationUrl() == null) { continue; }
      var url = entity.getDestinationUrl();
      if(url.indexOf('{') >= 0) {
        //Let's remove the value track parameters
        url = url.replace(/{[0-9a-zA-Z]+}/g,'');
      if(already_checked[url]) { continue; }
      var response_code;
      try {
        Logger.log("Testing url: "+url);
        response_code = UrlFetchApp.fetch(url, HTTP_OPTIONS).getResponseCode();
      } catch(e) {
        //Something is wrong here, we should know about it.
        bad_entities.push({e : entity, code : -1});
      if(BAD_CODES.indexOf(response_code) >= 0) {
        //This entity has an issue.  Save it for later. 
        bad_entities.push({e : entity, code : response_code});
      already_checked[url] = true;
  var column_names = ['Type','CampaignName','AdGroupName','Id','Headline/KeywordText','ResponseCode','DestUrl'];
  var attachment = column_names.join(",")+"n";
  for(var i in bad_entities) {
    attachment += _formatResults(bad_entities[i],",");
  if(bad_entities.length > 0) {
    var options = { attachments: [Utilities.newBlob(attachment, 'text/csv', 'bad_urls_'+_getDateString()+'.csv')] };
    var email_body = "There are " + bad_entities.length + " urls that are broken. See attachment for details.";
    for(var i in TO) {
      MailApp.sendEmail(TO[i], SUBJECT, email_body, options);
//Formats a row of results separated by SEP
function _formatResults(entity,SEP) {
  var e = entity.e;
  if(typeof(e['getHeadline']) != "undefined") {
    //this is an ad entity
    return ["Ad",
  } else {
    // and this is a keyword
    return ["Keyword",
//Helper function to format todays date
function _getDateString() {
  return Utilities.formatDate((new Date()), AdWordsApp.currentAccount().getTimeZone(), "yyyy-MM-dd");

N.B. I did not test my code after adding some comments. If this doesn’t work, please tell me in the comments! I’m a known typo machine.


Steal backlinks from a competitor’s product pages

Last month, a guest post by Chris Laursen caught my eye. It was about link building tactics for eCommerce that do not require quality content. One prospecting tactics Cris used was uncovering backlink profiles of closed businesses. I’ve decided to test something slightly different. Rather than look at closed businesses, the goal will be to steal backlinks from an active competitor.

In theory, webmasters should want to refer their readers to a place where they can actually buy the mentioned product right? That’s why I decided to uncover how many broken links to products page and links to products that are out of stock I can find for one company. If the results are satisfactory, I might test out outreach with a real competitor of mine.

I decided to run my prospecting test with SSENSE. Why them? I have a friend that works there and I don’t want to warn my competitors that I’m planning to steal backlinks from them.

Getting Those Dirty Leads

I was pleasantly surprised to see that SSENSE has a great URL structure. By specifying that the page is a product page in the URL, I’ll be able use search for backlinks using the prefix in Ahref.


Of course, it also means I’ll only be checking the Men’s section. By checking only one backlink per domain, I’ve gotten 1510 results. Not bad!

If You Don’t Got That Prefix

If SSENSE had a flat URL structure (, I would have needed to figure out a unique footprint associated to the product pages, scape the Link URLs for it, remove the fat and continue to the next step.

Find Those 404 Errors

I’m going to assume that my fake eCommerce store carries an identical inventory to SSENSE. If that wasn’t case, I might be interested in cutting out various brands or item categories.

Some people might be tempted to this:


Doing a such a check with SeoTools for Excel isn’t 100% wrong. It would just be a waste of time because there are definitely duplicate URLs in that Ahref export. In this case, I was able to reduce the list by 1/3 by copying the Link URLs to another sheet and removing duplicates. Not to shabby.

If you got some free tools that do reliable HTTP status checks really fast, I’d love hear about them in the comments because it’s annoying to wait for this check to end.

Once that’s done, copy and overwrite the column by pasting the results as values rather than formulas. This is a habit I have developed when dealing with large columns of function and it can really save you headaches.


For some weird reason, SSENSE redirects users to their 404 page. These redirections only account for less than 10% of all 301 redirects. The other redirects were due to a change in the URL structure. They were not redirecting sold out products to their home page or related pages.

Using VLOOKUP, I was able to confirm that there was only one domain per 404 error. In any case, that’s still 9 potential links to steal if you’re carrying the product or something extremely similar!

Discover What’s Out of Stock

While I was waiting for the HTTP status check to finish, I confirmed that sold out products pages aren’t redirected and are easily identifiable.


Because SSENCE has implemented rich snippets for products, it’s really easy to scrape their product availability using xPath.

If you’re using SeoTools for Excel, don’t be tricked into using =XPathOnUrl(H2,"//meta[@itemprop='availability']/@content"). That function will not give you the content of the meta tag, it will only confirm that it exists. You must use =XPathOnUrl(H3,"//meta[@itemprop='availability']","content") instead to see the actual contents.

Once again, there’s a bit of wait.


Out of all the valid URLs, over 75% were out of stock! If I were SSENSE, I’d be checking to see how much referral traffic the product pages are getting…

However, that’s good news for the people who still have those products in stock and want to steal those links.

Alternative Method

If product rich snippets aren’t implemented, you could always use ScrapeBox to check if “Sold Out” is present on the page.

What’s Left To Do

Before even beginning to harvest emails, you’ll just need make sure you have the item in stock.

The only roadblock I can see is that some of these backlinks are in articles about SSENSE. Therefore, it may be hard to pitch a replacement link. However, if you have the product in stock and a good price, you may still be able to get a link on the same page if the webmaster is open to it.

I’m still amazed that over 1000 external links were to out of stock products.

If you’ve tried this method, I’d love to hear what your results where.


Google+ profile links show up as nofollow when logged in

The second link building tactic from Backlinko’s blog post also ended up being a bit trickier than expected. This time it wasn’t because of a WordPress RSS feed issue, it’s because Google+ shows all profile links as nofollow when you’re logged in.

What is even more misleading is that selecting to view your public profile also shows a nofollow link.


I decided to double check some other Google+ profiles. All of them actually had dofollow links in their story section. I logged out and checked my true public profile and the link became follow!

dofollow google plus profile link

I have no idea why Google+ changes the follow attribute of profile links this way. Maybe Google is trying to limit SEO abuse?


How to fix invalid WordPress RSS feed errors

I recently read a great article by Brian Dean about untapped backlink sources. While some are definitely not untapped, I decided to try two of them on Another Marketer. The first source I tapped were blog aggregators. I had no problems submitting my blog feed until I hit Alltop. According to Alltop, I submitted an invalid WordPress RSS feed

I checked the RSS feed on and, to my surprise, my feed actually had some syntax errors. I’m not sure why this happened as I touched any of the source files for the WordPress RSS feed and I doubt any of the Bone WordPress theme functions had anything to do with it.

Easy Invalid WordPress RSS Feed Fix

All it needed to do is this:

  1. Navigate to the WordPress Reading Settings
  2. Change how the articles were being displayed from full to summary

Fix invalid WordPress RSS feed errors by selecting a new RSS feed option

I’m really not sure why this worked, but it did. If you’re having the same problems but you’re setting is at summary, try switching it to full instead.


Two scalable branded link reclamation tactics

This post is a work in progress. It requires more images and maybe a video tutorial!

There are a lot of tutorials and bloggers that state branded link reclamation is really easy. According to these SEOs, all you need to do is follow these 4 simple steps:

  1. Navigate to Google
  2. Search for your brand
  3. Find a non-linking brand mention
  4. Send an email

The four step process is a lie. Combing through your brand mentions like this takes time and isn’t scalable if your working for Toyota or any other large brand.

These types of businesses get mentioned multiple times everyday and often own a large number of other brands. By taking a look at all these mentions manually, not only will you encounter a high number of linking mentions, but a high number of unusable ones will also show in search results.

This means countless hours are wasted just to turn one mention into a link. You’ll be wasting your time for a very low return. In this post, I’ll show two branded link reclamation tactics that I have used with great success. I’ll also explain how to check for non-linking mentions as well as an easy way to find contact information in order to tell them what’s up.

Reclaiming Branded Links with Scrapebox

If you aren’t using Scrapebox because you’ve heard it’s a blackhat too, you need to wisen up. While some people like to use it for comment spam, it’s actually a great tool for uncovering guest posting opportunities, eliminating dead links from your Ahref or Majestic backlink exports, verifying DA of a huge number of domains and a lot of other everyday SEO tasks. I’m going to assume readers of this blog know their Scrapebox basics or have at least read Jacob King’s Ultimate Guide.

Setting Up Your Scrape

Personally, I don’t think you need to do much here. If you have a list of the branded terms you’re looking for it should be enough. The reason why I don’t suggest using the Scarpebox Keyword Scraper is that you might end up having to waste some time cleaning it up.

Due the fact that I work mainly with Canadian brands, I will merge a list of footprints tied to site: queries with country, province and city name mentions within my keyword list. Lastly, I’ll ensure to have in my global footprint.

After that’s done, you really just need to stat harvesting. It’s really that easy to get more than a thousand URLs that have potentially mentioned your brand.

Cleaning Your Results

One of the main issues with a Scrapebox harvest is that it tends to be very dirty. If you were looking for brand mentions of a Jeep Grand Cherokee, you’ll definitely get some sites that are actually about Native Americans instead. An easy way to eliminate those sites is by cleaning your harvest into Excel.

Using Seo Tools for Excel, it’s extremely easy to check if pages mention any potential keywords that would signify a scrape error. In our example, I’d most likely run a similar but more complex regex to this =IFERROR(RegexpFindOnUrl("","Native"), TRUE). I’d keep all TRUE URLs and manually check the others if I have the time or motivation.

Monitoring Brand Mentions for Link Reclamation

Now if you aren’t comfortable getting your hands a bit dirty with some PHP code, you won’t be able to actually use this second link reclamation tactic. It’s a more scalable version of Moz’s Fresh Web Explorer and services like

Creating a Huge Number Number of Google Alerts

You’ll be wanting to set up a lot of Google Alerts. I’m talking 20 to 50 alert different types of alerts. The easiest way to do this without getting suicidal thoughts is to automate it. A easy and efficient solution is to just create a ReMouse macro using your browser and excel columns with each alert.

Setting Up SimplePie

SimplePie is a great tool for filtering out useless mentions. I’m working on getting it to do the actual link verification for me. Until then, here’s my two introductory posts on Simple Pie:

How to merge your Google Alerts RSS feeds and filtering out items

Removing duplicate items and an another method to filter out items

Once you’ve set this up. all you will need to do is hit your page up once every few days to find new brand mentions.

Finding Non-Linking Brand Mentions

After getting our leads using either Scrapebox or our custom monitoring system built with SimplePie, we need to start turning those non-linking brand mentions into links.

This is fairly easy. Just install the Link Checker add-on. You’ll need to create two text file. One will contain the list of sites you scraped using one of the two methods bellow. The other must contain the list of domains you are interested in checking.

I suggest running the link checker multiple times while cleaning out positive hits. Depending on the number of URLs on the same domain and other factors, I tend to get a few false negatives on my first link checker runs.

Scraping Emails for Mass Mailing

While I know there are some other tools that will scrape email address from sites. I haven’t had the budget to invest in any of them yet. My current technique is to just use Scrapebox. I take the list of non-linking URLs and trim to root. I move this list to excel and create a list where I concatenate “/about” and another where I concatenate “/contact”. These three lists paired with Scrapebox Email Grabber tends to give me great results.

If you can suggest a great email scraper in the comments, I’d be enormously grateful.

Remember to have “Email Grabber: Save url with emails” enable in options.

Preventing Spam Accusations

Now you got your emails, you just need to start sending out some messages. The best way to prevent spam accusations is not to mass mail the list of emails you just got and be polite…

eCommerce SEO

Creating my first eBay listings

I’ve been working on helping a brick and mortar store build their first eCommerce site. I’ve been running roadblocks due to their lack of inventory management practices. The store manager refuses to install a POS and has never had employees track their inventory daily. In order to start training the staff to keep track of their inventory and fix some small kinks in our shipping strategy, I decided to create my first eBay listings.

It may seem counter intuitive to start an eBay account for the future eCommerce store, but these are the main reasons I have decided to do it:

  • Highlight importance of proper inventory management
  • Ensure shipping runs smoothly
  • Start making some first sales
  • Motivate staff by teaching them a new skill

An interesting point brought up by Jack Stonehouse on God of SEO is that these product listings and the eBay profile itself will create some powerful links pointing to the eCommerce websites. I hadn’t even thought about that, but it’s definitely another advantage to creating my first eBay listings.


If you’re a legitimate company, you’ll be able create a eBay business account first. The advantages of this type of account is that you can keep your business name as your username. The disadvantage is that it requires your business tax numbers. If you’re company isn’t government approved, you’ll have to stick with a personal eBay account.

Here’s a slightly edited version of the eBay listing tutorial that I created for the store’s employees

Creating Your First eBay Listings


After you log in to the eBay account, navigate to the “Sell an item” page. If you’re having trouble finding it, the page is in the “Sell” section of the menu.

Creating a Listing Versus Using a Template

While you can use a template and only modify the important part of the listing such as the title, item attributes and product description, this tutorial will be showing you how to create a new eBay listing from scratch. After you have created your first item, you may want to save your personal template for future use.

The only difference between a new listing and a template is that most of the information will be filled out already. It won’t necessarily be the good information for the new product you want to list.

difference between a new eBay listing and a tempalte

The first thing you will want to do is enter the name of your product. It should have all the important descriptors such as the brand, size, color and any other fact about the product that may be interesting for the typical user.

Choosing the eBay Category

choose a category for your first ebay listings

eBay will suggest a category for your listing but, most of the time, it will be a category that doesn’t truly reflect the users we are targeting. In addition, you’ll want to test out various categories to see which one works best for which type of products. That’s why you often will want to choose the category yourself or use a category that you know has worked in the past.

Describing Your eBay Listing With Item Attributes

Adding item attributes

The next step will be to fill in the attributes. Depending on the category you selected, there will be a few default ones. You’ll definitely want to create some new ones. The more item attributes the product has, the better.

Creating custom eBay item attributes

Remember to keep the same formatting as the past attributes. The title goes on the top and the value goes on the bottom.

Adding Pictures to Your eBay Listing

Uploading pictures to eBay

The next step is to take and upload pictures of the item. Sometimes, they’ll show up upside down. If that’s the case, you can rotate them directly in eBay. In some cases, there will be no good images to use for the product. While I don’t suggest it, you can look online for a better image and upload it instead.

Writing Your Product Listing Description

Adding a description to your eBay listing

After you’re done with the images, it’s time to write the product description. I have written an HTML template which is not present in this blog post with a basic design that can be used. All that is needed from you is to change the titles and ensure the description adequately describes the product.

Don’t be shy. If you feel like writing more text, do it!

Setting Up Your Auction Price

Setting up your listing's price

I will not be getting into fixed prices auctions for now, because we will be focusing on liquidating caps on auction. If we decide to go forward and create an eBay store, we will most likely start creating fixed price listings instead. It is the only way to add item variations and showcase an inventory for a specific product on eBay.

Remember to confirm with the big boss what starting prices he would like. Reserve prices cost money, and we currently don’t want to invest a penny in the sales.

Selecting payment, shipping and refund options

I have set default values for all these options. Until further notice, please just keep the default ones.


How to remove duplicate URLs and irrelevant pages from RSS feeds with SimplePie

At the end of my last post about using SimplePie, I mentioned that I did not think I would have any issues with removing duplicate URLs. I actually did have some issues with the appearance of duplicate URLs and irrelevant pages in my results. By irrelevant articles, I mean articles where reclaiming a brand mention would not make any sense. In any case, I’ll be showcasing the modifications I have done to my last snippets of code in this post. For those who wish to see both posts combined, there will be an excellent tutorial on branded link reclamation coming up in the next month.

Using Regex To Filter Out Non Relevant Titles

Most of the irrelevant pages were created by pages trying to sell products related to the brand. While I could of still used an array to look for specific substrings in the titles, I decided to use a regex variable. Regex is not only more flexible, but also requires less code to actually filter out results. Instead of looping through each string in the array and compare it to the item in the RSS feed, only one check is needed.

<?php $titlePattern = "/(U|u)sed|(f|F)or.(S|s)ale/"; ?>

As you can see, the filter looks quite different than the one created previously for domain names. If you look bellow, you’ll see that instead of doing a check on each item using stristr, you’ll need to use pre_match and it’ll check for all the combinations in the regex variable.

<?php $filtration = 0; ?>
<!-- title check -->
<?php   if(preg_match($titlePattern, $item->get_title()) != false){
                $filtration = 1; 
		} else {
                        //if title check passes, check domain
			foreach($filter as $token){
				if(stristr($item->get_permalink(), $token) !== false){
				$filtration = 1;

How to Remove Duplicate URLs & Preserve Relevant Information

You’ll notice that there’s a new variable included in this filter called $filtration. This works as a flag that tells the filter that the URL did not pass the various domain and title filters. The next step is to remove all duplicate domains while preserving only the information we want to display. Unlike the answer in this Stack Overflow question, the filtration creates items that include both the title and the permalink instead of just the title.

//remove duplicates that pass filter 	
if($filtration != 1){ 
    $filteredFeed[$item->get_title()] = array('title' => $item->get_title(), 'permalink' => $item->get_permalink());
endforeach; ?>

If you were wondering how this removes duplicate URLs, it’s because the items are created using the title of the RSS feed item. Whenever there’s a duplicate title, it gets removed from the list!


Starting an Etsy shop

My girlfriend has made some really cool creations during the Holidays as gifts for our family. In fact, she made a variety of felt foods, a dog coat, and a set of plush owl pillows. Due to the fact that she would have to tailor her creations to her customers and the creation process could take a few days, it seemed smarter to start our first foray into eCommerce by starting an Etsy shop together. Actually, she’ll be the one starting the shop. I’m just going to bug her to work on it every day because I’m helpful.

What You Need to Make an Etsy Shop

You’ll need more than just an idea in mind to create your first Etsy store. It doesn’t take much, but there are a few things you’ll want to have handy during the process.

The obvious first thing you will want to do is choose both a username and a shop name. I’m not really sure why you’re forced to do both. It’s pretty weird that you can not create more than one shop per user name. In fact, you’re encourage to just register a new username if you want to manage a second Etsy shop.

As I haven’t done much research on Etsy SEO, I won’t claim that a keyword rich shop name will help you in the internal search results even if my intuition suggests it.

etsy shop name

Next step is to start listing some products. In case you didn’t do your research before reading this post, Etsy charges $0.20 for each product listing and takes a 3.5% commission on your sales. Having taken a look at eBay and Amazon listing fees lately, it doesn’t seem that bad in comparison.

Remember good pictures and copy writing are the core to any successful eCommerce product. It’s no different for Etsy. My girlfriend wrote some decent copy, but she will definitely need to get a better picture of her handmade decorative owls.

In addition, you need to know how you will price your items. While looking at your competition can help you see what people are willing to pay, following a simple pricing formula like this one can be helpful. You should have the weight and the size of your products on hand as you will also need to figure out the shipping costs.

Last but not least, you’ll need to have your checkbook handy. Etsy deposits money to your bank account, so you’ll have to share your banking information. It’s kind of like they are an employer that wants to deposit your pay directly into your bank account.

That’s all folks. You now have a basic Etsy shop running. Next step will be read more about Etsy SEO and designing the look of the store.


Don’t trust anchor text distribution tools provided by any of the popular backlink checkers

At my day job, I’ve been working on a dashboard to begin monitoring the backlink portfolio of some of our clients. The goal is to be able to provide a detailed report that takes into account our content promotion efforts as well as our clients’ natural link acquisition month by month. I’ve been using this opportunity to do my own backlink checker review. Unlike Matthew Woodward’s review, I wanted to focus on anchor text distribution and destination URLs of all true live links. These are two metrics are much more meaningful when it comes to deciding on the proper offsite tactics to implement.

Since I have just started to perform my own tests, I don’t have any conclusive data as to which back link checker is the best. Yet, I have not seen any backlink checker that doesn’t have at least 10% to 20% dead backlinks in their reports. While that number may look relatively small, it can lead to some horrible conclusions if you’re trying to see whether certain keywords or landing pages should be avoided.

Dead Links Often Skew the Numbers

Here’s an example to illustrate my point. I have a hypothetical client that Ahrefs reports as having exactly 1000 back links before I verify their validity using Scrapebox. Out those links, 50% are said to have branded anchor text, 30% have targeted anchor text and the last 20% are a mix of brand-hybrids and targeted. Since I’m interested in supporting my anchor text choices for content promotion campaigns, I take a deeper look at those 300 links with exact targeted text and discover that 100 of them are using “goat cheese” as an anchor. That would be almost 1/3 of targeted links using the same short tail keyword! Most SEOs would then decide that other short tail keywords should be focused on for the sake of creating a naturally diversity. That would be wrong.

In almost all cases where I noticed a certain anchor or landing page appearing to take a unnaturally high percentage of the overall scope of links or of one of the smaller sub groups, it was always due to the presence of dead backlinks. In the latter case, only a 10% error would have caused most SEOs to avoid a potentially lucrative keyword. When it comes to data, your priority should always be exactitude, not speed of reporting.

Another New Project: An Anchor Text Distribution Tool

With the help of Scrapebox and Excel, I’ll be sharing with my small readership how I am creating my own anchor text distribution tool as part of my backlink portfolio management project.