Blog Posts About

SEO

SEO

Steal backlinks from a competitor’s product pages

Last month, a guest post by Chris Laursen caught my eye. It was about link building tactics for eCommerce that do not require quality content. One prospecting tactics Cris used was uncovering backlink profiles of closed businesses. I’ve decided to test something slightly different. Rather than look at closed businesses, the goal will be to steal backlinks from an active competitor.

In theory, webmasters should want to refer their readers to a place where they can actually buy the mentioned product right? That’s why I decided to uncover how many broken links to products page and links to products that are out of stock I can find for one company. If the results are satisfactory, I might test out outreach with a real competitor of mine.

I decided to run my prospecting test with SSENSE. Why them? I have a friend that works there and I don’t want to warn my competitors that I’m planning to steal backlinks from them.

Getting Those Dirty Leads

I was pleasantly surprised to see that SSENSE has a great URL structure. By specifying that the page is a product page in the URL, I’ll be able use search for backlinks using the prefix in Ahref.

ssence-url-structure

Of course, it also means I’ll only be checking the Men’s section. By checking only one backlink per domain, I’ve gotten 1510 results. Not bad!

If You Don’t Got That Prefix

If SSENSE had a flat URL structure (http://www.domain.com/product-name), I would have needed to figure out a unique footprint associated to the product pages, scape the Link URLs for it, remove the fat and continue to the next step.

Find Those 404 Errors

I’m going to assume that my fake eCommerce store carries an identical inventory to SSENSE. If that wasn’t case, I might be interested in cutting out various brands or item categories.

Some people might be tempted to this:

http-status-check-wrong

Doing a such a check with SeoTools for Excel isn’t 100% wrong. It would just be a waste of time because there are definitely duplicate URLs in that Ahref export. In this case, I was able to reduce the list by 1/3 by copying the Link URLs to another sheet and removing duplicates. Not to shabby.

If you got some free tools that do reliable HTTP status checks really fast, I’d love hear about them in the comments because it’s annoying to wait for this check to end.

Once that’s done, copy and overwrite the column by pasting the results as values rather than formulas. This is a habit I have developed when dealing with large columns of function and it can really save you headaches.

ssence-http-results

For some weird reason, SSENSE redirects users to their 404 page. These redirections only account for less than 10% of all 301 redirects. The other redirects were due to a change in the URL structure. They were not redirecting sold out products to their home page or related pages.

Using VLOOKUP, I was able to confirm that there was only one domain per 404 error. In any case, that’s still 9 potential links to steal if you’re carrying the product or something extremely similar!

Discover What’s Out of Stock

While I was waiting for the HTTP status check to finish, I confirmed that sold out products pages aren’t redirected and are easily identifiable.

ssence-sold-out

Because SSENCE has implemented rich snippets for products, it’s really easy to scrape their product availability using xPath.

If you’re using SeoTools for Excel, don’t be tricked into using =XPathOnUrl(H2,"//meta[@itemprop='availability']/@content"). That function will not give you the content of the meta tag, it will only confirm that it exists. You must use =XPathOnUrl(H3,"//meta[@itemprop='availability']","content") instead to see the actual contents.

Once again, there’s a bit of wait.

ssence-out-of-stock

Out of all the valid URLs, over 75% were out of stock! If I were SSENSE, I’d be checking to see how much referral traffic the product pages are getting…

However, that’s good news for the people who still have those products in stock and want to steal those links.

Alternative Method

If product rich snippets aren’t implemented, you could always use ScrapeBox to check if “Sold Out” is present on the page.

What’s Left To Do

Before even beginning to harvest emails, you’ll just need make sure you have the item in stock.

The only roadblock I can see is that some of these backlinks are in articles about SSENSE. Therefore, it may be hard to pitch a replacement link. However, if you have the product in stock and a good price, you may still be able to get a link on the same page if the webmaster is open to it.

I’m still amazed that over 1000 external links were to out of stock products.

If you’ve tried this method, I’d love to hear what your results where.

SEO

Google+ profile links show up as nofollow when logged in

The second link building tactic from Backlinko’s blog post also ended up being a bit trickier than expected. This time it wasn’t because of a WordPress RSS feed issue, it’s because Google+ shows all profile links as nofollow when you’re logged in.

What is even more misleading is that selecting to view your public profile also shows a nofollow link.

google-plus-profile-link

I decided to double check some other Google+ profiles. All of them actually had dofollow links in their story section. I logged out and checked my true public profile and the link became follow!

dofollow google plus profile link

I have no idea why Google+ changes the follow attribute of profile links this way. Maybe Google is trying to limit SEO abuse?

PHP SEO

Two scalable branded link reclamation tactics

This post is a work in progress. It requires more images and maybe a video tutorial!

There are a lot of tutorials and bloggers that state branded link reclamation is really easy. According to these SEOs, all you need to do is follow these 4 simple steps:

  1. Navigate to Google
  2. Search for your brand
  3. Find a non-linking brand mention
  4. Send an email

The four step process is a lie. Combing through your brand mentions like this takes time and isn’t scalable if your working for Toyota or any other large brand.

These types of businesses get mentioned multiple times everyday and often own a large number of other brands. By taking a look at all these mentions manually, not only will you encounter a high number of linking mentions, but a high number of unusable ones will also show in search results.

This means countless hours are wasted just to turn one mention into a link. You’ll be wasting your time for a very low return. In this post, I’ll show two branded link reclamation tactics that I have used with great success. I’ll also explain how to check for non-linking mentions as well as an easy way to find contact information in order to tell them what’s up.

Reclaiming Branded Links with Scrapebox

If you aren’t using Scrapebox because you’ve heard it’s a blackhat too, you need to wisen up. While some people like to use it for comment spam, it’s actually a great tool for uncovering guest posting opportunities, eliminating dead links from your Ahref or Majestic backlink exports, verifying DA of a huge number of domains and a lot of other everyday SEO tasks. I’m going to assume readers of this blog know their Scrapebox basics or have at least read Jacob King’s Ultimate Guide.

Setting Up Your Scrape

Personally, I don’t think you need to do much here. If you have a list of the branded terms you’re looking for it should be enough. The reason why I don’t suggest using the Scarpebox Keyword Scraper is that you might end up having to waste some time cleaning it up.

Due the fact that I work mainly with Canadian brands, I will merge a list of footprints tied to site: queries with country, province and city name mentions within my keyword list. Lastly, I’ll ensure to have -site:brand.com in my global footprint.

After that’s done, you really just need to stat harvesting. It’s really that easy to get more than a thousand URLs that have potentially mentioned your brand.

Cleaning Your Results

One of the main issues with a Scrapebox harvest is that it tends to be very dirty. If you were looking for brand mentions of a Jeep Grand Cherokee, you’ll definitely get some sites that are actually about Native Americans instead. An easy way to eliminate those sites is by cleaning your harvest into Excel.

Using Seo Tools for Excel, it’s extremely easy to check if pages mention any potential keywords that would signify a scrape error. In our example, I’d most likely run a similar but more complex regex to this =IFERROR(RegexpFindOnUrl("http://www.brandmention.com","Native"), TRUE). I’d keep all TRUE URLs and manually check the others if I have the time or motivation.

Monitoring Brand Mentions for Link Reclamation

Now if you aren’t comfortable getting your hands a bit dirty with some PHP code, you won’t be able to actually use this second link reclamation tactic. It’s a more scalable version of Moz’s Fresh Web Explorer and services like Mention.net.

Creating a Huge Number Number of Google Alerts

You’ll be wanting to set up a lot of Google Alerts. I’m talking 20 to 50 alert different types of alerts. The easiest way to do this without getting suicidal thoughts is to automate it. A easy and efficient solution is to just create a ReMouse macro using your browser and excel columns with each alert.

Setting Up SimplePie

SimplePie is a great tool for filtering out useless mentions. I’m working on getting it to do the actual link verification for me. Until then, here’s my two introductory posts on Simple Pie:

How to merge your Google Alerts RSS feeds and filtering out items

Removing duplicate items and an another method to filter out items

Once you’ve set this up. all you will need to do is hit your page up once every few days to find new brand mentions.

Finding Non-Linking Brand Mentions

After getting our leads using either Scrapebox or our custom monitoring system built with SimplePie, we need to start turning those non-linking brand mentions into links.

This is fairly easy. Just install the Link Checker add-on. You’ll need to create two text file. One will contain the list of sites you scraped using one of the two methods bellow. The other must contain the list of domains you are interested in checking.

I suggest running the link checker multiple times while cleaning out positive hits. Depending on the number of URLs on the same domain and other factors, I tend to get a few false negatives on my first link checker runs.

Scraping Emails for Mass Mailing

While I know there are some other tools that will scrape email address from sites. I haven’t had the budget to invest in any of them yet. My current technique is to just use Scrapebox. I take the list of non-linking URLs and trim to root. I move this list to excel and create a list where I concatenate “/about” and another where I concatenate “/contact”. These three lists paired with Scrapebox Email Grabber tends to give me great results.

If you can suggest a great email scraper in the comments, I’d be enormously grateful.

Remember to have “Email Grabber: Save url with emails” enable in options.

Preventing Spam Accusations

Now you got your emails, you just need to start sending out some messages. The best way to prevent spam accusations is not to mass mail the list of emails you just got and be polite…

eCommerce SEO

Creating my first eBay listings

I’ve been working on helping a brick and mortar store build their first eCommerce site. I’ve been running roadblocks due to their lack of inventory management practices. The store manager refuses to install a POS and has never had employees track their inventory daily. In order to start training the staff to keep track of their inventory and fix some small kinks in our shipping strategy, I decided to create my first eBay listings.

It may seem counter intuitive to start an eBay account for the future eCommerce store, but these are the main reasons I have decided to do it:

  • Highlight importance of proper inventory management
  • Ensure shipping runs smoothly
  • Start making some first sales
  • Motivate staff by teaching them a new skill

An interesting point brought up by Jack Stonehouse on God of SEO is that these product listings and the eBay profile itself will create some powerful links pointing to the eCommerce websites. I hadn’t even thought about that, but it’s definitely another advantage to creating my first eBay listings.

log-in-to-eBay

If you’re a legitimate company, you’ll be able create a eBay business account first. The advantages of this type of account is that you can keep your business name as your username. The disadvantage is that it requires your business tax numbers. If you’re company isn’t government approved, you’ll have to stick with a personal eBay account.

Here’s a slightly edited version of the eBay listing tutorial that I created for the store’s employees

Creating Your First eBay Listings

selling-ebay-items

After you log in to the eBay account, navigate to the “Sell an item” page. If you’re having trouble finding it, the page is in the “Sell” section of the menu.

Creating a Listing Versus Using a Template

While you can use a template and only modify the important part of the listing such as the title, item attributes and product description, this tutorial will be showing you how to create a new eBay listing from scratch. After you have created your first item, you may want to save your personal template for future use.

The only difference between a new listing and a template is that most of the information will be filled out already. It won’t necessarily be the good information for the new product you want to list.

difference between a new eBay listing and a tempalte

The first thing you will want to do is enter the name of your product. It should have all the important descriptors such as the brand, size, color and any other fact about the product that may be interesting for the typical user.

Choosing the eBay Category

choose a category for your first ebay listings

eBay will suggest a category for your listing but, most of the time, it will be a category that doesn’t truly reflect the users we are targeting. In addition, you’ll want to test out various categories to see which one works best for which type of products. That’s why you often will want to choose the category yourself or use a category that you know has worked in the past.

Describing Your eBay Listing With Item Attributes

Adding item attributes

The next step will be to fill in the attributes. Depending on the category you selected, there will be a few default ones. You’ll definitely want to create some new ones. The more item attributes the product has, the better.

Creating custom eBay item attributes

Remember to keep the same formatting as the past attributes. The title goes on the top and the value goes on the bottom.

Adding Pictures to Your eBay Listing

Uploading pictures to eBay

The next step is to take and upload pictures of the item. Sometimes, they’ll show up upside down. If that’s the case, you can rotate them directly in eBay. In some cases, there will be no good images to use for the product. While I don’t suggest it, you can look online for a better image and upload it instead.

Writing Your Product Listing Description

Adding a description to your eBay listing

After you’re done with the images, it’s time to write the product description. I have written an HTML template which is not present in this blog post with a basic design that can be used. All that is needed from you is to change the titles and ensure the description adequately describes the product.

Don’t be shy. If you feel like writing more text, do it!

Setting Up Your Auction Price

Setting up your listing's price

I will not be getting into fixed prices auctions for now, because we will be focusing on liquidating caps on auction. If we decide to go forward and create an eBay store, we will most likely start creating fixed price listings instead. It is the only way to add item variations and showcase an inventory for a specific product on eBay.

Remember to confirm with the big boss what starting prices he would like. Reserve prices cost money, and we currently don’t want to invest a penny in the sales.

Selecting payment, shipping and refund options

I have set default values for all these options. Until further notice, please just keep the default ones.

SEO

Don’t trust anchor text distribution tools provided by any of the popular backlink checkers

At my day job, I’ve been working on a dashboard to begin monitoring the backlink portfolio of some of our clients. The goal is to be able to provide a detailed report that takes into account our content promotion efforts as well as our clients’ natural link acquisition month by month. I’ve been using this opportunity to do my own backlink checker review. Unlike Matthew Woodward’s review, I wanted to focus on anchor text distribution and destination URLs of all true live links. These are two metrics are much more meaningful when it comes to deciding on the proper offsite tactics to implement.

Since I have just started to perform my own tests, I don’t have any conclusive data as to which back link checker is the best. Yet, I have not seen any backlink checker that doesn’t have at least 10% to 20% dead backlinks in their reports. While that number may look relatively small, it can lead to some horrible conclusions if you’re trying to see whether certain keywords or landing pages should be avoided.

Dead Links Often Skew the Numbers

Here’s an example to illustrate my point. I have a hypothetical client that Ahrefs reports as having exactly 1000 back links before I verify their validity using Scrapebox. Out those links, 50% are said to have branded anchor text, 30% have targeted anchor text and the last 20% are a mix of brand-hybrids and targeted. Since I’m interested in supporting my anchor text choices for content promotion campaigns, I take a deeper look at those 300 links with exact targeted text and discover that 100 of them are using “goat cheese” as an anchor. That would be almost 1/3 of targeted links using the same short tail keyword! Most SEOs would then decide that other short tail keywords should be focused on for the sake of creating a naturally diversity. That would be wrong.

In almost all cases where I noticed a certain anchor or landing page appearing to take a unnaturally high percentage of the overall scope of links or of one of the smaller sub groups, it was always due to the presence of dead backlinks. In the latter case, only a 10% error would have caused most SEOs to avoid a potentially lucrative keyword. When it comes to data, your priority should always be exactitude, not speed of reporting.

Another New Project: An Anchor Text Distribution Tool

With the help of Scrapebox and Excel, I’ll be sharing with my small readership how I am creating my own anchor text distribution tool as part of my backlink portfolio management project.

SEO

Potential Google algorithm updates as imagined by a paranoid link builder in 2014

I’m not going to state that any of these potential Google algorithm updates will actually happen. Ever since Matt Cutts ranted about guest posts, I’ve been asking myself how Google will tweak itself as to render guest posting even riskier than it currently is.

Recent posts by Sugar Rae and Aaron Wall started to get my brain juices flowing. As they both highlight, there has been no scalable link building tactic that has not ended up being branded as spam by Google. This entails that no link builder should assume that any scalable tactic can remain risk free, even if it’s just reclaiming brand mentions.

So instead of ranting, I’ve decided to brainstorm about the various ways a Google algorithm update could systematically affect the links I build every day using tactics like guest posting and link reclamation.

If you think I’m an idiot, feel free to tell me in the comments.

Guest Blogging: Looking at Personas & Other Footprints

I’ve already mentioned before that personas leave footprints that can be used during manual reviews. As far I as I know, Google has not begun mining data about specific guest bloggers but, in theory, it could be fairly easy to begin associating bloggers with specific domains.

For example, Nenad SEO exposed Abigail Clark as an Expedia link builder. Let’s scale his process:

Potential Algorithm Update #1

  1. Google begins associating authors to specific posts and taking note of all external links contained in that post.
  2. Google notices author using optimized anchor text for specific domains at an unnatural rate.
  3. Google devalues all links mentioned by that author and penalizes the websites.

It doesn’t even really matter if the quality of the content or the website that is being posted on by the author, Google can clearly argue that the links were placed for their SEO value. Yet, SEOs aren’t usually that stupid. We’ll vary our personas, in some cases never even use the same name twice, and make our anchor text appear natural. That wouldn’t be an issue for this update:

Potential Algorithm Update #2

  1. Google scrapes pages for mentions of “guest post,” “guest blog,” “guest author” and other important keywords and flags the pages.
  2. Google looks at the external links in the content of flagged pages.
  3. If a domain has a high amount of backlinks on flagged pages with rich anchor text, they get penalized.

The lesson here is to avoid any type of footprint when buying links. It seems like it might actually be less risky to buy links on recent content than actually guest post on high quality blogs.

Link Reclamation: Looking at Frequency of Page Updates

Link reclamation seems untouchable because it always brings value to the website. This is especially true when SEOs help webmasters fix broken links. Yet, reclaiming brand mentions also directs users to content that they may want to see.

We all know it isn’t lack of value is not what Google cares about. As soon as a link building tactic can game their algorithm, they will attack it. It will not be different for any type of link reclamation.

Here’s how I think they could do it:

Potential Algorithm Update #3

  1. Google flags pages with recent link modifications and takes note of the new URL.
  2. Google notices that many links are being modified to point to a new page on a specific domain.
  3. Google devalues the new links and penalizes the page or domain.

Potential Algorithm Update #4

  1. Google flags pages with recent link additions and takes note of the new URL.
  2. Google notices that many links are being created using preexisting anchor text on a massive number of domains.
  3. Google devalues the new links and penalizes linked domain for gaming the system.

The lesson here is even a safe tactic can be detected by Google and systematically penalized if they choose to do it.

It remains to be seen what actual algorithm updates Google will pull out in 2014. All I know is that SEOs better be prepared for the worst as usual.

SEO

Ditching Google won’t save you from the Google webspam team

There’s been a lot of talk recently about the Google Webspam team monitoring your emails and the lengths Google will go to hunt blackhat SEOs following posts by God of SEO and Agent Blackhat. The findings and correlations that they expose are problematic.

While I do agree that their findings and correlations are problematic, I strongly believe that for most SEOs reducing the amount of information you give to Google will not really slow that Webspam team too much. Sure, you can avoid Chrome, use foreign IPs to connect all your fake Google+ accounts and stop using GMail. I just don’t think it matters all that much.

Without going into too much paranoia about dealing with multiple IP addresses & tracking cookies, here are just a few issues I have with those asking for an exodus from Google. I’d love to hear how wrong I am, because I seriously lack knowledge about online privacy and security issues.

Google Webspam Team

Sending Emails to Gmail Accounts

If you stop using Gmail to prevent the Google webspam team from reading your emails, you have to realize that you should also avoid sending and receiving any incriminating emails from people using Gmail. For link builders, this seems like a possibly impossible requirement just to avoid being caught by Google. What would stop Google from identifying one link seller using Gmail and destroying all his clients efforts, while keeping his site alive as a honeypot?

While I understand the paranoia, I just don’t think it makes sense. Someone really needs to do a case study comparing response rates using identical emails from different providers.

Backlink Footprints

The easiest way for the webspam team is to identify individual link builders is check for similar backlink profiles between websites. Depending on the level of automated link building tactics used and the variety of niches of your sites, it may be quite easy to identify a loose network of sites that should get penalized by looking at similarities.

This doesn’t really have anything to do with the issues raised about the disavow tool. Most good link builders and link brokers know how to make their individual links look natural by placing them on contextually relevant domains and surrounding them with semantically relevant content. The problem happens when a link builder becomes lazy and decides to use the same domains over and over again for the same client.

Yet, I’m not sure how many SEOs spend the time to ensure that they are not building quasi identical back link portfolios for their sites. Someone should make a small tool for this in the future, because I haven’t found any good ones.

link building personas

Using Multiple Personas & Retiring Them Early Can Help Mitigate The Risk

Let’s say one of you link build for a variety of travel clients and you’re abusing guest posts like an idiot, do yourself a favor and create multiple personas. If you never use the same persona for different client, chances are you might only get one of your clients penalized if Google decides they don’t like your persona’s guest posts. If you’re smarter, you’ll also vary personas depending on the quality and risk of the blog ensuring that better links don’t get caught by a simple “John Doe ‘guest post'” Google search.

In any case, the smartest thing to do is to never abuse a domain for links and never do shady guest posts. Being smart and listening to Google is always the safest route, even if it usually doesn’t get you actual results in the short term. It’s really the only way to save yourself from dealing with the Google webspam team.

Coding SEO

Troubleshooting a massive link reclamation project

Now that the weekend is here, it’s time for some more development posts. My goals for the this weekend are:

  • Create a site taking advantage of SimplePie.
  • Get a decent Magento eCommerce store running without bugs
  • Continue learning Corona SDK.
  • Make my first attempt at learning SASS to style my <ul> and <ol> tags.

I’ll be learning how to use the SimplePie PHP library because I’ve recently begun preparing a massive link reclamation campaign for one of my biggest clients. Link reclamation is supposed to be one of the easiest link building tactics to scale, but so far it’s given me a headache.

According to everything I’ve read about preparing link reclamation campaigns, it’s as easy as setting up some Google Alerts. I’ve only gotten a bit further and I’ve started to run into three interesting issues. This is why you should never trust SEO blogs.

preparing a massive link reclamation with google alerts

Millions of Potential Keywords to Monitor

The first thing I realized is that to merely cover every single brand associated to my client, I would have to create over 100 Google Alerts if I only wanted to track brand mentions. If this sounds scary to you, learn how to use ReMouse. Any repetitive task can be done while you eat lunch if you use your brains and ReMouse.

In any case, the over abundance of potential exact brand mentions was a warning sign that I might have to deal with very dirty Google Alerts, but I decided to see what happens. In other words, I’m lazy.

Combine RSS Feeds or Use a Reader?

Next step was simplifying my life by combining the RSS feeds together. I didn’t want to have to check each individual one for quality sites. In addition, I wanted to be able to use the Copy All Links Firefox plugin to simplify checking SEO stats and scraping emails.

I knew I could use an RSS reader, but I wanted to see some alternatives first.

RSS Mix

First service I found was RSS Mix. It’s free and simple to use. However, I didn’t want a public RSS feed. I also didn’t feel like I should waste my time seeing if the tactic bellow was doable with RSS Mix.

combining rss feeds

Yahoo Pipes

The next tool I tested was Yahoo Pipes. The main advantage of Yahoo Pipes over RSS Mix was the ability to add filters to the feed. I’d be able to remove duplicate posts and YouTube videos. While Yahoo Pipes is really powerful, it seemed to be suited for smaller tasks.

If you take a look at the image below, you can see that the filer can only take one pipe. Sure, I can merge up to 50 RSS feeds using the union filter, but it doesn’t leave much leeway for changes such as adding RSS feeds, removing RSS feeds, and identifying what RSS feeds are providing too much dirty data.

merging rss for link reclamation using Yahoo Pipes

Even though, I really want to learn how to use Yahoo Pipes in the future. I decided it was not right for the task at hand for the reasons stated above.

Digg Reader

In the end, it looked like a reader would be the fastest way to combine the RSS feeds so that I could start evaluating the data. Setting up Digg Reader was fast, but it still didn’t feel quite right. The reader does not allow you to filter the feeds nor is it optimized for Copy All Links.

SimplePie

I decided it might be best to use some personal time to get comfortable using SimplePie. Not only will make it easy to merge the RSS feeds, I’ll also be able to filter it and display it to my exact specifications.

This is why I was happy that the weekend was coming up. At work, I don’t have the time nor the billable hours to waste on trying to make an idea a reality using a PHP library like SimplePie. In fact, I don’t even really get to code anything. I’m not sure how far I will get this weekend, but I’ll do my best to document my efforts with SimplePie and setting up massive link reclamation campaigns in a future post.

Abundance of Irrelevant Data

My intuitions about dirty data were confirmed after getting the data using Digg Reader. I was getting alerted of a lot of brand mentions that would never lead to links.

google-alerts-dirty

The current setup created dirty data for three reasons:

  1. Ignored advanced search operators.
  2. Selected to receive all results instead of the best.
  3. Did not segment queries using Google Blog nor Google News alerts.

In any case, making my queries a bit more specific and using SimplePie to remove duplicates and alerts from sources that I have deemed useless for link building should hopefully solve all my problems.

SEO

Are link building personas deceptive?

I’m not going to lie and say that using link building personas isn’t sketchy. Trying to lie to yourself by making half-assed arguments claiming personas are neither fundamentally deceptive nor necessarily deceptive won’t make those statements any true.

Fundamentally Deceptive, Neutral or Authentic?

Michael Martinez of SEO Theory argues that personas are a legitimate way to represent yourself in specific online environments. They are real in so far as they are an extension of yourself online. Therefore, he states that there is no deception happening when you use a persona to represent yourself online. Link building personas are not fundamentally deceptive.

Martinez does admit that creating a persona that impersonates someone that actually exists is a deceptive representation of that person. However, it’s not deceptive to represent myself as Tina Samosa using a female stock photo. Tina is a real, not deceptive, representation of myself in the online ecosystem.

According to Martinez a.k.a SEO Theory, Tina Samosa is just like a pen name, GoogleGuy on WebmasterWorld or the names that Popes take when they are chosen. The main issue here is that we know who these latter personas represent. No one except for me would know that Tina is actually a man named Philip Tomlinson.

So why do we often think link building personas are deceptive according to Martinez? It’s all about communication. If Tina Samosa tries to seduce men online, the deception happens not in my representation of myself but in my communication with others. The problem is rooted in my use of the persona, not its creation.

Deceptive representation versus deceptive communication is not a coherent way to explain our intuition. In fact, representation almost always communicates something regardless of the creators intention. The mere presence of Tina communicates to others that there exists a women out there named Tina and that this is what she does online. She actually does not even have to actively communicate for others online or have been created to express that specific message, the existence of her profile communicate a message to others. If Tina’s online existence communicate this message, her representation is fundamentally deceptive.

I’m not even going to go into his arguments about why personas are real, because it actually makes no difference to his argument. Reality is often deceptive just look at optical illusions.

Can Link Building Personas Be Deception Free

I’m not even sure why I’m going to go over how Megan Brown from iAquire believes we can use link building personas in a non-deceptive way. She even goes as far to say that SEOs have a responsibility to themselves, their clients and the industry to avoid all deceptions.

She argues that if you use your real name and picture, but invent all other aspects to better connect with your target niche, your persona is not deceptive as long as you don’t steal or impersonate.

link building personas are deceptive

Here are 11 non-deceptive tactics, according to Brown, that you should use with your new persona:

  1. Make it look like you are part of your client’s company.
  2. Don’t mention you are an SEO.
  3. Spend time researching about who your persona friends should be with,
  4. Learn the language to fit in.
  5. Give a new personality to your persona.
  6. Create a background that includes you persona that includes likes and dislikes outside of vertical.
  7. Don’t forget your back story.
  8. Don’t forget what you dislike and like.
  9. Be consistent and explain changes using personal life tweets.
  10. Don’t add your friends, people you care about, family or work unless they are influencers.
  11. Write a guideline for your persona so can get someone else to be you when you’re busy.

Nothing on this list is authentic. From what I understand, Megan Brown believes it’s not being deceptive to create a new you to better fit into a new vertical. Is it really authentic to learn a new jargon, plan out who your friends will be, build relationships based on ROI, create a new biography, new likes and new dislikes, and kill the old you? It isn’t.

An Armchair Psychologist Conclusion

The ethics of link building is part of the white hat versus black hat debate which has polluted the internet for quite some time. SEO Theory and iAquire have found a sneaky way to rehash this controversy by hiding it under a new veil.

Martinez argues that black hat personas that spam your blog should not be called fake people but spam people. True marketers do not use spam people, they use non-deceptive personas. Brown claims that only great agencies use personas non-deceptively and that’s why they don’t care about leaving footprints tied to their real names. It’s not about sheer laziness at all. It’s too bad Megan Brown is such a generic name that it’s being use by multiple guest posters?

If you actually want to learn about using personas without the all those ethical hang ups, Kaiser does a good job at stating pure facts here.

PHP SEO WordPress

Modifying the length of Yoast Breadcrumbs

If you don’t know Yoast’s WordPress SEO Plugin, you’re not a real SEO. Sure, he puts a link in your Sitemap, but it’s a small price to pay for an amazing WordPress plugin.

As a reward for finishing up two posts, I decided to spend some time making some initial design changes to Bones. Most of it was pretty simple. Yet, I did have to pause to fix a minute detail that bugged me about the SEO plugin’s auto-generated breadcrumbs: if the title of a blog post exceeded a certain number of characters, the Yoast Breadcrumbs would take two lines. I decided to take a look at how I could shorten the titles in the breadcrumbs if they went over a certain number of characters.

Getting it to fix on line wasn’t as easy as it looked.

How to Modify the Length of Yoast Breadcumbs in WordPress

The first step is as simple as getting the string that is generated by the plugin as follows.

<?php if ( function_exists('yoast_breadcrumb') ) {
	$yoast = yoast_breadcrumb("","",false);

Then, you just need to create an if statement that takes into consideration the length of the current breadcrumb string. You’ll notice that the length is remarkably long for something that is usually relatively little text. That’s because the breadcrumb string also includes some HTML markup and you’ll have to truncate it all.

You might have to test different lengths depending on the width of your content div.

	if(strlen($yoast) < 585) { ?>
		<p id="breadcrumbs">
			<?php echo $yoast; ?> 
		</p>
	<? } else { ?>
		<p id="breadcrumbs">
			<?php echo substr($yoast, 0, 585)."..."; ?>

As you might of guessed, the last piece of the puzzle is to close the tags that have been cut off by only using a sub-string of $yoast. If you did not select to bold the current page in the breadcrumb options, you can omit </strong>.

			</strong>
			</span>
		</p>
<?php } } ?>