Directory submissions used to be a valid, white hat SEO strategy. The goal of a directory was to offer a site on which content could be placed and then linked back to another website. Essentially, it was at that time a good way to get lots of inbound links which made it perfect for SEO purposes. However, with the recent updates to the Google algorithm, specifically the Penguin update, a once popular and legitimate practice has become black hat.
How time changes things
With the advancement of Google’s algorithm the way links are weighted in determining a site’s ranking has drastically changed. There was a day when the more links you had to your site, the better the chance that your site would be ranked. But with the most recent changes, Google now looks for links that are relevant and from high quality sites. This means that having any old links no matter how many, is not white hat SEO. When a link is made between two websites, it’s like a vote informing Google that the site is worth visiting. But through the use of directories, essentially the website owner is “voting” for their own site. This created a mess with links and is one of the reasons why Google made these algorithmic changes to begin with. Even if your site received links from a directory, they are no longer considered high quality or relevant. Sites with low quality links are likely to receive penalties which are counterproductive for search engine optimization purposes.
What is a directory?
An article directory is a site where users post short articles and a bio. The article itself usually contains no anchor text, but the author is allowed to put a limited number of links in their author bio. In most cases, they can include up to 3 links. The article is uploaded with two goals in mind. The author wants to publicize their site, hence the links in their bio; and they hope someone will like the article enough to publish it on their own site. This leads to multiple problems. First of all, the content on these types of sites is typically lacking when it comes to quality. They tend to lean more toward being “spammy” according to Google’s own Matt Cutts. Besides providing links that have no relevancy, duplicate content scattered across the web is not beneficial. Even though this practice worked great for SEO purposes just a few years ago, it is not a recommended practice today. And it certainly is not an effective way of building links that are beneficial. Cutts goes so far as to say that these practices simply are not recommended for building links.
What to do with directory Links
The website owner or SEO specialist is responsible to make sure that all inbound links to a site are legitimate. Checking inbound links should be a regular practice to ensure the quality of links to the site. Inbound links on a site can be pulled up from a Google Webmaster Tools Account. There are also some paid alternatives such as Moz that offers programs which can pull up links on a site. Using multiple programs can help ensure that all the inbound links are found. Once you have the list of links, go through them and analyze each one. Hopefully, there are a few sites that you’ll recognize as “good” such as a site you have been a guest blogger for or commenting on regularly. These are precisely the high quality links Google is looking for. The ones that are not from sources which are immediately recognized are the ones which need further investigating.
Creating Legitimate Links
Google has been pushing site owners to gain their links naturally, or organically. And they are penalizing sites which use link building tactics such as using directories. There are some good ways to earn valuable and relevant links. There is really no need to request links today or to use spammy methods to obtain them. It is ideal to create good content that engages site visitors and gives them something of value that they want to share with someone else. Good, shareable content can generate high quality links like what Google is looking for today.