Digital Marketing Mastery: Insights, Strategies, and Tactics for Success

What is Cloaking?

Cloaking refers to a search engine optimization, or SEO, method used by webmasters to deceive search engine crawlers by presenting different content to the search engine than that which is presented to a user. Search engines such as Google’s consider this a violation of ethical internet practices, and there are consequences for doing it. Current digital marketing strategies must include ways to avoid cloaking, and SEO experts warn against it. When Googlebot, Google’s web-crawling device (that essentially ‘crawls’ the web and discovers new and updated pages to be added to Google’s index) finds cloaking, it considers it unethical and there are consequences. Therefore, it is necessary to be able to recognize cloaking and avoid it.

Cloaking occurs by delivering content based on IP addresses or the User-Agent HTTP header of the user who is requesting a certain page. When a user is determined to be a web crawler, or ‘spider,’ a server side script will deliver a different version of the page than it delivers to other users. This version will contain content that does not appear on the page that is visible to users. Webmasters who practice cloaking typically do it to fool  a search engine so that it will display the page that would not otherwise be displayed. Google’s algorithms can pick up cloaking, and it can be dealt with through severe consequences to those who practice it. Google may remove a site that contains cloaking entirely from its index. Cloaking is a form of a doorway page, which is a web page created for spamdexing. Spamdexing, or search engine spamming, is done solely for the purpose of manipulating a search engine. Doorway pages often are also referred to as portal pages, jump pages, or gateway pages.

Methods of cloaking

There are a number of cloaking techniques, and they all should be avoided as they are frowned upon in today’s world.

User-Agent cloaking

This is a method whereby delivery is made of different versions of a website based on User-Agent. In User-Agent cloaking, search engines with specific IP addresses are presented with one version of a page, while other IP addresses are presented with a totally different version.

HTTP_REFERER cloaking

Though not as frequent, this type of cloaking is used because web crawlers such as Googlebot rarely use the HTTP_REFERER header, meaning that requests that come in without the header are automatically assumed to be crawlers.

JavaScript cloaking

This method targets JavaScript-enabled browsers by showing one version to them, and a different version to all users who have JavaScript turned off. This is effective because search engines typically have JavaScript turned off.

Black hat cloaking

Black hat cloaking typically refers to unethical methods that are used simply to get higher page rankings, and these are considered deceptive because they do not follow equitable search engine rules. This type of cloaking uses aggressive SEO tactics such as invisible text and key word stuffing.

How to avoid it

Googlebot’s process starts with a list of URLs that are generated from previous crawl processes and intensified with data provided by webmasters. As Googlebot then visits each website, it will detect links on each page. It then adds the links to its list of pages that it needs to crawl. This process tells Googlebot which sites are new, which ones are dead links, and also when changes have been made to existing websites. Then Google’s index is updated. This process detects cloaking and other illicit webmaster practices, and Google may take action by deleting a site. 

All cloaking creates a negative user experience, and is therefore frowned upon by all search engines. In fact, Matt Cutts, head of the Web spam team at Google, says all cloaking is considered by Google to be ‘black hat,’ meaning Google sees none of it as ‘white hat,’ or harmless. If Google finds any of these or other illicit practices, it often will remove your site entirely from the Google index. Your site can also be negatively impacted by Google’s algorithm or manual spam action. If a site is affected by a spam action, it is possible that it may no longer appear in Google’s, or any of its partner, sites.

SEO experts recommend reading search engine guidelines such as Google’s quality guidelines in order to adhere to search engine rules. Google has made it clear that its quality guidelines are not comprehensive. Because they find all cloaking and other deceptive practices to be unethical, Google is on constant lookout for illicit behavior from webmasters. They often change their algorithms in order to discourage and stop cloaking and other techniques that create a negative user experience. It is important to know that, simply because Google’s quality guidelines do not mention a particular type of cloaking or other deceptive behavior that is used to manipulate search engines, this does not mean Google won’t do anything about it. In fact, new methods of hacking, redirecting, cloaking, and other types of deceptive practices are developed all the time, and Googlebot is constantly being updated to find them.

Leave a comment