Cloaking is a black hat (SEO) technique in which the content presented to the search engine is different than that delivered to the user”s browser. It can be defined as a technique used to deliver different web pages under different circumstances.
The primary reasons that people go for page cloaking are as:
i) With the help of this technique people create a search engine friendly page for search engine and another page that is attractive and designed well for the visitors. When the search engine visits the site, it is provided the page designed for it and when the visitors visit the website, the pages that are solely designed for them are presented to them. The human visitors are not able to see the pages that have been designed for the search engines. The pages that are designed for search engines are meant for them only, they may contain over repetition of keywords or key phrases.
ii) SEO code of the pages is hidden from everyone and only search engines are able to index it. Thus the competitors are not able to see the keywords or key phrases. Competitors are not thus able to copy the code.
Implementation Of Page Cloaking:
A cloaking script is installed on the server, which detects who is requesting for a web page- whether a search engine or a human visitor. And when done, the proper web page is supplied accordingly.
Now the question arises how the page cloaking script detects who is visiting the website.
i) The best way is by checking the User-Agent variable. When anyone (human or a search engine spider) requests a page from a site, it reports a User-Agent name to the site. The User-Agent variable of search engine contains the name of the search engine. Thus, cloaking script helps to deliver the page which is optimized properly. And if finds out no search engine name in the User-Agent Variable then it delivers the pages designed for humans.
ii) Another method is a complex one; it makes use of I.P based cloaking. An I.P database is maintained, that contains the I.P addresses of all the major search engines. Each time when a web page is requested, the cloaking script checks out the I.P. address, and if the I.P address is listed in the database, it is detected that the web page is being requested by a search engine. And if the I.P address in not in the database then it refers to the visitor the page designed for visitor. I.P. based cloaking is more reliable and better.
Should Use Page Cloaking or Not?
The search engines do not like the websites that make use of page cloaking mechanisms. Search engines might even get your site banned if the search engines find out that you indulge in page cloaking mechanisms. The reason is that the search engines do not index the same pages that are provided to visitors. They cannot thus deliver the appropriate results to their users. And if they opt for page cloaking people would not visit them or even abandon them.