|
1. Hidden Content
Top of our list of black hat SEO techniques is hidden content. Hidden content comes in many forms, but the basic principle is that within the code for the site there will be content stuffed with keywords, this content will not be visible to the end user of the site.
One way of doing this is by using comment tags.
Comment tags look like this;
<!– Comment Tag –>
The real purpose of comment tags is for developers/webmasters to add in useful reminders within their code explaining what that piece of code does.
Here’s an example of the comment tag being used correctly,
<!– Start of the Main Content –>
Here’s an example of a comment tag being used Blackhat in a bid to promote a hypothetical page targeting search engine optimization.
<!—Search engine optimization, SEO, professional search engine optimization company, spamming search engines –>
Another popular way of hiding content is the use of the <noscript> tag. The <noscript> tag should be used to inform a user that a script is being used but their browser either doesn’t support the script language used or they have that function turned off.
Here’s an example of the <noscript> tag being used correctly,
<script type=”text/javascript”>
<!–
document.write(”Hello World!”)
//–>
</script>
<noscript>Your browser does not support JavaScript!</noscript>
Here’s an example of the <noscript> tag being used as a black hat SEO technique again in a bid to promote a hypothetical page but this time targeting car rentals.
<noscript>
AJAX Car Rental Company does Car Rental which is very affordable so if you want to hire a car call our car rental firm because we are the best car hire rental in the world
</noscript>
Other HMTL tags misused in similar ways include the <noframes> tag and hidden inputs in forms.
Content can also be hidden from the end user by using CSS, excessively small text and coloured text on the same coloured background.
All of these techniques are frowned upon by search engines and if detected can mean your website will be penalised or even banned. To the untrained eye it can be very difficult to spot the use of some of these techniques.
2. Meta Keyword Stuffing
There are two Meta tags that are generally used to inform search engines of the content on the page. They reside between the <head> tag of a page and when used incorrectly they can alert a search engine that a site is using spam techniques in an attempt to improve its ranking.
Meta Description
The meta description should be used to describe the content of your page honestly and concisely and be 1 or 2 sentences, 3 at most.
Here’s an example of the meta description being used in the correct manner,
<meta name=”description” content=”CoJV is an Online Marketing agency providing a full range of digital marketing services throughout Greater Chicago and all of the Midwest. If you need Search Engine marketing (SEM), Search Engine Optimization (SEO) or Pay per Click (PPC), we can help you. Contact us now.” />
Here’s an example of the meta description tag being used Blackhat for a page promoting a restaurant called “Imaginary”,
<meta name=”description” content=”Imaginary restaurant website is the best Imaginary restaurant website, our restaurant is better than any restaurant,great restaurant,best food restaurant,visit our restaurant” />
3. Meta Keywords
Meta Keywords should be a short list of words that inform of the main focus of the page. Meta keywords have been so misused in the past that there are few if any search engines that take any heed of them.
Here’s an example of the meta keywords being used in the correct manner,
<meta
name=”Keywords” content=”Online marketing, digital marketing, search
marketing, search engine marketing, e-mail marketing, SEO” />
Here’s an example of the meta keywords tag being used Blackhat for a page promoting a restaurant called “Imaginary”,
<meta name=”keywords” content=”Restaurant,restaurants,food,feed,take away food,fast food,junk food,eat,eating
out,dinner,dining,meal,eating,Imaginary,steak and chips,chicken and chips,pie and chips,pudding,desert,big restaurant,small restaurant,best restaurant,great restaurant, exclusive restaurant,cocktails,wine,drink,pizza,salads”>
4. Doorway or Gateway Pages
Doorway or Gateway pages are pages designed for search engines and not for the end user. They are basically fake pages that are stuffed with content and highly optimised for 1 or 2 keywords that link to a target or landing page. The end user never sees these pages because they are automatically redirected to the target page.
Low Tech Delivery
There are various ways to deliver doorway pages. The low-tech way is to create and submit a page that is targeted toward a particular phrase. Some people take this a step further and create a page for each phrase and for each search engine.
One problem with this is that these pages tend to be very generic. It’s easy for people to copy them, make minor changes, and submit the revised page from their own site in hopes of mimicking any success. Also, the pages may be so similar to each other that they are considered duplicates and automatically excluded by the search engine from its listings.
Another problem is that users don’t arrive at the goal page. Say they did a search for "golf clubs," and the doorway page appears. They click through, but that page probably lacks detail about the clubs you sell. To get them to that content, webmasters usually propel visitors forward with a prominent "Click Here" link or with a fast meta refresh command.
By the way, this gap between the entry and the goal page is where the names "bridge pages" and "jump pages" come from. These pages either "bridge" or "jump" visitors across the gap.
Some search engines no longer accept pages using fast meta refresh, to curb abuses of doorway pages. To get around that, some webmasters submit a page, and then swap it on the server with the "real" page once a position has been achieved.
This is "code-swapping," which is also sometimes done to keep others from learning exactly how the page ranked well. It’s also called "bait-and-switch." The downside is that a search engine may revisit at any time, and if it indexes the "real" page, the position may drop.
Another note here: simply taking meta tags from a page ("meta jacking"), does not guarantee a page will do well. In fact, sometimes resubmitting the exact page from another location does not gain the same position as the original page.
There are various reason why this occurs which go beyond this article, but the key point to understand is that you aren’t necessarily finding any "secrets" by viewing source code, nor are you necessarily giving any away.
Agent Delivery
The next step up is to deliver a doorway page that only the search engine sees. Each search engine reports an "agent" name, just as each browser reports a name.
The advantage to agent name delivery is that you can send the search engine to a tailored page yet direct users to the actual content you want them to see. This eliminates the entire "bridge" problem altogether. It also has the added benefit of "cloaking" your code from prying eyes.
Well, not quite. Someone can telnet to your web server and report their agent name as being from a particular search engine. Then they see exactly what you are delivering. Additionally, some search engines may not always report the exact same agent name, specifically to help keep people honest.
IP Delivery / Page Cloaking
Time for one more step up. Instead of delivering by agent name, you can also deliver pages to the search engines by IP address, assuming you’ve compiled a list of them and maintain it.
Everyone and everything that accesses a site reports an IP address, which is often resolved into a host name. For example, I might come into a site while connected to AOL, which in turn reports an IP of 199.204.222.123 (FYI, that’s not real, just an example). The web server may resolve the IP address into an address: ww-tb03.proxy.aol.com.
If you deliver via IP address, you guarantee that only something coming from that exact address sees your page. Another term for this is page cloaking, with the idea that you have cloaked your page from being seen by anyone but the search engine spiders.
5. Link Farming
Link farms or free for all (FFA) pages have no other purposes than to list links of unrelated websites. There are many service providers who promise to help you boost your Link Popularity by automatically entering you into Link Exchange programs they operate, often linking your page with Web sites that have nothing to do with your content. However, search engines such as Google consider link farming as a form of spam and have been implementing procedures to banish sites that participate in link farming, so be careful. |
|