Top Dirty Linking Tricks

Posted on 2024-04-10 12:53:46 by ascender in category SEO

Search engines use links to determine authority and relevance to particular keywords and phrases. For both ecommerce SEO and local SEO, reciprocal links or link trading remains one of the most often used methods of acquiring links. Unfortunately, it can also lead to your link partner getting a better deal than you receive. Use the following top linking tricks to determine if you're getting a fair deal.

Meta Tag Masking
 
This old trick simply used scripting such as PERL, PHP, and ASP to hide the Meta tags from browsers while allowing search engines to actually see the Meta tags.

You can catch this trick by viewing the search engine's cache of the page and/or using a tool to view the page as Google or Bing.
 
Robots Meta Instructions
 
Using noindex and nofollow attributes let's the novice link partner see the visible page with their link while telling the search engines to ignore the page and the links found on the page. Nofollow can be used while allowing the page to be indexed which gives the impression that the search engines will eventually count the link.

You can catch this trick by viewing the search engine's cache of the page and/or using a tool to view the page as Google or Bing.
 
Rel=nofollow Attributes
 
When first recommended by Google, this attribute was not part of the HTML standards. Since the drafts of HTML5, the world has recognized the "nofollow" attribute as an accepted part of the standards. The attribute indicates that the author of the linking page does not endorse the target page or that the link represents an advertisement for the target page.

In most cases, you can look at the source code of the linking page and identify if the link contains the rel="nofollow" attribute. Verify your findings by examining Google's cache of the page; sometimes people use scripting to exclude the rel="nofollow" when a normal browser requests the page.
 
Dynamic Listing
 
Dynamic listings result from the "fairness" idea. The "fairness" idea automatically rotates the links across multiple within the category giving everyone a chance at being seen. The link positions change upon each pageview. To search engines, the content appears to change normally and they return more often. However, your link moves within the spectrum and provides no real value to you.

You can catch this trick by refreshing the page and watching the links change positions, disappear from the page, and reappear on the page.
 
Floating List
 
 
Unlike the dynamic list, the floating list finds your link moving linearly across the spectrum of pages to never return to the first page. This causes the link to your page have a lifespan based upon how quickly other links are added above your link. The search engines will find no value in your link.

You can catch this trick by periodically (at least twice per month). Make note of your position each time you visit. If your position continually drops, then you're on a floating list.
 
Old Cache
 
 
Search engines make copies of each page they visit, called a cache. The cache date indicates the date the search engine last cached the page. If the cache is more than six months old, it can be surmised that the search engine has little or no desire to revisit the page.

Always check the cache of the page on which your link exists.
 
Denver Pages
 
While Denver, CO is a nice place to visit, Denver Pages are not a place you want to find your link in a trade. Denver Pages typically have a large amount of links grouped into categories on the same page. Some people call this the mile high list. These types of pages do not have any true value in the search engines and are not topically matched to your site.

You can catch this trick by visiting the linking page. If you see links segmented into multiple unrelated categories, you can bet it's a Denver Page.
 
Muddy Water Pages
 
Your link will be piled in with non-topically matched links with no sense of order. It's like someone took all the links and threw them in the air to see where they land. These are worse than the Denver Pages.

These are dangerous and easy to spot. It's like a kaleidoscope of entries with no semantic value.
 
Cloaking
 
Cloaking is the process of providing a page to people while providing a different page to search engines. You could be seeing your link on the Web page, but the search engines could possibly never see the link because they are provided with a different copy.

Checking the search engine's cache is the only way to catch this ploy.
 
Dancing Robots
 
This can be easily performed with server-side scripting like PHP and is rarely easy to catch. In this situation people that attempt to view the robots.txt file receive a copy of the robots.txt file that does not include exclusion instructions for the search engines. However, when the search engines request the robots.txt file they receive the exclusion instructions. With this situation the links pages will never be linked and you'll never know why without expert assistance.

Catch this trick by checking multiple URLs within the link or resource folder and the homepage. If the cache of the homepage does not include a link to the link or resource folder, consider the link or resource directory new. If the cache of the homepage does include a link to the link or resource folder, then consider the robots.txt file you receive a fraud.
 
Meta Tags and Robots.txt Confusion
 
In most cases, the robots.txt file sets the rules for exclusion. However, search engines still go into the excluded sections to find pages that may contradict the robots.txt exclusion rules allowing indexation and following. On the other hand, the page may include robots Meta instructions prohibiting indexation and/or following while the robots.txt file does not restrict access.

Examine the robots.txt file and the robots Meta instruction; the page instructions always have presidence over the robots.txt instructions.
 
Link the Head
 
While these links do not count in the search engines and do not show up on the Web page, they do get counted by scripts or programs designed to verify the links exist. These programs only look for the URL within the source codes for the Web page.

Always verify by viewing the page source code and verifying where the link exists.
 
Empty Anchors
 
This is a nasty trick, but can be an honest mistake. The links exist and are counted by the search engines, but unfortunately are neither visible nor clickable on the Web page. So, there are no traffic values from the link.

Verify your desired anchor text appears on the page and click it to ensure it goes to the page you want it to go.
 

Remember, your goal is to appear attract quality traffic from your link partners. Appearing higher in the search results is an added benefit. Therefore, look for these top dirty links tricks to help ensure you receive a fair deal in your link trading activities.

 

Comments








CAPTCHA Image