Case Study: International Medical Supply Company Dramatically Improves Organic Visibility

Est. Reading Time: 3 minutes

In the Wake of a Google Panda Update, Beacon Guides an Ecommerce Websites to Increased Rankings and Visibility

Challenge: Client was facing duplicate content issues in the wake of a

Google Panda update because their ecommerce website had many

similar products with almost identical content. In this case, it was

the near duplicate content that was hurting the rankings of the entire

site. The frustrated client had run into trouble with Google because they

had a very large number of products and didn’t understand how the

content on their site impacted the site as a whole. The algorithm

change was causing a rapid decline in search engine traffic.

 

Business Solution:  Beacon first created a robots.txt file to block Google

from crawling dynamic, unnecessary, and weak pages. A robots.txt file

is the easiest way to block URL parameters and entire folders. Many

pages were being replicated through the website’s CMS (Content

Managment System) Platform. Over time, the robots.txt file was able

to get most of the duplicate content removed from the index.

Beacon then faced the issue of duplcate URLs due to pagination.

Paginated duplicate content had the negative side effect of the

products not being indexed at all. Through parameter handling in

Google Webmaster Tools, Beacon provided suggestions to the

Google crawler to ignore certain parameters and get a more efficient

crawl, resulting in less duplication.

While targeting the duplicate content issue, a URL rewrite was

implemented that created more search-engine friendly URL’s. The

rewrite also prevents duplicate content from being created on new

pages along wtih a unique URL. New unique content was also created

with internal links that more efficiently pass PageRank to internal

category and product pages. Improving the content quality was

imperative to create compelling pages that improved the user

experience while also improving SEO results.

An important final step was taken to survey the URLs that Google had

indexed for the site. While querying the Google index, it was discovered

that a test site had been inadvertantly crawled, causing major site

duplication. Blocking the test site greatly increased the efficiency of the

web crawl.

Results: The removal of duplicate content increased the number of

pages in the Google index and the ability for Google to crawl the site.

Although it sounds counter-intuitive to remove hundreds of thousands

of URLS, doing so improved the rankings for thousands of product pages

in Google. A rising tide lifts all boats, and the website’s rankings

improved significantly. The client is now able to focus on building out

brand signals and improving user engagement and usability.