Case Study: International Medical Supply Company Dramatically Improves Organic Visibility

Beacon Blog Article

Published June 4, 2015 | Categories: Case Studies

In the Wake of a Google Panda Update, Beacon Guides an Ecommerce Websites to Increased Rankings and Visibility

Challenge: Client was facing duplicate content issues in the wake of a
Google Panda update because their ecommerce website had many
similar products with almost identical content. In this case, it was
the near duplicate content that was hurting the rankings of the entire
site. The frustrated client had run into trouble with Google because they
had a very large number of products and didn’t understand how the
content on their site impacted the site as a whole. The algorithm
change was causing a rapid decline in search engine traffic.

  • Client Facts:
  • American Based Medical Supplier
  • Online Sales of $200K 2011
  • Faculty size below 50 employees
  • Goals:
  • Reduce Duplicate Content and
    Google Panda Algorithm mitigation
  • Improve Website Indexing of
    60,000 products and search traffic
  • Results:
  • Website Visits: +99.05%
  • Pageviews: +40.79%
  • Landing Page Entrances: +420.94%
  • Conversions: +54.90%
  • Online Revenue: +45.29%

Business Solution:  Beacon first created a robots.txt file to block Google
from crawling dynamic, unnecessary, and weak pages. A robots.txt file
is the easiest way to block URL parameters and entire folders. Many
pages were being replicated through the website’s CMS (Content
Managment System) Platform. Over time, the robots.txt file was able
to get most of the duplicate content removed from the index.
Beacon then faced the issue of duplcate URLs due to pagination.
Paginated duplicate content had the negative side effect of the
products not being indexed at all. Through parameter handling in
Google Webmaster Tools, Beacon provided suggestions to the
Google crawler to ignore certain parameters and get a more efficient
crawl, resulting in less duplication.

While targeting the duplicate content issue, a URL rewrite was
implemented that created more search-engine friendly URL’s. The
rewrite also prevents duplicate content from being created on new
pages along wtih a unique URL. New unique content was also created
with internal links that more efficiently pass PageRank to internal
category and product pages. Improving the content quality was
imperative to create compelling pages that improved the user
experience while also improving SEO results.

An important final step was taken to survey the URLs that Google had
indexed for the site. While querying the Google index, it was discovered
that a test site had been inadvertantly crawled, causing major site
duplication. Blocking the test site greatly increased the efficiency of the
web crawl.

Results: The removal of duplicate content increased the number of
pages in the Google index and the ability for Google to crawl the site.
Although it sounds counter-intuitive to remove hundreds of thousands
of URLS, doing so improved the rankings for thousands of product pages
in Google. A rising tide lifts all boats, and the website’s rankings
improved significantly. The client is now able to focus on building out
brand signals and improving user engagement and usability.

Let's get to work!

Contact Us