Logan Ray

About Logan Ray

With a B.S. Degree from the University of North Carolina at Greensboro, Logan Ray joins Beacon as a Digital Marketing Specialist. Outside the workplace, Logan’s interests include spending time with his wife and dog, board sports and outdoor adventuring.

Improve Your Site Search with These SEO Principles

While just about every e-commerce website I’ve worked with optimizes for Google search, it amazes me how many underestimate the value of effective internal site search. If you’re not dedicating some effort to improving your internal search, you’re losing business.

Conversions aren’t the only thing impacted by a properly optimized site search (although it certainly is the most important). Internal site search can provide clues that have implications across the board and help you effectively manage your bricks-and-mortar business, too.

Here’s a short list of SEO principles that, if employed, will help you get the most from your internal site search (and boost conversions!):

Set up site search tracking in Google Analytics. It’s simple and you can do this right inside your current Google Analytics account.

Step One: You need to determine the query parameter your site search is using. Simply perform a search, any search, on your site. Look at the URl of the results page and identify the designation immediately following the question mark. For the example below, I went to a popular outdoor clothing and equipment site and did so.

As you can see, it’s a “q” in this case.

search-query-parameters

Step Two: Go into your Google Analytics account and click Admin, View Settings and scroll to Site Search Settings.

google analytics search query parameters

Step Three: Under the heading Site Search Tracking, simply click the slider to the “on” position. Look for the heading Query parameter and add the character that designates a search in your URl. In this example, it was “q”. Click the Strip query parameters out of URl button and save.

This will enable you to view the terms visitors searched on your site within Google Analytics.

Add often searched query terms to your keyword research. Once you know what people are searching while on your website, you’ll not only gain insight into ways to optimize your search terms for products and/or categories but you’ll get great ideas for new products.

NoIndex your search result pages. If someone performs a generic Google search hoping for a quick answer and lands on your internal search page instead, it may not result in a good experience. Google thinks it is less likely to. That being the case, it makes sense to block them.

Never display “No Results Found”. It serves no useful purpose to you or the user. If the item they’re searching for is out of stock, consider displaying related products. If the term is completely unrelated to any of your products, serve up a list of your best selling items with a blurb that reads “We couldn’t find the item you’re looking for. Perhaps you’re interested in one of the best sellers below.”

Don’t pass up a chance to ask for a sale. Proper use and optimization of your internal site search has been done to dramatically improve, even double in some cases, conversions. A designated Google Analytics Partner since 1998, the digital marketing team at Beacon has a wealth of knowledge regarding internal site search and ways in which it can be leveraged to improve your bottom line. Questions? Email me or call a Beacon team member at 1.866.585.6350. We’re ready to discuss ways in which we can help y make your online business more profitable than ever before.

Logan Ray
With a B.S. Degree from the University of North Carolina at Greensboro, Logan Ray joins Beacon as a Digital Marketing Specialist. Outside the workplace, Logan’s interests include spending time with his wife and dog, board sports and outdoor adventuring.
By | 2017-09-01T09:55:13+00:00 September 13th, 2017|SEO|Comments Off on Improve Your Site Search with These SEO Principles

SEO & Your Redesign: Don’t Be a Flintsone

If you’re reading this article on your desktop monitor, guess what. You’re a dinosaur, my friend. More than likely, there are far more people reading this article on hand held devices. Particularly those between the ages of 18 and 29*. But, since there are still holdouts like yourself, any website needs to account for this audience as well.

With that in mind, it’s time we talk about responsive website redesign. So, put those rocks down and follow me.  We’re going to cover a lot of ground in just a few words. We’ll look at some major SEO considerations with an added emphasis on mobile redesign factors.

Structured Schema Data & The Loyal Order of Water Buffaloes

A redesign is a great time to give your new website an organic visibility boost. By adding structured or schema markup, you can give search engines a little bit more help with interpreting your pages, thereby improving the quantity and quality of your traffic. Think of it as the secret handshake of Lodge brothers. You establish a familiarity on a higher level – it’s an engagement booster. And when better engagement metrics follow, so do better search rankings.

HTML tells the user’s browser how to display the information in the tag. Using a microdata format, schema markup tells the browser what the text within the tag means. There are markup types for different business categories, entertainment types, articles & review sites, products, and events. And those are just a few types of schema. You can see why schema markup is a great way to showcase your website content.

Yabba Dabba DON’T forget about page speed.

Page speed has always been important. With mobile devices, page speed is absolutely critical. Consider the numbers given earlier regarding mobile users. They’re young. Very Young. That means the average mobile user has an attention span similar to that of a common house fly.  Factor in questionable mobile connections and it doesn’t take a genius to figure it out. There is no such thing as too fast.

Make sure you address page speed. If you can’t get a straight answer from your design partner regarding site speed, use Google’s Mobile Friendly Test to find out for yourself. It will measure mobile friendliness and compare the load times of your site on mobile and desktop. Additionally, the tool will suggest improvements. Take this handy little report and give it to your design partner.

CSS, JavaScript & The Great Gazoo

In the Jurassic days of website design (last Thursday), it wasn’t unusual to use JavaScript to ensure that your design renders as intended. Page speed was not as great a factor in the Google search algorithm as it is today and page speeds didn’t have to account for mobile use.

However, side by side analysis suggests overuse of JavaScript and CSS can be a big mistake. Recent split tests indicate that if one were to strip the JavaScript from a page and use CSS to render it instead, performance would improve. In fact, if one were to split test the JavaScript pages versus the pages not reliant on JS, the latter sees roughly 5% more organic traffic.

Like the character in the cartoon, CSS and JavaScript mean well but they they’ll likely create more problems than they solve.

They don’t use Flash down at the quarry anymore.

There are some good things about Flash. It’s great for course animations and online presentations. Now, let’s list a few of its limitations.

  • It’s not supported on iPhone.
  • Or Anything, really.
  • Google has trouble crawling and indexing it. That means lousy search visibility.
  • It isn’t easily updated. If you have very, very deep pockets, this isn’t a problem.
  • It’s proprietary so Adobe can hold you hostage.

With the changing technologies, Flash simply should not be used any longer. Flash isn’t supported on mobile devices. If your redesign partner insists on using Flash in your new redesign, run for the hills.

Bold choice, Mr. Flintstone! You’ll go far in this company.

While all of the above must be considered in your redesign efforts, it all starts with your choice of a forward-thinking design and development partner. That’s where Beacon comes in. If you have any questions regarding SEO and your upcoming redesign project, I invite you to email me or call a member of our digital marketing team at 1.855.847.5440.

*According to the Pew Research Center

Logan Ray
With a B.S. Degree from the University of North Carolina at Greensboro, Logan Ray joins Beacon as a Digital Marketing Specialist. Outside the workplace, Logan’s interests include spending time with his wife and dog, board sports and outdoor adventuring.
By | 2017-07-31T12:52:45+00:00 July 26th, 2017|SEO|Comments Off on SEO & Your Redesign: Don’t Be a Flintsone

eCommerce Analysis: Using Google Analytics to Identify Your Whales and Minnows

If you manage an eCommerce site, you probably spend a lot of time in Google Analytics. There’s a ton of great metrics and reports to check out, like mutli-channel attribution, average order values by channel, eComm conversion rates, and so on. You’ve probably even segmented data by demographic, device type, or geography. All of this is great, high-five yourself if you’re doing these things because you’re probably a few steps ahead of your competition. But have you ever segmented by transaction dollar amount? I’m about to enlighten you on a couple advanced segments to help you identify your whales (biggest customers), your minnows (smallest customers), and how to get more whales and less minnows.

OK, got your coffee? Let’s go!

First thing we’re going to do is figure out your top 10% and bottom 10% transaction thresholds.

Go to Conversions > Ecommerce > Sales Performance. Expand your list to show all available transactions, then export to XLSX.

transaction report

The next step is to identify the thresholds for your top 10% of transactions. Open up the XLSX file, put filters in your headings, and remove the last row of data where your totals show (keeping it in skews your sorting).

Sort the revenue column from largest to smallest, then apply some conditional formatting, as seen in the screenshot below. Repeat this step with the bottom 10%.

*Side note: If you’re not familiar with this feature in Excel, I highly recommend becoming a conditional-formatting-ninja, it will shave tons of time off of your analysis.

top 10

Once you’ve got both conditional formats applied, simply look at the lowest dollar amount in your upper 10% grouping, and the highest dollar amount in your lower 10% grouping. These are your thresholds. In my case, using the Google Merch Store test account, I’ve identified $259.50 as the upper threshold, and $13.59 as the lower threshold.

Now the good stuff begins.

Head back into your Google Analytics account and create a new advanced segment. We’re looking for users who have a per-user revenue of $259.50 or greater, so we create the segment as shown below:

advanced segment

Now that you’ve got a segment created, apply it, and you’re free to check out other reports to analyze where these users come from, how the interact with the site, and figure out what you can do to acquire more people that would fall into this segment.

A few good starting points for conducting this analysis are:

  • The Source/Medium report in Acquisition – Perfect for learning how your whales found you
  • The Mobile Overview report in Audience – This is great for device type analysis. i.e. are your whales coming through mobile or desktop? Assumptions can be dangerous, so it’s always a good idea to investigate
  • The Landing Page report in Behavior – Surely, you’ve got some pages on your site that are more likely to drive purchase, but which ones are the best?

Of course there are plenty of other reports you can gain insights from, this list is only intended to kick start your analysis engine.

Once you’ve conducted your whales analysis, circle back and repeat the process for minnows. The idea is the same, but this time around you’re trying to identify ways to attract fewer minnows. So check out the same reports in Google Analytics and identify channels that drive small transactions, landing pages the don’t perform as well, et al.

If you do this on a regular basis and make adjustments to your marketing efforts accordingly, you’ll start to see your thresholds shifting upwards, as well as increases in your average order value.

Logan Ray
With a B.S. Degree from the University of North Carolina at Greensboro, Logan Ray joins Beacon as a Digital Marketing Specialist. Outside the workplace, Logan’s interests include spending time with his wife and dog, board sports and outdoor adventuring.
By | 2016-11-22T17:47:23+00:00 August 12th, 2016|Ecommerce, Google Analytics|Comments Off on eCommerce Analysis: Using Google Analytics to Identify Your Whales and Minnows

Here’s How to Use Google Analytics to Improve Your Search Results Appearance

It’s not everyday that the SEO world catches a break from Google, but they’ve recently connected two of their greatest resources for us. You may have noticed a new tool in the  Acquisition section of your Google Analytics reporting – the Search Console Beta. This report closes the gap between Search Console and Google Analytics, providing a new level of insight into what’s happening with your SEO efforts. Now, in addition to seeing the typical behavior and conversion metrics next to your landing pages, you can see the information that has been siloed in Search Console for years:

  • Impressions
  • Clicks
  • CTR
  • Average Position
  • Sessions

Now, how do you leverage this data to improve your visibility and organic traffic? The simplest thing you can do is identify pages that aren’t garnering the amount of clicks they should be. In the video below, I use advanced filters to show only pages with an average position on the first page of Google (by selecting less than 11) and also have a click-thru rate of less than 1.5%. Of course, you can set these thresholds to whatever you like, but this is a good starting point. In my opinion, any URL ranking on the first page of search results should have a pretty good CTR. Once you’ve filtered out the better performing URLs, you’re left with a list of URLs that need some improvements. Export this list and make updates to your title tags and meta descriptions. It’s helpful to run a few queries and see what other sites are showing up on the search results pages along with your pages. There’s clearly something better about their results, so take a close look and see how you can out-write them with stronger title tags and meta descriptions. Don’t forget to come back in a few weeks, or however long it takes to collect significant data, and conduct a before and after analysis on those URLs. *Take note, that the first thing I do is set the date range to include 90 days worth of data, but not the 2 most recent days. Search Console data lags a couple days and only back-logs 90 days’ worth.

Logan Ray
With a B.S. Degree from the University of North Carolina at Greensboro, Logan Ray joins Beacon as a Digital Marketing Specialist. Outside the workplace, Logan’s interests include spending time with his wife and dog, board sports and outdoor adventuring.
By | 2016-11-18T14:47:46+00:00 July 25th, 2016|SEO|Comments Off on Here’s How to Use Google Analytics to Improve Your Search Results Appearance

Google’s Latest Algorithm Update: Bacon

The sizzling algorithm update you probably haven’t heard about: Bacon.

It’s no secret that Google releases dozens or hundreds of updates to their ranking algorithm every year. Of course, some of these updates have a larger impact on search results than others, you may be familiar with the Panda or Penguin updates, which targeted content and link quality. These two updates have seen a series of on-going updates over the last few years as they continually make tweaks to improve the de-spamming of search results pages.

Most of the time, Google’s updates are focused on demoting websites that are using black-hat SEO tactics, have low content quality, or have a bunch of junky links pointing to the site. Historically speaking, these are all the things “spammers” have been doing for years to try and game the system.

google-bacon

So what’s the deal with the Bacon update? Well, as with all things, Bacon makes it better! Simply putting the word ‘Bacon’ in your on-page optimization efforts is a sure-fire way to improve the rankings of your site. Recent studies have shown that bacon-optimized pages are 17 times more likely to rank on the first page, experience a 243% increase in click-thru rate, and cut bounce rates in half! How do you go about making updates to optimize your site for Bacon? Follow these simple steps:

  • Add Bacon to the beginning of your page titles
  • Work Bacon-related content into your keyword-rich topic pages
  • Include the word Bacon in your page headings
  • Configure URLs to include Bacon
  • Anchor-text links loaded with Bacon
  • Add Bacon images on every page of your site

With these bacon-optimized updates, you’ll be cooking in no time!

 *if you made it this far and haven’t looked at the nearest calendar yet, April Fools! This is of course, total non-sense and you should in no way, shape, or form make any of the changes mentioned above (unless you actually run a bacon-related website)

Logan Ray
With a B.S. Degree from the University of North Carolina at Greensboro, Logan Ray joins Beacon as a Digital Marketing Specialist. Outside the workplace, Logan’s interests include spending time with his wife and dog, board sports and outdoor adventuring.
By | 2016-11-18T14:47:46+00:00 April 1st, 2016|SEO|Comments Off on Google’s Latest Algorithm Update: Bacon

404 Error Pages: Best Practices

Ya know what grinds my gears?! When I click a link (internal or external) and I get one of these:

404

You know what I do? Leave.

And so do 99.9999% of everyone else that hits an error like this. So how do you avoid this? Add a custom 404 page using these best practices:

  1. Your 404 page should actually return a 404 HTTP Status Code. This may seem obvious, but I’ve seen plenty of 404 pages out there throwing 200. From a user perspective, this is fine. But search engines should be notified of the broken/dead page via a 404 status code. When your error page returns a 200, search engines can misunderstand this and index pages that no longer exist. This is particularly important for anyone concerned with SEO!
  2. Your 404 Page should look like any other interior page on your site. Include the usual header, navigation, and footer.
  3. Acknowledge the error and include a message notifying users that the page they’re trying to access no longer exists. Something like “Oops, the page you’re trying to access no longer exists” should suffice. As a bonus, you could try one of the following tactics:
    • Relevant imagery. Here’s one of my favorites from LEGO.
    • Suggest other sections of your site that might be of interest. TED does a great job with this.
    • Be funny about it like Trek Bicycles. Laugh at yourself, your visitors will appreciate it.
    • Include a ‘Search’ bar like MailChimp.

*notice in all examples above, they’re all returning 404 status codes.

404 Best Practices

No site is perfect, error pages will be hit. It’s a fact of life that we must deal with. So we may as well deal with it the best possible way right? If you’re still using a generic 404 page, you’re losing people. No one is going to go to their URL bar and remove everything after the .com to get back to your homepage. They’re going to leave your site and go check out the competition. If you’re showing a custom 404 page, you’re highly likely to retain visitors. Especially if you implement one (or more) of the suggestions above.

It’s 2016, I think it’s high-time we retire the generic 404 page. I hope this post has convinced you to add a custom 404 to your site.

Logan Ray
With a B.S. Degree from the University of North Carolina at Greensboro, Logan Ray joins Beacon as a Digital Marketing Specialist. Outside the workplace, Logan’s interests include spending time with his wife and dog, board sports and outdoor adventuring.
By | 2017-08-11T16:10:34+00:00 March 8th, 2016|Digital Marketing|Comments Off on 404 Error Pages: Best Practices

How-To: Robots.txt Disaster Prevention

It’s any SEO‘s worst nightmare. The Production robots file got overwritten with the Test version. Now all your pages have been ‘disallowed’ and are dropping out of the index. And to make it worse, you didn’t notice it immediately, because how would you?

Wouldn’t it be great if you could prevent robots.txt file disasters? Or at least know as soon as something goes awry? Keep reading, you’re about to do just that.

The ‘How’

Our team recently began using Slack. Even if you don’t need a new team communication tool, it is worth having for this purpose. One of Slack’s greatest features is ‘Integrations’. Luckily for us SEOs, there is an RSS integration.

5 Simple Steps for Quick Robots.txt Disaster Remediation:

  1. Take the URL for your robots file and drop it into Page2Rss.
  2. Configure the Slack RSS integration.
  3. Add the Feed URL (from Page2RSS) for your robots file.
  4. Select the channel to which notifications will post. ( I suggest having a channel for each website/client, read more on why later)
  5. Relax and stop worrying about an accidental ‘disallow all’.

The Benefits of this Method

4 Benefits of Using Page2Rss & Slack to Watch Your Robots File:

  1. You can add your teammates to channels, so key people can know when changes occur! One person sets up the feed once, many people get notified.
  2. Page2Rss will check the page at least once a day. This means you’ll never go more than 24 hours with a defunct robots file.
  3. No one on your team has to check clients’ robots files for errors.
  4. You’ll know what day your dev team updated the file. This enables you to add accurate annotations in Google Analytics.

robots damage prevention

The ‘Why’

OK, cool, but why is this necessary? Because you care about your sites’ reputation with search engines, that’s why!

Mistakes happen with websites all the time, and search engines know that. They’re not in the business of destroying websites. But they are in the business of providing the best results to their customers. So if you neglect to fix things like this with a quickness, you’re risking demotion.

I’ve seen sites go weeks with a bad robots file, and it is not pretty. Once search engines have removed your site from the index, it is quite difficult to get back. It can sometimes take weeks to regain the indexation you had prior. Don’t undo hard work put into search engine optimization because your file was overwritten. Do yourself a favor and setup this automated monitoring feature.

I’ve armed you with this info, now there is no excuse for getting de-indexed due to robots.txt issues. If this has happened to someone you know, please share this post with them!

Logan Ray
With a B.S. Degree from the University of North Carolina at Greensboro, Logan Ray joins Beacon as a Digital Marketing Specialist. Outside the workplace, Logan’s interests include spending time with his wife and dog, board sports and outdoor adventuring.
By | 2017-08-11T16:09:50+00:00 December 1st, 2015|SEO|Comments Off on How-To: Robots.txt Disaster Prevention

How to Properly Handle Pagination for SEO [Free Tool Included]

Let’s start out by defining what I mean by ‘pagination’. This mostly applies to ecommerce sites where there are multiple pages of products in a given category, but it can occasionally be seen on lead-gen sites as well. Here’s an example of what this might look like:

  • http://www.freakerusa.com/collections/all
  • http://www.freakerusa.com/collections/all?page=1
  • http://www.freakerusa.com/collections/all?page=2
  • http://www.freakerusa.com/collections/all?page=3
  • http://www.freakerusa.com/collections/all?page=4

(pages 3 & 4 don’t actually exist on this site, but it helps illustrate my example a little bit more)

In this case, you’ve got 4 pages all with the same meta data. It’s likely that search engines are going to index all of the pages listed above, and count the pages with parameters as duplicates of the original first page. You’ve also got a duplicate hazard with /collections/all and/collections/all?page=1. If you’re concerned with search engine optimization and your organic visibility, you’re going to want to keep reading.

Proper Pagination for SEO

So, how do you go about solving this problem? Fortunately, all the major search engines recognize and obey rel= tags; rel=canonical, rel=prev, and rel=next. The canonical tag says “hey, we know this page has the same stuff as this other page, so index our preferred version”. The ‘prev’ and ‘next’ tags say “we know these pages are paginated and have duplicate meta elements, so here’s the page that will come next, and here’s the one the precedes it”. There are HTML tags that go along with each of these that you’ll need to have your dev team add to the <head> section of the pages. Rather than show you what these tags are and how to generate them for each page, I’ve built an Excel spreadsheet that will generate all necessary tags (for paginated categories up to 20 pages in depth), all you need to do is add your base-URL at the top and hit enter. By ‘base-URL’ I mean this: “http://www.freakerusa.com/collections/all?page=”, basically it’s the paginated URL without the actual number of the page.

Tag Builder CTA

Logan Ray
With a B.S. Degree from the University of North Carolina at Greensboro, Logan Ray joins Beacon as a Digital Marketing Specialist. Outside the workplace, Logan’s interests include spending time with his wife and dog, board sports and outdoor adventuring.
By | 2017-08-11T16:08:58+00:00 October 18th, 2015|SEO|Comments Off on How to Properly Handle Pagination for SEO [Free Tool Included]

Should I Use Canonicals or 301 Redirects?

Should you 301 redirect that page to another, or should you use a rel=canonical tag? There are tons of reason why you might have some redundancy on your site. If it’s an eCommerce site, you’re probably displaying product listing pages a few different ways (sort by price, color, rating, etc.), or you might have navigation pages that are similar to your SEO landing pages. Whatever the case may be, chances are pretty good you have some form of duplication on your site that needs addressing.  This topic has been debated for years, but the real answer lies in one simple question:

Should people be able to access both pages in question?

Should I use canonicals or 301 redirects?

If the answer to this questions is Yes, you want to use rel=canonical. Doing so will point search engines towards your preferred page, but won’t prevent people being able to access, read, and interact with both pages. Here are some times you might see the rel=canonical tag in action:

  • www & non-www versions of URLs
  • parameters that change how a product listing page is sorted
  • navigation pages that point to an equivalent SEO landing page (it doesn’t always make sense to put content on a nav page)

If the answer to your question is No, you should remove that page and 301 redirect it. Page removal is much more common among eCommerce sites where products are discontinued but you can’t just remove the page (what if someone is linking to it?!?). Occasionally, you’ll see cases where this needs to be done for SEO landing pages. In the case of large SEO projects, where there are hundreds or thousands of keywords, content can get duplicated easily. Keeping a perfect account of every single SEO landing page that’s been written is basically impossible, so you might end up with two different pages with URLs like this: /blue-widgets and /widgets-that-are-blue. Obviously, even if the content isn’t identical, you can’t keep both of those pages around. Figure out which one has the most authority, links, and traffic – keep that one, and redirect the other one to it.

Next time you come to this fork in the road, remember to ask yourself whether or not there is value in people being able to see both versions.

Logan Ray
With a B.S. Degree from the University of North Carolina at Greensboro, Logan Ray joins Beacon as a Digital Marketing Specialist. Outside the workplace, Logan’s interests include spending time with his wife and dog, board sports and outdoor adventuring.
By | 2017-08-08T08:42:09+00:00 October 8th, 2015|SEO|Comments Off on Should I Use Canonicals or 301 Redirects?

Case Study: International Furniture Company Leverages Pinterest to Increase ROI

Generating Web Traffic and Revenue through New Social Bookmarking Platform, Pinterest.

Challenge: Along with companies investing in more traditional web marketing activities like SEO, PPC and conversion optimization, Social Media has become a major component to the overall marketing strategy . Beacon Technologies began social media efforts for an International Furniture Retailer in early 2010. As new social networks emerge, Beacon researches opportunities and provides discovery, insights, and feedback . Pinterest is a new player in the world of social media and was a perfect fit for

  • Client Facts:
  • Online retailer of healthcare products incorporated in 1997
  • >$13.2M Annual Revenue 2012
  • Goals:
  • Increase brand visibility online by expanding client’s social media presence
  • Increase the number of unique websites linking to the site
  • Results:
  • Added +200 external links leading to client’s website within 60 days of its release, including unique “C” blocks and social media shares.
  • Increased Social Media referral traffic by more than 6%

 

Business Solution: In response to the client’s problem, Beacon created an info-graphic, a piece of visually entertaining content that conveys industry relevant information, which encourages the sharing of information by other bloggers and social media users. Once shared by a visitor through use of an embed code placed immediately underneath the image, a link is placed to the client’s website.

Beacon wrote entertaining copy, created an enticing image, and began promoting the image through various means including social media, newsletter subscriber list, and info-graphic directories.

Results: “C” blocks returned, doubling from 220 to 440 in only 60 days! Total links increased by over 20,000 and external “followed” links increased by 30,000.

Not only did these info-graphics help replenish links lost through attrition; they were often shared on social media platforms, helping to boost brand awareness and reinforce our client as a thought leader in his industry.

When comparing the 60 days before and after the release of the info-graphic, visits from social media sites increased by 6.02%, visits via social referral improved by 5.47%, and conversions from social media sources improved 3.37%.

Logan Ray
With a B.S. Degree from the University of North Carolina at Greensboro, Logan Ray joins Beacon as a Digital Marketing Specialist. Outside the workplace, Logan’s interests include spending time with his wife and dog, board sports and outdoor adventuring.
By | 2017-06-16T12:26:10+00:00 September 24th, 2015|Case Studies|Comments Off on Case Study: International Furniture Company Leverages Pinterest to Increase ROI
Load More Posts