17 06, 2016

Power Up Online Visibility to Your University with SEO

By | 2020-01-29T16:18:02+00:00 June 17th, 2016|Categories: Higher Education|Tags: , , , , |

Universities and Colleges are beginning to realize the true value of SEO for Higher Education. Unfortunately, the SEO landscape is volatile and ever changing. However, there are some best practices which have been fairly consistent over the last couple of years. We believe that these best practices will continue to be relevant throughout 2016 and work to improve online visibility.

Keyword Research and Targeting Specific URLs 

Higher education institutions should embrace keyword research and use it to build relevant pages. These pages should be built around their core programs they offer. To get started first conduct keyword research which will drive content creation. You can start with a free keyword research tool from SEObook. This tool pulls data from Google and Bing’s databases. Simply input the term you want to get information on. This tool returns data on monthly search volumes along with related terms.

Keyword Research for Higher Ed

Next, you want to build an appropriate page around a theme of closely related keywords. These keywords should relate to educational programs your University offers.

For example, upon conducting some research I came across a few Universities who had important content within PDFs. Rather than a dedicated landing page, one University had downloadable PDFs of their courses.

landing page example


An alternative would be to build a landing page around this content and include some sort of conversion point. In case you do not know what a micro conversion is it is a small step on the path of a visitor towards your website’s primary goal.

Avoid Duplicate Content

Google admits that 25-30% of the internet is duplicated content. However, we can still take measures to avoid this. To get an idea if your site has duplicate content simply use Siteliner to run a quick diagnostic.

Identify duplicate contentI can see this college has around 10% duplicated content but, the free version this tool will only look at 250 pages. A way to fix this is to write unique copy on key pages and block unnecessary pages from being crawled by search engines with a robots.txt file. The robots.txt is a nifty text file you can create to instruct search engine robots on where and where not to crawl your site.

Get a Grip on Local Visibility 

local search visibility


Colleges are not immune from the types of competition businesses face online. As you can see competition is fierce. There are a lot of different options for any prospective student deciding on where to further their education. Local visibility or local SEO has risen in prominence over the last several years. But how can Higher Educational Institutions benefit?

Google has begun to provide more individualized search features. These personalized features are based on a person’s geographic location and search history. You can take advantage of this by:

Local Citation Corrections – Your first job is to make sure your college’s address is easily found on key pages. Then you need to ensure it is listed correctly throughout the web.

local citation reviews


Local Content – Once your Higher Ed’s citations are handled it’s time to produce local content. It’s important this content resonates with your audience. Maybe you build a page dedicated to ‘College Life’ and talk about activities around your college’s campus.

Don’t Block Important Resources

Blocking content that is important to searchers within search engines is detrimental. A common way to block important information is with a robots.txt file.

A robots.txt file is a simple file which holds pretty significant power. This file controls how search engines initially crawl a website when they first visit. However as you can imagine it can be pretty easy to block important resources you would otherwise want indexed. To check this simply type your domain name “example.com/robots.txt”. Remember replace “example.com” with your Universities domain name.

robots.txt sample


A good rule of thumb is to not block key pages from search engines with a robots.txt file. For example, you would not want to block important categories such as, Admissions, Academics, or Athletics sections.

Keep in mind, if you use Google Site Search as your internal site search, blocking a section with the robots.txt file will also prevent it from showing up properly in your internal search as well.

Higher Educational institutions may not take digital marketing or search engine optimization into consideration. However, no website is immune from the grasps of Google.

Take advantage of digital marketing and SEO to get in front of more prospective students. By conducting keyword research, avoiding duplicate content issues, and gaining more local visibility you can reach a wider more targeted audience.

1 03, 2016

Google’s New SERP Layout: How It Affects You

By | 2017-07-20T09:10:42+00:00 March 1st, 2016|Categories: PPC|Tags: , , , , , |

What Happened With The SERPS?

Starting February 18th, Google issued a worldwide roll-out of a ad placement format: 4 paid ads will now display on top of desktop SERPs (search engine results pages), instead of the usual 3. In addition, they are also removing paid ads on the right-hand side of SERPs with the bottom of the SERPs featuring 3 paid ads. These changes, once fully rolled out, will be active across all languages. This change isn’t completely unexpected as Google has been testing their four-ads format for months now, but we should be seeing the changes take a more permanent effect in the next few weeks. The one constant amid all of these changes are the Product Listing Ads (PLAs), which will remain atop the SERPS or on the now free right-hand side.

How Does The SERP Change Affect Me?

In the industry, speculation abounds as to what effect these changes will have on:


  • Paid Search Cost: One of the main concerns from the change has been how much more expensive will clicks become with fewer ad spaces. Will advertisers with smaller budgets not be able to compete and fade away? SO far, the results don’t show this is the case. According to different early case studies, CPCs haven’t shown any increase and some industry experts argue that the increase in ad inventory at the top of the page will drive down CPCs.

new organic results

  • Organic Visibility: Another cause for concern has been that organic listing #1 is now paid ad #4. This means organic results and organic top-performers have been pushed down the page and lose visibility. Organic real estate has been shrinking for years, thanks to features such as news, images, videos, local/map packs, the Knowledge Graph, new ad formats and features like hotel/flight search. The conclusion is that results have not changed enough to start worrying or make drastic changes but if you have third-party tools or SEO management platforms to monitor position changes, you should make use of them to stay ahead of any new changes.


  • Product Listing Ads: One of the biggest gains from the SERP change came in the form of Google Shopping Ads. Product listing ads and the knowledge panel are the only listings that will appear in on the right, beside the top four paid search ads. Early results suggest that the retained position on the right-hand side is helping the PLAs attract a slightly higher CTR as well as a larger share of the paid clicks from the SERP. If you currently have PLAs, take advantage of them by expanding your product groups and enjoy the new exposure until competition creeps in. If you aren’t taking advantage of PLAs and have shopping feeds, work with your developer and/or marketing team to create a Merchant Center account and tap into this prime retail space. 


  • Search Volume: Many marketers have been concerned that fewer ads above the fold and SERPs would translate into decreased impressions and search volume. Although some paid search users have complained of fewer impressions and clicks it seems to be more of an issue of seasonality and normal shopping cycle than the SERP change. For marketers that focus on the spots 3-4, most have been the biggest winners from the SERP change as impressions and clicks went through the roof especially at the 3rd position (some are seeing CTR double or triple). The good news is the right-hand ad positions and bottom of the page ads accounted for less than 15% of click volume before the new changes so the true impact isn’t as great as some marketers will lead you to believe.

In Summary

The SERP change in February 2016 shook up the digital marketing world as do all global changes Google makes with search engine results pages. The biggest winners from the SERP change were most paid search advertisers especially those advertisers in position number 3 followed distantly by those in the brand new position 4. Position 3 advertisers saw click-through rates increase over 20% while position 4 saw increases up to 15% in click-through rate. The biggest losers were advertisers in positions 5-11. Those unfortunate enough to remain in these positions or to be bumped down to positions 5-11 have seen significant loss of impression share and total click volume due to be evicted from the right-hand SERPs.

It is still unknown whether organic results will win or lose as data is inconclusive and more case studies need to be performed. Due to just moving down one more position and possibly moving under the fold, organic listing businesses may see drops in search volume and website sessions. If you are high-ranking in organic currently, look to your digital agency or marketing team to find third-party tools or SEO-management platforms to help monitor position increases and decreases over the next 30 days.


18 01, 2016

RankBrain in 2016

By | 2017-06-16T12:52:33+00:00 January 18th, 2016|Categories: SEO|Tags: , , , |


Google has used word frequency, word distance, and world knowledge based on co-occurrences to connect the many relations between words to serve up answers to search queries. But now, due to the recent breakthroughs in language translation and image recognition, Google has turned to powerful neural networks that have the ability to connect the best answers to the millions of search queries Google receives daily.

RankBrain is the name of the Google neural network that scans content and is able to understand the relationships between words and phrases on the web.

Why is it better than the previous methods? In a nutshell, RankBrain is a better method because it is deep learning self-improving system. Training itself on pages within the Google index, RankBrain looks upon the relationships between the search queries and the content contained within the Google index.

How does it do this? Neural networks are very good at conducting reading comprehension based on examples and detecting patterns in those examples. Trained on existing data, Google’s vast database of website documents is able to provide the necessary large-scale level of training sets. When conducting training, Google changes key phrases or words into mathematical entities called vectors which act as signals. RankBrain then runs an evaluation similar to the cloze test.  A cloze test is a reading comprehension activity where words are emitted from a passage and then filled back in. With a cloze test, there may be many possible answers, but on-going training from a vast data set allows for a better understanding of the linguistic relationships of these entities.  Let’s look at an example:

The movie broke all (entity1) over the weekend.

Hollywood’s biggest stars were out on the red carpet at the (entity2) premiere.

After deciphering all of the intricate patterns of the vectors, RankBrain can deliver an answer to a query such as “Which movie had the biggest opening at the box office?” by using vector signals from entities that point to the search result entity receiving the most attention. It does this without any specific coding, without rules, or semantic markup. Even for queries that may be vague in nature, the neural network is able to outperform even humans.

With RankBrain, meaning is inferred from use. As RankBrain’s training and comprehension improves, it can focus on the best content that it believes will help answer a search query. As a result, RankBrain can understand search queries never seen before. In 2016, be prepared to provide the contextual clues that RankBrain is looking for.

14 07, 2015

Case Study: Medical School Uses Google’s Async Code to Provide Accurate and Flexible Tracking

By | 2017-06-16T12:44:43+00:00 July 14th, 2015|Categories: Case Studies|Tags: , |

Updating Website to Asynchronous Tracking Code from Traditional Tracking Code Follows Best Practices, Improves Site Performance, and Allows for Greater Flexibility in Code Installation.

Challenge: Client was re-launching website in a new format which offered an opportunity to update the site’s existing traditional Google Analytics snippet to the asynchronous tracking snippet. In addition to staying on the cutting edge with Google’s best practices, this was done to improve site speed and also produce more accurate results. This would also provide Beacon some added flexibility in where the snippet could be placed and still perform at optimal levels.

  • Client Facts:
  • For Profit Medical & Veterinary School
  • Valued at $115M in 2008
  • Faculty size greater than 200
  • 2000+ Student body hails from 24 countries six continents
  • Goals:
  • Update website with Asynchronous tracking code for GA for optimal flexibility of code
  • Improve site performance in both speed and accuracy with installation
  • Installation Notes:
  • Tracking code moved into header file for consistency and easier maintenance
  • Virtual Pageviews and Events were updated in order to track users who accomplished a predetermined step or goal completion
  • Results:
  • Client’s tracking script adhered to Google’s best practices and works at the optimum load time and accuracy levels available through GA


Business Solution: Previously the Google Analytics tracking code for the client’s website was located before the closing <body> tag towards the end of the page. This was considered an ideal location for this JavaScript snippet because it gave the page an opportunity to load prior to firing. However, due to the structure of the client’s website, this involved the code being included in a number of template files in order to be consistently rendered across every page. Any update to the code required changes to all of these files which increased the potential for errors.

Google’s updates to the asynchronous tracking code offered a greater flexibility than the traditional snippet. When the client re-launched its site to a new Drupal format, this provided a great opportunity to upgrade to the new code which could be placed in a universal header file and still function with a greater accuracy than the previous version. From this point forward, any changes made to the code could be made just once and would be applied site-wide.

Changes to the actual code involved adjusting to the new format. The traditional code used the pageTracker function for its tracking operations. The new async. code is built around the gaq.push function. Once these changes were made to the primary tracking snippet, the final task was to go through the site’s pages and ensure than any virtual pageviews or event tracking were updated to the new codes as well. These represent instances where specialized tracking code snippets are inserted at the point of action to display specific events in Google Analytics. Any instance where this was previously tracked under the traditional snippet was replaced. Beacon was able to conduct a global find and replace to work more efficiently to minimize client hours used.

Results: The adjustments to the asynchronous snippet were made without issue. The client also appreciated that the upgraded tracking code would lead to quicker load times and better accuracy for their Google Analytics data. The new site invested heavily in a number of flash elements, so load time for all installed scripts was critical to ensure a high quality user experience. In terms of accuracy, the client services students hailing from all over the world. Having the most accurate data on segments from each region helps to determine marketing strategy and financing for various countries around the world.

2 05, 2014

Beacon goes to Moogfest 2014

By | 2016-11-18T14:15:38+00:00 May 2nd, 2014|Categories: Beacon News|Tags: , , , , , , , , , |

Beacon had the opportunity this past weekend to participate in Synthesis, an event by NCTA, at Moogfest in Asheville, NC. This portion of Moogfest was a high energy tech expo featuring a wide range of technology companies. Live entertainment was there, in true Moogfest spirit, and various speakers from the tech field presented throughout the day.

moog1 Beacon was able to participate in the various Moogfest workshops and presentations as well as enjoy all the live music the festival has to offer. The one thing that unites all the performers and speakers throughout Moogfest is the use of technology and innovation in their art and music. It was fascinating to see how technology is being used in both audio and visual experiences and to envision where these technologies will take us in the future. Excitement about interactivity and communication was a common theme throughout Moogfest. At the Google Creators Panel Discussion, the Google team talked about creating tools for mobile devices that allow users to create their own music. Mobile apps are enabling people to have easier access to all sorts of tools, breaking down barriers, and giving people new ways to create and learn. They also showed a project that used data (from Google Maps), combined with images and music to create an audio/video experience that was personalized to the user’s childhood neighborhood. This took the music used in the project to a new personal level. I think personalization and interactivity are always marketable things to integrate into a software product. When you can build a relationship between the user and the experience, then there is attachment there that helps build loyalty and this relates back to user experience and brand loyalty.

moog2 Moogfest is mostly about music, but there was definitely a lot to take away about design and user experience. NCTA’s Synthesis event was a great avenue for people to see how technology is applied across a wide range of areas. A big thank you goes out to NCTA for putting on this event and bridging the gap between technology in the business world and technology in the music world.


2 05, 2012

Social Media Best Practices

By | 2016-11-18T12:10:07+00:00 May 2nd, 2012|Categories: Social Media|Tags: , , , , , , |

Now that we have been practicing for some time, we are aware of good (and, bad) ways to employ social media. Opinions will vary about the “best of the best”. That is the beauty of the web, it always changes. From a book summary I read on social media, here are some of the best practices according to the author at this point in time. This is not an exhaustive, all inclusive list. The basics are covered and should align with your current techniques for using social media effectively.

There are multiple platforms* available for your content and they should be used relative to the target audience. Having likeable content is a fundamental criteria for success. Get your customer to listen to you and then….

  1. Listen first, and never stop listening – You want to know what customers think. Ask and they will tell you. Next, the most important thing to do is listen. Closing the loop by acting on what your customers tell you will prove that you not only listened, but that you understand and can do something about it.
  2. Define your target audience better than ever – There are many tools that allow you to focus on your true demographics for your product/service(s). Define them and determine what will make them “like” your content.
  3. Think – and act – like your consumer – Remember, it’s about them, not you. Don’t sell them; instead, provide content that is of interest to them. Get them talking about topics of interest and find ways to integrate your wares into their lifestyle.
  4. Invite your customers to be your first fans – Word of mouth (WOM) is key here. The more likes you get, the better your credibility. Be clear about your value proposition and define what is in it for them. Remember, there is no value-add if only your employees are interacting with your content.
  5. Create true dialogue with, and between, your customers – Related to listening and being genuine. Get them talking about you to leverage the WOM effect. When your customers share tips and tricks with others, it proves they are engaged. It also saves you from providing customer support directly. Help guide the discussion by acknowledging comments – and, correct where needed.
  6. Be Authentic – Get connected by demonstrating an interest in your customers. Personalize it by including your name.
  7. Be honest and transparent – You can spot a phony a mile away. Your customers can too.
  8. Integrate social media into the entire customer experience – Another fundamental for success and cannot be stressed enough. Make sure everyone who interacts with your customers has the same message and is aware of promotions and specials. Regardless of how they find you, it should be a consistent message. The last thing you want is a disconnect among channels and mismanaged expectations from your customers. If they are online, they can tell their network about you – the good and the bad.
  9. Don’t sell! Just make it easy and compelling for customers to buy – They already found your content and are engaged. Don’t insult them with a bland sales pitch. State the (relative) value proposition clearly and make it easy to “Add to Cart”.

Having a dialogue with your customers is easy using a social media platform. I would add that you keep in mind how you want to be treated. After all, we all are consumers in the end.

*Platforms range from Facebook, Twitter, YouTube, FourSquare, LinkedIn, Google+, Blogs, and specialized networks (e.g. flickr, yelp, etc.)


5 01, 2012

Google Correlate

By | 2020-02-05T11:14:17+00:00 January 5th, 2012|Categories: SEO|Tags: , |

As a software development project manager at Beacon, I’m also proud to say that I’m both an NPR and data geek, so I was elated to hear a story this week that united all of my passions:  Google Searches Are A Window Into Our Culture.  The tool “Google Correlate” is actually a fascinating window into how people are searching for not only one specific term, but an entire web of other related (or maybe not-so-related) terms.

The political example given in the story was somewhat predicable (Democrats– veggie-loving, fitness buffs; Republicans– meat-loving, weight-loss program participants), but my own searches turned up some interesting results on Google Correlate.  I am just starting work on a new website redesign for a well-known business school and was wondering what kind of associations I’d find if using terms related to that school (thinking I might be able to use this information with regard to site design and features).  Here’s the terms I tried:

  • business school— while many of the U.S.’s top business schools are listed, I was surprised to see the appearance of “art schools” and “art colleges” as correlating terms.  Wonder if my client has considered cross-promotion with this demographic?  Could a more “artsy” site design have benefits in this area?
  • management school— like the “art” association listed above, I was suprized to find “hospital association” as a correlation with “management school.”  Perhaps another marketing opportunity here?  Would a site feature that included possible hospital careers be helpful to these visitors?
  • mba school— oddly, this was a much more common search term in Utah than any other state.  Not sure how we can leverage this, but I’m sure we’ll bat it around for a while!


Also, don’t miss the comic book on the Google Correlate site – fun!  The most important point that the comic book emphasizes and bears repeating here– “Remember:  Correlation is not causation.”  Google doesn’t attempt to explain the correlation between terms, just show it to us in a manner for us to interpret and leverage.  Happy correlating!

1 09, 2011

Google search, map and mail tips and tricks

By | 2016-11-18T14:46:51+00:00 September 1st, 2011|Categories: SEO|Tags: , , , |

Found a super article on “How to Google Like Google Googles” at PCWorld today.  Here are a couple of my favorites tips:

  1. It is a constant battle with my middle and high school students to get them to use “authoritative sources” for school work (“but Mooooom, EVERYONE uses Wikipedia and my teachers don’t care!”).  Having seen for myself the misinformation purposely posted on Wikipedia, I still insist on .edu, .gov, etc. sites for research and this tip makes that a bit easier (though the battle rages on…)

    Search certain types of sites or just certain sites. You can search a wide variety of sites by inserting a close angle bracket (>) symbol before the type of site you want to search. For example, [penguins site:>.edu] searches for penguins across all .edu sites; and [crater image site:>nasa.gov] searches for crater images across NASA.gov.

  2. Would have been soooo helpful on my three day, agonizing move across the country last year with the dog, two distraught teenagers and a dying minivan:

    Find hotel prices directly on Google Maps. No more copying and pasting the address from one site into a map to see its location–for several major cities in the United States, you can easily see nightly rates when you search for hotels in Google Maps. Try it now: Search for a “hotel in Los Angeles” on Google Maps

  3. Not a particularly helpful tip, but makes you really want to be a “Google Master” doesn’t it?  Or is that just me???

    GmGMail ninjaail is a very deep program, with too many tips and tricks to list in this article. In fact, Google categorizes its Gmail user tips into four stages–white belt, green belt, black belt, and master. The tips for each belt can be found at Google’s “Become a Gmail ninja” site. There’s even a printible guide; after all, even ninjas forget their moves once in a while.

8 07, 2011

Beacon Technologies Through the Eyes of an Intern – Week 8

By | 2017-08-15T15:59:01+00:00 July 8th, 2011|Categories: Social Media|Tags: , , , , , , |

Eight weeks down, two more to go.  The past two months have really gone by quickly.  I spent some time today reflecting back and what all I have done and learned so far here at Beacon.  I’ve learned a lot, but I realize that there is still a lot I don’t know.  Since this week was the start of a new month, I spent a lot of time this week working on transitioning the accounts I was working on to other members of the WMS team.  This involved some meeting time and talking about what I had been doing, what I planned to do, and what steps could come next.  The other major thing I did this week was to compile monthly reports for the clients I had been covering.  Like I mentioned a few weeks back, that’s not the most glamorous task but it’s really not so bad.  I found it very rewarding to see growth in the clients I covered and to see things I had done start to show results.

The other big thing I did this week was to sit down and map out a final two week plan to help market Beacon itself.  I’m excited to be getting into this since my background from undergrad is marketing.  Some of the things I am going to be doing involve PPC campaigns, setting up various tracking measures, creating possible promotions, and a few other ideas.  This will be fun.

On a side note, I have been doing a lot of work with social media for several clients as well as for Beacon, and the more involved I get, the more I learn.  I’ve always been comfortable with Facebook, but I never really have had much exposure to outlets such as blogs, Twitter, and Foursquare.  That has changed during my time at Beacon.  I’ve learned how to utilize various outlets to accomplish different tasks.  For instance, I’ve learned that using Twitter can be very valuable for interacting with customers and is a great tool for promotional contests.  Another thing I learned is that Foursquare, which is a location based check-in service, is great for driving foot traffic into a business.  The way that is done is by first setting a location for the business within Foursquare.  Then you can set up options where special offers will pop up on someone’s cell phone if they are running the Foursquare app and they are within a specified geographical area of your business’s location.

The final thing I want to talk about relates to social media as well.  The “new kid on the block” is Google+.  I was able to get an invite to join Google+ today.  For those who are unfamiliar with Google+, it basically is a social media outlet similar to Facebook.  There are subtle differences between the two that I have observed, but overall it seems to be more or less the same.  The concept is almost the same as Facebook, and the only real differences at the moment are that Google+ calls features by different names than they are called within Facebook.  I’m not entirely sure if it’s something I’ll stay with but I’m willing to give it the “old college try.”

10 06, 2011

Beacon Technologies Through the Eyes of an Intern – Week 4

By | 2016-11-18T12:10:24+00:00 June 10th, 2011|Categories: Social Media|Tags: , , , |

This week has gone well.  Along with the weekly meeting, I was able to participate in a couple other meetings.  One was with a client I am working on.  It was a standard “checking in” type of meeting.  I was able to hear some ideas the client would like to see worked on in the near future.  As part of that discussion, I was able to view some work for other clients that I was unaware of, and that gave me some good ideas for some things that I may try to bring up with some of my other clients that could potentially use the same type of feature on their website.  I also sat in on a brief training session on a tool for unifying social media efforts.  Hootsuite allows you manage social media updates and accounts from a centralized location.  You can connect Facebook and Twitter and monitor activity on each as well as schedule updates to release when you want rather than right at the moment you type them up.

As far as the type of work I did this week, I mostly worked on finishing up monthly reports and then getting them sent out.  Once that was finished, I switched over to working on keyword discovery.  I spent some time learning how to find new keywords as well as learning how to see how websites rank for keywords that they are targeting versus not targeting.  I did this for a couple clients and that gave me a chance to approach it from different perspectives.  The other thing I did in relation to keywords was to try to determine how difficult it would be to rank for keywords based on the current content of the website and the amount of competition for the keyword.  This was interesting because I was not aware of what all goes into determining how a website ranks for a keyword.  Some of the areas that are important, they are kind of common sense ideas, are having the keyword appear in the text of the URL, the title of the page, and/or the content of the page.

In closing, my productivity this week fought a long battle with my ability to be distracted and my ability to be amused by simple things.  I blame Google for this.  As most people are likely aware by now, Google created a special logo to honor Les Paul’s 96th birthday.  The logo was a playable guitar that even let people record short songs if they were in the US.  Google kept the logo up today, Friday, even though Les Paul’s birthday was Thursday.  This is by far the best Google logo I have seen.  The second best was the playable version of Pac-man.  I did manage to get several tasks completed this week, but I also managed to rock out from time to time.  While I am not an overly talented Google guitarist, I did manage to find several YouTube videos of people who are good.  Just head over to Google and search “Google guitar” and you should be able to find them.

Load More Posts