Google’s experiment with allowing authors to add their profiles to their content in search results has come to an end.

The end of Google Authorship. What Next?

That’s it. Google Authorship. It’s all over! Google’s experiment with allowing authors to add their profiles to their content in search results has come to an end.

 

Launched in 2011, Google’s Authorship mark-up was billed as “a way to connect authors with their content on the web”.

 

By implementing authorship mark-up, content creators could have their author name and G+ profile picture appear in search results, next to your ranking content. Google started off big, marking up content from The New York Times, The New Yorker and The Washington Post, before announcing it open to all webmasters and authors.

 

However, in its three-year long tenure, Authorship has been victim to low adoption rates and problems with correct user implementation. Late last week Google’s John Mueller announced on Google+, that the Authorship program was coming to an end, and as a result, Google Authorship would no longer be using any data associated to the ‘rel=author’ mark up.

 

 

The History of Google Authorship

 

Mueller’s announcement comes as no surprise to most webmasters and bloggers. We saw reports in December 2013 that there had been a noticeable reduction in the number of author photos appearing in the search results. Further questions were raised in June as to the future of Authorship, when we saw the removal of all author photos and a controversial statement that images had no impact on click-through rates.

 

Before Mueller’s announcement in June, Google had spoken of nothing but the benefits and future of Authorship with confidence. Only a year ago Google’s Tech Lead Maile Ohye said that the “Authorship annotation is useful to searchers because it signals that a page conveys a real person’s perspective or analysis on a topic.”

 

In June 2013 Matt Cutts said: “I’m pretty excited about the ideas behind authorship. Basically, if you can move from an anonymous web to a web where you have some notion of identity.”

 

You can’t help but wonder, what changed for Google to backtrack on this positive outlook for the future of Authorship?

 

 

Why have they killed Google authorship?

 

Google is relentless in it’s testing for search quality. Simply put – anything that doesn’t meet Google’s goals, doesn’t provide significant value to the user or doesn’t have sufficient user adoption – gets the chop.

 

An in-depth article by Search Engine Land on the end of Authorship includes study data that confirms low adoption rates from authors and webmasters. To highlight low adoption, Search Engine Land references a 2012 study of the 50 most influential social media marketers by Forbes, which reported only 30% of the top 50 people use authorship mark up.

 

In addition to the low adoption rates, Google now claims that Authorship was of low value to searchers. In Mueller’s Google+ announcement in June, he stated that authorship had little impact on click-through behaviour. In his most recent announcement, Mueller goes as far to say that Authorship may have even distracted from those results showing authorship.

 

What does this mean for the future? 

 

According to Mueller, removing authorship doesn’t generally seem to reduce traffic to sites. And if you leave the rel=author mark-up on your website, nothing will happen. Google will just choose not to use that data anymore. It’s also worth noting that Google says that when users sign in, you will still be able to see the Google+ posts relevant to your search query from your Google+ friends and pages.

 

In terms of Google’s future plans, Mueller claims that:

 

“Going forward, we’re strongly committed to continuing and expanding our support of structured markup (such as schema.org). This markup helps all search engines better understand the content and context of pages on the web, and we’ll continue to use it to show rich snippets in search results.”

 

Google authorship in principle still remains to be a good concept – connecting content with its rightful owner online. I believe the abandonment of authorship signals a concept that hasn’t quite worked this time, but will return in another form. After all, Author Rank – which impacts the ranking of a page dependent on the authority of an author – still lives on.

 

So what should you do? 

 

In essence, continue to create high quality content that helps to establish you as an authoritative author in your industry. If you share your content via Google+, continue to do so. If you aren’t already considering implementing markup for your website, like schema, then start. Schema can be used to markup anything from your brand and business opening hours, to your video content.

 

So has Google authorship just been a big waste of time?

 

Over to you: How do you feel about Google scrapping Authorship? Did you invest your time in to Google Authorship? Leave your comments below.

 

 

 

Google plus black image

Google Authorship downgrade: Is this the end of Google+?

Google’s John Mueller recently announced on Google+ that the company is removing authors’ profile picture and Google+ circle count from search results

 

All that visibly remains in the search results is the clickable author name in the byline.

 

From this:

To this: 

Screen Shot 2014-07-17 at 10.44.33 am

 

 

Images c/o Searchengineland.com

 

 

An improvement to user experience?

 

Google said the changes will improve the user experience by returning a less cluttered search result across all devices – especially mobile search.

 

One of the main advantages of authorship is the positive impact that profile pictures in search results have on click-through rates. However, Mueller said the new picture-less results exhibited a ‘similar click-through behavior’ to the previous results shown with a profile picture.

 

Google updated the visual appearance of it’s search results earlier this year with a change of font and the removal of the title underline, so this recent visual update to authorship doesn’t seem completely out of place. However, there is speculation around the real motives behind Google’s recent changes to authorship.

 

 

Was authorship impacting on Adwords Click-Through Rate?

 

This eye-tracking research and the Moz.com authorship study supports the claim that authorship pictures in search results improve click-through rates. Even Google themselves have spoken about the positive impact of social annotations in the search results and include ads with images in Google’s paid shopping ad program to attract clicks.

 

Mueller’s statement that click-through rates remain similar contradicts many of these previous studies, raising questions around whether authorship photos were negatively impacting on Adwords click-through rates.

 

Moz .com founder Rand Fishkin put forward his opinion about the changes shortly after the announcement from Google.

 rr

Image c/o www.twitter.com

 

 

Is this the end of Google+ as we know it?

 

Removing a significant feature and one of the biggest incentives of Google+ suggests potential changes are afoot. Google authorship images were a huge incentive for marketers to join up to Google+, in fact the potential to get your profile picture in the search results was probably one of the main reasons many people signed up and became active.

 

Removing one of the most notable features associated with Google+ coupled with the departure of Google+’s chief evangelist Vic Gundotra in April, could suggest that Google may be evolving Google+ as a product, having finally come to terms with the fact that Google+ cannot compete with Facebook in the social space.

 

Is there any value left in authorship?

 

Yes. It’s probable that some of the motive behind this change to authorship is related to improving the experience for people searching on mobile devices, however, it’s unlikely to be the whole story.

 

Google’s contradictory statement over the non-impact on click-through rates suggests that something bigger is happening in the background. However, I still believe it to be worth verifying your content using the “rel=author” tag as a method of claiming your content contributions across the Internet. Plus, authors still get their name in the byline of the search results.

 

While Google may see an impact on the number of people signing up for Google+ and implementing authorship, Google have said that they are still committed to promoting authors with authority. How you establish yourself as a trusted authority to Google may be evolving, but it isn’t going to disappear completely.

 

Therefore, if you are already participating in authorship you should continue, and mark up you content using the appropriate tag. If you aren’t actively using the authorship mark up, then it should be a consideration for any future content you create.

 

Authorship is only one consideration for anyone looking to improve their visibility in search engine results. Search Academy’s Glass Hat technology can identify the other factors and what tasks you should focus on to bring you the best return. For more information contact Search Academy today.

 

YOUR SAY:  How do you feel about the Authorship change? Is this a good move or bad move by Google? Do you think your search results will suffer? Leave your comments below.

 

The SEO of JavaScript

Google has been trying to understand web pages better for a long time now. They have quickly moved from indexing textual content through to the execution of JavaScript code within a browser-like environment, with some notable limitations.  For Web developers and SEO professionals it can be difficult to identify the right approach for developing SEO-friendly JavaScript. Luckily, Google has provided us with some guidance:

 

  1. Reduce the complexity of Scripts (make it easier for Googlebot)
  2. Ensure resources like JavaScript and CSS are not blocked in your robots.txt
  3. Ensure your server can handle crawl request traffic
  4. Apply Graceful Degradation development practices
  5. Avoid JavaScript that removes too much content

 

Graceful Degradation vs. Progressive Enhancement

We have some great advice, but I would suggest we can improve it a little further by understanding why Google has recommended Graceful Degradation and why the alternative approach of Progressive Enhancement may achieve better results.

 

Why mention Graceful Degradation?

All browsers are different, they each sport different features, sometimes with common bugs that make the display of content and execution of JavaScript inconsistent.

 

The top four browser rendering engines:

  • Trident: IE
  • Gecko: Firefox
  • Webkit: Chrome, Safari, Apple & Android mobile devices, Opera (soon)
  • Presto: Opera, Opera mini

 

Display and JavaScript variation was very common a few years ago during the The Browser Wars. Web developers still work very hard to ensure that the Web does not appear ‘broken’ across different browsers, often applying fixes of their own to mitigate browser bugs and reduce missing features.

 

In order to explain the approaches taken to solve these problems and why Progressive Enhancement may improve SEO for JavaScript, we will need a quick summary of the techniques required for bulletproof Web design:

 

Graceful Degradation

This is an older technique than Progressive Enhancement, which primarily caters for a richer experience and gradually degrades in less capable browsers.  It is a principle of fault-tolerant design, enabling a system to operate as planned and continue to work at a reduced level within a less capable environment, instead of failing completely when a fault is encountered.

 

Progressive Enhancement

Progressive Enhancement was coined by Steven Champeon in a series of articles and presentations for Webmonkey and the SXSW Interactive conference between March and June 2003. Core functionality and content are prioritised (e.g. a user should be able to ‘add to basket’ without JavaScript, or be able to read important content without barriers.) Richer features are layered on as more browser capabilities are detected.

 

To achieve this, Web developers conceptually build escalators not lifts.

 

“An escalator can never break – it can only become stairs. You would never see an ‘Escalator Temporarily Out Of Order’ sign, just “Escalator Temporarily Stairs. Sorry for the convenience. We apologize for the fact that you can still get up there.” - Mitch Hedberg.

 

Regressive Enhancement

This is a much newer technique that utilises all of the latest browser features (typically HTML5 and CSS3,) and then replaces the missing functionality in less capable browsers (e.g. older versions of Internet Explorer) via JavaScript using polyfills.

 

Summary

All the techniques above are different strategies of applying ‘fault-tolerant design’ to make an unstable browser environment appear to be seamless and uninterrupted for users.  This situation has improved greatly over recent years, with an impressive effort from Microsoft to make Internet Explorer – a much-maligned browser – more Standards Compliant.

 

Take Away

Progressive Enhancement, from the ground up, is a strategy for making content and core functionality available to the widest audience possible. This includes screen readers and devices not capable of handling JavaScript. In contrast to Graceful Degradation, Progressive Enhancement has a goal of ensuring core features are supported in the lowest capable target browsers.

 

This also means that Web sites applying this strategy will have a good chance of being found by search engine spiders that now understand JavaScript (not just Google Bot,) because features are progressively layered, based on capability.

seo roi graph

Crunching the numbers: How to measure ROI from your SEO campaign

If you’re curious about exactly what kind of returns you can expect from an SEO campaign, we don’t blame you. The reason most people invest in SEO in the first place is for conversions, sales and ultimately increased revenue, and SEO is proven to have one of the highest returns on investment over any other marketing and advertising channels. The problem is that SEO is not an exact science, and actually measuring its specific ROI can get even more complicated. That said, SEO is not an immeasurable free-for-all, and there are some actions you can take to better calculate and predict your outcomes. They may just not be quite what you expected.

 

Think about the big picture

 

As much as we wish we could make accurate short, medium and long term predictions about the efficacy of each brand’s SEO campaign, in reality it’s just not that simple, at least in the short term. When deciding on which keywords to target, we take into account a variety of factors that basically come down to what we already know, what we have previously witnessed, and what we can expect from the future.

 

Of course, in practice this analysis is a lot more complicated, and it is because of the sheer amount of variables involved that ROI in an SEO campaign is best tracked over the long term. It may take longer to get the results you are looking for, but these results will also be both more valuable and more sustainable to your business.

 

Exclude branded keywords

 

If you are seeing an increase in traffic to your website, that’s generally good news no matter how you spin it. The trick here is figuring out how much of this traffic is coming from branded keyword searches, and how much is from non-branded; because it’s the latter that means that your SEO efforts are really starting to pay off.

 

In simpler terms, if your website is experiencing more traffic from keywords that are non-branded, it means that users searching for goods or services are landing on your website – without even looking for your company specifically. This in all likelihood means that your website is on the first SERP (Search Engine Results Page) for these keywords, which is great news.

 

You can review the non-branded keywords driving traffic to your website in Google Analytics. More detailed instructions can be found here.

 

N.B. Direct traffic from branded search terms can also be influenced by organic searches, but that’s a story for another time.

 

Know the value of your customers

 

Or more importantly, know their value in relation to their CPA (Cost Per Action.) If you are not making a profit from the value of your customers vs. how much it takes to acquire them, then your SEO strategy needs an overhaul.

 

But remember our point about SEO being a long-term strategy, and don’t be afraid to invest a few extra acquisition dollars in order to attract good, rather than average customers who will contribute more to your business in the long run.

 

At the end of the day, tracking the dollars you are making from SEO is tricky, but not impossible. The key is to start thinking about the long term, big picture rather than attempting to calculate your ROI month by month. If you review your SEO campaign with this in mind, you will be in a better position to assess the real value it is bringing to your business.

Panda 4.0 – Is this latest Google update a Hummingbear?

The SEO world is again abuzz with speculation over the impact of two recent Google updates: Pay Day Loan 2.0 rolled out 20th May (more on this later) and Panda 4.0, which commenced rollout on 21st May.

 

As with the Hummingbird update, we applaud this latest Panda update because for SEO it means that efforts made towards the creation of good quality, web page content and well designed information taxonomy; will earn higher search engine visibility.

 

We are hoping this update is more of a “Hummingbear.”

With Hummingbird, Google sought to better understand the intent of a user’s search query and thereby improve their indexing of websites. Now, with the latest Panda [Bear] update, improved semantic search is more accurately matched to quality website content. I give you – a Hummingbear!

 

What we know        

  • Google updates it’s Panda algorithm regularly
  • The Panda algorithm is centred on content quality
  • This update is “Bigger”

 

 

What we think:

  • Google is broadening the meaning of “Poor Quality Content” to include more than the page copy.  We believe the metrics of ‘Content Quality’ will include user experience, information taxonomy, device responsiveness and more.

 

What is good quality content?

Here is a short checklist for you to use to evaluate the quality of your website’s content:

 

  • Is your webpage well written with clear text blocks and good use of headings and sub headings?
  • Is your web page rich in information for your audience, rather than advertising and links to other websites?
  • Do visitors stay on your web pages – indicating that your content is engaging?
  • Do you have content that visitors like to share?
  • Is the copy on each web page unique to others in your site?
  • Is each web page content updated and fresh?

 

Google Guide on Creating High Quality Websites

Here is a guide from Google that dates back to 2012 but remains a clear guide on what factors make a high quality website and what factors are considered low quality.

 

http://googlewebmastercentral.blogspot.com.au/2012/04/another-step-to-reward-high-quality.html

 

If you are a client of Search Academy, your project manager will be happy to discuss this latest Google update with you.

ƒ

How science can help solve SEO

At Search Academy, our belief is that search engine optimisation (SEO) is a scientific problem that needs a scientific solution. I have written before on how every industry behaves differently online, and how GlassHat technology identifies which SEO task to focus on next for the maximum return on investment. I am often asked to explain the underlying features of how our technology works, and today I want to explain some of GlassHat’s very basic principles, and why we built it to help guide us.

 

What is a regression model?

 

A regression model simply comes down to comparing historical data patterns to predict outcomes. In its most basic form, I would use the example of how long it takes you to go somewhere is determined by how fast you are travelling. If I travel at 20km per hour, it will take me 2.5 hours (150 minutes) to travel 50km. If I travel twice as fast, it will only take me half that time (i.e. at 40km per hour it will take me 75 minutes.)

 

This pattern continues – you can predict how long it will take you travel at any speed because you have built a “regression model” of T=D/S, where T= Time, D = Distance, and S= Speed. If that sounds complicated, let me show you by way of a diagram as below, where the Speed is on the X-axis and the corresponding Time is on the Y-axis. For this analysis I used D=50.

 

 

How is regression modelling applied to SEO?

 

In SEO you can use indicative factors such as “number of external backlinks,” “how fast a website loads” or “how rich and fresh your content is” to determine how high you rank in Search Engines. If you can find a pattern between the two data points, you can make a judgement as to which will drive rankings for your website. Simple, eh?

 

Why a simple regression model is not enough

 

Although this sounds great in principle, it’s far from that simple. There are many problems with this approach – consider the scatterplot below for an actual dataset from one day’s worth of data for a set of websites competing in the hospitality industry.

 

Although there is evidence of a pattern that shows more links cause a higher rank (red arrow), there are also a number of distinct problems:

 

1. NOISE – It’s obvious this isn’t the only datapoint that affects ranking for this category. We know of course that Google uses well over 300 factors to determine relevance, so we also know that this datapoint could not completely explain the cause of rank.

 

2. OUTLIERS – Considering that we are using a logarithmic scale  to plot the number of links, the top right single data point represents a website with a very high volume of links, but comparatively low ranking. This could be caused by a Google Penalty, and therefore we shouldn’t influence our model if this is the case. In the bottom left are two data points that rank very well but have a low number of links. What is the reason for this? Is it something we can’t replicate or explain?

 

3. INSUFFICIENT DATA –  Lacking in the model above is consideration for more keywords, the current rank of the website in question, the number of days of data, and how the keywords map to the website. Collecting and analysing more data complicates our problem.

 

4. NO CONSIDERATION FOR EFFORT – How much effort goes into creating links, anyway? Is it easier to create the first few links, but harder to create the rest? If so, at which point is this less effective than other methods?

 

5. NO CONSIDERATION FOR LIMITATIONS – External links can take time to build, and it’s generally not a good idea to build too many too soon, so there is a limitation on time. It’s not simply the case that we can move this data point without knowing what it’s limitations are. One of the questions we get asked the most is about restrictions, or difficulty in implementing SEO recommendations. Knowing what these limitations are enables us to understand the do-ability of moving each data point to the precise position.

 

6. THE MODEL CHANGES – We know that Google changes their algorithm all the time. What worked yesterday will not always work today or tomorrow. How do you stay on top of these changes and determine exactly how to spend your next SEO dollar?

 

Although not an exhaustive list, it’s easy to see how using science can help guide SEO, but not without considerable challenges.

 

In future postings, I will explain these problems in more detail, and how we overcome them using our GlassHat technology platform.

Every industry behaves differently: Why there is no such thing as a ‘One Size fits all’ SEO plan.

Share this content  

When it comes to search engine optimisation, there can be no “one size fits all” approach if businesses wish to succeed online. Using GlassHat technology, we have analysed hundreds of campaigns and thousands of keywords. However, we have still yet to find two sets of priority SEO factors that are statistically related to the same outcome. What works for one business may be vastly different to that of another, and vice versa.

 

Recently, we analysed the importance of various priority factors in both the insurance and weight loss industries. The results[i] demonstrate the difference in the effect of various SEO factors on ranking for websites in these two industries. More specifically, we observed three key insights from this GlassHat analysis.

 

 

1.       Some factors have a similar effect in different industries

 

In both the insurance and weight loss industries, external backlinks from government websites have a positive effect on a similar amount of websites in these two industries. Evidently, there are some cases where a similar approach may be beneficial across more than one industry, and a streamlined SEO approach may be possible for some factors. However, this is only the beginning of the story.

 

2.       Some SEO factors carry more weight in different industries

 

 

For various factors that we analysed, there was a vastly more measurable positive effect on ranking in the insurance industry as opposed to the weight loss industry. Factors such as having an ACRank 5 – that is, having 24 or more domains linking to the webpage in question – had significantly more positive effect in the insurance industry, as did having keywords in the anchor text. There may be similarities across the industries in some cases, but frequently the impact of one SEO factor is much more pronounced in one industry over another.

 

3.       Other factors can have the opposite effect in different industries

 

For some of the SEO factors we analysed, what is true for one industry was not only different, but the complete opposite for another. Specifically, we found that having keywords in the h2 tag has a positive effect on rankings for a large amount of insurance websites. However, this same factor is not only less effective for weight loss websites, but in fact has a negative SEO effect on some of these sites, making it an overall poor SEO strategy for businesses in the latter industry.

 

What these results demonstrate is that when it comes to SEO and ranking well on the search engines, it always comes back to a custom plan for each individual business. In order to perform online, different actions must be taken for different industries. A “one-size-fits-all” approach cannot be successful with the variety in SEO factors and how they perform differently for websites in different industries.

 

Not only are the impact of different SEO factors vastly different between industries, but SEO itself is subject to frequent and – often unpredictable – change. Pinning down the SEO priorities and needs of different clients in different industries is an ongoing task, and one that will always require the right technology that can keep up with this rapid rate of change.

 

 


[i] Results calculated using Spearman’s Correlation Coefficient using sample data for illustrative purposes during January to May 2014 – GlassHat 2014.

GlassHat: how it works and the benefits explained

Share this content  

 

 

After almost three years of hard work and more than two billion algorithmic calculations, our GlassHat team have released the above video to explain how our technology works.

 

Once all the mathematical jargon is out of the way, Search Academy’s offering provides three core benefits:

 

1.     We know which SEO task to focus on next

 

With limited time and resources, as well as barriers to implementation, most companies find the overload of search information difficult to navigate, to say the least. Not only do many businesses experience these obstacles to effective SEO, but most professional advice available resorts to an individual’s opinion and judgement in order to figure out what to do. What this essentially boils down to is glorified guesswork, repackaged as expertise.

 

GlassHat technology removes this  guesswork. We use a scientific model to compare real world website rankings and their online footprint in order to understand why some websites rank above others. GlassHat translates this analysis into a prioritised, actionable activity plan specific to the target website and the keywords on which we are focussing. Furthermore, the GlassHat activity plan is based on up-to-date search engine algorithms, which can change up to several times per day.

 

2.     We provide you the expertise to complete SEO tasks correctly

 

The SEO world is overflowing with experts, but it’s often hard to know who you can trust and exactly what their limitations are. Frequently, we have found that most of the resources that companies need in order to take action in this space are already in place within their existing teams. The real value that can be gained from SEO is in identifying and implementing a broad set of activities, quickly.

 

Most SEO experts have a working knowledge across one or two of these activities or areas, but with the rate of change and breadth of topics, it is impossible to find all of the necessary skills in one person alone.

 

At Search Academy, we offer first-rate Project Management to ensure that each SEO plan is kept on track and to maximise the impact of our clients’ existing expertise, as well as identifying where more support is needed. We structure our internal teams around four key pillars of expertise – on site, off site, technical and client services – in order to enable fast, quality implementation at industry leading standards.

 

3.     Our technology continues to learn which tasks work best

 

After three years of development and six years in business, we have not only developed an unrivalled database of which tasks to implement and when, but we have also built technology that continues to learn and adapt[ as the search engine algorithms change. We systematically and continuously collect feedback on which tasks have the most impact, how long they take to implement and how they are relevant in certain situations or within specific limitations.

 

Additionally, we continue to test different approaches to tasks as determined by our pool of experts, in order to understand the most effective and efficient methods of  White Hat SEO implementation. Importantly, we test these new approaches’ efficacy, but never rely solely on one opinion to guide our methods.

 

If your business wishes to compete online and improve its presence on the search engines, there is no other approach that can yield faster, lower-risk results.

 

For more information, contact us today.

The state of SEO in 2014 – Search Engine Optimisation or Optimism?

 

In 2014, there is no greater, yet underinvested marketing opportunity than Search Engine Optimisation.

 

Search Engines allow marketers a unique opportunity not available via any other channel. It is the only place to accurately target a potential customer at the precise moment they are searching for goods and services. This means high conversion rates and low cost-per-acquisition in a market with growing customer demand.

 

Why is it then, that while over 80% of search engine users click on the organic results, Australian businesses are investing only 3% of their Search Marketing budget[i] in SEO? Google turned over $60 billion globally last year selling paid clicks, so there is no shortage of demand from advertisers for search engine traffic.

 

Most Australian businesses would like to spend more on SEO, so why don’t they?

 

The issue with SEO is risk – the business case doesn’t stack up because results are unpredictable, and it relies too much on one person’s opinion.

 

The Search Engine algorithm changes on average more than once per day, and behaves differently for every keyword you may wish to target. Google themselves have already made the challenge even greater by not sharing as much individual keyword performance data, both in their research tools, and shared via their Analytics platform. More of the algorithm is now dedicated to identifying strong natural language rather than keywords alone, as evidenced by the Hummingbird update. Additionally, many companies who have employed traditional SEO efforts have fallen foul of the dreaded Google Penalty, whose impact can be enough to make your website all but impossible to find. Things are set to become harder still with Google’s growing interest in Artificial Intelligence resources such as DeepMind[ii].

 

It’s a frustrating situation for advertisers and SEO providers alike. What usually eventuates is advertisers relying on an SEO expert, who in turn uses a large mixture of tools and some gut instinct based on their experience in order to predict outcomes.

 

Search Engine Optimisation is hard to scale

 

Once an organisation gets good results from an SEO investment, the next issue becomes one of scale and sustainability. If it was a “guru approach” that helped the advertiser achieve a positive outcome, how does one repeat this process? Furthermore, the business risk is heightened due to the outcomes being reliant on an individual alone.

 

Other advertisers have wisely resorted to developing unique, fresh and rich content on their website, an undisputed strategy that every company can employ. The limitation of this approach is that Google looks at hundreds of factors when ranking websites, and content is only one consideration. At Search Academy, we have built proprietary technology specifically aimed at resolving this scientific problem. Having analysed hundreds of campaigns and many thousands of keywords, we have yet to find two sets of priority factors that are statistically related to the same outcome. Depending on your target market, you may need to address how fast the page should load, how much content should be off-page versus on-page, the importance of social media engagement, or even something else entirely.

Example output using GlassHat to identify priority factors for an SEO campaign

 

A single, linear approach to solving the SEO problem simply does not work.

 

 

So, with the undeniable size of the SEO opportunity what must businesses do next?

 

The path to long-term, sustainable SEO results is by no means easy or uncomplicated.  Search remains an area of rapid growth and change, and Google is well aware of this, as evidenced by their constant updates and harsh penalties for spammers. SEO will remain notoriously hard to track, scale and predict, but with such an opportunity available, the most uneconomical response is to do nothing at all. Too many advertisers continue to rely on SEO experts using imprecise strategies for what is essentially a scientific problem. Too often, these strategies land businesses in Penalties, leaving them in a worse situation than before. It will only be possible to properly address the SEO needs any business after doing away with the inexact opinions of gurus, and employing logical, statistically sound answers. Just as the Google algorithm was built from the ground up using science, so too must our SEO solutions.

 

 

What do these solutions look like?

 

Our next blog will be the first in our series of GlassHat Weekly Insights. We will explain how industries behave differently in search, and therefore have different SEO priorities. We will also look at how GlassHat technology shows us that there is no “one size fits all” approach when it comes to search.

 

Which SEO factors should one business focus on in order to rank well on the search engines? The answer is more complicated than you may think.



[i] Search Academy research “State of Search 2014” White Paper, January 2014. Excludes independent consultants, in-house head count, and other marketing efforts that can coincidentally assist SEO results.

Don’t feed the trolls: responding to negative feedback

 Share this content  

 

It’s almost a little too easy for a few unhappy customers to send a business’ marketing department into a tailspin these days, and all it takes is a couple of well-placed negative reviews or Facebook comments. But while this sort of response can be extremely frustrating – and you may think unjustified – there’s nothing you can do to take back the negative feedback once it has been preserved in the never-ending story that is the Internet. So, how to respond?

 

What you should definitely not do is to react with anger and vitriol towards the negative reviewers – this will only fuel the fire. If you want any more proof of this, then check out the hilarious unfortunate meltdown of the owners of Arizona restaurant Amy’s Baking Company Boutique & Bistro. They took to Facebook to (loudly) air their discontent after featuring on an episode of Gordon Ramsay’s Kitchen Nightmares and receiving a slew of negative feedback on Yelp and Reddit.

 

Their reaction – predictably – only made things much worse. Of course, any small business that isn’t quite up to speed on the dynamics of the Internet is ill equipped to respond to a thousand faceless trolls, but there are a few things you can do to recover more quickly. And guess what? It all comes down to just using your words. If your problem is only a few negative reviews or comments, then don’t ignore them, respond! If a consumer has a genuine concern with your products or services, just addressing and resolving them can turn a simmering, negative experience into a positive one for your brand, and maybe even a return customer.

 

Of course, if your online problem is a little more widespread than just a few unhappy shoppers, a more synergised strategy is going to be necessary. The mistake that Amy’s Baking Company made was that they left themselves open to ridicule, by both cyber-shouting at their critics, and in doing so, coming across as a couple of luddites. The lesson here is simple; whatever the catalyst happens to be, a widespread negative response to your brand represents widespread dissatisfaction – and that is not a badge you can afford to wear for long.

 

This means that you will have to own up to what has gone wrong with your business – whether you are at fault or not – and commit to changing it. How you go about achieving this depends on both what has caused this spike in negativity, as well as what kind of business you are running. A couple of brands that managed to recover from these types of dreaded disasters include Dominos and Nestlé. Dominos made a fast recovery after two employees posted a video of themselves, erm…violating certain health and safety regulations while preparing customers’ food. The video soon went viral, but Dominos managed to nip the negativity in the bud by quickly releasing an apology video, and taking legal action against the employees in question.

 

Nestlé’s problem was a Greenpeace video that criticised them for using palm oil sourced from deforested areas. But this problem only became worse when the food and beverage giant attempted to have the video taken down, and deleted any negative Facebook comments. Unsurprisingly, the comments just kept on coming with renewed fervour, and the video in question was uploaded multiple times by other YouTube users. Being such a major brand, they eventually managed to recover, but it took them almost 50% longer than the likes of Dominos. This can most likely be accounted for by their attempted strategy of censoring, denying and censoring some more. In the age of the Internet, it has become impossible to truly eradicate anything that has been said about your company, ever. Once upon a time, complaints happened in the form of letters and phone calls that no one else was privy too, but those days are long gone.

 

So if your business runs into some unexpected online negativity – whether on a large scale or small – the best thing to do is to respond with the same level of professionalism and transparency you would bring to any other business venture. The Internet and social media have made business owners more accountable than ever to their customers, and whether this is a blessing or a curse for your brand is entirely up to you.

Page 1 of 212