Google plus black image

Google Authorship downgrade: Is this the end of Google+?

Google’s John Mueller recently announced on Google+ that the company is removing authors’ profile picture and Google+ circle count from search results


All that visibly remains in the search results is the clickable author name in the byline.


From this:

To this: 

Screen Shot 2014-07-17 at 10.44.33 am



Images c/o



An improvement to user experience?


Google said the changes will improve the user experience by returning a less cluttered search result across all devices – especially mobile search.


One of the main advantages of authorship is the positive impact that profile pictures in search results have on click-through rates. However, Mueller said the new picture-less results exhibited a ‘similar click-through behavior’ to the previous results shown with a profile picture.


Google updated the visual appearance of it’s search results earlier this year with a change of font and the removal of the title underline, so this recent visual update to authorship doesn’t seem completely out of place. However, there is speculation around the real motives behind Google’s recent changes to authorship.



Was authorship impacting on Adwords Click-Through Rate?


This eye-tracking research and the authorship study supports the claim that authorship pictures in search results improve click-through rates. Even Google themselves have spoken about the positive impact of social annotations in the search results and include ads with images in Google’s paid shopping ad program to attract clicks.


Mueller’s statement that click-through rates remain similar contradicts many of these previous studies, raising questions around whether authorship photos were negatively impacting on Adwords click-through rates.


Moz .com founder Rand Fishkin put forward his opinion about the changes shortly after the announcement from Google.


Image c/o



Is this the end of Google+ as we know it?


Removing a significant feature and one of the biggest incentives of Google+ suggests potential changes are afoot. Google authorship images were a huge incentive for marketers to join up to Google+, in fact the potential to get your profile picture in the search results was probably one of the main reasons many people signed up and became active.


Removing one of the most notable features associated with Google+ coupled with the departure of Google+’s chief evangelist Vic Gundotra in April, could suggest that Google may be evolving Google+ as a product, having finally come to terms with the fact that Google+ cannot compete with Facebook in the social space.


Is there any value left in authorship?


Yes. It’s probable that some of the motive behind this change to authorship is related to improving the experience for people searching on mobile devices, however, it’s unlikely to be the whole story.


Google’s contradictory statement over the non-impact on click-through rates suggests that something bigger is happening in the background. However, I still believe it to be worth verifying your content using the “rel=author” tag as a method of claiming your content contributions across the Internet. Plus, authors still get their name in the byline of the search results.


While Google may see an impact on the number of people signing up for Google+ and implementing authorship, Google have said that they are still committed to promoting authors with authority. How you establish yourself as a trusted authority to Google may be evolving, but it isn’t going to disappear completely.


Therefore, if you are already participating in authorship you should continue, and mark up you content using the appropriate tag. If you aren’t actively using the authorship mark up, then it should be a consideration for any future content you create.


Authorship is only one consideration for anyone looking to improve their visibility in search engine results. Search Academy’s Glass Hat technology can identify the other factors and what tasks you should focus on to bring you the best return. For more information contact Search Academy today.


YOUR SAY:  How do you feel about the Authorship change? Is this a good move or bad move by Google? Do you think your search results will suffer? Leave your comments below.


The SEO of JavaScript

Google has been trying to understand web pages better for a long time now. They have quickly moved from indexing textual content through to the execution of JavaScript code within a browser-like environment, with some notable limitations.  For Web developers and SEO professionals it can be difficult to identify the right approach for developing SEO-friendly JavaScript. Luckily, Google has provided us with some guidance:


  1. Reduce the complexity of Scripts (make it easier for Googlebot)
  2. Ensure resources like JavaScript and CSS are not blocked in your robots.txt
  3. Ensure your server can handle crawl request traffic
  4. Apply Graceful Degradation development practices
  5. Avoid JavaScript that removes too much content


Graceful Degradation vs. Progressive Enhancement

We have some great advice, but I would suggest we can improve it a little further by understanding why Google has recommended Graceful Degradation and why the alternative approach of Progressive Enhancement may achieve better results.


Why mention Graceful Degradation?

All browsers are different, they each sport different features, sometimes with common bugs that make the display of content and execution of JavaScript inconsistent.


The top four browser rendering engines:

  • Trident: IE
  • Gecko: Firefox
  • Webkit: Chrome, Safari, Apple & Android mobile devices, Opera (soon)
  • Presto: Opera, Opera mini


Display and JavaScript variation was very common a few years ago during the The Browser Wars. Web developers still work very hard to ensure that the Web does not appear ‘broken’ across different browsers, often applying fixes of their own to mitigate browser bugs and reduce missing features.


In order to explain the approaches taken to solve these problems and why Progressive Enhancement may improve SEO for JavaScript, we will need a quick summary of the techniques required for bulletproof Web design:


Graceful Degradation

This is an older technique than Progressive Enhancement, which primarily caters for a richer experience and gradually degrades in less capable browsers.  It is a principle of fault-tolerant design, enabling a system to operate as planned and continue to work at a reduced level within a less capable environment, instead of failing completely when a fault is encountered.


Progressive Enhancement

Progressive Enhancement was coined by Steven Champeon in a series of articles and presentations for Webmonkey and the SXSW Interactive conference between March and June 2003. Core functionality and content are prioritised (e.g. a user should be able to ‘add to basket’ without JavaScript, or be able to read important content without barriers.) Richer features are layered on as more browser capabilities are detected.


To achieve this, Web developers conceptually build escalators not lifts.


“An escalator can never break – it can only become stairs. You would never see an ‘Escalator Temporarily Out Of Order’ sign, just “Escalator Temporarily Stairs. Sorry for the convenience. We apologize for the fact that you can still get up there.” - Mitch Hedberg.


Regressive Enhancement

This is a much newer technique that utilises all of the latest browser features (typically HTML5 and CSS3,) and then replaces the missing functionality in less capable browsers (e.g. older versions of Internet Explorer) via JavaScript using polyfills.



All the techniques above are different strategies of applying ‘fault-tolerant design’ to make an unstable browser environment appear to be seamless and uninterrupted for users.  This situation has improved greatly over recent years, with an impressive effort from Microsoft to make Internet Explorer – a much-maligned browser – more Standards Compliant.


Take Away

Progressive Enhancement, from the ground up, is a strategy for making content and core functionality available to the widest audience possible. This includes screen readers and devices not capable of handling JavaScript. In contrast to Graceful Degradation, Progressive Enhancement has a goal of ensuring core features are supported in the lowest capable target browsers.


This also means that Web sites applying this strategy will have a good chance of being found by search engine spiders that now understand JavaScript (not just Google Bot,) because features are progressively layered, based on capability.

seo roi graph

Crunching the numbers: How to measure ROI from your SEO campaign

If you’re curious about exactly what kind of returns you can expect from an SEO campaign, we don’t blame you. The reason most people invest in SEO in the first place is for conversions, sales and ultimately increased revenue, and SEO is proven to have one of the highest returns on investment over any other marketing and advertising channels. The problem is that SEO is not an exact science, and actually measuring its specific ROI can get even more complicated. That said, SEO is not an immeasurable free-for-all, and there are some actions you can take to better calculate and predict your outcomes. They may just not be quite what you expected.


Think about the big picture


As much as we wish we could make accurate short, medium and long term predictions about the efficacy of each brand’s SEO campaign, in reality it’s just not that simple, at least in the short term. When deciding on which keywords to target, we take into account a variety of factors that basically come down to what we already know, what we have previously witnessed, and what we can expect from the future.


Of course, in practice this analysis is a lot more complicated, and it is because of the sheer amount of variables involved that ROI in an SEO campaign is best tracked over the long term. It may take longer to get the results you are looking for, but these results will also be both more valuable and more sustainable to your business.


Exclude branded keywords


If you are seeing an increase in traffic to your website, that’s generally good news no matter how you spin it. The trick here is figuring out how much of this traffic is coming from branded keyword searches, and how much is from non-branded; because it’s the latter that means that your SEO efforts are really starting to pay off.


In simpler terms, if your website is experiencing more traffic from keywords that are non-branded, it means that users searching for goods or services are landing on your website – without even looking for your company specifically. This in all likelihood means that your website is on the first SERP (Search Engine Results Page) for these keywords, which is great news.


You can review the non-branded keywords driving traffic to your website in Google Analytics. More detailed instructions can be found here.


N.B. Direct traffic from branded search terms can also be influenced by organic searches, but that’s a story for another time.


Know the value of your customers


Or more importantly, know their value in relation to their CPA (Cost Per Action.) If you are not making a profit from the value of your customers vs. how much it takes to acquire them, then your SEO strategy needs an overhaul.


But remember our point about SEO being a long-term strategy, and don’t be afraid to invest a few extra acquisition dollars in order to attract good, rather than average customers who will contribute more to your business in the long run.


At the end of the day, tracking the dollars you are making from SEO is tricky, but not impossible. The key is to start thinking about the long term, big picture rather than attempting to calculate your ROI month by month. If you review your SEO campaign with this in mind, you will be in a better position to assess the real value it is bringing to your business.

Panda 4.0 – Is this latest Google update a Hummingbear?

The SEO world is again abuzz with speculation over the impact of two recent Google updates: Pay Day Loan 2.0 rolled out 20th May (more on this later) and Panda 4.0, which commenced rollout on 21st May.


As with the Hummingbird update, we applaud this latest Panda update because for SEO it means that efforts made towards the creation of good quality, web page content and well designed information taxonomy; will earn higher search engine visibility.


We are hoping this update is more of a “Hummingbear.”

With Hummingbird, Google sought to better understand the intent of a user’s search query and thereby improve their indexing of websites. Now, with the latest Panda [Bear] update, improved semantic search is more accurately matched to quality website content. I give you – a Hummingbear!


What we know        

  • Google updates it’s Panda algorithm regularly
  • The Panda algorithm is centred on content quality
  • This update is “Bigger”



What we think:

  • Google is broadening the meaning of “Poor Quality Content” to include more than the page copy.  We believe the metrics of ‘Content Quality’ will include user experience, information taxonomy, device responsiveness and more.


What is good quality content?

Here is a short checklist for you to use to evaluate the quality of your website’s content:


  • Is your webpage well written with clear text blocks and good use of headings and sub headings?
  • Is your web page rich in information for your audience, rather than advertising and links to other websites?
  • Do visitors stay on your web pages – indicating that your content is engaging?
  • Do you have content that visitors like to share?
  • Is the copy on each web page unique to others in your site?
  • Is each web page content updated and fresh?


Google Guide on Creating High Quality Websites

Here is a guide from Google that dates back to 2012 but remains a clear guide on what factors make a high quality website and what factors are considered low quality.


If you are a client of Search Academy, your project manager will be happy to discuss this latest Google update with you.


How science can help solve SEO

At Search Academy, our belief is that search engine optimisation (SEO) is a scientific problem that needs a scientific solution. I have written before on how every industry behaves differently online, and how GlassHat technology identifies which SEO task to focus on next for the maximum return on investment. I am often asked to explain the underlying features of how our technology works, and today I want to explain some of GlassHat’s very basic principles, and why we built it to help guide us.


What is a regression model?


A regression model simply comes down to comparing historical data patterns to predict outcomes. In its most basic form, I would use the example of how long it takes you to go somewhere is determined by how fast you are travelling. If I travel at 20km per hour, it will take me 2.5 hours (150 minutes) to travel 50km. If I travel twice as fast, it will only take me half that time (i.e. at 40km per hour it will take me 75 minutes.)


This pattern continues – you can predict how long it will take you travel at any speed because you have built a “regression model” of T=D/S, where T= Time, D = Distance, and S= Speed. If that sounds complicated, let me show you by way of a diagram as below, where the Speed is on the X-axis and the corresponding Time is on the Y-axis. For this analysis I used D=50.



How is regression modelling applied to SEO?


In SEO you can use indicative factors such as “number of external backlinks,” “how fast a website loads” or “how rich and fresh your content is” to determine how high you rank in Search Engines. If you can find a pattern between the two data points, you can make a judgement as to which will drive rankings for your website. Simple, eh?


Why a simple regression model is not enough


Although this sounds great in principle, it’s far from that simple. There are many problems with this approach – consider the scatterplot below for an actual dataset from one day’s worth of data for a set of websites competing in the hospitality industry.


Although there is evidence of a pattern that shows more links cause a higher rank (red arrow), there are also a number of distinct problems:


1. NOISE – It’s obvious this isn’t the only datapoint that affects ranking for this category. We know of course that Google uses well over 300 factors to determine relevance, so we also know that this datapoint could not completely explain the cause of rank.


2. OUTLIERS – Considering that we are using a logarithmic scale  to plot the number of links, the top right single data point represents a website with a very high volume of links, but comparatively low ranking. This could be caused by a Google Penalty, and therefore we shouldn’t influence our model if this is the case. In the bottom left are two data points that rank very well but have a low number of links. What is the reason for this? Is it something we can’t replicate or explain?


3. INSUFFICIENT DATA –  Lacking in the model above is consideration for more keywords, the current rank of the website in question, the number of days of data, and how the keywords map to the website. Collecting and analysing more data complicates our problem.


4. NO CONSIDERATION FOR EFFORT – How much effort goes into creating links, anyway? Is it easier to create the first few links, but harder to create the rest? If so, at which point is this less effective than other methods?


5. NO CONSIDERATION FOR LIMITATIONS – External links can take time to build, and it’s generally not a good idea to build too many too soon, so there is a limitation on time. It’s not simply the case that we can move this data point without knowing what it’s limitations are. One of the questions we get asked the most is about restrictions, or difficulty in implementing SEO recommendations. Knowing what these limitations are enables us to understand the do-ability of moving each data point to the precise position.


6. THE MODEL CHANGES – We know that Google changes their algorithm all the time. What worked yesterday will not always work today or tomorrow. How do you stay on top of these changes and determine exactly how to spend your next SEO dollar?


Although not an exhaustive list, it’s easy to see how using science can help guide SEO, but not without considerable challenges.


In future postings, I will explain these problems in more detail, and how we overcome them using our GlassHat technology platform.

Every industry behaves differently: Why there is no such thing as a ‘One Size fits all’ SEO plan.

Share this content  

When it comes to search engine optimisation, there can be no “one size fits all” approach if businesses wish to succeed online. Using GlassHat technology, we have analysed hundreds of campaigns and thousands of keywords. However, we have still yet to find two sets of priority SEO factors that are statistically related to the same outcome. What works for one business may be vastly different to that of another, and vice versa.


Recently, we analysed the importance of various priority factors in both the insurance and weight loss industries. The results[i] demonstrate the difference in the effect of various SEO factors on ranking for websites in these two industries. More specifically, we observed three key insights from this GlassHat analysis.



1.       Some factors have a similar effect in different industries


In both the insurance and weight loss industries, external backlinks from government websites have a positive effect on a similar amount of websites in these two industries. Evidently, there are some cases where a similar approach may be beneficial across more than one industry, and a streamlined SEO approach may be possible for some factors. However, this is only the beginning of the story.


2.       Some SEO factors carry more weight in different industries



For various factors that we analysed, there was a vastly more measurable positive effect on ranking in the insurance industry as opposed to the weight loss industry. Factors such as having an ACRank 5 – that is, having 24 or more domains linking to the webpage in question – had significantly more positive effect in the insurance industry, as did having keywords in the anchor text. There may be similarities across the industries in some cases, but frequently the impact of one SEO factor is much more pronounced in one industry over another.


3.       Other factors can have the opposite effect in different industries


For some of the SEO factors we analysed, what is true for one industry was not only different, but the complete opposite for another. Specifically, we found that having keywords in the h2 tag has a positive effect on rankings for a large amount of insurance websites. However, this same factor is not only less effective for weight loss websites, but in fact has a negative SEO effect on some of these sites, making it an overall poor SEO strategy for businesses in the latter industry.


What these results demonstrate is that when it comes to SEO and ranking well on the search engines, it always comes back to a custom plan for each individual business. In order to perform online, different actions must be taken for different industries. A “one-size-fits-all” approach cannot be successful with the variety in SEO factors and how they perform differently for websites in different industries.


Not only are the impact of different SEO factors vastly different between industries, but SEO itself is subject to frequent and – often unpredictable – change. Pinning down the SEO priorities and needs of different clients in different industries is an ongoing task, and one that will always require the right technology that can keep up with this rapid rate of change.



[i] Results calculated using Spearman’s Correlation Coefficient using sample data for illustrative purposes during January to May 2014 – GlassHat 2014.

GlassHat: how it works and the benefits explained

Share this content  



After almost three years of hard work and more than two billion algorithmic calculations, our GlassHat team have released the above video to explain how our technology works.


Once all the mathematical jargon is out of the way, Search Academy’s offering provides three core benefits:


1.     We know which SEO task to focus on next


With limited time and resources, as well as barriers to implementation, most companies find the overload of search information difficult to navigate, to say the least. Not only do many businesses experience these obstacles to effective SEO, but most professional advice available resorts to an individual’s opinion and judgement in order to figure out what to do. What this essentially boils down to is glorified guesswork, repackaged as expertise.


GlassHat technology removes this  guesswork. We use a scientific model to compare real world website rankings and their online footprint in order to understand why some websites rank above others. GlassHat translates this analysis into a prioritised, actionable activity plan specific to the target website and the keywords on which we are focussing. Furthermore, the GlassHat activity plan is based on up-to-date search engine algorithms, which can change up to several times per day.


2.     We provide you the expertise to complete SEO tasks correctly


The SEO world is overflowing with experts, but it’s often hard to know who you can trust and exactly what their limitations are. Frequently, we have found that most of the resources that companies need in order to take action in this space are already in place within their existing teams. The real value that can be gained from SEO is in identifying and implementing a broad set of activities, quickly.


Most SEO experts have a working knowledge across one or two of these activities or areas, but with the rate of change and breadth of topics, it is impossible to find all of the necessary skills in one person alone.


At Search Academy, we offer first-rate Project Management to ensure that each SEO plan is kept on track and to maximise the impact of our clients’ existing expertise, as well as identifying where more support is needed. We structure our internal teams around four key pillars of expertise – on site, off site, technical and client services – in order to enable fast, quality implementation at industry leading standards.


3.     Our technology continues to learn which tasks work best


After three years of development and six years in business, we have not only developed an unrivalled database of which tasks to implement and when, but we have also built technology that continues to learn and adapt[ as the search engine algorithms change. We systematically and continuously collect feedback on which tasks have the most impact, how long they take to implement and how they are relevant in certain situations or within specific limitations.


Additionally, we continue to test different approaches to tasks as determined by our pool of experts, in order to understand the most effective and efficient methods of  White Hat SEO implementation. Importantly, we test these new approaches’ efficacy, but never rely solely on one opinion to guide our methods.


If your business wishes to compete online and improve its presence on the search engines, there is no other approach that can yield faster, lower-risk results.


For more information, contact us today.

The state of SEO in 2014 – Search Engine Optimisation or Optimism?


In 2014, there is no greater, yet underinvested marketing opportunity than Search Engine Optimisation.


Search Engines allow marketers a unique opportunity not available via any other channel. It is the only place to accurately target a potential customer at the precise moment they are searching for goods and services. This means high conversion rates and low cost-per-acquisition in a market with growing customer demand.


Why is it then, that while over 80% of search engine users click on the organic results, Australian businesses are investing only 3% of their Search Marketing budget[i] in SEO? Google turned over $60 billion globally last year selling paid clicks, so there is no shortage of demand from advertisers for search engine traffic.


Most Australian businesses would like to spend more on SEO, so why don’t they?


The issue with SEO is risk – the business case doesn’t stack up because results are unpredictable, and it relies too much on one person’s opinion.


The Search Engine algorithm changes on average more than once per day, and behaves differently for every keyword you may wish to target. Google themselves have already made the challenge even greater by not sharing as much individual keyword performance data, both in their research tools, and shared via their Analytics platform. More of the algorithm is now dedicated to identifying strong natural language rather than keywords alone, as evidenced by the Hummingbird update. Additionally, many companies who have employed traditional SEO efforts have fallen foul of the dreaded Google Penalty, whose impact can be enough to make your website all but impossible to find. Things are set to become harder still with Google’s growing interest in Artificial Intelligence resources such as DeepMind[ii].


It’s a frustrating situation for advertisers and SEO providers alike. What usually eventuates is advertisers relying on an SEO expert, who in turn uses a large mixture of tools and some gut instinct based on their experience in order to predict outcomes.


Search Engine Optimisation is hard to scale


Once an organisation gets good results from an SEO investment, the next issue becomes one of scale and sustainability. If it was a “guru approach” that helped the advertiser achieve a positive outcome, how does one repeat this process? Furthermore, the business risk is heightened due to the outcomes being reliant on an individual alone.


Other advertisers have wisely resorted to developing unique, fresh and rich content on their website, an undisputed strategy that every company can employ. The limitation of this approach is that Google looks at hundreds of factors when ranking websites, and content is only one consideration. At Search Academy, we have built proprietary technology specifically aimed at resolving this scientific problem. Having analysed hundreds of campaigns and many thousands of keywords, we have yet to find two sets of priority factors that are statistically related to the same outcome. Depending on your target market, you may need to address how fast the page should load, how much content should be off-page versus on-page, the importance of social media engagement, or even something else entirely.

Example output using GlassHat to identify priority factors for an SEO campaign


A single, linear approach to solving the SEO problem simply does not work.



So, with the undeniable size of the SEO opportunity what must businesses do next?


The path to long-term, sustainable SEO results is by no means easy or uncomplicated.  Search remains an area of rapid growth and change, and Google is well aware of this, as evidenced by their constant updates and harsh penalties for spammers. SEO will remain notoriously hard to track, scale and predict, but with such an opportunity available, the most uneconomical response is to do nothing at all. Too many advertisers continue to rely on SEO experts using imprecise strategies for what is essentially a scientific problem. Too often, these strategies land businesses in Penalties, leaving them in a worse situation than before. It will only be possible to properly address the SEO needs any business after doing away with the inexact opinions of gurus, and employing logical, statistically sound answers. Just as the Google algorithm was built from the ground up using science, so too must our SEO solutions.



What do these solutions look like?


Our next blog will be the first in our series of GlassHat Weekly Insights. We will explain how industries behave differently in search, and therefore have different SEO priorities. We will also look at how GlassHat technology shows us that there is no “one size fits all” approach when it comes to search.


Which SEO factors should one business focus on in order to rank well on the search engines? The answer is more complicated than you may think.

[i] Search Academy research “State of Search 2014” White Paper, January 2014. Excludes independent consultants, in-house head count, and other marketing efforts that can coincidentally assist SEO results.

Don’t feed the trolls: responding to negative feedback

 Share this content  


It’s almost a little too easy for a few unhappy customers to send a business’ marketing department into a tailspin these days, and all it takes is a couple of well-placed negative reviews or Facebook comments. But while this sort of response can be extremely frustrating – and you may think unjustified – there’s nothing you can do to take back the negative feedback once it has been preserved in the never-ending story that is the Internet. So, how to respond?


What you should definitely not do is to react with anger and vitriol towards the negative reviewers – this will only fuel the fire. If you want any more proof of this, then check out the hilarious unfortunate meltdown of the owners of Arizona restaurant Amy’s Baking Company Boutique & Bistro. They took to Facebook to (loudly) air their discontent after featuring on an episode of Gordon Ramsay’s Kitchen Nightmares and receiving a slew of negative feedback on Yelp and Reddit.


Their reaction – predictably – only made things much worse. Of course, any small business that isn’t quite up to speed on the dynamics of the Internet is ill equipped to respond to a thousand faceless trolls, but there are a few things you can do to recover more quickly. And guess what? It all comes down to just using your words. If your problem is only a few negative reviews or comments, then don’t ignore them, respond! If a consumer has a genuine concern with your products or services, just addressing and resolving them can turn a simmering, negative experience into a positive one for your brand, and maybe even a return customer.


Of course, if your online problem is a little more widespread than just a few unhappy shoppers, a more synergised strategy is going to be necessary. The mistake that Amy’s Baking Company made was that they left themselves open to ridicule, by both cyber-shouting at their critics, and in doing so, coming across as a couple of luddites. The lesson here is simple; whatever the catalyst happens to be, a widespread negative response to your brand represents widespread dissatisfaction – and that is not a badge you can afford to wear for long.


This means that you will have to own up to what has gone wrong with your business – whether you are at fault or not – and commit to changing it. How you go about achieving this depends on both what has caused this spike in negativity, as well as what kind of business you are running. A couple of brands that managed to recover from these types of dreaded disasters include Dominos and Nestlé. Dominos made a fast recovery after two employees posted a video of themselves, erm…violating certain health and safety regulations while preparing customers’ food. The video soon went viral, but Dominos managed to nip the negativity in the bud by quickly releasing an apology video, and taking legal action against the employees in question.


Nestlé’s problem was a Greenpeace video that criticised them for using palm oil sourced from deforested areas. But this problem only became worse when the food and beverage giant attempted to have the video taken down, and deleted any negative Facebook comments. Unsurprisingly, the comments just kept on coming with renewed fervour, and the video in question was uploaded multiple times by other YouTube users. Being such a major brand, they eventually managed to recover, but it took them almost 50% longer than the likes of Dominos. This can most likely be accounted for by their attempted strategy of censoring, denying and censoring some more. In the age of the Internet, it has become impossible to truly eradicate anything that has been said about your company, ever. Once upon a time, complaints happened in the form of letters and phone calls that no one else was privy too, but those days are long gone.


So if your business runs into some unexpected online negativity – whether on a large scale or small – the best thing to do is to respond with the same level of professionalism and transparency you would bring to any other business venture. The Internet and social media have made business owners more accountable than ever to their customers, and whether this is a blessing or a curse for your brand is entirely up to you.

Three ways to fail spectacularly at SEO in 2014

 Share this content  

If you’re anything like us, you spend a great deal of your spare time pawing through the hidden troves and treasures of the Internet, looking for golden nuggets of SEO information. What we have found lately is a plethora of how-to guides for everything digital you could ever want for. So in the spirit of kinda-sorta originality, we have done a 180° on the advice yardstick and compiled a short list of SEO things you should just never do. Don’t worry, this has definitely been never been done before. So enjoy, and take with a few generous lashings of salt.


1.         Write for search engines and algorithms, but never those pesky humans.


Unless you have been living under a rock (pronounced: “don’t work in SEO”), then by now you will know all about the Google Hummingbird update, and that it generally means that quality content is in, and keyword stuffers are out. Since Hummingbird was launched around August/September of last year (the exact date is fuzzy), Google searches have been optimised for webpages that answer conversational enquiries, within context. So while once upon a time a search for “best pizza near me” would likely only take into account the keywords “best” and “pizza”, you can almost guarantee that Google will now show you a map that pinpoints the exact location of various Italianesque eateries in your general vicinity. Bellissimo!


Delicious continental cuisine aside, the point is that back in the days of old school SEO, you could kind of get away with just stringing together keywords in the form of nonsensical, below-par content, complete with irrelevant backlinks on webpages not even tangentially related to your business. This has never exactly flied with Google, and now they have changed the game to make it even harder for this sort of black hatting to gain any traction. So if you’re still indulging in any the above practices – or you suspect someone who is contributing to your website may be – then stop now, and start creating quality, readable content that actually provides value for the user. As Matt Cutts, Google’s Head of Webspam so aptly puts it,


“If you really want to stop spam, it is a little bit mean, but what you want to do, is sort of break their spirits.”


2.         Don’t optimise your website for mobile, because smartphones are on their way out, y’know?


We hate to break it to you, but mobile Internet devices are here to stay (blog). This being said, if you are yet to optimise your company’s website so that it looks just as good on a teeny tiny smartphone screen as it does on a giant, hulking desktop, then you’re not alone. Only about half of big brands currently have their websites optimised for mobile, and that equals a whole lot of unsatisfactory user experiences. But being in the other (and arguably, better) 50% isn’t as complicated as you may think. It doesn’t involve creating a whole new, minute website or even generating any more content. It’s a simple edit of your site to make the user experience smoother on a mobile device. This means articles will have larger text to make them readable on a small screen, the site will have simplified navigation, and you may choose not to include all of your website’s features and content. The latter may be because it isn’t likely to be accessed by many users on mobile devices, or that it could slow down the loading time significantly, or a combination of both.


Whatever your reasoning is, the time to optimise for mobile is now, and your brand risks getting left behind, or even being stamped with the dreaded ‘out-dated’ seal if you don’t make a move to the benefit of the small screen. If you have any doubts of the validity of mobile Internet browsing, then the next time you’re on a train, take a look around at what most people are doing (hint: not reading a book.)


3.         Don’t vary your content, because it’s consistent and not at all boring.


This may well be completely true, but if there is one thing we have learned about the Internet, it’s that users have extremely short attention spans. In fact if you’re doing your job right, most people who end up on your website having a nice long read of your well-penned content probably didn’t intend to be there in the first place. They were enticed by a catchy title, and informed or entertained by the content, so they stayed. For an endless supply of examples, check out this Upworthy clickbait generator. Brilliant, right?

Maybe the first fifty times, but what’s more is that this is the law of the online land these days, and if you want to compete then you’ve got to keep it fresh. This doesn’t just mean blogging about different topics, but also varying the medium. Try a combination of short blogs, in-depth articles, videos, infographics, or even stories told in gifs and witty one-liners (à la Buzzfeed.) Don’t try all of these at once, but certainly test your hand at one or two in addition to your preferred content form. Internet users (and humans, in general) naturally seek variety and as we already know, are easily bored. So start upping the entertainment stakes, and take a risk with a short video or gif set. You could even end up reaching another niche of your target demographic, and wind up with increased interaction on your website (that is the end goal, after all.)


Page 1 of 212