The truth about Android vs iOS numbers

Are Google’s Android team leaning on their Analytics colleagues to present their numbers in the best possible light?

or: “Hey Google Analytics, play fair with your Mobile Operating System numbers!”

Google have been shouting about their impressive Android activation numbers for several quarters now (less so since the latest iPhone and iPad I notice). Yet they still need to lean on the Analytics team to present the Android numbers in the best possible light?

Notice how the iPad, iPhone and iPod touch are all regarded as separate ‘Operating Systems’ yet all the myriad devices that run Android (tablets, phones, media players, TVs and whatever else) are all included in that Android number.

And they’re still behind!

Android as a combined entity is still behind the iPhone on its own. What are you afraid of GOOG?

So what are all these Android users doing with their phones?

(Because let’s face it, nobody’s buying the tablets.) What they’re doing with them is exactly the same as what people did when buying feature phones: they took home the shiniest phone the salesperson in the mobile phone operator’s retail store or telesales department would give them for free. Then they’re using them for playing games and sending texts and making phone calls. The things they’re not doing with them: buying apps, buying music or browsing the web. These recent numbers from Adwords alternative Chitika bear that out.

Image credit: Robert Nelson via Creative Commons on Flickr.

2 years with Advanced Web Ranking

Over the space of the two years I’ve been using Advanced Web Ranking it has become an essential part of my digital marketing toolset.

Do rankings really matter?

In this ‘new reality’ of Google Search Plus Your World, personalised and localised results, you’d think that tracking rankings for your keyword phrases wasn’t all that important. To some extent, yes, your search engine position for any particular phrase isn’t static like it used to be – there are lots of variables that come into play. On the other hand there’s been backlash recently about personalised search results and this is something that the engines can’t really ignore.

Yes they do

The reason rankings still matter and why tracking them is important is that:

  1. Not every query is personalised
  2. Not every user is logged in when they search
  3. Tracking rankings can help early diagnosis of other problems

Why use software to do it?

There are a variety of ways to track your rankings – online (subscription based web services, through to the new Google Analytics & Webmaster Tools sharing features) as well as desktop software like Advanced Web Ranking, so why go this route?

Cloud based services – pros and cons

Web based subscription services (cloud based) sound great in principle – no need to remember to run software on your computer (AWR has a scheduler, so that’s not such a big deal) and no large up-front cost. However you’re then tied into paying a monthly fee to access your reports and data. Stop paying? No more access.

Google Analytics new Webmaster Tools ranking data integration

Aside from the obvious fact that this only shows you how you rank on Google (and not any local search engines that are important to you), the data is widely held to be unreliable and doesn’t necessarily track your choice of keywords. There’s also no way to group keywords in order to focus more on how you rank for your most important terms. Google effectively decides what they are for you. You also have to use Google Analytics to get this information.

Advanced Web Ranking’s strengths

Track the evolution of sites keyword rankings over timeBy running software on your own computer you get several advantages:

Control over when ranking checks are performed

You get to decide when the checks are done, manually or to a schedule you define.

The data is yours to keep forever

With Google data it’s never certain how long it will be kept for. With a web-based subscription service, you stop paying and you lose access.

Single up-front cost to buy for 1 year, reasonably priced maintenance plan

You have a one-off cost up-front with a year’s search engine updates included (new search engines added and updates to search engine profiles to ensure that the rank checking software gathers results correctly) and extensions to the maintenance plan are inexpensive compared to monthly subscription services. If you’re out of the maintenance plan the software carries on working (so you can look at your data) and will still run ranking checks, but should search engine specs change, you’ll need to buy a renewal. Renewals are more expensive than extensions but at least you’re not left needing to buy a whole new full license.

Track results on many more search engines

With 2,000 search engines already included, and requests taken to add others, you can see how your site is doing on many more search engines offered by web-based options.

Unlimited websites

This is a huge deal if you have more than a couple of websites you’re responsible for tracking. Whereas subscription based services often give you tracking for up to 3 sites, there’s usually a heftier price to be paid when you go beyond that.

Be in more than one place at once

With customisable location settings for your Google queries, you can see how you’re doing from different locations very easily. There’s also built-in proxy support so you can both increase the throughput of your rank-checking queries as well as ensure that the results you’re getting are valid for the audience a site is targeting, not just your location.

Go get it

Advanced Web Ranking starts from just $99 for the standard version – enough for a small business to track their own sites and the top-end Server version, which comes with an company-wide license to run the software and connect to a shared database, great for SEO and marketing agencies, is just $599.

Credit where its due

How to attribute credit to your brand building and offline efforts in Google Analytics

logosAn often mocked excuse given by non direct response advertising media and its proponents when charged that such advertising is wasteful is that the money spent on the ads ‘builds brand awareness’ or ‘improves brand recall’.

Did you know it’s actually pretty easy to measure that online? Instead of the credit going to ‘search’, where it’s far too easy to think that the traffic is down to online efforts such as SEO, you can measure the traffic coming from branded searches and direct web address type-ins.

What you need to do is set up an Advanced Segment, called “Brand Traffic” in Google Analytics that matches:

All organic traffic where the keyword is one of your brand terms

This is all the people typing in your brand search terms into Google, Bing et al. This is called a ‘navigational’ search. Don’t forget to include a keyword that matches your domain name, you’d be surprised by how many people get the location bar and the search box confused.

PLUS

All direct traffic

This is everyone who comes to your website from a bookmark, a link in email from a friend (unless they were using webmail client, then the referrer will be the domain that they were browsing on. If you want to get more specific about separating that traffic out, then add some more lines to your filter) or a link clicked from a twitter client rather than the twitter web site, when the link doesn’t have any utm parameters on it to reclassify the traffic.

There’s no such thing as 100% accuracy

One of the first things to accept about web analytics is that there’s no such thing as 100% accuracy. There will always be cases when things aren’t tracked properly but we’re not really looking for exact numbers, just a trend.

Watch your branded traffic trend

When you’re done with setting up this advanced segment you’ll have an idea of the change in time of your web traffic that is the result of your brand building efforts outside of search. Make sure you use the Annotations feature of Analytics to mark any big offline efforts you’re involved in.

 

Image credit: All my life’s logos by captcreate via Creative Commons on Flickr.

If you’re going to use QR codes…

QR Code at metro stationPlease don’t make the mistake these people made. That QR code goes to their homepage. Not the product detail page for the camera they’ve clearly spent a lot of money advertising. QR Codes finally give you a metric of success for your offline advertising (whether you use them to compare different placements, or different media types, as long as you’re doing a fair comparison, this data can be incredibly useful). Here’s a quick three point checklist:

Does the QR code get the user to the right page – a landing page that is relevant to the product being advertised?

Have you included tracking variables in the URL (such as Google Analytics utm_medium, utm_source and utm_campaign parameters)?

Are you paying attention to the statistic being generated?

Because otherwise, you’re just perpetuating the same problem of offline advertising of not knowing for sure what works and what doesn’t.

Tracking (and fixing) site search

Nobody goes to a hardware megastore for fun, we usually need something specific. We wander down the aisles ‘if the adhesives are here, then that must mean tile adhesives are here right?’ until we give in and ask a disinterested looking pimply youth for help. They begrudgingly motion in the general direction of where you should find your grail. Now imagine there’s no pimply youth to help you. That’s what most e-commerce sites are like.

Riffing off of a recent Seth Godin post where he points out “Broken search = no sale”…

Tracking site search (whether you’re running an e-commerce site or not) is an important place where you can go the extra mile to make sure that your audience/customers/readers are finding what they want.

A lot of people just browse a site, navigating to what they want by clicking through your (hopefully well thought out) site hierarchy, letting you lead them to the product you want. Some people however want to get right to it, they search. If they don’t find what they’re looking for, they’re gone.

As Avinash Kaushik points out, tracking WHAT people are searching for is super easy with Google Analytics but that doesn’t make your task – improving search results as easy as it could be.

You know how Google has a search quality team? For your site, that team is you.

Whether you track the inputs (search terms) and outcomes (number of results returned) using your own back-end systems or Google Analytics custom variables function isn’t as important as the act of paying attention to what they tell you.

Where to start?

Begin with the low hanging fruit and work your way up the tree – your biggest problems probably lie with the searches that return zero results.

No results found

This is the worst thing a site can return – it’s a door slammed in the user’s face. There are a number of actions you should take here:

Start with the simple, single search parameter queries. If you provide filtering functionality, ignore queries for ‘1TB Hard Drive’ with a price filter set to ‘below $10’, at least until that becomes a reasonable possibility.

1) Check your ‘no results found’ message.

If you’re going to slam a door in someone’s face, at least try and do it gently. Perhaps with some tips on how they might open it again ‘try widening your search, using fewer words’ (contextually of course!) or how you can help open it for them, with a contact form, with the subject filled in (‘I searched for “creme brulée torch” and didn’t find anything’), or even a phone number. Something that has the customer understanding that ‘your search is important to us’. Oh yeah, and it helps if you actually respond to those requests. I’ve seen ‘no results found’ pages that are completely blank. That’s the opposite of helpful.

2) Start with the largest volume.

Whatever reporting tool you’re using, sort by volume (by unique users – some people will search multiple times, the logic being ‘that can’t be right, they must have something for this search’ – incidentally this is the definition of madness. (GET A LINK) Examine each query – is it a misspelling? A synonym (hard disk vs hard drive)? Products you don’t sell (but do sell an alternative)? What does your site software allow you to do about this? Apply autocorrect filters? Add keywords to products? Make a search ‘fuzzy’? Show a message? Suggest alternatives? If the answer is none of those, time to hire a developer, find/buy a plugin or nag your ecommerce platform vendor. Work on the top ten once a week until the top ten is all low single digits – that’s when you’ve squeezed all the juice out of this.

Found something, exited on that page

These are the people that gave up in disgust – they didn’t find anything, they left. It could be they were just checking prices but whatever happened, they didn’t like what they saw enough to stick around any longer.

1) Filter out bots

Not an issue if you’re using Google Analytics as bots don’t load the tracking javascript but make sure you exclude them if you’re using something back-end.

2) Start at the top ten again and see what they saw

Do the same search yourself and see what you find. If could be that the search returned some results but not all, or the order of the search results meant that they didn’t see the product they were interested in on the first page. A search for ‘Photoshop’ might return an alpha-sorted list of products with photoshop in the name. If a company called Aardvark makes plugins and they show up first, the user doesn’t see the Adobe products in the first ten. Think about how you can provide more information to make it easier for the user to drill down and refine. Showing brand logos or a brand filter at the top is a simple solution and easier than writing your own ranking algorithm (Google have more PHDs working for them than you do).

3) Keep notes

There are some queries that you just can’t do anything about – make a note on them so you don’t duplicate your efforts – revisit occasionally.

Where next?

You could go crazy with this. Conversion rate is affected by a number of things: site design, purchase cycle duration, pricing, availability, visit intent (order tracking for an outstanding order isn’t much of a conversion opportunity). The best approach is to implement a form of user testing. There are free and paid for options. I’ve been using 4qsurvey’s free service. Whilst it has some aspects that aren’t quite optimal, you can’t argue with the price. And whilst you’ll only get a fraction of a percent of your traffic to fill in the survey, and some of the responses will be positive (pats on the back are lovely but don’t help you to improve), you will find the odd gem amongst the responses – the annoyances, frustrations and disappointments are what will drive you forward.

Image credit: D’arcy Norman via Creative Commons on Flickr.

Finally some sense about Google Instant

When Google announced Google Instant yesterday, where the search results change as you type, some people proclaimed SEO dead, whilst displaying a stunning lack of understanding of how Google Instant works, how users interact with it and what SEO really is.

If you’re not getting Google Instant, you should check that you’re signed in and using Google.com or .co.uk. Other Google domains don’t appear to be enabled yet. If you’re using it and it bothers you, turn it off with a control to the right of the search box.

Here are three excellent explanations of why Google Instant is a GOOD THING for SEO, and far from the Extinction Level Event claimed, by people who really know their stuff. Google Instant didn’t come to bury SEO but to praise it.

If you’re curious as to the effect of Google Instant on your traffic here are a couple of howtos for segmenting out that traffic in Google Analytics – essentially you get to see the original query (e.g. “choco”) and the expanded query that Google displayed the results for (“chocolate”).

As an aside, anyone else amused by Google Instant coming on the heels of the Caffeine update?

Image credit: Roadside Pictures via Creative Commons on Flickr

Is Yahoo Directory still worth it?

Is paying $300 a year for a Yahoo directory listing still worth it? Google aren’t saying much and the lift it gave in your Yahoo results is gone.

Yahoo Directory Circa 1995Now Bing is powering Yahoo’s search results, anyone tracking their search engine positions across both engines will have see their Yahoo rankings move into line with their Bing ranking

In a particular case of a client with a paid Yahoo Directory listing, the Yahoo rankings have dropped from a probably unrealistic first place all the way down to 21. Theoretically the directory listing didn’t affect the organic placement but that’s pretty hard to believe on the basis of this evidence.

This leads me to question the value of a Yahoo directory listing. The listing in the directory itself has brought just 3 visits so far this year, indicating that no humans are using the directory. As for that top ranking, there wasn’t much organic traffic coming from Yahoo anyway (less than 10% of the traffic that came from Google).

A Yahoo Directory listing costs $299 a year, so now it’s just down to gut feeling on whether a known paid link from it still carries any value in the other search engines’ algorithms. Google no longer suggest getting listed there on their webmaster advice pages. Is this an indication that it’s time to cut that expense and spend the money on something more trackable and productive?

Image credit: David Erickson via Creative Commons on Flickr

Back to School

Graduation DayA few weeks ago I sat an exam for the first time in many years. It was the online certification test for Inbound Marketing University, a project driven by Hubspot, providers of a web based software solution that is designed to assist SMEs with, well, inbound marketing. The ‘professors’ of Inbound Marketing are all high level practitioners in their relative fields – it reads like a who’s who of ‘new marketing’ types.

What’s Inbound Marketing anyway?

Inbound Marketing is the antithesis of many elements of traditional marketing – it’s about creating relationships and establishing a presence, making potential customers aware of you in a more natural way than interruptive tactics like TV advertising.

Course overview

The course was broken into three areas, Get Found, Convert and Analyze. I’ll give an overview of each.

Get Found is centred on being there when people are looking for you, meeting them on their terms, in their places like Twitter and Facebook, but it’s a permission based relationship – someone chooses to follow you on Twitter or become a fan on Facebook because of the value you provide to the community through the content that you create. The stand-out class in this section was David Meerman Scott’s, if you’ve read his books, New Rules of Marketing and PR and World Wide Rave then you already know it, but his approach is dead on. Another key component to the Get Found section is search engine optimisation with two very strong classes on the subject.

The Convert part of the course is all about how you handle leads, whether someone is ready to buy or not, there is a way to strengthen the relationship. Providing useful content doesn’t stop once you’ve got someone interested. The classes covered landing pages, lead nurturing and email marketing.

The Analyze part of the course was handled in one lesson and if I have any criticism of the course, it was in this area. There was only one class on this and it kept things pretty simple. That’s not a criticism of the class in itself – everybody needs a 101 in analytics and it had important and valuable lessons. However if you want to get really deep into the analysis however you’d be better served by picking up a copy of Web Analytics an Hour a Day by Avinash Kaushik (Google Analytics evangelist) and pre-ordering his forthcoming second book too.

Effort and reward

I gave myself two weeks to get through all the classes in and took the exams on the first day of the period in mid August. People who took it in the first exam period in June had a little less time to take it all in – the live webinars were spread over two days. Whilst you miss out a little on the experience by watching the classes later instead of live, don’t be put off because of that at all. You’ll find a lot of the ‘ask the teacher’ questions you come up with during the course of the class are asked by others at the end anyway.

I am pleased to say that I passed the exam, with a decent score and I’m proudly displaying my Inbound Marketing Certified Professional badge in the sidebar of my blog. I would like to publicly thank everyone at Hubspot for the immense effort that that put into this project, all the professors that created such excellent classes and a hat tip for the infrastructure sponsors that made it possible.

If you’re interested in taking the qualification, jump on over to Inbound Marketing University, read up a little more, watch one or two classes, create an account and get notified when the signups are open for the next exam session in October. Happy studying!

Image credit: Robert Crum

Why Twitter needs to take credit for links

Update, mid August 2011

Wow, just two years after I wrote this Twitter are finally doing what’s suggested here, through the use of their own URL shortening service, they’re ‘owning’ the clicks. Read more in this post by Tom Critchlow of Distilled.

 

So Twitter are spending time amping up the retweeting functionality on the site and in the API. All well and good but that’s not really going to make a huge difference.

Here’s an idea that will benefit both Twitter and webmasters the world over

traffic

If you use Twitter via the web interface and click a link in a tweet, most URL shorteners (which is how most links are presented in tweets) correctly pass the referrer value in the format ‘twitter.com/username/’ or if from an individual tweet: ‘twitter.com/username/status/tweetid/’.

If you’re using a third party Twitter client and you click on a link, no referrer information is sent with that to the site you visit. In terms of analytics, that information is lost, so that traffic is presented as direct traffic.

As Twitter grows in reach and more users switch from using the Twitter web interface to clients like TweetDeck, Nambu, Tweetie etc and the myriad mobile clients, the scale of this problem will continue to grow.

Why would you (as a website owner) care?

Because knowing where your traffic is coming from allows you to understand your visitors better, segment them better and improve your site. It also allows you to know the true efficacy of your efforts on twitter – are these links you are sending out, links that have been created by others, or retweeted?

There are three technical approaches to solve this:

  1. Twitter clients parse the unshortened links through the unshortening APIs provided by the URL shortening services. Some already perform this function anyway. They could then put a ‘utm_medium’ parameter on the URL. See the Google Analytics documentation for more about utm parameters.
  2. URL shorteners could all be required to add/replace utm parameters in the URL with those specified by the twitter client – which will add the parameters: utm_medium = twitter, utm_source = /username/.
  3. Twitter clients could be compelled (want API access for your app? Gotta play by the rules) to send any links via a redirection page on Twitter that matches at least the user, maybe even the tweet id of the tweet the link was in. That page would set the browser’s referrer at that point, allowing any analytics package at all to correctly attribute the traffic as a referral from twitter.com.

The problems with methods 1 & 2

utm values are a kludge and not well understood/universally accepted
Using utm parameters simply isn’t something that every web analytics person is up to speed with. Not everyone has read Web Analytics an Hour a Day and gone into painstaking detail. Most people are new to this stuff and they look at the pie chart of direct vs organic vs paid vs referrals and that’s as far as they want to go. It also creates very ugly less sharable URLs

Too many people involved
Requiring Twitter client developers AND URL shortening services to change the way the work is a bit much – plus there’s no enormous benefit to the URL shorteners to get involved. Let alone the number of white label/roll your own URL shorteners that have sprung up since the talk of tr.im shutting down and people realising that they don’t own their short URLs.

Why don’t people posting links put the ‘utm’ parameters for analytics onto the long-form link before shortening?

This doesn’t necessarily solve the problem – it assumes that the shortened link is never going to be shared in another medium – someone could take a shortened link used in a tweet, paste it into a facebook comment. Voila, different medium, same utm value. Then there’s the issue that is only relevant for links that you create yourself. No regular person will go and tag these values on the end of a link to be shortened.

Twitter taking credit for links, as the referrer, is the only sensible solution

  • It requires no learning on the part of the vast majority of web analytics users, putting information into a recognisable universally understood format
  • It underlines the importance of Twitter as a traffic acquisition medium
  • It gives website owners a true picture of where their traffic comes from

So, Twitter, quit messing around with the ego-massaging retweeting changes and do yourselves a favour – put your stamp on the traffic you bring to people.

Image credit: Elvis Payne