Or is Google’s algorithm a little off?
Whilst checking into some background info for yesterday’s blog post about the Jury’s Inns hotel chain I did the obligatory Google search for Jury’s Inns. Not sure what you’re seeing, based on Google’s penchant for showing different search results based on your location or whether you’re logged in or not.
Click the image to see the results I got. Almost the entire first page on Google for that term is either the main site or a search-engine friendly subdomain for Jury’s Inn hotel(s) in a particular city.
If I’d have searched for jurysinns.com seeing this wouldn’t have surprised me at all (and this is exactly what does come up, similar to ibm.com or allcapitals.com). If I’d have searched for Jurys’s Inn Hotels London I wouldn’t be surprised to see the custom subdomain come up first or second either.
As subdomains, there’s likely far less incoming links to them, so how did they beat out the page 2 listings like TripAdvisor, Late Rooms and others? Jury’s Inns clearly have people working on SEO for them, otherwise they wouldn’t have subdomains like this. My question is whether there’s such as thing as too much SEO? Does a first page of Google SERPs consisting purely of main site and subdomains help the searcher? Would it not be better for both the searcher and the company for there to be more third party sites, that all link to you anyway? Third party sites confer more authority on the main site, simply because they flesh out the kind of information that is available: TripAdvisor’s user generated content, the property detail pages of third party booking sites and more.
The only ‘solution’, if indeed one is desired, to this is for Google’s algorithm to be altered with regard to subdomains, and the last time any changes were mentioned in that area is this blog post from almost two years ago by Matt Cutts and discussed on Search Engine Land. At the time it seemed that things would change so that subdomains would be treated in a similar way to subdirectories, or at least, seeing a whole page of results from a single domain would be “less likely”. I’m not sure if these results indicate a reversal of that, or comes under the exception Matt Cutts aludes to with “This change doesn’t apply across the board; if a particular domain is really relevant, we may still return several results from that domain.”
Until there’s a way to tell Google what terms you’d like a subdomain not to rank for, it’s unlikely this will get any clearer or produce SERPs which meet the needs of both the searcher and the brand unless the geniuses at Google somehow build telepathy into the algorithm.