Another Look at Google Place Pages

When I wrote my original post on Google Place Pages I was under incredible time pressure. But I wanted to take another look at what Google has done, because it’s potentially quite significant — although less significant than if Google were indexing these pages in search results (more on that below).

Here’s a search for SOMA, San Francisco — a neighborhood:

Picture 52

Click the map and then “more info” and you arrive at the new page — being created for every business, landmark/POI, neighborhood and city:

Picture 53

The new SOMA Place Page:

Picture 54

Picture 51

What’s there:

  • Ads for things related to SOMA (although Bing really isn’t)
  • Related maps that can offer a lot of value (created often by users, e.g., “Ben’s Guide to San Francisco“)
  • Popular places (being indexed here may be significant for certain venues and attractions)
  • Video and images
  • Street View

That these pages are engaging and create highly targeted ad inventory should be obvious. They each have a URL but Google told me they won’t be indexed (but see Mike B’s post). That’s probably a political decision to keep people from screaming that Google’s favoring its own properties. The only way then to get to these pages is to click on “more info.” Thus there somewhat buried. But people will probably discover them given the visibility and volumes of searches on Google Maps.

The big departure for Google, beyond the new format, is the creation of these pages for neighborhoods, landmarks and cities. Formerly information was mostly available for business locations. The addition of places and POIs makes these pages a potentially great discovery tool for travel and tourism. Indeed, Hitwise categorizes Google Maps under Travel. 

Here’s a comparable page for a local business

Picture 58

Picture 57

My belief is that increasing numbers of local businesses will claim these pages because they will be visible and widely consulted by consumers (notwithstanding my “buried” remark above). And again they’ll have their own URL so local businesses can link to them. 

Imagine if Google were to index these pages in search results; their impact would be huge. But Google is saying it’s not going to. These pages aren’t yet available for mobile devices but some version of them will be in time. 

What do you think the impact of these changes will be, if any, on:

  • Small busnesses
  • Consumer behavior
  • Locally targeted AdWords 

40 Responses to “Another Look at Google Place Pages”

  1. Sebastien Provencher Says:

    If they were to index these pages in their SERPs, it would be a game changer and a serious blow to directory publishers and other local media players. But because they need those large local media players to sell AdWords, I don’t think they will do it.

  2. predictabuy Says:

    I think it can be seen as very similar to Seth Godin’s controversial ‘Brands In Public’ – just on a massive scale and with ‘opt-out’ as the default (which is smart). What if they allow businesses – the nominal owner’s of the page – to ‘opt-in’ to having them included in the SERP? In essence, haven’t they created a default web page for everyone who wants it?

    I also think the new design makes it much easier for them to (selectively) experiment with different content and see how user’s respond to it — that’s probably another big reason for the re-design. Imagine them showing real-time information in another ‘box’ on the page. This design is much easier to experiment with and optimize.

  3. Greg Sterling Says:

    I agree Seb. However that same concern wouldn’t apply to indexing places/POIs/neighborhoods.

  4. Sebastien Provencher Says:

    @Greg With the current Yellow Pages publishers’ commercial mission, I agree. But it definitely would be in their interest to try to structure activity around non-commercial places/POIs/neighborhoods going forward. I wouldn’t leave that space to Google.

  5. Mike Blumenthal Says:

    I may have this wrong but see my post today: Where are Google Places Pages Going?. It appears to me that Google may be indexing them…

  6. Greg Sterling Says:

    Mike: That’s interesting because I asked them lots of questions about whether they were going to index and they said the pages would only be accessible via Maps.

  7. Sebastien Provencher Says:

    @Mike Woa, that’s a game changer definitely. The Burdick Chocolate (http://www.google.com/search?hl=en&client=safari&rls=en&q=Burdick+Chocolate+Cafe+Boston&aq=f&oq=&aqi=) example clearly shows indexation. I wonder how they’re going to determine ranking and relevancy for their own pages.

  8. Greg Sterling Says:

    Seb: I don’t think the IYPs can match this. Maybe they can prove me wrong. Kosmix a long time ago started to do something very similar to this: http://www.kosmix.com/topic/las_vegas

  9. Barry Hunter Says:

    They wont be indexed anymore, because
    Disallow: /places/
    appears in robots.txt.

    But that wasnt there right at the beginning (possibly an oversight, or only added in reaction to community feedback), so some pages have been indexed, but they will (or should) fade.

  10. Greg Sterling Says:

    Thanks Barry. It would be very bad form for Google to say X here and do Y.

  11. Barry Hunter Says:

    Ah only just read Mikes post, its try that they wont be indexed in themselves. They can still be ranked by links to them. In which case appears as a simple link – no snippet.

    Interestingly its mostly places mentioned in all the blog posts talking about places that are of course being linked up…

  12. Barry Hunter Says:

    ‘try’ in the first sentance was meant to be ‘true’ 😉

    continuing… I’m not sure the ‘maps’ people can do anything about this type of indexing, its effectivly out of their control. Google ‘search’ try to be helpful in indexing all it can. The keywords in the url obviouslly cant be hidden. It would probably need a ‘fix’ by the search team to exclude these.

  13. predictabuy Says:

    Here’s another (possibly hair-brained) scheme. Think of these pages as landing pages. Google could provide tools that optimize the performance of these pages both for incoming ‘organic’ traffic and for Adwords campaigns — possibly in a fully automated way. This could lead to a greatly simplified way for Merchants to advertise using Adwords (or other ad vehicles for that matter – mobile, display, etc.).

    (And they could give merchants a Google Voice number for tracking the calls.)

  14. Mike Blumenthal Says:

    @pedictabuy mot hairbrained at all…it makes the 60% of smb’s that don’t have websites better targets for adwords if not by Google then resellers.

  15. Sebastien Provencher Says:

    @Mike can a third party manage a merchant page? If yes, then your reseller idea makes even more sense.

  16. Greg Sterling Says:

    The idea that these could function as landing pages is not far fetched at all.

  17. Mike Blumenthal Says:

    @Sebastien it is not ideal but yes they can manage the business listing in the LBC with the businesses permission and help with verification.

  18. Mike Blumenthal Says:

    Although it will need to be tested as Google’s algo based content checking in the LBC flags anything with the word Google in it, even a Google Site url…

  19. Greg Sterling Says:

    Seb: Look at it this way — anyone can link to these pages and anyone (unless claimed) can edit them.

  20. predictabuy Says:

    Just wrote a post outlining in more detail how the Place Pages are designed for optimization and could be used as landing pages: http://bit.ly/2JJKZm.

  21. Where Are Google Places Pages Going? To the Index? | Understanding Google Maps & Local Search Says:

    […] Place’s pages were introduced it was noted that they were not going to be indexed (there is a great discussion going on at Greg’s blog now) leaving the impression amongst many that they would sit, […]

  22. What Are the SEO Implications of Google Places Pages? Find Out at SMX East Says:

    […] wrote a post that discusses the potential indexing of Places Pages. And on my personal blog, Screenwerk, there’s a discussion in the comments about whether this is inevitable and the potential […]

  23. Google Plays Monopoly and Owns Every Place on the Board | Stever.ca Says:

    […] get indexed and start appearing in search results for local places and local business names then it becomes a game changer. The major Internet Yellow Pages (IYP) sites must all be doing laundry right now to clean out the […]

  24. google » What Are The SEO Implications Of Google Place Pages? Find Out At … Says:

    […] wrote a post that discusses the potential indexing of Places Pages. And on my personal blog, Screenwerk, there’s a discussion in the comments about whether this is inevitable and the potential […]

  25. Another Look at Google Place Pages « Business News Says:

    […] Click the map and then “more info” and you arrive at the new page — being created for everySource: Screenwerk RSS Feed […]

  26. Lior Ron Says:

    Greg,

    No conspiracy theory here, as I’ve mentioned when we talked Place Pages are not meant to be crawlable with this launch. This was an oversight on our part – we didn’t block all url paths and left maps.google.com/place open. Its now closed in robots.txt and we’ll make sure all other paths are blocked as well

    Lior.

  27. Federico Says:

    Greg,
    I always read your blog and one of the things that´s always on around is “the local flavor”.
    How do you think this new release from google will compete against some already well-established local guides like yelp, yellow pages, citysearch, etc.?
    At first sight it seems that its direct competition… but from somewhere google will have to collect the data it shows (maybe from this sites?)

  28. Greg Sterling Says:

    Thanks Lior for clarifying

  29. Greg Sterling Says:

    Google has said these pages won’t be indexed. So competition will be indirect. They will strengthen google maps however as an overall competitor.

  30. Chris Silver Smith Says:

    Lior, you might want to read Eric Enge’s interview with Matt Cutts or speak directly to Matt about this:

    http://www.stonetemple.com/articles/interview-matt-cutts.shtml

    Google’s interpretation of Robots.txt rules have been a bit more literal than other search engines – Google will index pages that are disallowed under robots.txt – they just won’t crawl them.

    So, you may be disallowing and you may be closing up link leaks in your structure, but these Place pages will be indexed as people outside of Google link to them, so you’ll have to do something else to keep Google from making those appear in the index altogether.

    Incidentally, I’d be very interested in hearing what that would be. As Google now interprets more Javascript and Flash along with this literal interpretation, there are fewer and fewer ways of keeping some URLs from being indexed. If you have some other tool or protocol for doing so, it could be very helpful to the SEO community (I’ve been called in, for instance when highly sensitive banking industry pages have been accidentally indexed, and if there are links out of one’s control pointing to them, it can be quite difficult to put the cat back in the bag, so to speak.)

  31. Danny Sullivan Says:

    I know it’s easy to assume Google has a big plan to take over with these pages, but they said they wouldn’t be indexed. So why are they showing up? Easy. They failed to understand the difference between the robots.txt file and the meta robots tag (which isn’t hard; site owners struggle with this). Many people have been telling Google for years that the robots.txt block should be enough to totally keep pages out of the index, but oh no, Google just had to have a way to keep showing pages. Well, now it bites them in their own butt a bit. Postscripted to our SEL post to explain this more.

  32. Barry Hunter Says:

    @Danny,

    How does the meta robots tag have anything to do with this?

    The meta tag appears in the actual page, so it can only have an effect if the page is actully crawled. But because the page cant be crawled due to the robots.txt rule, GoogleBot will never see such a meta tag.

    And if meaning a ‘nofollow’ in the page containing the link – for the most part these pages are going to be outside the control of Google, for example all these blog posts talking about Places is creating loads of links to places pages (which are then potentially ranked).

    I agree with Chris there is no real way to remove these results without higher intervention from the web search team. However maybe the ‘Remove Directory’ tool in Webmasters Tools could be used, if the Maps team have access to it for maps.google.com

    • stroseo Says:

      @Barry and Danny,

      I initially agreed with Danny in that the meta noindex tag would definitely prevent the pages from appearing in the index. My thoughts were that Googlebot was disregarding the robort.txt file and partially indexing the page because of heavy linking. However, look at the title they are using for the recently popular search for “burdick chocolate cafe boston”

      The Google Places search result listing usues the title:

      “Burdick Chocolate Cafe in Boston”

      However, the html page title on the page is:

      “Burdick Chocolate Cafe – Google Maps”

      This must mean that Googlebot is never accessing the pages. Google will modify meta descriptions but never page titles. So is Google getting this information directly from the anchor text that is linking to this page?

      • stroseo Says:

        Sorry but Over at Tech Crunch, Matt Cutts followed up with another reply regarding the URLs showing up in search results even though robots.txt files prevented the crawling of the pages.

        He refered to a previous post of his regarding this issue here: http://www.mattcutts.com/blog/googlebot-keep-out/

        Matt said:

        “You might wonder why Google will sometimes return an uncrawled url reference, even if Googlebot was forbidden from crawling that url by a robots.txt file.”

        “There’s a pretty good reason for that: back when I started at Google in 2000, several useful websites (eBay, the New York Times, the California DMV) had robots.txt files that forbade any page fetches whatsoever. Now I ask you, what are we supposed to return as a search result when someone does the query [california dmv]? We’d look pretty sad if we didn’t return http://www.dmv.ca.gov as the first result. But remember: we weren’t allowed to fetch pages from http://www.dmv.ca.gov at that point. The solution was to show the uncrawled link when we had a high level of confidence that it was the correct link. Sometimes we could even pull a description from the Open Directory Project, so that we could give a lot of info to users even without fetching the page.”

  33. Chris Silver Smith Says:

    Barry, the noindex meta tag will tell the engine not to INDEX the page. So, yes, it has to be crawled for that to happen, but it would keep the page from showing up in search results listings.

    So, this is partly what I was alluding to — if you’re Google Maps and you don’t want to spend tons of unnecessary CPUs in delivering pages to Googlebot to NOT be indexed, then you’re out of luck. You either have to expend tons of unnecessary CPUs telling Googlebot not to index, or else you end up not truly keeping pages out of the index — which is their current dilemma.

  34. Michael Bauer Says:

    I just did a quick analysis over at http://www.seeingforests.com/google-places-redux/ on the different kinds of content available at different page “levels”. Was useful in the context of looking at the balance here between the auto-generated and the human-curated place pages. I’m not sure about the indexing but I’m wondering whether simply having more of this kind of content might address the “Local Paradox” by making people more aware that they CAN search for this kind of content and by so doing drive traffic to across the local spectrum.

  35. Barry Hunter Says:

    Sorry yes, that was my point, they need robots.txt to prevent crawling, in which case meta tag is useless. Wasn’t considering they could do it without the robots.txt rule 🙂

  36. Rich Rosen Says:

    Is this new?

    Blue Bottle Cafe
    This place has unverified edits. Show all edits »

    the reveal presents the history of edits. I haven’t noticed this before.

  37. Greg Sterling Says:

    Not sure re the history of edits. It could be.

  38. More Info on Google Local Listing Ads « Screenwerk Says:

    […] issue Mike, David and I talked about was Google’s non-indexing of the Place Pages. The decision was likely made to avoid alienating Google’s reseller partners, many of which […]

Comments are closed.