Why is map listing split this way ?
-
I am trying to figure out why map listings gets split up this way. Some people gets their maplisting to the right while the generic map listing block is down after a few organic listings .
Didn't they at some point of time club orgainc + map together so only one shows up in SERPs ?
-
Thanks for your input
-
Hi Saijo,
There a quite a few different ways in which Google displays local results and your screenshot shows one of them. As to why Google tries so many different layouts, I believe they are always testing things to see which layouts serve users (and Google) best.
Regarding your question:
"Didn't they at some point of time club orgainc + map together so only one shows up in SERPs ?"
I believe what you are referring to is the blended results. A blended results is a result in which there is both information coming from Google's local product (like reviews) as well as information coming from the business' website (like the title tag and URL). The majority of local listings are blended now and will appear on a page of both blended results and organic results, as in your screenshot. So, what you are seeing is normal.
Hope this helps!
Miriam
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Product Listing Pages
Hi I had a question regarding product pages and the best way to display the page for SEO. For example, is it best to have a page for - Blue Euro Containers including a table of the capacity options you can buy.. Or, have each product split out so it has it's own product page - 60L Blue Euro Container, etc etc I know a lot of the information will be fairly similar, with the capacity being the one major difference - is this a bad thing? Some of our product tables are too big and the idea was to split them out. Thanks!
Algorithm Updates | | BeckyKey0 -
Website dance on Google Map results and organic seo results
My website is daily showing different position on maps.google.com and for the last few days like yesterday it was on 21st position on some keyword and today it is no where and same with other keywords. Is this a Google Dance ?? what can be its period ? and what is tyhe solution to handle it ??
Algorithm Updates | | mnkpso0 -
What is the point of XML site maps?
Given how Google uses Page Rank to pass link juice from one page to the next if Google can only find a page in an XML site map it will have no link juice and appear very low in search results if at all. The priority in XML sitemaps field also seems pretty much irrelevant to me. Google determines the priority of a page based on the number of inbound links to it. If your site is designed properly the most important pages will have the most links. The changefreq field could maybe be useful if you have existing pages that are updated regularly. Though it seems to me Google tends to crawl sites often enough that it isn't useful. Plus for most of the web the significant content of an existing page doesn't change regularly, instead new pages are added with new content. This leaves the lastmod field as being potentially useful. If Google starts each crawl of your site by grabbing the sitemap and then crawls the pages whose lastmod date is newer than its last crawl of the site their crawling could be much more efficient. The site map would not need to contain every single page of the site, just the ones that have changed recently. From what I've seen most site map generation tools don't do a great job with the fields other than loc. If Google can't trust the priority, changefreq, or lastmod fields they won't put any weight on them. It seems to me the best way to rank well in Google is by making a good, content-rich site that is easily navigable by real people (and that's just the way Google wants it). So, what's the point of XML site maps? Does the benefit (if any) outweigh the cost of developing and maintaining them?
Algorithm Updates | | pasware0 -
Are Some Websites "White Listed"?
I track several niches that I am not in so I am not to biased with my own, and I noticed one site despite its rather mediocre quality, never moves. I have seen other websites rise and fall in rank, a few with pretty good content. He writes reviews, but very obviously never touched the products he reviews. However I see some other sites with real photos, and good advice for making a decision - they will sit on page two or three. I havent done a lot of research other than the size of the sites, and the links, and they are about equal. Sometimes the ranking site is smaller (its about 90 pages in google). The other sites I have seen have more content on one topic as well, which is interesting google opts for his one page "once over" review over something more in depth and authentic. It got me thinking about whether some sites are white listed by google, as in hand picked to rank despite what else is out there. Is this possible?
Algorithm Updates | | PrivatePartners0 -
Double Listings On Page One
I've been noticing a trend over the past month and a half. My sites that use to get more than one page listed in certain SERPs are now being adjusted. It almost looks manual but I know it is most likely a change in the algorithm. Let's say I had a SERP where my site was showing two different sub-pages in a single SERP at #4 and #6 are now having one page being pushed up to #3 but the other page is being pushed back past the first page. I'm not worried about penalizations or loss of value. I have been seeing this accross many of my client's sites. I just wanted to confirm that others were seeing it as well (so I'm not going crazy) and/or if Google has made any announcements or leaks regarding this shift. Maybe it's just my sites coming of age or something but I would love to be able to explain it more knowledgeably than with a "Google might be doing this". BTW - This is not effecting any of my Brand SERPs.
Algorithm Updates | | BenRWoodard0 -
TripAdvisor multiple listings
Why do certain sites for example TripAdvisor rank for their multiple international sites e.g TripAdvisor.co.uk, TripAdvisor.com, TripAdvisor.fr, TripAdvisor.de etc for a English search phrase ? From my view point they are just spamming the index with the same content. These searches are performed on google.co.uk
Algorithm Updates | | NeilTompkins0 -
What is the best way for a local business site to come up in the SERPs for a town that they are not located in?
At our agency, we work with many local small business owners who often want to come up in multiple towns that are near to their business where they do not have a physical address. We explain to them again and again that with the recent changes that Google in particular has made to their algorithms, it is very difficult to come up in the new "blended" organic and Places results in a town that you don't have a physical address in. However, many of these towns are within 2 or 3 miles of the physical location and well within driving distance for potential new clients. Google, in it's infinite wisdom doesn't seem to account for areas of the country, such as New Jersey, where these limitations can seriously affect a business' bottom line. What we would like to know is what are other SEOs doing to help their clients come up in neighboring towns that is both organic and white hat?
Algorithm Updates | | Mike-i0 -
Best way to find new studies?
I'm wondering how sites break news without dedicating too much time researching. Are there any tools that you use? For example, if you wanted to be one of the first people to know when a celebrity was diagnosed with a health condition, how would you go about doing that?
Algorithm Updates | | nicole.healthline0