Would duplicate listings effect a client's ranking if they used same address?
-
Lots of duplication on directory listings using similar or same address, just different company names... like so-and-so carpet cleaning; and another listing with so-and-so janitorial services.
Now my client went from a rank around 3 - 4 to not even in the top 50 within a week.
-- -- -- Would duplication cause this sudden drop?
Not a lot of competition for a client using keyword (janitorial services nh);
-- -- -- would a competitor that recently optimized a site cause this sudden drop?
Client does need to optimize for this keyword, and they do need to clean up this duplication. (Unfortunately this drop happened first of March -- I provided the audit, recommendations/implementation and still awaiting the thumbs up to continue with implementation).
--- --- --- Did Google make a change and possibly find these discrepancies within listings and suddenly drop this client's ranking?
And they there's Google Places:
Client usually ranks #1 for Google Places with up to 12 excellent reviews, so they are still getting a good spot on the first page.The very odd thing though is that Google is still saying that need to re-verify their Google places.
I really would like to know for my how this knowledge how a Google Places account could still need verification and yet still rank so well within Google places on page results? because of great reviews? --- Any ideas here, too?
_Cindy
-
Glad to be of help, Cindy. Good luck!
Miriam
-
Miriam,
Thank you very much. I believe I am starting to see the whole picture of possibilities for this drop in rank. And excellent info about Google Places.
I do need to visit the Places help forum and add it to the list of resources!
Much appreciated.
_Cindy
-
Hi Cindy,
I would definitely suspect either a penalty or a bug of some sort in regards to the drop from #1 to not in top 50.
What happens when you do a direct business name or phone number search within maps.google.com for the business? Are you able to find their Place Page, or, possibly, getting a 'Do Not Support' message.
Check out this thread at the Google Places Help Forum to see if you recognize your problem in it:
https://productforums.google.com/forum/#!category-topic/business/technical-issue/9JszqkewMVU
Regardless of this, you have clearly identified problems with the business that need to be fixed ASAP. There is no question in my mind that the data confusion could be precisely what has caused the ranking drop. Data consistency is the number one requirement of a good Places record, but, a business can get by for months or years with high rankings and bad data. Then, one day, it all goes away. This is something I see being reported constantly in the Google Places Help Forum, and the major lesson is that if you find problems, make every effort to clean them up as swiftly as possible. Then, once you've cleaned up the record to the best of your ability, you need to wait for the effects of what you've done to settle in. So, it becomes about patience.
Hopefully, the client will give you the go-ahead to get cracking on this. Good luck!
-
Perfect, thanks Cody, helps a lot.
And thank you for advice on matching contact page and Google Places!
_Cindy
-
In the local search realm, you want to have every directory listing exactly the same as it appears on your contact page. This allows search engines to associate those listings with your business without a doubt. As a good practice, I'd go ahead and make sure they all match, especially Google Places vs. your client's Contact Us page on their website.
Also, Penguin hit directories pretty hard. There is no penalization going on from what I have read, but there has been some major devaluation in links. It's likely the case that your client is not getting nearly as much weight from the directory links as they once were. Check out the link profile and see how many of these there are. I would imagine more reputable links now need to be acquired.
About the places listing, I'm not sure why that would occur, but it really doesn't matter for the end result here. If the Big G says you need to re-verify I would do it. It's a quick thing to do and that can't be helping their rankings.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to get into Google's Tops Stories?
Hi All, I have been doing research for a few weeks and I cannot for the life of me figure out why I cannot get my website (Racenet) into the top stories in Google. We are in Google News, have "news article" schema, have AMP pages. Our news articles also perform quite well organically and we typically dominate the Google News section. We have two main competitors (Punters and Just Horse Racing) who are both in top stories and I cannot find anything that we are doing that they aren't. Apparently the AMP "news article" schema is incorrect and that could be the reason why we aren't showing up in Google Top Stories, but I can't find anything wrong with the schema and it looks the same as our competitors. For example: https://search.google.com/structured-data/testing-tool/u/0/#url=https%3A%2F%2Fwww.racenet.com.au%2Fnews%2Fblake-shinn-booked-to-ride-doncaster-handicap-favourite-alizee-20190331%3FisAmp%3D1 Does anyone have any ideas of why I cannot get my site into Google Top Stories? Any and all help would be greatly appreciated. Thanks! 🙂
Technical SEO | | Saba.Elahi.M.0 -
Why is robots.txt blocking URL's in sitemap?
Hi Folks, Any ideas why Google Webmaster Tools is indicating that my robots.txt is blocking URL's linked in my sitemap.xml, when in fact it isn't? I have checked the current robots.txt declarations and they are fine and I've also tested it in the 'robots.txt Tester' tool, which indicates for the URL's it's suggesting are blocked in the sitemap, in fact work fine. Is this a temporary issue that will be resolved over a few days or should I be concerned. I have recently removed the declaration from the robots.txt that would have been blocking them and then uploaded a new updated sitemap.xml. I'm assuming this issue is due to some sort of crossover. Thanks Gaz
Technical SEO | | PurpleGriffon0 -
Duplicate Page Title for a Large Listing Website
My company has a popular website that has over 4,000 crawl errors showing in Moz, most of them coming up as Duplicate Page Title. These duplicate page titles are coming from pages with the title being the keyword, then location, such as: "main keyword" North Carolina
Technical SEO | | StorageUnitAuctionList
"main keyword" Texas ... and so forth. These pages are ranked and get a lot of traffic. I was wondering what the best solution is for resolving these types of crawl errors without it effecting our rankings. Thanks!0 -
Case sensitive url's
Hi, Really appreciate advice on this one in advance! We had a problem with case sensitive urls (eg: /web-jobs or /Web-jobs) We added a code to convert all urls into lowercase letters and added 301 redirection. We are now experiencing problems with duplicate page content. Each time a url contains caps letter it is converted and redirected to small letter url. I can convert all urls into lowercase letters (all places) but the problem now is google have already indexed urls so they may cause duplicate content issue. The solution: Remove 301 redirection added to convert url into small letter. Add canonical url which converts url into complete small letter, so google index content only from canonical url. But I am little confused about what will happen to already indexed pages with caps in url. Appreciate any advice you can give? Simon
Technical SEO | | simmo2350 -
According to 1 of my PRO campaigns - I have 250+ pages with Duplicate Content - Could my empty 'tag' pages be to blame?
Like I said, my one of my moz reports is showing 250+ pages with duplicate content. should I just delete the tag pages? Is that worth my time? how do I alert SEOmoz that the changes have been made, so that they show up in my next report?
Technical SEO | | TylerAbernethy0 -
I'm redesigning a website which will have a new URL format. What's the best way to redirect all the old URLs to the new ones? Is there an automated, fast way to do this?
For example, the new URL will be: https://oregonoptimalhealth.com/about_us.html while the old one's were like this: http://www.oregonoptimalhealth.com/home/ooh/smartlist_1/services.html I have redirect almost 100 old pages to the correct new page. What's the best and easiest way to do this?
Technical SEO | | PolarisMarketing0 -
Can I format my H1 to be smaller than H2's and H3's on the same page?
I would like to create a web design with 12px H1 and for sub headings on the page to be more like 24px. Will search engines see this and dislike it? The reason for doing it is that I want to put a generic page title in the banner, and more poetic headings above the main body. Example: Small H1: Wholesale coffee, online coffee shop and London roastery Large h2: Respect the bean... Thanks
Technical SEO | | Crumpled_Dog
Scott0 -
Ranked on second page for keyword; now not within first 1000 listings?
Our website www.clientfirstfunding.com ranked 13th for the keyword "structured settlement". After this weekend we are no longer ranking for this keyword at all. We haven't made any changes at all to the site and i haven't gained any backlinks that appear to be spammy. We have held this position for the last several months. I can understand a drop in SERPs but one this drastic is shocking. Any ideas as to what could have caused this would be greatly appreciated.
Technical SEO | | Tony19860