Pages with duplicate meta descriptions
-
We have around 17 pages have underscores in the URL. From the 17 pages, we have changed 3 pages URL for example if the url is test_sample_demo.html, we have changed as test-sample-demo.html
After the updates, we have made redirect as follows
Redirect 301 test_sample_demo.html test-sample-demo.html
Presently google webmaster tool shows as "Pages with duplicate meta descriptions" & "Pages with duplicate title tags" for changed pages
How to fix this. Please help us
-
Hi,
It sounds like the 301 isn't implemented correctly OR Google didn't yet crawl the old URLs after you implemented the redirect.
How long ago did you change the URLs? If it's only a few days ago I'd just wait for Google to crawl your old URLs again and detect the 301.
Hope it helps.
-
This is likely to be Google's auditing system (from within Google Search Console) errors and is (probably) not your fault. If pages which are redirecting are being flagged as having duplicate Meta descriptions, that is demonstrably and necessarily factually inaccurate. A page which redirects somewhere else **never serves its source code **to Google, users or anyone else (assuming that the redirect is global, of course). If the source code is never seen, Google should not be able to find any Meta description let alone a duplicate one.
In all likelihood Google is comparing the new URLs against cached versions of the old pages (instead of re-visiting the old addresses as live URLs like it should do). As such it believes there's duplicate Meta data. When it eventually bothers to _actually _re-crawl the old URLs - it will eventually work out its issues and fix itself. If you want to speed it along, Fetch and render the old URLs so that Google knows they are actually redirecting now. Following that, spam the 'mark as fixed' thing until it complies with your work.
If however you are exempting Google from those particular redirects (maybe via the Googlebot user-agent), then obviously it can't see the redirects and is still accessing the old page-versions. Make sure that Google follows 301 redirects in the same way that users are forced to.
Be sure to test the redirects manually using something like this chrome extension. Test that the redirects work. Set your user-agent to Googlebot, do a hard refresh to clear your cache - try the page again. Try using a VPN to access the redirects from servers in different locations (try the UK, somewhere in Europe, the USA). Sometimes redirects are 'conditional' and if Google is somehow slipping through the net, that's a problem for you. Never just accept "well someone told me it was coded like this so it must always apply". Test manually, work out the real truth
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Looking at creating some auto-generated pages - duplicate content?
Hi Everyone! We just launched a new version of our research site and the main CTA on the page sends users to a subdomain that's blocked by robots.txt. The subdomain link is one of our PPC landing pages and they would be duplicate content for every model (cars). We're also looking at a new content stream of deals pages, on the main domain. The thought process was that we could rank these pages for things like "Volkswagen golf deals" and also use them as canonical URLs from the PPC pages so that Panda doesn't get mad at us for sending hundreds of links to a subdomain that's blocked. It's going to take us a lot of time to write the copy for the deals pages, so if we auto-generate it by pulling a paragraph of copy from the car review, and numerical stats about that model, will it be classes as duplicate and/or is there any downside to doing it? Review Page: http://www.carwow.co.uk/car-reviews/Ford/Fiesta Deals Page: http://www.carwow.co.uk/deals/Ford/Fiesta PPC Landing Page: http://quotes.carwow.co.uk/buy/Ford/Fiesta I can't help but feel that this may all be a bit overkill and perhaps it makes more sense to build 1 central deals page per model with unique content that we can also send the PPC traffic to, then life any block from the quotes. subdomain. But that will take time and we'd also like a quick solution. I'd also question if it's even an issue to link to a blocked subdomain, Google adds the quote URL into the index but can't crawl it, which I've been told is bad - but is it bad enough to do something about? Thanks, JP
Technical SEO | | Matt.Carwow0 -
Duplicated rel=author tags (x 3) on WordPress pages, any issue with this?
Hi,
Technical SEO | | jeffwhitfield
We seem to have duplicated rel=author tags (x 3) on WordPress pages, as we are using Yoast WordPress SEO plugin which adds a rel=author tag into the head of the page and Fancier Author Box plugin which seems to add a further two rel=author tags toward the bottom of the page. I checked the settings for Fancier Author Box and there doesn't seem to be the option to turn rel=author tags off; we need to keep this plugin enabled as we want the two tab functionality of the author bio and latest posts. All three rel=author tags seem to be correctly formatted and Google Structured Data Testing Tool shows that all authorship rel=author markup is correct; is there any issue with having these duplicated rel=author tags on the WordPress pages?
I tried searching the Q&A but couldn't find anything similar enough to what I'm asking above. Many thanks in advance and kind regards.0 -
Duplicate Pages , Do they matter ?
I have been told buy the company who created my site that duplicate page warning are not a problem ? my site is small and only has 50 pages ( including product pages etc ) yet the crawl shows over 6500 duplicate pages am I right to be concerned?
Technical SEO | | Gardening4you0 -
Double problem: mobile friendly site and shopping cart page duplication
I have a website that has two issues related to SEO: 1) the main website (product.com) is not mobile-friendly and 2) I have a shopping cart site (buymyproduct.com) using Magento that basically duplicates our product pages that exist on the main marketing website. Uses click "buy now" on a product page and are sent to the checkout at "buymyproductnow.com". The company cannot overhaul product.com website right away and our shopping site (buymyproduct.com) uses a responsive theme and works well for iphone and iPad so I am thinking of making buymyproduct.com the mobile-friendly version of our website by using a sniffer on product.com and forwarding users to the mobile friendly version. If I add canonical references from the shopping cart product pages and articles back to product.com associated pages, will this lessen the blow to any seo issues? What other factors am I missing/need to consider Complicated and painful. Maybe doing nothing right now is best. Thanks for any feedback.
Technical SEO | | Timmmmy0 -
SEO Mox reporting all pages & titles as duplicate, but this is not the case.
HI, I am confused . This week SEOMoz is reporting that all my pages and pages titles are duplicate. This is not the case. I have added geo meta tags to each page - could this be causing the duplicate page content condition? I have no theory about the title duplicate condition. I have uploaded pdfs of these seomoz reports here: http://www.2shared.com/document/JAgS2Ni9/Issue_-Duplicate_Page_Content.html http://www.2shared.com/document/gyXcMsoP/Issue-Duplicate_Page_Title-.html Help please. thanks.
Technical SEO | | RichardB20 -
Google inconsistent in display of meta content vs page content?
Our e-comm site includes more than 250 brand pages - lrg image, some fluffy text, maybe a video, links to categories for that brand, etc. In many cases, Google publishes our page title and description in their search results. However, in some cases, Google instead publishes our H1 and the aforementioned fluffy page content. We want our page content to read well, be descriptive of the brand and appropriate for the audience. We want our meta titles and descriptions brief and likely to attract CTR from qualified shoppers. I'm finding this difficult to manage when Google pulls from two different areas inconsistently. So my question... Is there a way to ensure Google only utilizes our title/desc for our listings?
Technical SEO | | websurfer0 -
How do I eliminate duplicate page titles?
Almost...I repeat almost all of my duplicate page titles show up as such because the page is being seen twice in the crawl. How do I prevent this? <colgroup><col width="336"> <col width="438"></colgroup>
Technical SEO | | ENSO
| www.ensoplastics.com/ContactUs/ContactUs.html | Contact ENSO Plastics |
| ensoplastics.com/ContactUs/ContactUs.html | Contact ENSO Plastics | This is what is from the CSV...there are many more just like this. How do I cut out all of these duplicate urls?0 -
Avoiding duplicate content/same pages
hi I have been checking through all the Q and A but i i'm still not sure how you get http://www.domain.co.uk/index.html to be just http://www.domain.co.uk/? Do you add canonical to the index page to point to the page you prefer and then add a 301 redirect? thanks
Technical SEO | | challen0