GWT and html improvements
-
Hi all
I am dealing with duplicate content issues on webmaster tool but I still don't understand what's happening as the number of issues keeps changing. Last week the duplicate meta description were 232, then went down to 170 now they are back to 218.
Same story for duplicate meta title, 110, then 70 now 114. These ups and downs have been going on for a while and in the past two weeks I stopped changing things to see what would have happened.
Also the issues reported on GWT are different from the ones shown in the Crawl Diagnostic on Moz.
Furthermore, most URL's have been changed (more than a year ago) and 301 redirects have been implemented but Google doesn't seem to recognize them.
Could anyone help me with this?
Also can you suggest a tool to check redirects?
Cheers
Oscar
-
Thank you guys for your answers, I will look into it, and try to solve the problems.
I think many pages are self canonicalized, but I see that many URL's haven't been redirect to the new ones so I will start fixing the redirects.
In the top pages report though shows just the new URL's.
Anyway, I will keep you update on this as I am not too sure how to tackle this.
Thanks a lot.
Cheers
-
Had a few minutes and wanted to help out...
Google doesn't always index/crawl the same # of pages week over week, so this could be the cause of your indexing/report problem with regards to the differences you are seeing. As well, if you are working on the site and making changes, you should be seeing these numbers improve (depending on site size of course
Enterprise sites might take more time to go through and fix up, so these numbers might look like they are staying at the same rate - if your site is huge
To help with your 301 issue - I would definitely look up and download SEO Screaming Frog. It's a great tool to use to identify potential problems on the site. Very easy to download and use. Might take some getting used too, but the learning curve isn't very hard. Once you use it a few times to help diagnose problems, or see things you are working on improve through multiple crawling. It will allow you to see some other things that might not be working and get to planning fixes there too
As well, make sure to review your .htaccess file and how you have written up your 301's. If you are using Apache, this is a great resource to help you along. Read that 301 related article here
Make sure to manually check all 301 redirects using the data/URL's from the SEO Screaming Frog tool. Type them in and visually see if you get redirected to the new page/URL. If you do, it's working correctly, and I'm sure it will only be a matter of time before Google fixes their index and displays the right URL or 301. You can also check this tool for verifying your 301 redirects using the old URL and see how it performs (here)
Hope some of this helps to get you off to working/testing and fixing! Keep me posted if you are having trouble or need someone to run a few tests from another location.
Cheers!
-
We had the same issue on one of our sites. Here is how I understand it after looking into it and talking to some other SEOs.
The duplicate content Title and Meta description seem to lag any 301 redirects or canonicals that you might implement. We went through a massive site update and had 301s in place for over a year with still "duplicates" showing up in GWT for old and new URLs. Just to be clear, we had the old URLs 301ing to the new ones for over a year.
What we found too, was that if you look into GWT under the top landing pages, we would have old URLs listed there too.
The solution was to put self canonicalizing links on all pages that were not canonicaled to another one. This cleaned thing up over the next month or so. I had checked my 301 redirects. I removed all links to old content on my site, etc.
What is still find are a few more "duplicates" in GWT. This happens on two types of URLs
-
We have to change a URL for some reason - we put in the 301. It takes a while for Google to pick that up and apply it to the duplicate content report. This is even when we see it update in the index pretty quick. As, I said, the duplicate report seems to lag other reports.
-
We still have some very old URLs that it has taken Google a while to "circle back" and check them, see the 301 and the self canonical and fix.
I am honestly flabbergasted at how Google is so slow about this and surprised. I have talked with a bunch of people just to make sure we are not doing anything wrong with our 301s etc. So, while I understand what is happening, and see it improving, I still dont have a good "why" this happens when technically, I have everything straight (as far as I know). The self canonical was the solution, but it seems that a 301 should be enough. I know there are still old links to old content out there, that is the one thing I cannot update, but not sure why.
It is almost like Google has an old sitemap it keeps crawling, but again, I have that cleared out in Google as well
If you double check all your stuff and if you find anything new, I would love to know!
Cheers!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International Config in a WP Multisite environment: GWT, Yoast, Hreflang, SiteMaps etc ?
Hi If setting up on a WP multisite environment using Yoast seo plugin how should you: 1) Geotarget in GWT - set up a profile for the different tlds OR the network.domain.com different subfolders ? 2) Set up hreflang sitemaps - can Yoast handle this or anything manual need to be done such as disabling Yoast and creating a bespoke site map with hreflang ? Would this plug in help: https://wordpress.org/plugins/language-selector-related/ 3) Any other ideas or recommendations for setting up geotrageting correctly using WP Multisite ? Thanks Dan
Technical SEO | | Dan-Lawrence0 -
A while back there was a strategy presented about developing games or widgets and then using them for link bait. Is this still a viable strategy to improve ranking?
We are developing a game (might be an app or a widget or both) and are looking for the latest advice about how to leverage it for seo benefit. Can anyone share their recent experience with this? I know it was a strategy used in the past, but what considerations should we be looking at (technically and strategically) as we begin developing. Thanks in advance for any advice.
Technical SEO | | larahill0 -
Website Redesign / Switching CMS / .aspx and .html extensions question
Hello everyone, We're currently preparing a website redesign for one of our important websites. It is our most important website, having good rankings and a lot of visitors from Search Engines, so we want to be really careful with the redesign. Our strategy is to keep as much in place as possible. At first, we are only changing the styling of the website, we will keep the content, the structure, and as much as URLs the same as possible. However, we are switching from a custom build CMS system which created URLs like www.homepage.com/default-en.aspx
Technical SEO | | NielsB
No we would like to keep this URL the same , but our new CMS system does not support this kind of URLs. The same with for instance the URL: www.homepage.com/products.html
We're not able to recreate this URL in our new CMS. What would be the best strategy for SEO? Keep the URLs like this:
www.homepage.com/default-en
www.homepage.com/products Or doesn't it really matter, since Google we view these as completely different URLs? And, what would the impact of this changes in URLs be? Thanks a lot in advance! Best Regards, Jorg1 -
When choosing GWT preferred domain its asking for re-verification?
Trying to set a preferred domain in GWT, and the site is verified via Google Analytics and meta tag in the code, but still asks: Part of the process of setting a preferred domain is to verify that you own http://site.org/. Please verify http://site.org/. Tried looking for answer to no avail, am I missing anything?
Technical SEO | | vmialik0 -
Will invalid HTML code generated by WordPress affect SEO efforts?
Hi all, I'm new to SEOmoz and SEO in general really. I run a small but well regarded freelance website and graphic design business, and until very recently had an employee who handled the SEO side of things. I'm now looking to step into this role myself and hopefully learn the in's and out's of SEO. I've no doubt there will be much to learn, but the SEOmoz tools and it's community seem excellent and helpful. My question then is basically, if WordPress generated HTML code can have an effect on SEO, when it's reported as invalid by tools such as the W3C HTML validator? I'm used to hand coding the majority of my websites for clients, where creating valid HTML and CSS code is something I can do with relative ease. A new client however wants to use WordPress - for ease of updating the site content themselves. The client does however consider any potential SEO implications to be a very important factor in choosing a hand coded vs. WordPress based website. I am aware that WordPress itself is just a means of generating HTML code, and that to the search engines there is no difference between this and the hand coded websites I usually produce. However if WordPress is generating HTML that is being reported as invalid, would this make the search engines penalise the site? On a second note, will the search engines look negatively on a WordPress site where it is being used as a standard website, and the content may not be updated as frequently, as say, a blog? Thanks for your time, and I look forward to hearing your suggestions.
Technical SEO | | SavilleWolf0 -
Leaving Comments on blogs when html is removed
I found the following blog. It is pagerank 5 do follow http://www.unssc.org/web1/programmes/rcs/cca_undaf_training_material/teamrcs/forumdetail.asp?ID=32 If you attempt to leave a comment with html, the html is removed. There is a button which allows you to leave a comment but if you do it gets redirected to the domain of the blog not your site. However there are still people leaving links with the url of the intended site. As late as today. look at this comment
Technical SEO | | mickey11
Comment posted by : Alex on 09/09/2011 I love to se percorsi on this site very often How is this done, if anyone knows I got the code done to this your keywords The important part being mce_real_href0 -
Root vs. Index.html
Should I redirect index.html to "/" or vice versa? Which is better for duplicate content issues?
Technical SEO | | DavetheExterminator0 -
HTML and no index, follow
I’m just learning about HTML and I was wondering can a tag be put into a dynamic HTML page?
Technical SEO | | EricVallee340