Best Way To Go About Fixing "HTML Improvements"
-
So I have a site and I was creating dynamic pages for a while, what happened was some of them accidentally had lots of similar meta tags and titles. I then changed up my site but left those duplicate tags for a while, not knowing what had happened. Recently I began my SEO campaign once again and noticed that these errors were there. So i did the following.
-
Removed the pages.
-
Removed directories that had these dynamic pages with the remove tool in google webmasters.
-
Blocked google from scanning those pages with the robots.txt.
I have verified that the robots.txt works, the pages are longer in google search...however it still shows up in in the html improvements section after a week. (It has updated a few times). So I decided to remove the robots.txt file and now add 301 redirects.
Does anyone have any experience with this and am I going about this the right away? Any additional info is greatly appreciated thanks.
-
-
Great advise here,
Just to add Google Search Console seems to update it's index slower than the search index so it is possible to see old errors longer than they exists until it is re-indexed.
Kind Regards
Jimmy
-
Hi there
I wouldn't remove pages just because they had issues. Some of that content may hold value, it's just a matter of making sure that your on-site SEO is unique to those pages. Your users maybe searching for it - make sure you research and tailor those pages to your user's intent.
Google also offers advice on duplicate content, including parameters and dynamic pages, so make sure you read through that before you just start discarding pages/content.
Hope this helps! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to set up URL structure for reviews off of PDP pages.
We are adding existing customer reviews to Product Detail Pages pages. There are about 300 reviews per product so we're going to have to paginate reviews off of the PDP page. I'm wondering what the best url structure for reviews pages is to get the most seo benefit. For example, would it be something like this? site.com/category/product/reviews/page-1 or something that used parameters, such as: site.com/reviews?product=a Also, what is the best way to show that the internal link on the PDP page to "All Reviews" is a higher priority link than the other links on the page?
Intermediate & Advanced SEO | | katseo10 -
What is the best way to go about product comparison text?
Our website is in the midst of a massive content enrichment project - we're moving from mostly catalog content to optimized web content. Our catalog and copy teams are hoping to include more product comparisons on the web (e.g. "unlike composite basketballs, rubber one's are more X..."), which can certainly provide useful information to our shoppers! However, from an SEO standpoint, we seem to have confused search engines when doing this in the past (i.e. the example above is currently ranked for a "composite basketball" term, not a rubber one). So... What is the best way to provide useful product comparisons without confusing search engines?
Intermediate & Advanced SEO | | laurenf0 -
Google webmaster tools showing "no data available" for links to site, why?
In my google webmaster account I'm seeing all the data in other categories except links to my site. When I click links to my site I get a "no data available" message. Does anyone know why this is happening? And if so, what to do to fix it? Thanks.
Intermediate & Advanced SEO | | Nicktaylor10 -
Use "If-Modified-Since HTTP header"
I´m working on a online brazilian marketplace ( looks like etsy in US) and we have a huge amount of pages... I´ve been studing a lot about that and I was wondering to use If-Modified-Since so Googlebot could check if the pages have been updated, and if it is not, there is no reason to get a new copy of them since it already has a current copy in the index. It uses a 304 status code, "and If a search engine crawler sees a web page status code of 304 it knows that web page has not been updated and does not need to be accessed again." Someone quoted before me**Since Google spiders billions of pages, there is no real need to use their resources or mine to look at a webpage that has not changed. For very large websites, the crawling process of search engine spiders can consume lots of bandwidth and result in extra cost and Googlebot could spend more time in pages actually changed or new stuff!**However, I´ve checked Amazon, Rakuten, Etsy and few others competitors and no one use it! I´d love to know what you folks think about it 🙂
Intermediate & Advanced SEO | | SeoMartin10 -
Why is Google rewriting titles with the brandname @ the front followed with a conon " : " i.e. > Brandname: the rest of the title
Example: https://www.google.nl/search?q=providercheck.nl&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a#bav=on.2,or.r_cp.r_qf.&ei=9xUCUuH6DYPePYHSgKgJ&fp=96e0b845c2047734&q=www.providercheck.nl&rls=org.mozilla:en-US:official&sa=X&spell=1&ved=0CC4QBSgA Look @ the first result: www.providercheck.nl
Intermediate & Advanced SEO | | Zanox0 -
Does "Noindex" lead to Loss of Link Equity?
Our company has two websites with about 8,000 duplicate articles between them. Yep, 8,000 articles were posted on both sites over the past few years. This is the definition of cross-domain duplicate content. Plan A is to set all of the articles to "noindex,follow" on the site that we care less about (site B). We are not redirecting since we want to keep the content on that site for on-site traffic to discover. If we do set them to "noindex," my concern is that we'll lose massive amounts of link equity acquired over time...and thus lose domain authority...thus overall site rankability. Does Google treat pages changed to "noindex" the same as 404 pages? If so, then I imagine we would lose massive link equity. Plan B is to just wait it out since we're migrating site B to site A in 6-9 months, and hope that our more important site (site A) doesn't get a Panda penalty in the meantime. Thoughts on the better plan?
Intermediate & Advanced SEO | | M_D_Golden_Peak0 -
Use of rel="alternate" hreflang="x"
Google states that use of rel="alternate" hreflang="x" is recommended when: You translate only the template of your page, such as the navigation and footer, and keep the main content in a single language. This is common on pages that feature user-generated content, like a forum post. Your pages have broadly similar content within a single language, but the content has small regional variations. For example, you might have English-language content targeted at readers in the US, GB, and Ireland. Your site content is fully translated. For example, you have both German and English versions of each page. Does this mean that if I write new content in different language for a website hosted on my sub-domain, I should not use this tag? Regards, Shailendra Sial
Intermediate & Advanced SEO | | IM_Learner0 -
1 of the sites i work on keeps having its home page "de-indexed" by google every few months, I then apply for a review and they put it back up. But i have no idea why this keeps happening and its only the home page
1 of the sites i work on (www.eva-alexander.com) keeps having its home page "de-indexed" by google every few months, I then apply for a review and they put it back up. But i have no idea why this keeps happening and its only the home page I have no idea why and have never experienced this before
Intermediate & Advanced SEO | | GMD10