Campaign shows 5,000 warnings from shared database feed, made pages no-follow and no-index, are we OK now?
-
One of our campaigns shows 5,000 warnings for dup content, meta descriptions, and urls.
This is from a xml database feed that is shared throughout the industry. We made the pages no-follow and no-index, but on Moz crawl still get the warnings. No warnings on Webmaster tools.
Should we ignore these warnings and are we OK now, or is there more work to do?
-
I think best practice is to make them "noindex,follow". You'll still get the warnings in Moz. They're ok to ignore if you have have noindexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Indexing without content
Hello. I have a problem of page indexing without content. I have website in 3 different languages and 2 of the pages are indexing just fine, but one language page (the most important one) is indexing without content. When searching using site: page comes up, but when searching unique keywords for which I should rank 100% nothing comes up. This page was indexing just fine and the problem arose couple of days ago after google update finished. Looking further, the problem is language related and every page in the given language that is newly indexed has this problem, while pages that were last crawled around one week ago are just fine. Has anyone ran into this type of problem?
Technical SEO | | AtuliSulava1 -
Getting 'Indexed, not submitted in sitemap' for around a third of my site. But these pages ARE in the sitemap we submitted.
As in the title, we have a site with around 40k pages, but around a third of them are showing as "Indexed, not submitted in sitemap" in Google Search Console. We've double-checked the sitemaps we have submitted and the URLs are definitely in the sitemap. Any idea why this might be happening? Example URL with the error: https://www.teacherstoyourhome.co.uk/german-tutor/Egham Sitemap it is located on: https://www.teacherstoyourhome.co.uk/sitemap-subject-locations-surrey.xml
Technical SEO | | TTYH0 -
Moving from no follow to follow links on our eCommerce site
Hi everyone, I recently taken on an SEO eCommerce account and found that all the footer links have a no follow attribute. I've requested that the no follow tags be removed as the pages are quite valuable (about us, finance, recycling, help centre etc). I've been asked what the risks are and all I can think of is a slightly increased number of pages for Google to Crawl. Are there any other risks you can think of? Does anyone have experience around making this type of change? For benefits, I believe that it will make our content look more trustworthy to Google and help with traffic through to those pages in the SERPs. Any other pros you can think of will be a great help.
Technical SEO | | RebekahVP0 -
Home Page Ranking Instead of Service Pages
Hi everyone! I've noticed that many of our clients have pages addressing specific queries related to specific services on their websites, but that the Home Page is increasingly showing as the "ranking" page. For example, a plastic surgeon we work with has a page specifically talking about his breast augmentation procedure for Miami, FL but instead of THAT page showing in the search results, Google is using his home page. Noticing this across the board. Any insights? Should we still be optimizing these specific service pages? Should I be spending time trying to make sure Google ranks the page specifically addressing that query because it SHOULD perform better? Thanks for the help. Confused SEO :/, Ricky Shockley
Technical SEO | | RickyShockley0 -
New pages need to be crawled & indexed
Hi there, When you add pages to a site, do you need to re-generate an XML site map and re-submit to Google/Bing? I see the option in Google Webmaster Tools under the "fetch as Google tool" to submit individual pages for indexing, which I am doing right now. Thanks,
Technical SEO | | SSFCU
Sarah0 -
2 links on home page to each category page ..... is page rank being watered down?
I am working on a site that has a home page containing 2 links to each category page. One of the links is a text link and one link is an image link. I think I'm right in thinking that Google will only pay attention to the anchor text/alt text of the first link that it spiders with the anchor text/alt text of the second being ignored. This is not my question however. My question is about the page rank that is passed to each category page..... Because of the double links on the home page, my reckoning is that PR is being divided up twice as many times as necessary. Am I also right in thinking that if Google ignore the 2nd identical link on a page only one lot of this divided up PR will be passed to each category page rather than 2 lots ..... hence horribly watering down the 'link juice' that is being passed to each category page?? Please help me win this argument with a developer and improve the ranking potential of the category pages on the site 🙂
Technical SEO | | QubaSEO0 -
What do we do now?
OK where do we start? Lets go back to June 2011So in June 2011 we left an SEO company that was looking after our account, after a year of being with them and not moving 1 place in the SERPS we decided to cut ties with them and move to a new SEO company... Over the first 6 months things seemed to be going well, for our main keyword "car warranty" we started to climb up the rankings from #6 - #4 - #2, we also moved up for other keywords. One thing that I did notice when we were with the new SEO Company not much on-site optimisation was completed.Move forward to January 2012So over the Christmas period the usual happened, our rankings stayed the same put traffic dropped which is obviously normal for this specific time of year... But on the 7th of Jan 2012 our rankings dropped from #2 - #10, we contacted the SEO company in question and they reported back that a server that had 20links pointing back to our server had crashed on Christmas Day and the links had been de-indexed from Google, they said give it 2 - 3 weeks and the links will be re-indexed and we should pop back up on google.3 Weeks later we were still in the exactly the same position, the good thing was because we run a very good Adwords campaign the traffic to our site didn't drop.The SEO company then came back and said it looks like we have been placed in a 60day filter by Google and once the 60day filter had been reached we should pop back up - 65days came and went and we were still in the exact same position. After us waiting around for months to see if the rankings improved we decided to leave the SEO company and move all our SEO work back in-house.Move forward to February 2012Once we had full control over the account again we made some changes to the on-site optimisation of the site, we improved page titles, descriptions, tags and also our content was re-written - we then waited for Google to pickup the changes and re-index our site. YesterdaySo yesterday we checked our rankings - some of our longer tail keywords had improved but our main big traffic keywords had dropped even further - we had gone from #10 to #16, with the update to the algorithm yesterday targeting spammy websites we feared that we had been hit by the new update. We then cleaned up some links that looked spammy and asked for a reconsideration request. Now looking deeper into our backlinks we still have some spammy non-relevant links as well as a few big sitewide links - 1 sitewide link provides 732,667 links to our homepage and our total links indexed by Google is only 759,144.What do you think we should do...Wait for Google to come back to us after the reconsideration request?Remove more and backlinks?Build more high value links?If anyone can provide me with more information it would be great.Thanks,
Technical SEO | | ScottBaxterWW
Scott0 -
Warnings on Pages excluded from Search Engines
I am new to this, so my question may seem a little rookie type... When looking at my crawl diagnostic errors there are 1604 warnings for "302 redirects". Of those 1604 warnings 1500 of them are for the same page with different product ID's on them such as: www.soccerstop.com/EMailproduct.aspx?productid=999
Technical SEO | | SoccerStop
www.soccerstop.com/EMailproduct.aspx?productid=998 In our robots.txt file we have Disallow: /emailproduct.aspx Wouldn't that take care of this problem? If so, why is it still giving me these warning errors? It does take into account our robots.txt file when generating this report does it not? Thanks for any help you can provide.
James0