What does the blocking_google property in the CSV export mean, and what to do about it?
-
The CSV export for Crawl Diagnostics contains a column named "blocked_google". It states a blocking date/time but doesn't occur on all our webpages, not even on all pages of the same type / structure.
There are no other flags on these records that would explain a blocking of Google; all other agents are not flagged, and our robots.txt doesn't contain any blocks either. The only flag the records have in common is "Page Title > 70 characters". Of course, I could just assume this is the reason for the "blocking_google", but is it?
What evaluation makes the crawler fill in this property, and how to handle/solve it's occurrence?
-
that page seem fine, and it also indexed by google, so I'm not sure whats the story with that. Might be best to contact seomoz support (help@moz.com I think)
-
Thank you again for your response.
The following link forwards to one of the affected pages: http://bit.ly/1b3IVYB
I'm downloading Screeming Frog right now and give it a try, thank you for the advice.
-
can you give an example of one of the pages the is "blocked_google"
Btw try screeming frog ( you can do 500 pages for free), it should pick up the same error and might explain why.
-
Thank you for your response.
Our CMS generates the following meta-tags on the affected pages:
Pragma is for cache control, content-type and google verification are clear. P: is Pinterest Social media, the rest is standard again. I searched the source of the page for "robots" as well, but no results.
-
check that those pages don't have some like:
meta name="robots" content="NOINDEX,NOFOLLOW" />
in the source code
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the Impact of Duplicate Content on Multiple Managed Property Domains?
Hi Moz Community! Our team is having an internal (and external) debate regarding the extent and implications of duplicate content for a hospitality client that I would love to get some feedback on. I unfortunately cannot divulge the brand/URL, but will give as much info as possible. The brand in question manages dozens of properties in the US and worldwide and has recently rolled up all of the domains under a singular brand.com domain. So whereas the properties used to have their own domains (property1.com, property2.com, etc...), they are now housed in sub-folders (brand.com/property1, brand.com/property2.com and so forth). The concern we have is that they launched the new brand site with all of the property sites/content rolled up under the new brand.com domain, however all of the individual property sites and their pages are still live as well. All of the canonicals on both brand.com as well as property1.com (property2.com, property3.com, etc...) are self-referencing (so the canonicals for brand.com/property1 and all of its sub-ages do not point to the still live property1.com and all of its sub-pages, for example). On the brand side, they believe this is the best path forward as brand.com grows and gains some authority, with the later intent on eventually redirecting the individual property domains - but we are unclear of that timeline (though we do think its more months as opposed to days/weeks) So our questions for the community here are: What is the perceived impact in this state of limbo to the individual property sites (ideally they house the original content and have the history, but could Google still give preference to the brand.com/property URLs and/or could both of them suffer in rank/search experience from the duplicate content an non-uniform presentation?) Could brand.com be "dinged" so-to-speak due to launching with this much duplicate content? (And if so, could that affect how quickly normalization occurs after the property sites are finally redirected?) Anything else we should consider/Any other feedback from the community? Thank you all for your time and support!
Technical SEO | | imiJoe0 -
How to arrange taxonomies when many mean the same thing?
Hello 🙂 I'm trying to figure out the category/taxonomy structure for my website which will be selling "Colored Contact Lenses" I'm a bit confused because, there are several search queries which sort of mean the same thing, for example "Halloween Contacts" are sort of the same thing as "Colored Contacts" people searching for Colored Contacts may potentially be looking for "Natural" styles, however many are looking for crazy styles, aka "Halloween Contacts" or "Crazy Contacts" Crazy contacts and halloween contacts, being the exact same thing just different choice of words from the searcher. So I'm trying to think of what to do for categories/link structure... I believe i should start with a primary category .com/colored-contacts/ then .com/colored-contacts/halloween-contacts/ But what about crazy contacts? Should I keep going, .com/colored-contacts/crazy-contacts/ which will have the exact same products listed? I'm kind if going crazy thinking about this lol, any thoughts and advice would be highly appreciated. Thank you!
Technical SEO | | abuntlysupport0 -
Schema Markup for property listings (estate agent)
Hello, I've been looking online for some help with this. An estate agent has a page of properties for sale. Is it possible to mark these individual properties up and if so would they appear as rich snippets in the SERPS - never seen anything like this for properties for sale so just wondered,
Technical SEO | | AL123al1 -
Creating a CSV file for uploading 301 redirect URL map
Hi if i'm bulk uploading 301 redirects whats needed to create a csv file? is it just a case of creating an excel spreadsheet & have the old urls in column A and new urls in column B and then just convert to csv and upload ? or do i need to put in other details or paremeters etc etc ? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
My site was Not removed from google, but my most visited page was. what does that mean?
Help. My most important page http://hoodamath.com/games/ has disappeared from google, why the rest of my site still remains. i can't find anything about this type of ban. any help would be appreciated ( i would like to sleep tonight)
Technical SEO | | hoodamath0 -
Campaign Issue: Rel Canonical - Does this mean it should be "on" or "off?"
Hello, somewhat new to the finer details of SEO - I know what canonical tags are, but I am confused by how SEOmoz identifies the issue in campaigns. I run a site on a wordpress foundation, and I have turned on the option for "canonical URLs" in the All in one SEO plugin. I did this because in all cases, our content is original and not duplicated from elsewhere. SEOmoz has identified every one of my pages with this issue, but the explanation of the status simply states that canonical tags "indicate to search engines which URL should be seen as the original." So, it seems to me that if I turn this OFF on my site, I turn off the notice from SEOmoz, but do not have canonical tags on my site. Which way should I be doing this? THANK YOU.
Technical SEO | | mrbradleyferguson0 -
Do user metrics really mean anything?
This is a serious question, I'd also like some advice on my experience so far with the Panda. One of my websites, http://goo.gl/tFBA4 was hit on January 19th, it wasn't a massive hit, but took us from 25,000 to 21,000 uniques per day. It survived Panda completely prior. The only thing that had changed, was an upgrade in the CMS, which caused a lot of duplicate content, i.e 56 copies of the homepage, under various URLs. These were all indexed in Google. I've heard varying views, as to whether this could trigger Panda, I believe so, but i'd appreciate your thoughts on it. There was also the above the fold update on the 19th, but we have 1 ad MAX on each page, most pages have none. I hate even having to have 1 ad. I think we can safely assume it was Panda that did the damage. Jan 18th was the first Panda refresh, since we upgraded our CMS in mid-late December. As it was nothing more than a refresh, I feel it's safe to assume, that the website was hit, due to something that had changed on the website, between the Jan 18th refresh and the one previous. So, aside from fixing the bugs in the CMS, I felt now was a good time to put a massive focus on user metrics, I worked hard and continuing to spend a lot of time, improving them. Reduced bounce rate from 50% to 30% (extremely low in the niche) Average page views from 7 to 12 Average time on site from 5 to almost 8 minutes Plus created a mobile optimised version of the site Page loading speeds slashed. Not only did the above improvements have no positive effect, traffic continued to slide and we're now close to a massive 40% loss. Btw I realise neither mobile site nor page loading speeds are user metrics. I fully appreciate that my website is image heavy and thin on text, but that is an industry wide 'issue'. It's not an issue to my users, so it shouldn't be an issue to Google. Unlike our competitors, we actively encourage our users to add descriptions to their content and provide guidelines, to assit them in doing so. We have a strong relationship with our artists, as we listen to their needs and develop the website accordingly. Most of the results in the SERPs, contain content taken from my website, without my permission or permission of the artist. Rarely do they give any credit. If user metrics are so important, why on earth has my traffic continued to slide? Do you have any advice for me, on how I can further improve my chances of recovering from this? Fortunately, despite my artists download numbers being slashed in half, they've stuck by me and the website, which speaks volumes.
Technical SEO | | seo-wanna-bs0 -
More products the same description but different properties
I am beginner in seo. I have development a website in prestashop. I have a problem: for more product i have the same description, but different properties: for example : http://www.icentrale.ro/45-centrale-termice-cu-tiraj-fortat-schimbator-secundar-ferroli-divatop-f-24.html and http://www.icentrale.ro/46-centrale-termice-cu-tiraj-fortat-schimbator-secundar-ferroli-divatop-f-32.html Only diffrent for this is the properties from table. Its posible to be penalized the google for duplicate content? A solution for this issue?
Technical SEO | | vilcu0