If Google Authorship is used for every page of your website, will it be penalized?
-
Hey all,
I've noticed a lot of companies will implement Google Authorship on all pages of their website, ie landing pages, home pages, sub pages. I'm wondering if this will be penalized as it isn't a typical authored piece of content, like blogs, articles, press releases etc.
I'm curious as I'm going to setup Google Authorship and I don't want it to be setup incorrectly for the future. Is it okay to tie each page (home page, sub pages) and not just actual authored content (blogs, articles, press releases) or will it get penalized if that occurs?
Thanks and much appreciated!
-
I actually don't think it is alright to use Authorship or Publisher on every page and this is not what Google intends, check out their blog on this:
http://googlewebmastercentral.blogspot.co.uk/2013/08/relauthor-frequently-asked-advanced.html
Specifically they say "Authorship annotation is useful to searchers because it signals that a page conveys a real person’s perspective or analysis on a topic. Since property listings and product pages are less perspective/analysis oriented, we discourage using authorship in these cases. However, an article about products that provides helpful commentary, such as, “Camera X vs. Camera Y: Faceoff in the Arizona Desert” could have authorship."
So while at this time using Authorship on non-article, product pages is 'unlikely' to get you Google slapped, you are going against their direct advice - which often then gets put into the algorithm when they notice something being abused.
You are right many sites are using this on every page, and it will as of now give you an advantage, even in not in higher rankings then in a more visible results listing which may have a improved CTR. However, as I said once Google see this is being abused they will attempt to stop the practice and make sure it is used for rich content pages only.
Publisher is different in that they want it to be from ideally the homepage to the Business G+ page, they are both different things and Google treats them as separate.
Hope this helps - basically, if what you are doing on your site doesn't benefit your site user, then you are right to question it.
-
Yes, you are right on both counts. I think there will come a time when Google will display the brand icon in place of an image for pages that are marked up with rel=publisher. I can see that pulling through for the sites I manage when I plug them into the Rich Snippet testing tool. Google, however, is not yet displaying those images.
Good luck! It sounds like you've got a good idea of what pages should use what type of authorship.
Dana
-
Hey Dana,
Thanks for the response. So basically what saying is that if it isn't in the immediate authored content area (such as href="https://plus.google.com/104609087715575652977" rel="publisher"/> in the section) then it should be rel=publisher. However, rel=author should be on authored content like the blogs and articles and press releases. This would be shown as "Authored by [Individual Name]" linking the name to the personal Google+ profile, Right?
Also, in doing rel=author /rel=publisher, only "rel=author" translates into the Google+ head shot profile picture in search results, but rel="publisher" will list the company Google+ profile information?
Thanks again for all the responses!
-
There are times when rel=publisher is more approrpiate than rel=author, a product page on an e-commerce site for example. Will a site be penalized for establishing authorship on every page? Absolutely not. In fact, I think that is what Google is intending for people to do.
The problem right now is there is such mass confusion over rel=author and rel=publisher andhow to use them properly, that right now, you see lots of sites that should be using rel=publisher using rel=author instead. Because Google has done such a poor job of articulating how and where to implement these things, I can't imagine them penalizing sites for using one when they should be using the other. Although, I suppose strange things have happened.
I do think that the intention with authorship and also structured data markup is that Webmaster implement all the appropriate tags and markup on every page of their site.
Hope that's helpful!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content site not penalized
Was reviewing a site, www.adspecialtyproductscatalog.com, and noted that even though there are over 50,000 total issues found by automated crawls, including 3000 pages with duplicate titles and 6,000 with duplicate content this site still ranks high for primary keywords. The same essay's worth of content is pasted at the bottom of every single page. What gives, Google?
White Hat / Black Hat SEO | | KenSchaefer0 -
Multiple E-commerce website
Following is a scenario where we plan to have a single database and different sites pulling product information from this. There will be a primary site with all the products listed and then there will be other category based website with the same products. All transactions will happen on respective website. The common factor will be products and its information. Our question is should we have different item numbers for the same product listed on two websites or they can be the same.?
White Hat / Black Hat SEO | | promodirect
e.g.
Website A: Product - Blue Shoes and item number '123'
Product page url will be: websitea.com/blueshoes-123.html Website B: Product - Blue Shoes and item number '123' or should the item# should be unique e.g. 'B123'
Product page url will be: websiteb.com/blueshoes-123.html
or
If item number is unique the product page url will be: websiteb.com/blueshoes-B123.html Please advise what is the best way forward.0 -
Steps to Improve Page Rank and Domain Authority
Hi If the quality of web pages is very good I believe rankings should reflect this. My domain is at least 3 years old.
White Hat / Black Hat SEO | | SEOguy1
What steps would you recommend short term and long term in terms of how to improve page rank and domain authority? Thanks.0 -
How to deal with link echoes of former hacked websites?
Hi all, I'd know which is the best way to deal with link echoes of former hacked websites that Webmaster tool reports. to clarify: when you download the backlink report from Webmaster tool you'll have a list of backlinks discovered, but if you follow one of those links you will see that on that page there is no link to your website. the source code is also clean, no hidden links or other dodgy technique. Since that the topic is usually miles away from my industry I have to assume at some point that site has been hacked by a spammer who placed that backlink. In this case what should I do? Ignore it, disavow the domain or what? Moreover, which is the best procedure when you have to face a site which points a lot of backlinks from only its sub-domains? For example: this dodgy spammy website : http://px949z32.com/ is apparently a desert, but when you do site:http://px949z32.com/ you'll discover 55,200 results! Would be it be enough to just disavow the root domain http://px949z32.com/?
White Hat / Black Hat SEO | | madcow78
As I don't want to wait too long before taking any action, my plan is to disavow all those domains without any mercy, although I can't find a current backlink in one of their pages. I will do this, as at the minute my concern is they will be hacked again and I have to face the same issue again and again Thanks to all, P.0 -
Can I just delete pages to get rid of bad back-links to those pages?
I just picked up a client who had built a large set of landing pages (1000+) and built a huge amount of spammy links to them (too many to even consider manually requesting deletion for from the respective webmasters). We now think that google may also be seeing the 'landing pages' as 'doorway pages' as there are so many of them 1000+ and they are all optimized for specific keywords and generally pretty low quality. Also, the client received an unnatural links found email from google. I'm going to download the links discovered by google around the date of that email and check out if there are any that look specifily bad but I'm sure it will be just one of the several thosand bad links they built. Anyway, they are now wanting to clean up their act and are considering deleting the landing/doorway pages in a hope to a. rank better for the other non landing/doorway pages (Ie category and sub cats) but more to the crux of my question.. b. essentially get rid of all the 1000s of bad links that were built to those landing/doorway pages. - will this work? if we just remove those pages and use 404 or 410 codes will google see any inbound (external) links to those pages as basicly no longer being links to the site? or is the TLD still likely to be penilized for all the bad links coming into no longer existing URLs on it? Also, any thoughts on whether a 404 or 410 would be better is appreciated. Some info on that here: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=64033 I guess another option is the disavow feature with google, but Matt Cutts video here: http://www.youtube.com/watch?v=393nmCYFRtA&feature=em- kind of makes it sound like this should just be used for a few links, not 1000s... Thanks so much!!!!
White Hat / Black Hat SEO | | zingseo0 -
Google Sitemaps & punishment for bad URLS?
Hoping y'all have some input here. This is along story, but I'll boil it down: Site X bought the url of Site Y. 301 redirects were added to direct traffic (and help transfer linkjuice) from urls in Site X to relevant urls in Site Y, but 2 days before a "change of address" notice was submitted in Google Webmaster Tools, an auto-generating sitemap somehow applied urls from Site Y to the sitemap of Site X, so essentially the sitemap contained urls that were not the url of Site X. Is there any documentation out there that Google would punish Site X for having essentially unrelated urls in its sitemap by downgrading organic search rankings because it may view that mistake as black hat (or otherwise evil) tactics? I suspect this because the site continues to rank well organically in Yahoo & Bing, yet is nonexistent on Google suddenly. Thoughts?
White Hat / Black Hat SEO | | RUNNERagency0 -
Google Sand boxed?
Since early March I have been slowing moving up the SERP for my site http://amplereviews.com/. At around the end of March I have reached the top 5 rankings for every keyword I had targeted. Maybe a week or so later the keywords I have been targeting disappeared from the rankings. Now I am here today stuck in the ~600s for at least 2 weeks. So have I been sand boxed? And If so what should I do? PS. My rankings on Yahoo and Bing are still in their usual range. Domain is 3 months old.
White Hat / Black Hat SEO | | Blaze4Fire0 -
How do I find out if a competitor is using black hat methods and what can I do about it?
A competitor of mine has appeared out of nowhere with various different websites targetting slightly different keywords but all are in the same industry. They don't have as many links as me, the site structure and code is truly awful (multiple H1's on same page, tables for non-tabular data etc...) yet they outperform mine and many of my other competitors. It's a long story but I know someone who knows the people who run these sites and from what I can gather they are using black hat techniques. But that is all I know and I would like to find out more so I can report them.
White Hat / Black Hat SEO | | kevin11