If Google Authorship is used for every page of your website, will it be penalized?
-
Hey all,
I've noticed a lot of companies will implement Google Authorship on all pages of their website, ie landing pages, home pages, sub pages. I'm wondering if this will be penalized as it isn't a typical authored piece of content, like blogs, articles, press releases etc.
I'm curious as I'm going to setup Google Authorship and I don't want it to be setup incorrectly for the future. Is it okay to tie each page (home page, sub pages) and not just actual authored content (blogs, articles, press releases) or will it get penalized if that occurs?
Thanks and much appreciated!
-
I actually don't think it is alright to use Authorship or Publisher on every page and this is not what Google intends, check out their blog on this:
http://googlewebmastercentral.blogspot.co.uk/2013/08/relauthor-frequently-asked-advanced.html
Specifically they say "Authorship annotation is useful to searchers because it signals that a page conveys a real person’s perspective or analysis on a topic. Since property listings and product pages are less perspective/analysis oriented, we discourage using authorship in these cases. However, an article about products that provides helpful commentary, such as, “Camera X vs. Camera Y: Faceoff in the Arizona Desert” could have authorship."
So while at this time using Authorship on non-article, product pages is 'unlikely' to get you Google slapped, you are going against their direct advice - which often then gets put into the algorithm when they notice something being abused.
You are right many sites are using this on every page, and it will as of now give you an advantage, even in not in higher rankings then in a more visible results listing which may have a improved CTR. However, as I said once Google see this is being abused they will attempt to stop the practice and make sure it is used for rich content pages only.
Publisher is different in that they want it to be from ideally the homepage to the Business G+ page, they are both different things and Google treats them as separate.
Hope this helps - basically, if what you are doing on your site doesn't benefit your site user, then you are right to question it.
-
Yes, you are right on both counts. I think there will come a time when Google will display the brand icon in place of an image for pages that are marked up with rel=publisher. I can see that pulling through for the sites I manage when I plug them into the Rich Snippet testing tool. Google, however, is not yet displaying those images.
Good luck! It sounds like you've got a good idea of what pages should use what type of authorship.
Dana
-
Hey Dana,
Thanks for the response. So basically what saying is that if it isn't in the immediate authored content area (such as href="https://plus.google.com/104609087715575652977" rel="publisher"/> in the section) then it should be rel=publisher. However, rel=author should be on authored content like the blogs and articles and press releases. This would be shown as "Authored by [Individual Name]" linking the name to the personal Google+ profile, Right?
Also, in doing rel=author /rel=publisher, only "rel=author" translates into the Google+ head shot profile picture in search results, but rel="publisher" will list the company Google+ profile information?
Thanks again for all the responses!
-
There are times when rel=publisher is more approrpiate than rel=author, a product page on an e-commerce site for example. Will a site be penalized for establishing authorship on every page? Absolutely not. In fact, I think that is what Google is intending for people to do.
The problem right now is there is such mass confusion over rel=author and rel=publisher andhow to use them properly, that right now, you see lots of sites that should be using rel=publisher using rel=author instead. Because Google has done such a poor job of articulating how and where to implement these things, I can't imagine them penalizing sites for using one when they should be using the other. Although, I suppose strange things have happened.
I do think that the intention with authorship and also structured data markup is that Webmaster implement all the appropriate tags and markup on every page of their site.
Hope that's helpful!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Third part http links on the page source: Social engineering content warning from Google
Hi, We have received "Social engineering content" warning from Google and one of our important page and it's internal pages have been flagged as "Deceptive site ahead". We wonder what's the reason behind this as Google didn't point exactly to the specific part of the page which made us look so to the Google. We don't employ any such content on the page and the content is same for many months. As our site is WP hosted, we used a WordPress plugin for this page's layout which injected 2 http (non-https) links in our page code. We suspect if this is the reason behind this? Any ideas? Thanks
White Hat / Black Hat SEO | | vtmoz1 -
Website penalized never again be the same
In February 2015 I received google email that had been penalized Superficial content with little or no added value. I resolved the situation with Google and the site was reconsidered two months later. The problem happens that since I had to drop the site never again be the same, since the site has been penalized never again be the same, now owned only 10% of visits and since then has not shown more growth. I'm deciding to leave the site for no more hopes and all who have had the same problem told me to forget about and working with new. What do you think? Give up the site and get a new one? In addition, during this period I rephrased the entire site, let responsive, mobile and improved as a whole in the general context and migrated to wordpress. www.acervoamador.com.br (Warning: adult content) I thank you for your attention and have a nice day.
White Hat / Black Hat SEO | | stroke0 -
Fix Bad Links in Google
I have a client who had some grey hat SEO done in the past. Some of their back links aren't from the best neighborhoods. Google didn't seem to mind until 9/28, when they literally disappeared for all searches except for their domain name. Google still has their site indexed, but it's just not showing up. There are no messages in Webmaster Tools. I know Bing has the tool where you can disavow bad links and ask them to discount them. Google doesn't have such a tool, but what is the strategy when you don't have control over the link sources, such as in blog comments? Could this update have been a delayed Penguin ranking change from the latest Penguin Update on the 18th? http://www.seomoz.org/google-algorithm-change Any advice would be greatly appreciated. Thanks, Tom
White Hat / Black Hat SEO | | TomBristol0 -
HOW TO: City Targeted Landing Pages For Lead Generation
Hi guys, So one of my clients runs a web development agency in San Diego and for lead generation purposes we are thinking of creating him city targeted landing pages which will all be on different domains ie. lawebdesginstudio / sfwebdesigngurus I plan to register these 20-30 domains for my client and load them all up on a my single linux server I have from godaddy. I noticed however today using google's keyword tool that roughly only 5-10 cities have real traffic worth trying to capture to turn into leads. Therefore I am not sure if its even worth building those extra 20 landing pages since they will receive very little traffic. My only thought is, if I do decide to build all 30 landing pages, then I assume I will have a very strong private network of authority websites that I can use to point to the clients website. I mean I figure I can rank almost all of them page 1 top 5 within 2-3 months. My question is: 1. Do city targeted micro sites for the purpose of lead generation still work? If so are there any threads that have more info on this topic? 2. Do you suggest I interlink all 30 sites together and perhaps point them all to the money site? If so i'm wondering if I should diversify the ip's that I used to register the domains as well as the whois info. Thanks guys, all help is appreciated!
White Hat / Black Hat SEO | | AM2130 -
Link farming and related websites
In my niche I have about 17 sites I have created. They all provide unique content, html, and all have a variety of uses that differ from each other mostly, some repetition but not really. All these sites are related to the same niche. I do link to each other in my sites. I don't go crazy and link every site to every other site or span links on footers. I somewhere in the content link here to there. Not even consistent, just linking to related pages from others. I was wondering if this is something I need to be careful about or could I get hit with link farming?
White Hat / Black Hat SEO | | cbielich0 -
Google Sand boxed?
Since early March I have been slowing moving up the SERP for my site http://amplereviews.com/. At around the end of March I have reached the top 5 rankings for every keyword I had targeted. Maybe a week or so later the keywords I have been targeting disappeared from the rankings. Now I am here today stuck in the ~600s for at least 2 weeks. So have I been sand boxed? And If so what should I do? PS. My rankings on Yahoo and Bing are still in their usual range. Domain is 3 months old.
White Hat / Black Hat SEO | | Blaze4Fire0 -
Switching prices for google base
We would like to be able to submit lower prices to google than we do to other sources. How i see it working is that at the end of each url we submit to google base there is a tracking code (source=googlebase). When a user visits the site via one of these urls we would knock 10% of the price of that item and store the item in a cookie to ensure that the price of that item, for that user would stay at the low price for 24 hours. My question is whether google would have a problem with us doing this? The second part of my question is whether they check the full url including the query strings? If theyt just checked the canocial URL they would see a price thats 10% higher than the one we submitted to base - which, of course - would be bad
White Hat / Black Hat SEO | | supermarketonline0 -
What are the biggest optimization factors for Google Places?
I know some of the basic factors to rank better on Google Places, but I'm looking to see where the priority is and if there are negative factors?
White Hat / Black Hat SEO | | anchorwave0