If Google Authorship is used for every page of your website, will it be penalized?
-
Hey all,
I've noticed a lot of companies will implement Google Authorship on all pages of their website, ie landing pages, home pages, sub pages. I'm wondering if this will be penalized as it isn't a typical authored piece of content, like blogs, articles, press releases etc.
I'm curious as I'm going to setup Google Authorship and I don't want it to be setup incorrectly for the future. Is it okay to tie each page (home page, sub pages) and not just actual authored content (blogs, articles, press releases) or will it get penalized if that occurs?
Thanks and much appreciated!
-
I actually don't think it is alright to use Authorship or Publisher on every page and this is not what Google intends, check out their blog on this:
http://googlewebmastercentral.blogspot.co.uk/2013/08/relauthor-frequently-asked-advanced.html
Specifically they say "Authorship annotation is useful to searchers because it signals that a page conveys a real person’s perspective or analysis on a topic. Since property listings and product pages are less perspective/analysis oriented, we discourage using authorship in these cases. However, an article about products that provides helpful commentary, such as, “Camera X vs. Camera Y: Faceoff in the Arizona Desert” could have authorship."
So while at this time using Authorship on non-article, product pages is 'unlikely' to get you Google slapped, you are going against their direct advice - which often then gets put into the algorithm when they notice something being abused.
You are right many sites are using this on every page, and it will as of now give you an advantage, even in not in higher rankings then in a more visible results listing which may have a improved CTR. However, as I said once Google see this is being abused they will attempt to stop the practice and make sure it is used for rich content pages only.
Publisher is different in that they want it to be from ideally the homepage to the Business G+ page, they are both different things and Google treats them as separate.
Hope this helps - basically, if what you are doing on your site doesn't benefit your site user, then you are right to question it.
-
Yes, you are right on both counts. I think there will come a time when Google will display the brand icon in place of an image for pages that are marked up with rel=publisher. I can see that pulling through for the sites I manage when I plug them into the Rich Snippet testing tool. Google, however, is not yet displaying those images.
Good luck! It sounds like you've got a good idea of what pages should use what type of authorship.
Dana
-
Hey Dana,
Thanks for the response. So basically what saying is that if it isn't in the immediate authored content area (such as href="https://plus.google.com/104609087715575652977" rel="publisher"/> in the section) then it should be rel=publisher. However, rel=author should be on authored content like the blogs and articles and press releases. This would be shown as "Authored by [Individual Name]" linking the name to the personal Google+ profile, Right?
Also, in doing rel=author /rel=publisher, only "rel=author" translates into the Google+ head shot profile picture in search results, but rel="publisher" will list the company Google+ profile information?
Thanks again for all the responses!
-
There are times when rel=publisher is more approrpiate than rel=author, a product page on an e-commerce site for example. Will a site be penalized for establishing authorship on every page? Absolutely not. In fact, I think that is what Google is intending for people to do.
The problem right now is there is such mass confusion over rel=author and rel=publisher andhow to use them properly, that right now, you see lots of sites that should be using rel=publisher using rel=author instead. Because Google has done such a poor job of articulating how and where to implement these things, I can't imagine them penalizing sites for using one when they should be using the other. Although, I suppose strange things have happened.
I do think that the intention with authorship and also structured data markup is that Webmaster implement all the appropriate tags and markup on every page of their site.
Hope that's helpful!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
More or Less pages helps in SEO?
Hi all, I have gone through some articles where less pages are suggested and they claim that they will be favoured by Google. I'm not sure as with limited pages, we can only target limited keywords. There might be threat from Google in-terms of doorway pages for more pages. But one of our competitor has many pages like dedicated page for every keyword. And their website ranks high and good for all keywords. I can see three pages created with differnet phrases for same on keyword. If less pages are good, how come this works for our competitor? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Why would a blank page rank? What am I missing about this page?
In terms of content, this page is blank. Yes, there's a sidebar and footer, but no content. I've seen a page like this rank before. I'm curious if they're implementing something on the back-end I don't realize or if this is just a fluke? Etc. Also, the DA of the site is only a 15, so I don't think that's the reason. http://www.thenurselawyer.com/component/tags/tag/20-pasco-county-personal-injury-lawyers.html Thanks, Ruben
White Hat / Black Hat SEO | | KempRugeLawGroup1 -
How does Google handle product detail page links hiden in a <noscript>tag?</noscript>
Hello, During my research of our website I uncovered that our visible links to our product detail pages (PDP) from grid/list view category-nav/search pages are <nofollowed>and being sent through a click tracking redirect with the (PDP) appended as a URL query string. But included with each PDP link is a <noscript>tag containing the actual PDP link. When I confronted our 3rd party e-commerce category-nav/search provider about this approach here is the response I recieved:</p> <p style="padding-left: 30px;">The purpose of these links is to firstly allow us to reliably log the click and then secondly redirect the visitor to the target PDP.<br /> In addition to the visible links there is also an "invisible link" inside the no script tag. The noscript tag prevents showing of the a tag by normal browsers but is found and executed by bots during crawling of the page.<br /> Here a link to a blog post where an SEO proved this year that the noscript tag is not ignored by bots: <a href="http://www.theseotailor.com.au/blog/hiding-keywords-noscript-seo-experiment/" target="_blank">http://www.theseotailor.com.au/blog/hiding-keywords-noscript-seo-experiment/<br /> </a> <br /> So the visible links are not obfuscating the PDP URL they have it encoded as it otherwise cannot be passed along as a URL query string. The plain PDP URL is part of the noscript tag ensuring discover-ability of PDPs by bots.</p> <p>Does anyone have anything in addition to this one blog post, to substantiate the claim that hiding our links in a <noscript> tag are in fact within the SEO Best Practice standards set by Google, Bing, etc...? </p> <p>Do you think that this method skirts the fine line of grey hat tactics? Will google/bing eventually penalize us for this?</p> <p>Does anyone have a better suggestion on how our 3rd party provider could track those clicks without using a URL redirect & hiding the actual PDP link?</p> <p>All insights are welcome...Thanks!</p> <p>Jordan K.</p></noscript></nofollowed>
White Hat / Black Hat SEO | | eImprovement-SEO0 -
Trying to escape from Google algorithm ranking drop
in 2010 our website was ranking number 1 for many keywords. we suddenly saw a crash in this a few years ago. we have since identified we have been hit by many shades of Panda and penguin updates. Mainly due to low quality back-links and poor content (some duplicates). since then we have done a major overhaul of our backlink profile. We have saved rankings that went from number 1 for many keywords to number 60 -70. We are now placed at around 11 to 18 rankings. We have also looked at our duplicate content issues, and removed all duplicate content, introduced a blog for fresh bi daily updates in an attempt to gain traffic. We also amalgamated many small low quality pages to larger higher quality content pages. we are now mobile friendly with a dynamic site, and our site speed is good (around 80). we have switched to https, and also upgraded our website for better conversions. we have looked at the technical issues of the site and don't have many major issues, although we do have 404's coming up in the google webmaster tools for old pages we removed due to duplicate content. we are link building at a pace of around 40 mentions a month. some are no follow, some do follow and some no links. We are diversifying links to include branding in addition to target keywords. We have pretty much exhausted every avenue we can think of now, but we cannot jump over to page 1 for any significant keywords we are targeting. Our competitor websites are not that powerful, and metrics are similar to ours if not lower. 1. please can you advise anything else you can think of that we should look at. 2. we are even considering going to a new domain and 301'ing all pages to this domain in an attempt to shake off the algorithm filter (penalties). has anyone done this? how long can we expect to get at least the same ranking for the new domain if 301 all urls to it? do you think its worth it? we know the risk of doing this, and so wanted to seek some advice. 3. we have on the other hand considered the fact that we have disavowed so many links (70%) that this could be a cause of the page two problem, however we are link building according to moz metric standards and majestic standards with no benefit.. do you think we should increase link building? Advice is appreciated!
White Hat / Black Hat SEO | | Direct_Ram0 -
Unique page URLs and SEO titles
www.heartwavemedia.com / Wordpress / All in One SEO pack I understand Google values unique titles and content but I'm unclear as to the difference between changing the page url slug and the seo title. For example: I have an about page with the url "www.heartwavemedia.com/about" and the SEO title San Francisco Video Production | Heartwave Media | About I've noticed some of my competitors using url structures more like "www.competitor.com/san-francisco-video-production-about" Would it be wise to follow their lead? Will my landing page rank higher if each subsequent page uses similar keyword packed, long tail url? Or is that considered black hat? If advisable, would a url structure that includes "san-francisco-video-production-_____" be seen as being to similar even if it varies by one word at the end? Furthermore, will I be penalized for using similar SEO descriptions ie. "San Francisco Video Production | Heartwave Media | Portfolio" and San Francisco Video Production | Heartwave Media | Contact" or is the difference of one word "portfolio" and "contact" sufficient to read as unique? Finally...am I making any sense? Any and all thoughts appreciated...
White Hat / Black Hat SEO | | keeot0 -
Can one business operate under more than one website?
Is it possible for a business to rank organically for the same keyword multiple times with different web addresses? Say if I sell car keys and I wanted to rank for "buy new car keys" and I set up two different website say ibuycarkeys.com and carkeycity.com and then operate under both of these, would Google frown upon this?
White Hat / Black Hat SEO | | steve2150 -
Website Spam Backlinks Solution
I have been doing some back-link checking and found that 25% of the total back-links to my PR5 site are Spam and generated over the past 8 weeks. There are 189 links in total from 38 different domains and the anchor text is a combination of 'ugg boots for women' from TLDs in China, Russia and North Korea. The PR of these sites is 15 are n/a, 12 are 0 and the other 11 range between 1 - 6. More interestingly, all the links point to 1 single page on the domain. I have taken down that page now and wondering if I should 'disavow' the offending links in Google and Bing? Clearly with such a high % of my total links now being Spam, I want to be proactive so this does not hurt my rankings in search. If a Spambot is behind it then the issue is going to get worse moving forward. Any advice is welcome...
White Hat / Black Hat SEO | | Ubique0 -
Landing page for ppc
Is it okay to create a landing page with a different url to get additional traffic to my site with ppc? The purpose would not be for link building; I would only use it for direct marketing with ppc and people would click through to my main site via a no-follow link. Is there anything wrong with doing this?
White Hat / Black Hat SEO | | BradBorst0