For Google + purposes, should the author's name appear in the Meta description or title tag of my web site just as you would your key search phrase?
-
Relative to Cyrus Shepard's article on January 4th regarding Google's Superior SEO strategy, if I'm the primary author of all blog articles and web site content, and I have a link showing authorship going back to Google Plus, is a site wide link from the home page enough or should that show up on all blog posts etc and editorial comment pages etc? Conversely, should the author's name appear in the Meta description or title tag of my web site just as you would your key search phrase since Google appears to be trying to make a solid connection with my name, and all content?
-
Hi Lowell,
To add to what Lonnie said, not long ago Google changed the instructions for adding authorship markup to your content. You can find the instructions here:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1408986
Adding your name to the meta description or title tag isn't necessary for authorship markup. But doing so is fine if you feel it will help your personal branding.
Just a note - Google's use of authorship is still in the early stages. It's unclear that following their instructions will have any impact on rankings or relevancy. That said, most SEOs that I know are adding this markup because even if Google isn't using it now, the undoubtedly have plans to use it in the future.
-
Google recently introduced a new link attribute called "rel=author". This attribute allows you to tell Google who you are as an author and what articles you write. Google has indicated that they believe the authority of an author may even be weighted more heavily than traditional on page metrics, like page or domain authority. As Matt Cutts stated at SMX West, “The concept is that if an author is trustworthy, why does it matter what site the article appears on?”. Author authority also has implications for the impending Panda 2.2 update, which will affect the sites that steal content from other sites to post on their own. If Google sees the same article on 10 different sites, and 1 of those sites clearly identifies an author, marked up with the "rel=author" attribute, your site will get all the link juice.
For custom CMS, implement "rel=author" for blog posts and "rel=me" for author bio pages
Watch for "rel=author" plugins for major CMS like Wordpress, Joomla, and Drupal
Add "rel=me" to all external guest post links on your author bio pageYou can pick up a great plugin called Google Authorship Widget @ wordpress.org
or visit the plugin maker @ digitalfair.tk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No: 'noindex' detected in 'robots' meta tag
I'm getting an error in Search Console that pages on my site show No: 'noindex' detected in 'robots' meta tag. However, when I inspect the pages html, it does not show noindex. In fact, it shows index, follow. Majority of pages show the error and are not indexed by Google...Not sure why this is happening. Unfortunately I can't post images on here but I've linked some url's below. The page below in search console shows the error above... https://mixeddigitaleduconsulting.com/ As does this one. https://mixeddigitaleduconsulting.com/independent-school-marketing-communications/ However, this page does not have the error and is indexed by Google. The meta robots tag looks identical. https://mixeddigitaleduconsulting.com/blog/leadership-team/jill-goodman/ Any and all help is appreciated.
Technical SEO | | Sean_White_Consult0 -
Will a Robots.txt 'disallow' of a directory, keep Google from seeing 301 redirects for pages/files within the directory?
Hi- I have a client that had thousands of dynamic php pages indexed by Google that shouldn't have been. He has since blocked these php pages via robots.txt disallow. Unfortunately, many of those php pages were linked to by high quality sites mulitiple times (instead of the static urls) before he put up the php 'disallow'. If we create 301 redirects for some of these php URLs that area still showing high value backlinks and send them to the correct static URLs, will Google even see these 301 redirects and pass link value to the proper static URLs? Or will the robots.txt keep Google away and we lose all these high quality backlinks? I guess the same question applies if we use the canonical tag instead of the 301. Will the robots.txt keep Google from seeing the canonical tags on the php pages? Thanks very much, V
Technical SEO | | Voodak0 -
Why can't I redirect 302 errors to 301's?
I've been advised by IT that due to the structure of our website (they don't use sub-folders) it's not possible to change 302's to 301's. Is this correct, or am I being fobbed off?
Technical SEO | | lindsaytuerena0 -
Amazon Product Descriptions and our website's product descriptions
I am updating our product descriptions site-wide. I wanted to also update our amazon listings for those same products. Is that considered duplicate content if it would be on amazon and our site? Is there any reason why I wouldn't want to do that? Is google product ads also a problem?
Technical SEO | | EcomLkwd0 -
Does using Google Loader's ClientLocation API to serve different content based on region hurt SEO?
Does using Google Loader's ClientLocation API to serve different content based on region hurt SEO? Is there a better way to do what I'm trying to do?
Technical SEO | | Ocularis0 -
Why has Google removed meta descriptions from SERPS?
One of my clients' sites has just been redesigned with lots of new URLs added. So the 301 redirections have been put in place and most of the new URLs have now been indexed. BUT Google is still showing all the old URLs in the SERPS and even worse it only displays the title tag. The meta description is not shown, no rich snippet, no text, nothing below the title. This is proving disastrous as visitors are not clicking on a result with no description. I have to assume its got something to do with the redirection, but why is it not showing the descriptions? I've checked the old URLs and he meta description is definitely still in the code, but Google is choosing not to show it. I've never seen this before so I'm struggling for an answer. I'd like to know why or how this is happening, and if it can be resolved. I realise that this may be resolved when Google stops showing all the old URLs but there's no telling how long that will take (can it be speeded up?)
Technical SEO | | Websensejim0 -
What's the best way to switch over to a new site with a different CMS?
Is it better to 301 or to closely duplicate each page URL when switching over to a new website from an established site with good ranking and a different CMS ( Drupal switching to Wordpress)?
Technical SEO | | OhYeahSteve0 -
How much of an issue is it if a site is somehow connected to a site that was penalized by Google?
I am working with someone that is about to launch a new site, and one of the sites was affected by the Panda update. Does it matter if the two sites are connected? Share the same hosting provider and same Google Webmaster's account?
Technical SEO | | nicole.healthline0