How do you remove Authorship photos from your homepage?
-
Suppose you have a website with a blog on it, and you show a few recent blog posts on the homepage. Google see the headline + by Author Name and associates that user's Google+ profile.
This is great for the actual blog posts, but how do you prevent this from happening on the homepage or other blog roll page?
-
I have a similar issue. For whatever reason, Google has decided our CEO (Glen Kelman) is the 'author' of some of our site pages. There is no author markup on the page anywhere. In fact, our CEO's name isn't anywhere on the page. Yet, in SERPs, he is the 'author' of our Seattle market page (you can likely see it by searching for 'seattle real estate' and looking for Redfin in the results).
Glen is a prolific blogger who not only posts to the Redfin blog, but also guest blogs on high profile sites around the web so it stands to reason that Google is very 'familiar' with him as an author. Moreover, he lives in Seattle so maybe Google is thinking, "Glen is from Seattle...he's the CEO of Redfin...he's a prolific author...Glen + Seattle + Redfin + Author = Glen is the author of the Seattle market page on Redfin!"
Any ideas on how to stop Google from making this mistake?
-
Hi Tom, thanks for the response but that doesn't work.
There is no link to a Google+ profile on this page - the Author, though, is verified by the domain name and the page includes "by", causing this.
Any other thoughts?
-
Hi Stephen
Basically, all you need to do is make sure that the rel=author code is not in the tag of that page.
The code will look something like rel="author" href="https://plus.google.com/112656687930780652496"/> but obviously with the G+ profile URL that you are talking about.
If that code isn't on the page, then Google will not verify the page as marked by an author.
If you've gone a different way and linked by an actual URL on the page, like Name here - again all you need to do is just make sure that this link isn't present on the page and the authorship markup won't be attributed to that page.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
To remove or not remove a redirected page from index
We have a promotion landing page which earned some valuable inbound links. Now that the promotion is over, we have redirected this page to a current "evergreen" page. But in the search results page on Google, the original promotion landing page is still showing as a top result. When clicked, it properly redirects to the newer evergreen page. But, it's a bit problematic for the original promo page to show in the search results because the snippet mentions specifics of the promo which is no longer active. So, I'm wondering what would be the net impact of using the "removal request " tool for the original page in GSC. If we don't use that tool, what kind of timing might we expect before the original page drops out of the results in favor of the new redirected page? And if we do use the removal tool on the original page, will that negate what we are attempting to do by redirecting to the new page, with regard to preserving inbound link equity?
Intermediate & Advanced SEO | | seoelevated0 -
Website Indexing Issues - Search Bots will only crawl Homepage of Website, Help!
Hello Moz World, I am stuck on a problem, and wanted to get some insight. When I attempt to use Screaming Spider or SEO Powersuite, the software is only crawling the homepage of my website. I have 17 pages associated with the main domain i.e. example.com/home, example.com/sevices, etc. I've done a bit of investigating, and I have found that my client's website does not have Robot.txt file or a site map. However, under Google Search Console, all of my client's website pages have been indexed. My questions, Why is my software not crawling all of the pages associated with the website? If I integrate a Robot.txt file & sitemap will that resolve the issue? Thanks ahead of time for all of the great responses. B/R Will H.
Intermediate & Advanced SEO | | MarketingChimp100 -
To recover from Penguin update, shall i remove the links or disavow links?
Hi, One of our websites hit by Penguin update and I now know where the links are coming from. I have chance to remove the links from those incoming links but I am a little confused whether i should just remove the links from incoming links or disavow the links? Thanks
Intermediate & Advanced SEO | | Rubix0 -
Are disavowed links removed from the GWMT?
Hi, I am disavowing some links. Does anyone know if Google removes them from the WMT?
Intermediate & Advanced SEO | | BeytzNet
This is interesting for followup purposes. Thanks0 -
Removed Site-wide links
Hi there, I have recently removed quite a lot of site-wide links leaving the only link on homepage's of some websites, since doing this I have seen a dramatic drop on my keywords, going from position 2-3 to nowhere. Has anyone else experienced anything like this, should I expect to see a return on these keywords? Thanks
Intermediate & Advanced SEO | | Paul780 -
Forwarding Empty URLs to Homepage for SEO & Old Backlink Salvaging - Is there any value or risk?
Our company owns about 30 URLs that we aren't currently using. Is there any SEO value to be gained by forwarding these content-less URLs to our homepage if they aren't currently indexed by google? Some of these sites were previously in use at low traffic volumes by companies who licensed use of our brand and URL. After parting ways a year or longer in the past, no 301 redirection was done to save the link juice, so it's long gone at this point. However, there may be some sites on the net that are still linking to various pages on the URL. What would be the best course of action to salvage any value of these URLs until they are in use again as full websites? Insights would be greatly appreciated! Cheers, Justin
Intermediate & Advanced SEO | | grayline0 -
301 a page and then remove the 301
I have a real estate website that has a city hub page. All the homes for sale within a city are linked to from this hub page. Certain small cities may have one home on the market for a month and then not have any homes on the market for months or years. I call them "Ghost Cities". This problem happens across many cities at any point in time. The resulting city hub pages are left with little to no content. We are throwing around the idea of 301 redirecting these "Ghost City" pages to a page higher up in the hierarchy (Think state or county) until we get new homes for sale in the city. At that point we would remove the 301. Any thoughts on this strategy? Is it bad to turn 301s on and off like that? Thanks!
Intermediate & Advanced SEO | | ChrisKolmar0 -
Get-targeted homepage for users vs crawlers
Hello there! This is my first post here on SEOmoz. I'll get right into it then... My website is housingblock.com, and the homepage runs entirely off of geo-targeting the user's IP address to display the most relevant results immediately to them. Can potentially save them a search or three. That works great. However, when crawlers frequent the site, they are obviously being geo-targeted for their IP address, too. Google has come to the site via several different IP addresses, resulting in several different locations being displayed for it on the homepage (Mountain View, CA or Clearwater, MI are a couple). Now, this poses an issue because I'm worried that crawlers will not be able to properly index the homepage because the location, and ultimately all the content, keeps changing. And/or, we will be indexed for a specific location when we are in fact a national website (I do not want to have my homepage indexed/ranked under Mountain View, CA, or even worse, Clearwater, MI [no offence to any Clearwaterians out there]). Of course, my initial instinct is to create a separate landing page for the crawlers, but for obvious reasons, I am not going to do that (I did at one point, but quickly reverted back because I figured that was definitely not the route to go, long-term). Any ideas on the best way to approach this, while maintaining the geo-targeted approach for my users? I mean, isn't that what we're supposed to do? Give our users the most relevant content in the least amount of time? Seems that in doing so, I am improperly ranking my website in the eyes of the search engines. Thanks everybody! Marc
Intermediate & Advanced SEO | | THB0