Its posible to use Google Authorship in an online shop?
-
Today I installed Google Authorship in my Wordpress Blog and I would like to know if its posible to implement it in my Opencart online shop.
I am not interested in rich snippets because I have 9k of products and the 90% of them dont have sells nor reviews
-
Just a point here as I just ran into this thread on the "Bounty" QNA section. Google dropped authorship pictures just about the time this thread started, so being more visual will not have the same impact it once had
Honestly, I agree with EGOL, I think using authorship as a tactic on product pages when there is really nothing that you authored. That just sounds like an approach that will get you penalized down the road.
-
Wouldn't be worth doing, as it could have the opposite effect of what you are wanting. Instead install a markup plugin for opencart like this one.
I understand you don't have a lot of reviews, but if you can get the images of your products showing up, or even any kind of product data to make your links in the SERP stand out more, it couldnt hurt
-
I agree with Prestashop.
These will probably be skimpy pages. Matt Cutts recently said that authors with low quality content could see that work to their disadvantage.
-
It can be done yes, but it would not have a good effect. The reason being is that authorship was meant for articles and good content, not for just general content like thousands of product descriptions. Also, this is not the correct use of authorship, this is something that you would use publisher for instead.
If you ever had decided to blog or post articles any where else, you would be hurt as well. From the best that everyone understands right now Google ranks authors by the content and the sites they write on. Having loads of low quality content like the product pages will drive your authorship ranking in the ground and hurt the ranking of your blog posts I would bet.
As for the rich snippets, you can have product snippets without the review. I also would like to point out since open cart is template based all you have to do is add the snippets to the template page and it will work on all of the products.
-
Well, you said you didn't care about the richt snippets ;-). You've also read this: searchengineland.com/google-plays-authorship-search-results-dropping-profile-image-google-circle-count-195163 ?
-
Hi, its an advantage because I'll be more visual than my competitors and I'll get some traffic.
How could I do it? thank you very much for your reply
-
Of course it would be possible, but how do you see it as an advantage to the shop? The products don't have an author, right? So that would result in not using authorship for attributing post content to an author.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using PURL.org/GoodRelations for Schema Markup
Hello awesome MOZ community! Our agency uses JSON-LD for our local business schema markup. We validate our markup using Google's Structured Data Testing Tool. All good! Recently, I discovered a competing agency using our similar JSON-LD markup (that's ok) and "http://purl.org/goodrelations" markup. The latter appears to be–potentially–black hat SEO. Why? According to MOZ, "there is no conclusive evidence that this markup improves rankings." BUT, the purl.org markup has provided an opportunity for "keyword stuffing". Using purl.org markup, the agency has stuffed/used 66 of the same keywords into the validated markup. I would love to get feedback from the MOZ community. Can schema markup–of any kind–be used to "keyword stuff"? If so, why aren't sites getting penalized for this? Is this practice flying under the elusive algorithm radars? Thanks! Your feedback, insight, and snarky remarks are welcome 🙂 Cheers!
White Hat / Black Hat SEO | | SproutDigital0 -
How to remove this type of external link from Google
Hello, My website has been hacked few days Before, But after resolved it It is generating bad links, So i am Dis-vowing it , But as it is generating links like this,
White Hat / Black Hat SEO | | innovativekrishna1
http://domain.com/a></p><h1>DIXCEL HS-typeスリットディ
i am Not able to disavow it As it generating Spacing between. So my question is : Is there any Way to remove this Type of link from google???
If any body know Please Let me know, I need Do remove this As soon as possible,
please Help, Thank you0 -
Advice needed! How to clear a website of a Wordpress Spam Link Injection Google penalty?
Hi Guys, I am currently working on website that has been penalised by Google for a spam link injection. The website was hacked and 17,000 hidden links were injected. All the links have been removed and the site has subsequently been redesigned and re-built. That was the easy part 🙂 The problems comes when I look on Webmaster. Google is showing 1000's of internal spam links to the homepage and other pages within the site. These pages do not actually exist as they were cleared along with all the other spam links. I do believe though this is causing problems with the websites rankings. Certain pages are not ranking on Google and the homepage keyword rankings are fluctuating massively. I have reviewed the website's external links and these are all fine. Does anyone have any experience of this and can provide any recommendations / advice for clearing the site from Google penalty? Thanks, Duncan
White Hat / Black Hat SEO | | CayenneRed890 -
80% of traffic lost over night, Google Penalty?
Hi all.
White Hat / Black Hat SEO | | Hemjakt
I have a website called Hemjakt (http://www.hemjakt.se/) which is a search engine for real estate currently only available on the Swedish market. The application crawl real estate websites and collect all estates on a single searchable application. The site has been released for a few months and have seen a steady growth since release, increasing by 20% weekly up to ~900 visitors per day. 3 days ago, over night, I lost 80% of my traffic. Instead of 900 visitors per day I'm at ~100 visitors per day and when I search for long, specific queries such as "Åsgatan 15, Villa 12 rum i Alsike, Knivsta" ( <adress><house type=""><rooms><area> <city>), I'm now only found on the fifth page. I suspect that I have become a subject of a Google Penalty. How to get out of this mess?</city></rooms></house></adress> Just like all search engines or applications, I do crawl other websites and scrape their content. My content is ~90% unique from the source material and I do add user value by giving them the possibility to compare houses, get ton of more data to compare pricing and history, giving them extra functionalities that source site do not offer and so on. My analytics data show good user engagement. Here is one example of a Source page and a page at my site:
Source: http://www.hemnet.se/bostad/villa-12rum-alsike-knivsta-kommun-asgatan-15-6200964
My Site: http://www.hemjakt.se/bostad/55860-asgatan-15/ So: How do I actually confirm that this is the reason I lost my traffic? When I search for my branded query, I still get result. Also I'm still indexed by Google. If I am penalized. I'm not attempting to do anything Black Hat and I really believe that the app gives a lot of value to the users. What tweaks or suggestions do you have to changes of the application, to be able to continue running the service in a way that Google is fine with?0 -
Google places VS position one ranking above the places.
Hi Guys, Will creating a new Google places listing for a business have any effect their current position one spot for their major geo location keyword? I.e restaurants perth - say they are ranking no 1 above all the places listings if they set up a places listing would they lose that position and merge with all the other places accounts? Or would they have that listing as well as the places listing? I have been advised it could be detrimental to set up the places account if this is the case does anyone know any ways around this issue as the business really needs a places page for google maps etc. Appreciate some guidance Thanks. BC
White Hat / Black Hat SEO | | Bodie0 -
Passing page rank with frames - Is this within Google Guidelines?
It appears this site is gaming Google for better rankings. I haven't seen a site do it this before way before. Can you tell me what enables this to get such good rankings, and whether what they are doing is legitimate? The site is http://gorillamikes.com/ Earlier this year this site didn't show up in the rankings for terms like "Cincinnati tree removal" and"tree trimming Cincinnati" etc. The last few months they have been ranking #1 or #2 for these terms. The site has a huge disparity in MozRank (8, very low) vs. Page Rank (6, high). The only links to this page come from the BBB. However, when you look at the source code you find 100% of what is displayed on the site comes from a page on another site via a frame. The content is here: http://s87121255.onlinehome.us/hosting/gorillamikes/ When I go to onlinehome.us I'm redirected to http://www.1and1.com/. I'm only speculating, but my guess is onlinehome.us has a high page rank that it is passing to http://gorillamikes.com/, enabling Gorilla Mikes to achieve PR of 6. Does this make sense? In addition, the content is over optimized for the above terms (they use "Cincinnati (Cincinnat, OH)" in the first three H2 tags on the page. And all of the top menu links result in 404 errors. Are the tactics this site is using legitimate? It appears that everything they're doing is designed to improve search results, and not in ways that are helpful to users. What do you think?
White Hat / Black Hat SEO | | valkyrk0 -
Massive drop in Google traffic after upping pagecount 8-fold.
I run a book recommendation site -- Flashlight Worthy. It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage". It's been online for 4+ years. Historically, it's been made up of: a single home page ~50 "category" pages, and ~425 "book list" pages. (That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.) On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight. If an Author has more than one book on the site, the page shows every book they have on the site, such as this page: http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805 ..but the vast majority of these author pages have just one book listed, such as this page: http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116 Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries. And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google. (Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...) Here's the problem: For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable. And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today. And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem... So: 1. Do you think the drop is related to my upping my pagecount 8-fold overnight? 2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority). 3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors. What else? Thanks so much, help is very appreciated. Peter
White Hat / Black Hat SEO | | petestein1
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. 😉0 -
Link Building after Google updates!
Hello All, I just wanted to ask the question to start a discussion on link building after the Google Updates. I haven't been very proactive lately with regards to link building due to the updates and not wanting to get penalised! Are there any link building trends/techniques people are using since the changes? Thanks, seo_123
White Hat / Black Hat SEO | | TWPLC_seo0