Does integration of external supplemenatry data help or hurt regarding googles perception of content quality? (e.g weather info, climate table, population info, currency exchange data via API or open source databases)
-
We just lost over 20% traffic after google algo update at June 26.
In SEO forums people guess that there was likely a Phantom update or maybe a Panda update.The most common advice I found was adding more unique content. While we have already unique proprietary content on all our pages and we plan to add more, I was also considering to add some content from external sources. Our site is travel related so I thought about adding for each city page external data such as weather, climate data, currency exchange data via APIs from external sources and also some data such as population from open source databases or some statistical info we would search on the web.
I believe this data would be useful to the visitors. I understand that purely own content would be ideal and we will work on this as well.
Any thoughts? Do you think the external data may rather help or hurt how google perceives content quality?
-
Everett, thanks so much. Also the link for the quality rater guidelines was very interesting and useful.
-
iCourse,
It used to be that Google told their Quality Raters to look for "Supplementary Content". This has recently been removed from their Handbook for Quality Raters, and you can learn more about it here: http://www.thesempost.com/updated-google-quality-rater-guidelines-eat/ .
That said, they probably removed it because people were showing unrelated supplementary content, or because QRs were marking pages with lots of supplementary content and very little unique body content as "High Quality", which they are not.
In your case, all of the ideas you presented sounded like useful added information for someone on a local vacation or real estate page.
-
Hi Patrick, thanks these are very useful links for an audit. Also the Barracuda tool is great.
In our case we are already quite confident that our focus should be adding more content to our about 1000 city category pages.
My core doubt right now is really: Shall I as a quick first step add now to the city pages the mentioned data from external sources or may it rather hurt in the eyes of google. For visitors it would be useful. -
Hi there
What I would take a look at the algorithm updates and line up your analytics with the dates. Barracuda actually has a great tool to make this easy on you. Note what pages dropped the most. From there, I would look the following resources:
- How To Do a Content Audit (Moz)
- Link Audit Guide for Effective Link Removals & Risk Mitigation (Moz)
I am not so much worried about tools and plugins (as long as they are credible and you're not abusing them) as much as I am that usually travel sites that have to cover a lot of cities using the same content simply switching city names out. I would review duplicate content best practices and make sure you're not inadvertently abusing this tactic.
Let me know if this helps, happy to help where I can! Good luck!
Patrick
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Maps Integration Dynamic url
We are integrating Google Maps into a search feature on a website. Would you use the standard dynamic generated long url that appears after a search or find a way of reducing this to a shorter url. Taking into account hundreds of results. Question asked for seo purposes.
Intermediate & Advanced SEO | | jazavide0 -
Divs Vs Table for styled data\
Hello, We're in the process of launching MultipleSclerosis.net and are a bit confused with how to present some specific information. Looking at pages such as http://multiplesclerosis.net/symptoms/, http://multiplesclerosis.net/what-is-ms/courses-patterns/ and http://multiplesclerosis.net/treatment/prescription-nonprescription-medications/ is it better to keep this data structured as divs, and style them as tables or to keep them as tables and style them accordingly. Though not technically "tabular" data, i'm not too sure how to handle this. The text to code ratio is quite high with the divs in the markup, which though i'm not overly worried about, it could cause some issues with the site's indexability. Thanks I appreciate any feedback.
Intermediate & Advanced SEO | | OlivierChateau0 -
How a press release can help with Google serp?
Hi, After publishing a press release, if that press release is on top position on Google News for the keyword, how will it effect the SERP for that website?
Intermediate & Advanced SEO | | purplar0 -
Quality Content - Tick - So now comes the link building, who is on your checklist?
Hi We've spent a month putting together a really complete and awesome video guide, it really is a beautiful and useful thing and is hands down the best bit of content online on this subject. - Great... but Getting the eyeballs and getting the links? Related Bloggers and Influencers.... Egobait (anyone who is in the video) Customers (inform those who already use our site) But who am I missing, this is our first guide, and a months work deserves a really good push, how else can I push this content for a) links and b) views. Whats your checklist? All suggestions welcome, I will revert with success of this afterwards.
Intermediate & Advanced SEO | | xoffie0 -
Bi-Lingual Site: Lack of Translated Content & Duplicate Content
One of our clients has a blog with an English and Spanish version of every blog post. It's in WordPress and we're using the Q-Translate plugin. The problem is that my company is publishing blog posts in English only. The client is then responsible for having the piece translated, at which point we can add the translation to the blog. So the process is working like this: We add the post in English. We literally copy the exact same English content to the Spanish version, to serve as a placeholder until it's translated by the client. (*Question on this below) We give the Spanish page a placeholder title tag, so at least the title tags will not be duplicate in the mean time. We publish. Two pages go live with the exact same content and different title tags. A week or more later, we get the translated version of the post, and add that as the Spanish version, updating the content, links, and meta data. Our posts typically get indexed very quickly, so I'm worried that this is creating a duplicate content issue. What do you think? What we're noticing is that growth in search traffic is much flatter than it usually is after the first month of a new client blog. I'm looking for any suggestions and advice to make this process more successful for the client. *Would it be better to leave the Spanish page blank? Or add a sentence like: "This post is only available in English" with a link to the English version? Additionally, if you know of a relatively inexpensive but high-quality translation service that can turn these translations around quicker than my client can, I would love to hear about it. Thanks! David
Intermediate & Advanced SEO | | djreich0 -
Duplicate content via dynamic URLs where difference is only parameter order?
I have a question about the order of parameters in an URL versus duplicate content issues. The URLs would be identical if the parameter order was the same. E.g.
Intermediate & Advanced SEO | | anthematic
www.example.com/page.php?color=red&size=large&gender=male versus
www.example.com/page.php?gender=male&size=large&color=red How smart is Google at consolidating these, and do these consolidated pages incur any penalty (is their combined “weight” equal to their individual selves)? Does Google really see these two pages as DISTINCT, or does it recognize that they are the same because they have the exact same parameters? Is this worth fixing in or does it have a trivial impact? If we have to fix it and can't change our CMS, should we set a preferred, canonical order for these URLs or 301 redirect from one version to the other? Thanks a million!0 -
Thanks for the help!
I just wanted to say thank you all for the advice you've given on this board. When I first joined we had been stomped by Panda three times and our traffic was down about 40% from last year. Since then, we've followed recommendations here and while we aren't quite back to where we were before, our traffic for the last three days from Google is higher than any time since May and trending higher. We are also up in Google rank for 70% of our tracked keywords and showing up for 15% that we weren't even on the radar for in August. We still have a lot of work to do but know that we are on the right track. We can now do the same on a site that survived the initial hit but got slapped in July. I just want to reiterate what others have said: Get rid of duplicate urls - 301 redirect all dups to a single page Flesh out or drop pages with low content value Find 404 pages that used to exist and instead of 404s, 301 redirect to current resources Fix speed issues This forum was well worth the subscription.
Intermediate & Advanced SEO | | IanTheScot1