Does integration of external supplemenatry data help or hurt regarding googles perception of content quality? (e.g weather info, climate table, population info, currency exchange data via API or open source databases)
-
We just lost over 20% traffic after google algo update at June 26.
In SEO forums people guess that there was likely a Phantom update or maybe a Panda update.The most common advice I found was adding more unique content. While we have already unique proprietary content on all our pages and we plan to add more, I was also considering to add some content from external sources. Our site is travel related so I thought about adding for each city page external data such as weather, climate data, currency exchange data via APIs from external sources and also some data such as population from open source databases or some statistical info we would search on the web.
I believe this data would be useful to the visitors. I understand that purely own content would be ideal and we will work on this as well.
Any thoughts? Do you think the external data may rather help or hurt how google perceives content quality?
-
Everett, thanks so much. Also the link for the quality rater guidelines was very interesting and useful.
-
iCourse,
It used to be that Google told their Quality Raters to look for "Supplementary Content". This has recently been removed from their Handbook for Quality Raters, and you can learn more about it here: http://www.thesempost.com/updated-google-quality-rater-guidelines-eat/ .
That said, they probably removed it because people were showing unrelated supplementary content, or because QRs were marking pages with lots of supplementary content and very little unique body content as "High Quality", which they are not.
In your case, all of the ideas you presented sounded like useful added information for someone on a local vacation or real estate page.
-
Hi Patrick, thanks these are very useful links for an audit. Also the Barracuda tool is great.
In our case we are already quite confident that our focus should be adding more content to our about 1000 city category pages.
My core doubt right now is really: Shall I as a quick first step add now to the city pages the mentioned data from external sources or may it rather hurt in the eyes of google. For visitors it would be useful. -
Hi there
What I would take a look at the algorithm updates and line up your analytics with the dates. Barracuda actually has a great tool to make this easy on you. Note what pages dropped the most. From there, I would look the following resources:
- How To Do a Content Audit (Moz)
- Link Audit Guide for Effective Link Removals & Risk Mitigation (Moz)
I am not so much worried about tools and plugins (as long as they are credible and you're not abusing them) as much as I am that usually travel sites that have to cover a lot of cities using the same content simply switching city names out. I would review duplicate content best practices and make sure you're not inadvertently abusing this tactic.
Let me know if this helps, happy to help where I can! Good luck!
Patrick
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How search engines look at collapse content in mobile while on desktop it open by default?
Hello everyone!
Intermediate & Advanced SEO | | Roi_Bar
To have a mobile friendly UX we chose to collapse some of the page content.
On the desktop it is in open mode by default and user can see the whole content.
Does the search engines see the content even if it's collapse? is the collapse mode on the mobile only can hurt us with SERP ranking? okgF0pX 1LU6utU1 -
Reporting Webspam to Google
We are in ecommerce, and there are a few review sites that are dominating the rankings for our products. The sites are very good - very well written content (2000+ words) and visually appealing sites. The 2 main culprits are clearly black hat. One site's backlinks are pure spam, and the other is buying footer and sidebar links. Will ratting them to Google have any impact? If not, any suggestions on how to compete? Our competing pages are product descriptions, and creating a 2000 word product description seems inappropriate. Also, all of these products are brand new, and due to extensive media spends, the search volume is very high. Since they are beating us to the punch by getting good content posted first, they are proving difficult to displace.
Intermediate & Advanced SEO | | AMHC0 -
Is there a way to make Google realize/detect scraper content?
Good morning,Theory states that duplicated content reduces certain keywords’ position in Google. It also says that a web who copy content will be penalized. Furthermore, we have spam report tools and the scraper report to inform against these bad practices.In my case: the website, both, sells content to other sites and write and prepare its own content which is not in sale. However, other sites copy these last ones, publish them and Google do not penalize their position in results (not in organic results neither in Google news), even though they are reported using Google tools for that purpose.Could someone explain this to me? Is there a way to make Google realize/detect these bad practices?Thanks
Intermediate & Advanced SEO | | seoseoseos0 -
Does Google View "SRC", "HREF", TITLE and Alt tags as Duplicate Content on Home Page Slider?
Greetings MOZ Community. A keyword matrix was developed by my SEO firm. I am in the process of integrating primary, secondary and terciary phrases into the text and am also sprinkling three or four other terms. Using a keyword density tool (http://www.webconfs.com/keyword-density-checker.php) the results were somewhat unexpected after I optimized. So I then looked at the source code and noticed text from HREF, ALT and SRC tags that may be effecting how Google would interpret text on the page. Our home page (www.nyc-officespace-leader.com) contains a slider with commercial real estate listings. Would Google index the SRC, HREF, TITLE and ALT tags in these slider items? Would this be detrimental to SEO? The code for one listing (and there are 7-8 in the slider) looks like this: | href="http://www.nyc-officespace-leader.com/listings/305-fifth-avenue-office-suite-1340sf" title="Lease a Prestigious Fifth Avenue Office - Manhattan, New York">Class A Fifth Avenue Offices class="blockLeft"><a< p=""></a<> href="http://www.nyc-officespace-leader.com/listings/305-fifth-avenue-office-suite-1340sf" title="Lease a Prestigious Fifth Avenue Office - Manhattan, New York"> src="http://dr0nu3l9a17ym.cloudfront.net/wp-content/uploads/fsrep/houses/125x100/305.jpg" alt="Lease a Prestigious Fifth Avenue Office - Manhattan, New York" width="125" height="94" /> 1,340 Sq. Ft. $5,918 / month Fifth Avenue Midtown / Grand Central <a< p=""></a<> | Could the repetition of the title text ("lease a Prestigious Fifth...") trigger a duplicate content penalty? Should the slider content be blocked or set to no-index by some kind of a Java script? We have worked very hard to optimize the home page so it would be a real shame if through some technical oversight we got hit by a Google Panda penalty. Thanks, Alan Thanks
Intermediate & Advanced SEO | | Kingalan10 -
Structured Data Questions
I am showing 2 items with errors. These products have both been removed from the site, and will trigger a 404 Page Not Found. I am still seeing the page URLs in Webmaster Central > Search Appearance > Structured Data. They are shown as items with errors, the errors being that they are missing price too. Should I 301 redirect these on an htaccess file, or should I remove the page url in some other way from Google? Also, I have a site with over 50,000 products and 2,000 category level pages. In Structured Data, there are only 2,848 items. Does it seem like Google is collecting very little data compared to how many urls I have on my site?
Intermediate & Advanced SEO | | djlittman0 -
Website is not indexed in Google, please help with suggestions
Our client website was removed from Google index. Anybody could recommend how to speed up process of re index: Webmaster tools done SM done (Twitter, FB) sitemap.xml done backlinks in process PPC done Robots.txt is fine Guys any recommendations are welcome, client is very unhappy. Thank you
Intermediate & Advanced SEO | | ThinkBDW0 -
What Sources to use to compile an as comprehensive list of pages indexed in Google?
As part of a Panda recovery initiative we are trying to get an as comprehensive list of currently URLs indexed by Google as possible. Using the site:domain.com operator Google displays that approximately 21k pages are indexed. Scraping the results however ends after the listing of 240 links. Are there any other sources we could be using to make the list more comprehensive? To be clear, we are not looking for external crawlers like the SEOmoz crawl tool but sources that would be confidently allow us to determine a list of URLs currently hold in the Google index. Thank you /Thomas
Intermediate & Advanced SEO | | sp800 -
Large volume of ning files in subdomain - hurting or helping?
I have a client that has 600 pages in their root domain and a subdomain that contains 7500 pages of un-seoable Ning pages. PLUS another 650 pages from Sched.com that also is contributing to a large volume of errors. My question is - should I create a new domain for the Ning content - or am I better off with the volume of pages - even if they have loads of errors? Thanks!
Intermediate & Advanced SEO | | robertdonnell0