Automated Statistical Data Unique Enough?
-
I have many pages that will soon have a lot of statistical data (real estate related). Each page represents a neighborhood, and the stats will be unique for each neighborhood. However, the stats follow a pattern on all pages: Nr o Sales year-to-date, Median Sales Price etc etc. It is great value to users, but I wonder if such pattern of similar types of calculations (though unique results for each neighborhood) across many pages will potentially be seen as lacking uniqueness as it all pages follow a similar pattern and sentence structure (Nr of Sales year-to-date, Median Sales Price etc). Adding to this, these statistics will be the only stuff that is truly unique content on these pages.
-
these pages' only unique content will be the dynamic statistical data. So I may have 100 different pages that all have:
Median Price
Nr of Properites Sold year 2013, 2012, 2011
$ Volume of Sold Properites year 2013, 2012
etc etcObviously $ and number amounts will be different for each neighborhood, but otherwise same writing and layout across many pages. I wonder if good enough, as just offering value to the users do not always seem to be all we should keep in mind, as I see many websites perform very well taking a simple approach writing a general blurb and nothing more.
-
I think you didn't realize that you gave the answer to your question already ;-). "It is great value to users", then do it. Google won't bother about a couple of element you dynamically calculate. Reviews is a similar thing which they (tend to) love.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Structural data in google webmaster tools
Hey, During the year I have done everything in my power to please Google with my website. Instead of building links towards the page I have focused on content, content and content. In addition I have worked with https and page speed. Today my site is faster than 98% of all tested sites in Pingdom tools and have 94/83 in Google insights. Of course we have had to build some links as well, perhaps 50 links in 8 months. At the same time we have built 700 pages of text. The total amount of links build is 180 over 20 months. On Thursday last week it looks like the site was penalized by Google. I still believe that we can do something about it and get the site back on track again. Hence we have been looking at technical things on the site, if there is anything Google don't like. One thing that I have found is structural data. For some reason this has dropped from 875 a month ago to 3 today. I have no clue why. Does anyone know how structural data works and what can have caused this problem. Would it be possible that we in our attempt to optimize the site might have done something that may affect the structural data? http://imgur.com/a/vurB1 In that case, what affect might this drop in structural data mean for SEO. Could that be a reason for the total drop in ranking? (we have basically been wiped on all our keywords) What I can see in Google webmaster tool about 975 pages are still indexed in Google which has been stable for a long time. Does anyone know more about structural data and what I can do about this?
Intermediate & Advanced SEO | | Enigma123
Thanks in advance! /A vurB10 -
Should HTML be included in the structured data (schema) markup for the main body content?
Lately we have been applying structured data to the main content body of our client's websites. Our lead developer had a good question about HTML however. In JSON-LD, what is the proper way to embed content from a data field that has html markup (i.e. p, ul, li, br, tags) into mainContentOfPage. Should the HTML be stripped our or escaped somehow? I know that apply schema to the main body content is helpful for the Googlebot. However should we keep the HTML? Any recommendations or best practices would be appreciated. Thanks!
Intermediate & Advanced SEO | | RosemaryB0 -
Have I set up my structured data correctly, the testing tool suggests not?
Hi, I've recently marked up some Events for a client in hope that they'll appear as rich snippets in ther SERPS. I have access to their Google Search Console so used the Data Highlighter facility to mark them up, rather than the Raven plugin available for WordPress sites like this. I completed this on 10th July and the snippets are yet to appear - I understand that this can take time and there are no guarantees - but as a novice it would be reassuring if someone can advise that I have done this correctly. We did incidentally resubmit a sitemap after completing this task, but I'm not sure if that makes any difference. I've read that it's the structured data testing tool that I need to use to test my markup, but when I input the urls below, the tool doesn't tell me a lot, which either suggests I've marked it up incorrectly, or don't know how to read it! http://www.ad-esse.com/events/19th-august-2015-reducing-costs-changing-culture-improving-services/
Intermediate & Advanced SEO | | nathangdavidson
http://www.ad-esse.com/events/160915-reducing-costs-changing-culture-improving-services-london/
http://www.ad-esse.com/events/151015-reducing-costs-changing-culture-improving-services-london/ Any guidance welcomed! Many thanks,
Nathan0 -
Generating Rich Snippets without Structured Data
I noticed something in Google search results today that I can't explain. Any help would be appreciated. I performed a real estate based search and the top result featured a rich snippet showcasing the following... Address Price Bd/Ba
Intermediate & Advanced SEO | | RyanOD
912 Garden District Dr #17. Charlotte, NC 28202 $179,990 3 / 2
222 S Caldwell St #1602. Charlotte, NC 28202 $389,238 2 / 2&1/2 However, when I visit the page associated with this information, there is no Schema to be found. In fact, the page is, for the most part, just a large table listing homes on the market. The table headings are Address, Price, and Bd/Ba. Is it common for Google to use table based data to generate rich snippets? What is the best way to influence this? In the absence of Schema (as the page we are talking about has no Schema implementation), does Google default to table data? Has anyone seen this behavior before and, if so, can you point me to it? EDIT: I've now come across a few other examples where the information is not in a table, but rather in divs. Why are such sites (you can find some by searching for "[ZIPCODE] real estate") getting this treatment?0 -
Using a 302 re-direct from http://www to https://www to secure customer data
My website sends Customers from a http://www.mysite.com/features page to a https://www.mysite.com/register page which is an account sign-up form using a 302 re-direct. Any page that collects customer data has an authenticated SSL certificate to protect any data on the site. Is this 302 the most appropriate way of doing this as the weekly crawl picks it up as being bad practise? Is there a better alternative?
Intermediate & Advanced SEO | | Ubique0 -
Should I redirect all my subdomains to a single unique subdomain to eliminate duplicate content?
Hi there! I've been working on http://duproprio.com for a couple of years now. In the early stages of the website, we've put into place a subdomain wildcard, that allowed us to create urls like this on the fly : http://{some-city}.duproprio.com This brought us instantly a lot of success in terms of traffic due to the cities being great search keywords. But now, business has grown, and as we all know, duplicate content is the devil so I've been playing with the idea of killing (redirecting) all those urls to their equivalent on the root domain. http://some-city.duproprio.com/some-listing-1234 would redirect to equivalent page at : http://duproprio.com/some-listing-1234 Even if my redirections are 301 permanent, there will be some juice lost for each link redirected that are actually pointing to my old subdomains This would also imply to redirect http://www.duproprio.com to http://duproprio.com. Which is probably the part I'm most anxious about since the incoming links are almost 50/50 between those 2 subdomains... Bringing everything back into a single subdomain is the thing to do in order to get all my seo juice together, this part is obvious... But what can I do to make sure that I don't end up actually losing traffic instead of gaining authority? Can you help me get the confidence I need to make this "move" without risking to lose tons of traffic? Thanks a big lot!
Intermediate & Advanced SEO | | DuProprio.com0 -
Microdata and dinamic data.
Hi, everybody! We're starting up a local services website in Brazil. Something like redbeacon.com or thumbtack.com, but obviously different. So we are developing our 2.0 version of the site, and I want do put microdata in every provider's pages, to rank people's evaluation about this particular provider, and geographic information about him. Ok, we want to use microdata in several pages, but those are more important: the providers. These data (geo and rank) will be dynamically generated from our database. In Schema.org, I only found information about using static data to build microdata for my intentions. My doubt is: does google and bing and yahoo and etc index dynamic generated data? Is there something about sitemaps.xml or robots.txt that I can do to have my data indexed on search engines? Our front-end is the guy who deal with html and our codemaster uses pure php for coding. Thanks!
Intermediate & Advanced SEO | | ivan.precisodisso0 -
Detailed Revisions of Articles coexisting with Automated Description Articles
Hello all, think per instance in a comparator of cars, motorbikes, etc, where you have dozens of brands, types of cars and motorbikes like diesel or oil, 4x4 vs sport, etc So, in one part of your site you are reviewing them in detail, explaining everything. You also have a database with hundreds of models with several specs like top speed, length, engine, etc so you can automatically create an info page for these hundreds of models. How would you make both of them live together in your website? If you add the review to the automatted articles, then you would have an unconsistency as you cannot manually review all the products. On the other hand, doing it separetly will lead to a very, very similar title posts and urls (revision vs automated versions). In my particular case, I just had the revisions until now and my site is developed in Wordpress. I had all the url posts below the home (mysite.com/review-of-car-x-of-brand-y) and now I am going to add the automatted ones and am thinking on place the automatted ones like WP Custom Posts and the url would be mysite.com/cars/description-of-car-x-of-brand-y. But still have the problem with categories, tags, etc, etc Well, it is long question but what do you think about this?
Intermediate & Advanced SEO | | antorome1