Automated Statistical Data Unique Enough?
-
I have many pages that will soon have a lot of statistical data (real estate related). Each page represents a neighborhood, and the stats will be unique for each neighborhood. However, the stats follow a pattern on all pages: Nr o Sales year-to-date, Median Sales Price etc etc. It is great value to users, but I wonder if such pattern of similar types of calculations (though unique results for each neighborhood) across many pages will potentially be seen as lacking uniqueness as it all pages follow a similar pattern and sentence structure (Nr of Sales year-to-date, Median Sales Price etc). Adding to this, these statistics will be the only stuff that is truly unique content on these pages.
-
these pages' only unique content will be the dynamic statistical data. So I may have 100 different pages that all have:
Median Price
Nr of Properites Sold year 2013, 2012, 2011
$ Volume of Sold Properites year 2013, 2012
etc etcObviously $ and number amounts will be different for each neighborhood, but otherwise same writing and layout across many pages. I wonder if good enough, as just offering value to the users do not always seem to be all we should keep in mind, as I see many websites perform very well taking a simple approach writing a general blurb and nothing more.
-
I think you didn't realize that you gave the answer to your question already ;-). "It is great value to users", then do it. Google won't bother about a couple of element you dynamically calculate. Reviews is a similar thing which they (tend to) love.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mapping ALL search data for a broad topic
Hi All As our company becomes a bigger and bigger entity I'm trying to figure out how I can create more autonomy. One of the key areas that needs fixing is briefing the writers on articles based on keywords. We're not just trying to go after the low hanging fruit or the big money keywords but actually comprehensively cover every topic and provide actual good quality up to date info (surprisingly rare in a competitive niche) and eventually cover pretty much every topic there is. We generally work on a 3 tier system on a folder level, topics and then sub-topics. The challenge is getting an agency to: a) be able to pull all of the data without being knowledgeable in our specific industry. We're specialists and, thus, target people that need specialist expertise as well as more mainstream stuff (the stuff that run of the mill people wouldn't know about). b) know where it all fits topically as we kind of organise the content on a heirarchy basis. And we generally cover multiple smaller topics within articles. Am I asking for the impossible here? It's the one area of the business I feel the most nervous about creating autonomy with. Can we become be as extensive and comprehensive as a wiki-type website without having somebody within the business that knows it providing the keyword research. I did a searh for all data using the main two seed keywords for this subject on ahrefs and it came up with 168000 lines of spreadsheet data. Obviously this went way beyond the maximum I was allowed to export. Interested in feedback and, if any agencies are up for the challenge, do let me know! I've been using moz pro for a long time but have never posted and apologise if what I'm describing is being explained badly here. Requirements Keywords to cover all (broad niche) related queries in the UK, no relevant uk (broad niche) keywords will be missed Organised in a way that can be interpreted as article brief and folder structure instructions. Questions How would you ensure you cover every single keyword? Assuming no specialist X knowledge, how will you be able to map content and know which search queries belong in which topics and in what order. Also (where there is keyword leakage from other regions) how will you know which are UK terms and which aren’t? With minimal X knowledge – how will you know whether you’ve missed an opportunity or not (what you don’t know you don’t know) What specific resources will you require from us in order for this to work? What format will the data be provided to us in - how will you present the finished work so that it can be turned into article briefs?
Intermediate & Advanced SEO | | d.bird0 -
How many images should I use in structured data for a product?
We have a basic printing website that offers business cards. Each type of business card has a few product images. Should we use structured data for all the images, or just the main image? What is your opinion about this? Thanks in advance.
Intermediate & Advanced SEO | | Choice0 -
What would cause my structured data items to drop off?
I have about 20,000 items on my Magento Site. I used a plugin to add structured data. If I use the Webmaster Tools Structured Data Tester everything shows up perfect. There are no errors on any page that I have spot checked. My total items increased to about 2500, but has now started dropping. The numbers have dropped to about 725 over the last few weeks. What can I check?
Intermediate & Advanced SEO | | Tylerj0 -
Building a product clients will integrate into their sites: What is the best way to utilize my clients' unique domain names?
I'm designing a hosted product my clients will integrate into their websites, their end users would access it via my clients' customer-facing websites. It is a product my clients pay for which provides a service to their end users, who would have to login to my product via a link provided by my clients. Most clients would choose to incorporate this link prominently on their home page and site nav.
Intermediate & Advanced SEO | | emzeegee
All clients will be in the same vertical market, so their sites will be keyword rich and related to my site.
Many may even be .org and ,edus The way I see it, there are three main ways I could set this up within the product.
I want to know which is most beneficial, or if I'm missing anything. 1: They set up a subdomain at their domain that serves content from my domain product.theirdomain.com would render content from mydomain.com's database.
product.theirdomain.com could have footer and/or other no-follow links to mydomain.com with target keywords The risk I see here is having hundreds of sites with the same target keyword linking back to my domain.
This may be the worst option, as I'm not sure about if the nofollow will help, because I know Google considers this kind of link to be a link scheme: https://support.google.com/webmasters/answer/66356?hl=en 2: They link to a subdomain on mydomain.com from their nav/site
Their nav would include an actual link to product.mydomain.com/theircompanyname
Each client would have a different "theircompanyname" link.
They would decide and/or create their link method (graphic, presence of alt tag, text, what text, etc).
I would have no control aside from requiring them to link to that url on my server. 3: They link to a subdirectory on mydomain.com from their nav/site
Their nav would include an actual link to mydomain.com/product/theircompanyname
Each client would have a different "theircompanyname" link.
They would decide and/or create their link method (graphic, presence of alt tag, text, what text, etc).
I would have no control aside from requiring them to link to that url on my server. In all scenarios, my marketing content would be set up around mydomain.com both as static content and a blog directory, all with SEO attractive url slugs. I'm leaning towards option 3, but would like input!0 -
Google webmaster tools showing "no data available" for links to site, why?
In my google webmaster account I'm seeing all the data in other categories except links to my site. When I click links to my site I get a "no data available" message. Does anyone know why this is happening? And if so, what to do to fix it? Thanks.
Intermediate & Advanced SEO | | Nicktaylor10 -
Using unique content from "rel=canonical"ized page
Hey everyone, I have a question about the following scenario: Page 1: Text A, Text B, Text C Page 2 (rel=canonical to Page 1): Text A, Text B, Text C, Text D Much of the content on page 2 is "rel=canonical"ized to page 1 to signalize duplicate content. However, Page 2 also contains some unique text not found in Page 1. How safe is it to use the unique content from Page 2 on a new page (Page 3) if the intention is to rank Page 3? Does that make any sense? 🙂
Intermediate & Advanced SEO | | ipancake0 -
Is it possible to "undo" canonical tags as unique content is created?
We will soon be launching an education site that teaches people how to drive (not really the topic, but it will do). We plan on being content rich and have plans to expand into several "schools" of driving. Currently, content falls into a number of categories, for example rules of the road, shifting gears, safety, etc. We are going to group content into general categories that apply broadly, and then into "schools" where the content is meant to be consumed in a specific order. So, for example, some URLs in general categories may be: drivingschool.com/safety drivingschool.com/rules-of-the-road drivingschool.com/shifting-gears etc. Then, schools will be available for specific types of vehicles. For example, drivingschool.com/cars drivingschool.com/motorbikes etc. We will provide lessons at the school level, and in the general categories. This is where it gets tricky. If people are looking for general content, then we want them to find pages in the general categories (for example, drivingschool.com/rules-of-the-road/traffic-signs). However, we have very similar content within each of the schools (for example, drivingschool.com/motorbikes/rules-of-the-road/traffic-signs). As you could imagine, sometimes the content is very unique between the various schools and the general category (such as in shifting), but often it is very similar or even nearly duplicate (as in the example above). The problem is that in the schools we want to say at the end of the lesson, "after this lesson, take the next lesson about speed limits for motorcycles" so there is a very logical click-path through the school. Unfortunately this creates potential duplicate content issues. The best solution I've come up with is to include a canonical tag (pointing to the general version of the page) whenever there is content that is virtually identical. There will be cases though where we adjust the content "down the road" 🙂 to be more unique and more specific for the school. At that time we'd want to remove the canonical tag. So two questions: Does anyone have any better ideas of how to handle this duplicate content? If we implement canonical tags now, and in 6 months update content to be more school-specific, will "undoing" the canonical tag (and even adding a self-referential tag) work for SEO? I really hope someone has some insight into this! Many thanks (in advance).
Intermediate & Advanced SEO | | JessicaB0 -
Multiple domain level redirects to unique sub-folder on one domain...
Hi, I have a restaurant menu directory listing website (for example www.menus.com). Restaurant can have there menu listed on this site along with other details such as opening hours, photos ect. An example of a restaurant url might be www.menus.com/london/bobs-pizza. A feature i would like to offer is the ability for Bob's pizza to use the menus.com website listing as his own website (let assume he has no website currently). I would like to purchase www.bobspizza.com and 301 redirect to www.menus.com/london/bobs-pizza Why?
Intermediate & Advanced SEO | | blackrails
So bob can then list bobspizza.com on his advertising material (business cards etc, rather than www.menus.com/london/bobs-pizza). I was considering using a 301 redirect for this though have been told that too many domain level redirects to one single domain can be flagged as spam by Google. Is there any other way to achieve this outcome without being penalised? Rel canonical url, url masking? Other things to note: It is fine if www.bobspizza.com is NOT listed in search results. I would ideally like any link juice pointing to www.bobspizza.com to pass onto www.menus.com though this is a nice to have. If it comes at the cost of being penalised i can live without the link juice from this. Thanks0