Getting pages that load dynamically into the SE's
-
SEO'ers,
Am dealing with an issue I cannot figure out the best way to handle. Working on a website that shows the definitions of words which are loaded dynamically from an open source. Source such as: wiktionary.org
When you visit a particular page to see the definition of the word, say; www.example.com/dictionary/example/ the definition is there. However, how can we get all the definition pages to get indexed in search engines? The WordPress sitemap plugin is not picking up these pages to be added automatically - guess because it's dynamic - but when using a sitemap crawler pages are detected.
Can anybody give advice on how to go about getting the 200k+ pages indexed in the SE's? If it helps, here's a reference site that seems to load it's definitions dynamically and has succeeded in getting its pages indexed: http://www.encyclo.nl/begrip/sample
-
I see what you mean there - thanks for sharing your expertise and views on this issue. Much appreciated
-
The only way I'd let those pages be indexed is if they had unique content on them AND/OR provided value in other ways besides just providing the Wiki definition. There are many possibilities for doing this, none of them scalable in an automated fashion, IMHO.
You could take the top 20% of those pages (based on traffic, conversions, revenue...) and really customize them by adding your own definitions and elaborating on the origin of the word, etc... Beyond that you'd probably see a decline in ROI.
-
Everett, yes that's correct. I will go ahead and follow up on what you said. I do still wonder what the best way would be to go about getting it indexed - if I wanted to do that in the future. If you could shed some light on how to go about that, I'd really appreciate it. Thanks so much in advance!
-
It appears that your definitions are coming from wiktionary.org and are therefore duplicate content. If you were providing your own definitions I would say keep the pages indexable, but in this case I would recommend adding a noindex, follow robots meta tag to the html header of those pages.
-
Hi Everett, I've been looking at the index for word definitions and there's so many pages that are very similar to each other. It's worth giving it a shot I think. If you can provide feedback please do. Here's the domain: http://freewordfinder.com. The dictionary is an addition to users who'd like to see what a word means after they've found a word from random letters. You can do a search at the top to see the results, then click through to the definition of the word. Thanks in advance
-
Ron,
We could probably tell you how to get those pages indexed, but then we'd have to tell you how to get them removed from the index when Google sees them all as duplicate content with no added value. My advice is to keep them unindexed, but if you really want them to be indexed tell us the domain and I'll have a look at how it's working and provide some feedback.
-
Hi Keri, did you think that the site might get penalized because it would in essence be duplicate content from another site? Even though the source is linked from the page? Please let me know your thoughts when you can
-
No they currently do not have additional information on them. They are simply better organized on my pages compared to the 3rd party. The unique information is what drives visitors to the site and from those pages it links to the definitions just in case they're interested understanding the meaning of a word. Does that help?
-
Do the individual pages with the definitions have additional information on them, or are they just from a third party, with other parts of the site having the unique information?
-
Hi Keri, thanks for your response. Well, I see what you're saying. The pages that show the definition pulled from the 3rd party are actually supplementary to the solution the site provides (core value). Shouldn't that make a difference?
-
I've got a question back for you that's more of a meta question. Why would the search engines want to index your pages? If all the page is doing is grabbing information from another source, your site isn't offering any additional value to the users, and the search engine algos aren't going to see the point in sending you visitors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there any reason to get a massive decrease on indexed pages?
Hi, I'm helping on SEO for a big e-commerce in LatAm and one thing we've experienced during the last months is that our search traffic had reduced and the indexed pages had decreased in a terrible way. The site had over 2 Million indexed pages (which was way too much, since we believe that around 10k would be more than enough to hold the over 6K SKUs) but now this number has decreased to less than 3K in less than 2 months. I've also noticed that most of the results in which the site is still appearing are .pdf or .doc files but not actual content on the website. I've checked the following: Robots (there is no block, you can see that on the image as well) Webmaster Tools Penalties Duplicated content I don't know where else to look for. Can anyone help? Thanks in advance! cpLwX1X
International SEO | | mat-relevance0 -
Include mobile and international versions of pages to sitemap or not?
My pages already have alternate and hreflang references to point to international and mobile versions of the content. If I add 5 language desktop versions and 5 language mobile versions as https://support.google.com/webmasters/answer/2620865?hl=en explains, my sitemap will get bulky. What are the pros and cons for referencing all page versions in sitemap and for include just general (English/Desktop) version in sitemap?
International SEO | | poiseo0 -
Subfolders and 301's
Hello all, Quite simply, I'm stuck. Well, I think I am. We are about to launch a whole new International side of our website. We're an education job board www.eteach.com for schools in the UK and a little internationally. Now that the business is growing we want to make our brand more global. All the big bosses wanted to create a brand new website called www.eteachinternational.com. I managed to persuade them to not to do that and instead use a subfolder approach off of our well established and strong domain www.eteach.com (phew). However, now I'm getting a little lost in making sure I don't duplicate my content. We have a staffroom section on our website which basically has lots of relevant content for people searching how to become a teacher, e.g. www.eteach.com/how-to-become-a-teacher. We also want this same content on the international subfolder, as it will still be relevant content for international teachers. However... Do I have to completely re-write the content (which I'm trying to avoid as it will be very similar) or can I put in a rel=canonical to the already existing pages? So basically (I know this HTML isn't right, it's just for visual's sake!): www.eteach.com/international/how-to-become-a-teacher rel=canonical --> www.eteach.com/how-to-become-a-teacher I understand this gives all the authority to the original page, not the international one, but I'm fine with that (unless anyone can suggest anything else?)
International SEO | | Eteach_Marketing0 -
What is the current thinking about translated versions of pages? Is it necessary
My company is about to do a big push in China. We can get our homepage translated in Chinese at a very reasonable price. My questions are: Is it worth it? Do browsers to an adequately job of translating pages? If it is worth it: Can someone suggest a good post explaining what to do with the translation? What are the SEO implications? Thank you
International SEO | | appbackr
Sarah0 -
Specific page URL in a multi-language environment
I've read a lot of great posts on this forum about how to go about deciding the best URL structure for each language that your site will support, so thank you to everyone that has provided input on that. I now have a question that I haven't really found answers/opinions on. When providing a page translation, should my content URL reflect that of the country I'm targeting or always remain the same across all sites? Below is an example using the "About Us" page. www.example.com/about-us/
International SEO | | Matchbox
www.example.com/es-mx/about-us/ -- OR -- www.example.com/about-us
www.example.com/es-mx/sobre-nosotros Thank you in advance for your help. Cheers!0 -
Getting ranked in French on Google UK ?
Hellooooo the Moz community ! (#superexcited, # firstpost) Here's my problem. I'm working for a client specialised in Corporate Relocation to London for French families. (I'm reworking the entire site from the ground up, so I can manoeuvre pretty easily) The thing is, these families will either be : Searching on Google FR but mostly in English (French as well) Searching on Google UK but mostly in French ! (and of course, English as well) To be honest, I'm really not sure what strategy I should go with. Should I just target each local market in its native language and google will pick up the right language if people are searching in the "opposite" language ? I'd love some tips to help get me started. Sadly, I don't have a lot of data yet. (Client didn't even have tracking up on their site before I came in). So far here's what I got (on very small number of visitors): Location: 50+% from UK / 20+% from France.
International SEO | | detailedvision
Language : 60+% En / 35+% Fr Thank you. Tristan0 -
Should product-pages with different currencies have different URLs?
Here is a question that should be of interest for small online merchants selling internationally in multiple currencies. When, based on geolocation, a product-page is served with different currencies, should a product-page have a different URL for each currency? Thanks.
International SEO | | AdrienOLeary0 -
Getting A Sub Domain To Out-Rank The Main Domain
Hi, We have a prospective client who currently have a sub domain setup for each language, they all have the same content as the main domain. The problem is that the main domain is written in English (but not UK English), and they want the UK sub domain to outrank it (it's the other way round at the moment). Effectively, there are duplicate content issues here and as a result it looks like Google have chosen to keep the main domain (as it has more authority) and lower the UK sub results in its rankings. Is there a feature in webmaster tools where you can target subdomains to a location (I know you can do this with a main domain). Additionally, any other tips for the above would be greatly appreciated. Thanks in advance,
International SEO | | jasarrow0