Getting pages that load dynamically into the SE's
-
SEO'ers,
Am dealing with an issue I cannot figure out the best way to handle. Working on a website that shows the definitions of words which are loaded dynamically from an open source. Source such as: wiktionary.org
When you visit a particular page to see the definition of the word, say; www.example.com/dictionary/example/ the definition is there. However, how can we get all the definition pages to get indexed in search engines? The WordPress sitemap plugin is not picking up these pages to be added automatically - guess because it's dynamic - but when using a sitemap crawler pages are detected.
Can anybody give advice on how to go about getting the 200k+ pages indexed in the SE's? If it helps, here's a reference site that seems to load it's definitions dynamically and has succeeded in getting its pages indexed: http://www.encyclo.nl/begrip/sample
-
I see what you mean there - thanks for sharing your expertise and views on this issue. Much appreciated
-
The only way I'd let those pages be indexed is if they had unique content on them AND/OR provided value in other ways besides just providing the Wiki definition. There are many possibilities for doing this, none of them scalable in an automated fashion, IMHO.
You could take the top 20% of those pages (based on traffic, conversions, revenue...) and really customize them by adding your own definitions and elaborating on the origin of the word, etc... Beyond that you'd probably see a decline in ROI.
-
Everett, yes that's correct. I will go ahead and follow up on what you said. I do still wonder what the best way would be to go about getting it indexed - if I wanted to do that in the future. If you could shed some light on how to go about that, I'd really appreciate it. Thanks so much in advance!
-
It appears that your definitions are coming from wiktionary.org and are therefore duplicate content. If you were providing your own definitions I would say keep the pages indexable, but in this case I would recommend adding a noindex, follow robots meta tag to the html header of those pages.
-
Hi Everett, I've been looking at the index for word definitions and there's so many pages that are very similar to each other. It's worth giving it a shot I think. If you can provide feedback please do. Here's the domain: http://freewordfinder.com. The dictionary is an addition to users who'd like to see what a word means after they've found a word from random letters. You can do a search at the top to see the results, then click through to the definition of the word. Thanks in advance
-
Ron,
We could probably tell you how to get those pages indexed, but then we'd have to tell you how to get them removed from the index when Google sees them all as duplicate content with no added value. My advice is to keep them unindexed, but if you really want them to be indexed tell us the domain and I'll have a look at how it's working and provide some feedback.
-
Hi Keri, did you think that the site might get penalized because it would in essence be duplicate content from another site? Even though the source is linked from the page? Please let me know your thoughts when you can
-
No they currently do not have additional information on them. They are simply better organized on my pages compared to the 3rd party. The unique information is what drives visitors to the site and from those pages it links to the definitions just in case they're interested understanding the meaning of a word. Does that help?
-
Do the individual pages with the definitions have additional information on them, or are they just from a third party, with other parts of the site having the unique information?
-
Hi Keri, thanks for your response. Well, I see what you're saying. The pages that show the definition pulled from the 3rd party are actually supplementary to the solution the site provides (core value). Shouldn't that make a difference?
-
I've got a question back for you that's more of a meta question. Why would the search engines want to index your pages? If all the page is doing is grabbing information from another source, your site isn't offering any additional value to the users, and the search engine algos aren't going to see the point in sending you visitors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International Targeting: What Does Google Consider an Equivalent Page?
Hi All, We are working with an international brand that owns several domains across the EU and in North America. Our team is in the process of setting up international targeting using sitemaps to indicate alternate language pages. This is being done to prevent North American pages from being served in the UK, Spain pages from being served in Portugal, or any other combination of possibilities... Currently we are mapping duplicate or “equivalent” pages and defining them as rel="alternate" on their respective sitemaps. The problem is, it’s not always explicitly clear what Google considers “equivalent.” 1. In this instance, URL structures vary by domain,
International SEO | | MetaPaul
2. in most cases the content is similar (but unique),
3. the landing page templates vary is design and functionality,
4. and lastly, services often contain nuances that make them slightly different from one another (Professional Liability Insurance vs Professional Indemnity Insurance). All things considered, these pages are offering the same service, but are vastly different (see above). Q: Is it appropriate to use these attributes to serve the correct language / regional URL to searchers? Q: Is there a rule of thumb on what should be considered an "equivalent" page? Thanks All, Paul3 -
How well does Google's "Locale-aware crawling by Googlebot" work?
Hello, In January of this year Google introduced "Locale-aware crawling by Googlebot." https://support.google.com/webmasters/answer/6144055?hl=e Google uses different crawl settings for sites that cannot have separate URLs for each locale. ......... This is basically for sites that dynamically render contend on the same URL depending on the locale and language (IP) of the visitor. If e.g. a visitor was coming from France, the targeted page would load in french. If a visitor was coming from the US the same page would load in English on the same URL. Does anyone have any experience with this setup and how well it works? How well do the different versions of a page get indexed, and how well do those pages rank? In the example above, does the french content get indexed correctly? Many thanks!
International SEO | | Veva0 -
If domain mapping subfolders to TLD's is it perceived as a fully separate entity/site therafter ?
Hi I take it once you have domain mapped a country specific subfolder to a country specific TLD (for better local region targeting reasons) Google perceives it as a completely separate entity and it no longer shares any of the parent sites domain benefits (such as domain authority etc) so from that point on requires its own dedicated link building etc ? All Best Dan
International SEO | | Dan-Lawrence0 -
What is the current thinking about translated versions of pages? Is it necessary
My company is about to do a big push in China. We can get our homepage translated in Chinese at a very reasonable price. My questions are: Is it worth it? Do browsers to an adequately job of translating pages? If it is worth it: Can someone suggest a good post explaining what to do with the translation? What are the SEO implications? Thank you
International SEO | | appbackr
Sarah0 -
Specific page URL in a multi-language environment
I've read a lot of great posts on this forum about how to go about deciding the best URL structure for each language that your site will support, so thank you to everyone that has provided input on that. I now have a question that I haven't really found answers/opinions on. When providing a page translation, should my content URL reflect that of the country I'm targeting or always remain the same across all sites? Below is an example using the "About Us" page. www.example.com/about-us/
International SEO | | Matchbox
www.example.com/es-mx/about-us/ -- OR -- www.example.com/about-us
www.example.com/es-mx/sobre-nosotros Thank you in advance for your help. Cheers!0 -
How do I successfully verify my site for Baidu's webmaster tools?
Instructions for verifying a website via file validation for Baidu's webmaster tools are pretty vague. Does anyone know if the process is the same as Google Webmaster Tools where the verification string must appear in the URL and in the content of the file? Also, does it truly have to be verified within 2.6 hours? Appreciate any feedback from people who have successfully verified their site.
International SEO | | sigmaaldrich0 -
Duplicated 404 Pages (Travel Industry)
Our website has creating numberous "future pages" with no alt tag or class tag that are showing up as 404 pages, To make matters worst, they are causing duplicate 404 pages because we have different languages. The visitors cant find the 404s but the searchbots can. Would it better to remove or add the links to robot.txt or add nofollow/noindex tag? This is an example. http://www.solmelia.com/nGeneral.LINK_FAQ http://www.solmelia.com/nGeneral.LINK_HOTELESDESTINOS_BODAS http://www.solmelia.com/nGeneral.LINK_CONDICIONES http://www.solmelia.com/nGeneral.LINK_MAPSITE http://www.solmelia.com/nGeneral.LINK_HOTELESDESTINOS_EMPRESA
International SEO | | Melia0 -
What's the best strategy for checking international rankings?
Hi There- I am looking to optimize sites serving the UK and Austrailia markets. I feel like I have a good handle on how to go about doing that, but what I am fuzzy on is, what's the best way to monitor the SERPs for the keywords I am targeting. I know based on experience that if I just search google.com.au from here in the states, my results will be 'americanized' and may/probably won't accurately reflect what someone would see if they were search from Austrailia. Are there any good tools or tactics for seeing what searchers in the countries I am focusing on woudl see? Thanks! Jason
International SEO | | phantom0