Getting pages that load dynamically into the SE's
-
SEO'ers,
Am dealing with an issue I cannot figure out the best way to handle. Working on a website that shows the definitions of words which are loaded dynamically from an open source. Source such as: wiktionary.org
When you visit a particular page to see the definition of the word, say; www.example.com/dictionary/example/ the definition is there. However, how can we get all the definition pages to get indexed in search engines? The WordPress sitemap plugin is not picking up these pages to be added automatically - guess because it's dynamic - but when using a sitemap crawler pages are detected.
Can anybody give advice on how to go about getting the 200k+ pages indexed in the SE's? If it helps, here's a reference site that seems to load it's definitions dynamically and has succeeded in getting its pages indexed: http://www.encyclo.nl/begrip/sample
-
I see what you mean there - thanks for sharing your expertise and views on this issue. Much appreciated
-
The only way I'd let those pages be indexed is if they had unique content on them AND/OR provided value in other ways besides just providing the Wiki definition. There are many possibilities for doing this, none of them scalable in an automated fashion, IMHO.
You could take the top 20% of those pages (based on traffic, conversions, revenue...) and really customize them by adding your own definitions and elaborating on the origin of the word, etc... Beyond that you'd probably see a decline in ROI.
-
Everett, yes that's correct. I will go ahead and follow up on what you said. I do still wonder what the best way would be to go about getting it indexed - if I wanted to do that in the future. If you could shed some light on how to go about that, I'd really appreciate it. Thanks so much in advance!
-
It appears that your definitions are coming from wiktionary.org and are therefore duplicate content. If you were providing your own definitions I would say keep the pages indexable, but in this case I would recommend adding a noindex, follow robots meta tag to the html header of those pages.
-
Hi Everett, I've been looking at the index for word definitions and there's so many pages that are very similar to each other. It's worth giving it a shot I think. If you can provide feedback please do. Here's the domain: http://freewordfinder.com. The dictionary is an addition to users who'd like to see what a word means after they've found a word from random letters. You can do a search at the top to see the results, then click through to the definition of the word. Thanks in advance
-
Ron,
We could probably tell you how to get those pages indexed, but then we'd have to tell you how to get them removed from the index when Google sees them all as duplicate content with no added value. My advice is to keep them unindexed, but if you really want them to be indexed tell us the domain and I'll have a look at how it's working and provide some feedback.
-
Hi Keri, did you think that the site might get penalized because it would in essence be duplicate content from another site? Even though the source is linked from the page? Please let me know your thoughts when you can
-
No they currently do not have additional information on them. They are simply better organized on my pages compared to the 3rd party. The unique information is what drives visitors to the site and from those pages it links to the definitions just in case they're interested understanding the meaning of a word. Does that help?
-
Do the individual pages with the definitions have additional information on them, or are they just from a third party, with other parts of the site having the unique information?
-
Hi Keri, thanks for your response. Well, I see what you're saying. The pages that show the definition pulled from the 3rd party are actually supplementary to the solution the site provides (core value). Shouldn't that make a difference?
-
I've got a question back for you that's more of a meta question. Why would the search engines want to index your pages? If all the page is doing is grabbing information from another source, your site isn't offering any additional value to the users, and the search engine algos aren't going to see the point in sending you visitors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is best way to display user reviews in languages different from the page language? (e.g. English reviews on a page in Spanish)
What is best way to display user reviews in languages different from the page language? (e.g. English reviews on a page in Spanish). For the user it would be useful to see these reviews but I am concerned about negative SEO impact.
International SEO | | lcourse
I would not want to invest into having them all translated by human translator. Any suggestions?0 -
How To Proceed With Int'l Language Targeting if Subfolders Not An Option?
I’m currently working with my team to sort out the best way to build out the international versions of our website. Any advice on how to move forward is greatly appreciated! Current Setup: Subdirectories to target languages - i.e. domain.com/es/. We chose this because… We are targeting languages not countries Our product offering does not change from country to country Translated site content is almost identical to the english version Current Problem: Our site is built on WordPress and our database can’t handle the build out of 4 more international versions of the site. The database is slowing down and our site speed is being affected for multiple reasons (WordPress multilingual plugin being one of them). **What to do next? **My developers have said that we cannot continue with our current subdirectory structure due to the technical infrastructure issues I’ve mentioned above (as well as others I’m yet to get full details on). Now I’m left with a decision: Change to a subdomain structure Change to a ccTLD structure Is there an option 3? From what I’ve read it does not make sense to build out language targeted sites on a ccTLD structure because that limits the ability for people outside of the targeted country to find the content organically. I.e. a website at www.domain.es is targeted to searchers in Spain so someone in Columbia is less likely to find that content through the engines. Is this correct? If so, how much can it hurt organic discovery? What’s the optimal setup to move forward with in this case? Thanks!
International SEO | | UnbounceVan0 -
Geo-Targeting separate TLD's where both are .com domains
Hi I have a client who owns two separate TLDs for the same brand (for the sake of this post, we'll call the two sites www.site-a.com and www.site-b.com). For site www.site-a.com the website has been around for a while and is their primary site for their US operations which is their heartland, is well established in the SERPS and is where they make most of their money. As they looked to expand to the UK, they then created www.site-b.com and added the UK as a subfolder (so www.site-b.com/uk) and geo-targeted it towards the UK in Webmaster tools . The site has recently launched but they now find that, when a customer searches for their brand in the UK, they find www.site-a.com in position 1 (which, given it's tailored for a primary US audience, has a significantly lower conversion rate for UK traffic) and www.site-b.com in position 2. However, the client doesn't want to specifically geo target www.site-a.com to the USA as they feel it might affect where they appear for other international markets aside from the UK. So the question is, how can they, with the existing infrastructure, help remove www.site-a.com from the UK SERPs without adversely affecting their rank elsewhere? Hope this makes sense and thanks in advance for your help. James
International SEO | | jimmygs19820 -
Anybody experience with speeding up loading time for visitors from China mainland?
I made some speed tests and noticed that our website loads 10 times slower for visitors from China mainland.
International SEO | | lcourse
Did anybody have experience with speeding up loading time for visitors from China? We operate the Chinese version of our website in a subdirectory and we have no interest in registering a company in China in order to get the ICP number.
Currently using cloudflare who should have a node in HK and serving static content via rackspace. Does disabling google analytics and facebook widgets really make a difference? (ideally would like to avoid this)0 -
How do you get the "real" organic traffic from direct traffic?
Please check the following article: http://www.searchenginejournal.com/study-shows-organic-search-responsible-64-web-traffic/111791/ I hope you guys have some ideas on how to extract the "real" organic traffic from direct. Thanks in advance!
International SEO | | zpm20140 -
Redirect the main site to keyword-rich subfolder / specific page for SEO
Hi,
International SEO | | Awaraman
I have two questions. Question 1: is it worthwhile to redirect the main site to keyword-rich subfolder / specific page for SEO? For example, my company's webpage is www.example.com. Would it make sense to redirect the main site to address www.example.com/service-one-in-certain-city ? I am asking this as I have learned that it is important for SEO to have keywords in the URL, and I was thinking that we could do this and include the most important keywords to the subfolder / specific URL. What are the pros and cons and how important is it to include keywords to folders and page URLs. Should I create folders or pages just the sake of keywords? Question 2: Most companies have their main URL shown as www.example.com when you access their domain. However, some multi-language sites show e.g. www.example.com/en or www.example.com/en/main when you type the domain to your web browser to access the site. I undertstand that this is a common practice to use subdomains or folders to separate the language versions. My question is regarding the subfolder. Is it better to have only the subfolder shown (www.example.com/en) or should you also include the specific page's URL after the subfolder with keywords (www.example.com/en/main or www.example.com/en/service-one-in-certain-city)? I don't really understand why some companies show only the subfolder of a specific language page and some the page's URL after the subfolder. Thanks in advance, Sam0 -
Dynamic Google search snippet text based on user's language
On Google search results page, I want to show search snippet text (of my webpage) in Hindi language if user is user is using Google in Hindi language. If user chose another language on Google search page, my snippet text should be shown in that language. Is this possible? How?
International SEO | | Avinashmb0 -
Why don't our English versions show up first?
If I google "greatfire" I find the Chinese version of our website (zh.greatfire.org) before the English version (en.greatfire.org). This is not on the Chinese-language version of Google. Why is this? Our site even has a language indicator () and also hints of where the English version is (). The same thing happens if I google "freeweibo". I find https://freeweibo.com but not https://freeweibo.com/en/, even though we indicate that's the English version (). Any ideas?
International SEO | | GreatFire.org0