Getting pages that load dynamically into the SE's
-
SEO'ers,
Am dealing with an issue I cannot figure out the best way to handle. Working on a website that shows the definitions of words which are loaded dynamically from an open source. Source such as: wiktionary.org
When you visit a particular page to see the definition of the word, say; www.example.com/dictionary/example/ the definition is there. However, how can we get all the definition pages to get indexed in search engines? The WordPress sitemap plugin is not picking up these pages to be added automatically - guess because it's dynamic - but when using a sitemap crawler pages are detected.
Can anybody give advice on how to go about getting the 200k+ pages indexed in the SE's? If it helps, here's a reference site that seems to load it's definitions dynamically and has succeeded in getting its pages indexed: http://www.encyclo.nl/begrip/sample
-
I see what you mean there - thanks for sharing your expertise and views on this issue. Much appreciated
-
The only way I'd let those pages be indexed is if they had unique content on them AND/OR provided value in other ways besides just providing the Wiki definition. There are many possibilities for doing this, none of them scalable in an automated fashion, IMHO.
You could take the top 20% of those pages (based on traffic, conversions, revenue...) and really customize them by adding your own definitions and elaborating on the origin of the word, etc... Beyond that you'd probably see a decline in ROI.
-
Everett, yes that's correct. I will go ahead and follow up on what you said. I do still wonder what the best way would be to go about getting it indexed - if I wanted to do that in the future. If you could shed some light on how to go about that, I'd really appreciate it. Thanks so much in advance!
-
It appears that your definitions are coming from wiktionary.org and are therefore duplicate content. If you were providing your own definitions I would say keep the pages indexable, but in this case I would recommend adding a noindex, follow robots meta tag to the html header of those pages.
-
Hi Everett, I've been looking at the index for word definitions and there's so many pages that are very similar to each other. It's worth giving it a shot I think. If you can provide feedback please do. Here's the domain: http://freewordfinder.com. The dictionary is an addition to users who'd like to see what a word means after they've found a word from random letters. You can do a search at the top to see the results, then click through to the definition of the word. Thanks in advance
-
Ron,
We could probably tell you how to get those pages indexed, but then we'd have to tell you how to get them removed from the index when Google sees them all as duplicate content with no added value. My advice is to keep them unindexed, but if you really want them to be indexed tell us the domain and I'll have a look at how it's working and provide some feedback.
-
Hi Keri, did you think that the site might get penalized because it would in essence be duplicate content from another site? Even though the source is linked from the page? Please let me know your thoughts when you can
-
No they currently do not have additional information on them. They are simply better organized on my pages compared to the 3rd party. The unique information is what drives visitors to the site and from those pages it links to the definitions just in case they're interested understanding the meaning of a word. Does that help?
-
Do the individual pages with the definitions have additional information on them, or are they just from a third party, with other parts of the site having the unique information?
-
Hi Keri, thanks for your response. Well, I see what you're saying. The pages that show the definition pulled from the 3rd party are actually supplementary to the solution the site provides (core value). Shouldn't that make a difference?
-
I've got a question back for you that's more of a meta question. Why would the search engines want to index your pages? If all the page is doing is grabbing information from another source, your site isn't offering any additional value to the users, and the search engine algos aren't going to see the point in sending you visitors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does not having any hreflang tags for U.S Visitors lead to an increase in International Visitors?
I have seen a massive increase in International Visitors on our website and visitors within the United States dropped off hard this month (by about 20%). Could it be possible that not having any hreflang tags can lead to an increase in International Customers visiting the site even though your sitemap is set to "Target users in United States" within the Google Search Console? In the Google Search Console, I have International Targeting set to "Target users in United States." However, Google Search Console is saying our site doesn't have any hreflang tags. In the Google Search Console, it says "Your site has no hreflang tags. Google uses hreflang tags to match the user's language preference to the right variation of your pages." I'm not sure when that was flagged, but recently we have seen a massive increase in International Visitors to our site from countries such as Russia, Vietnam, Indonesia, the United Kingdom and so on. This poses a problem since our chances of turning one of those visitors into a customer is extremely slim. Along with that, nearly every international customer is contributing to an extremely high Bounce Rate. Attached is a screenshot of the Error about hreflang tags. https://imgur.com/a/XZI45Pw And here is a screenshot of the Country we are targeting. https://imgur.com/a/ArpWe9Z Lastly, attached is a screenshot of all of the Countries that visited our site today: https://imgur.com/a/d0tNwkI
International SEO | | MichaelAtMSP1 -
Geolocation issue: Google not displaying the correct url in the SERP's
Hello, Im running a multi-country domain with this structure: domain.com/ar/
International SEO | | EstebanCervi
domain.com/mx/
domain.com/cl/
etc I also have: domain.com/int/ for x-default
domain.com/category/ does a 301 redirect through IP geo-location to the correspondent url, example if your IP is from Mexico, then you got redirected to domain.com/mx/category/ hreflang is correct. webmaster tool geo-location is correct. Example of the issue Im facing right now: When users from Chile do a keyword search in Google Chile, the domain ranks well but the URL that appears in the SERP is the /mx/ version, or the /int/ version or any other country version. Other times is the /cl/ version. The same happens for all the users / countries / keywords. I need to understand what Im doing wrong, because Google is not displaying in the SERP's the correct URL version for the country of the user who is doing the search. Thank you so much! I will appreciate your ideas. PS: I think I should try to change the 301 to a 302 redirect, or completely remove those redirects. Any ideas? Suggestions? Thanks!0 -
What's the Best Strategy for Multiregional Targeting for Single Language?
I have a service based client who is based in the US but wants to expand to audiences in Australia, Canada, and the United Kingdom. Currently, all the content is in American English with international targeting in Google Search Console set to the US. I know that is going to have to change, but I'm unsure of the best strategy. Right now there are a few basic strategies in my head. Remove International Targeting in GSC and let her rip Remove International Targeting in GSC, install copies of the site on subfolders /au/, /ca/, and /uk/, add hreflang tags, and add canonicals pointing back to original Remove International Targeting in GSC, install copies of the site on subfolders /au/, /ca/, and /uk/, add hreflang tags, and risk duplicate content Have independent writers overcharge for English translations into different dialects and add hreflang tags It's hard to come up with a perfect solution for content differentiation by region in order to implement hreflang tags with a region (en-au, en-ca, en-gb). Remove International Targeting in GSC and let her rip This one is pretty simple. However, I am completely unsure of its effectiveness. Remove International Targeting in GSC, install copies of the site on subfolders /au/, /ca/, and /uk/, add hreflang tags, and add canonicals pointing back to original The point of adding canonicals is to avoid the duplicate content, but then my new subfolders do not get indexed. I'm unsure of what type of exposure these URLs would receive or how they would be valuable. Remove International Targeting in GSC, install copies of the site on subfolders /au/, /ca/, and /uk/, add hreflang tags, and risk duplicate content High risk of a penalty with duplicate content, but my targeting will be the most efficient. Have independent writers overcharge for English translations into different dialects and add hreflang tags This is probably the safest bet, takes the longest, and costs the most money. However, how different will the content actually be if I change truck to lorry, trunk to boot, and optimization to optimisation? Maybe I'm missing something, but this conundrum seems extremely difficult. Weighing the cost, time, and possible result is challenging. Hit me with your best answer and thanks for taking a look at someone else's problem.
International SEO | | ccox12 -
Is there any reason to get a massive decrease on indexed pages?
Hi, I'm helping on SEO for a big e-commerce in LatAm and one thing we've experienced during the last months is that our search traffic had reduced and the indexed pages had decreased in a terrible way. The site had over 2 Million indexed pages (which was way too much, since we believe that around 10k would be more than enough to hold the over 6K SKUs) but now this number has decreased to less than 3K in less than 2 months. I've also noticed that most of the results in which the site is still appearing are .pdf or .doc files but not actual content on the website. I've checked the following: Robots (there is no block, you can see that on the image as well) Webmaster Tools Penalties Duplicated content I don't know where else to look for. Can anyone help? Thanks in advance! cpLwX1X
International SEO | | mat-relevance0 -
Geotarget subfolders with the same language or get rid of duplicates but lose option to geotarget?
Hi, we have a domain that is aimed to cover LatAm region. Currently, the homepage contains country selector for ~20 countries. 95% of them hold content in Spanish. We have only homepages for each regions as separate subfolders, i.e.
International SEO | | eset
www.maindomain.com/co
www.maindomain.com/cl
www.maindomain.com/br
etc. but once the user clicks on menu item he is taken back to main domain subpages, i.e. www.maindomain.com/comprar My struggle is to decide whether it is better to: A) copy all content for each subfolder, which will create huge amount of duplicates (there are no resources to create unique content and it is even impossible taking into account nature of the product - mostly tech.specs, etc.) and implement hreflang sitemaps and configure GWT to target each country with its own Spanish content (the same for each country) OR B) remove all local subfolders and keep only main domain in Spanish that will serve all countries within the region. With this option, we will get rid of duplicates but also lose option to geotarget. So, my questions is which option will do less harm, or if there is any other approach that comes to your minds. I consulted with two agencies but still haven't got clear answer. Thanks a lot for your help!0 -
Alternate tag. Anybody had success getting English websites only with localized currency served with alternate tag?
I have an English website with USD prices and US phone.
International SEO | | lcourse
Via currency dropdown visitors in Ireland can choose EUR as currency, visitors from Denmark Danish crown etc and via GEO IP I also serve local contact phone numbers. So I though it made sense to define this with the alternate tags, but now after several months google still does not pickup these pages in local searches. Did anybody have success with getting a website just with currency parameter ranked locally using the alternate tag? Does it help to have also static links (not only dropdown links) to currency versions on the page? Any other thing that could help to have google pick these up? Below my code sample:0 -
Geo-targeting a sub-folder that's had url's rewritten from a sub-domain
I have a client that's setting up a section of his site in a different language, and we're planning to geo-target those pages to that country. I have suggested a sub-folder solution as it's the most cost effective solution, and it will allow domain authority to flow into those pages. His developer is indicating that they can only set this up as a sub-domain, for technical reasons, but they're suggesting they can rewrite the url's to appear as sub folder pages. I'm wondering how this will work in terms of geo-targeting in Google Webmaster Tools. Do I geo-target the sub domain or the sub folder i.e. does Google only see urls or does it physically see those pages on the sub-domain? It seems like it might be a messy solution. Would it be a better idea just to forget about the rewrites and live with the site being a sub domain? Thanks,
International SEO | | Leighm0 -
Multi-lingual SEO: Country-specific TLD's, or migration to a huge .com site?
Dear SEOmoz team, I’m an in-house SEO looking after a number of sites in a competitive vertical. Right now we have our core example.com site translated into over thirty different languages, with each one sitting on its own country-specific TLD (so example.de, example.jp, example.es, example.co.kr etc…). Though we’re using a template system so that changes to the .com domain propagate across all languages, over the years things have become more complex in quite a few areas. For example, the level of analytics script hacks and filters we have created in order to channel users through to each language profile is now bordering on the epic. For a number of reasons we’ve recently been discussing the cost/benefit of migrating all of these languages into the single example.com domain. On first look this would appear to simplify things greatly; however I’m nervous about what effect this would have on our organic SE traffic. All these separate sites have cumulatively received years of on/off-site work, and even if we went through the process of setting up page-for-page redirects to their new home on example.com, I would hate to lose all this hard-work (and business) if we saw our rankings tank as a result of the move. So I guess the question is, for an international business such as ours, which is the optimal site structure in the eyes of the search engines; Local sites on local TLD’s, or one mammoth site with language identifiers in the URL path (or subdomains)? Is Google still so reliant on TLD for geo targeting search results, or is it less of a factor in today’s search engine environment? Cheers!
International SEO | | linklater0