Can dynamically translated pages hurt a site?
-
Hi all...looking for some insight pls...i have a site we have worked very hard on to get ranked well and it is doing well in search. The site has about 1000 pages and climbing and has about 50 of those pages in translated pages and are static pages with unique urls. I have had no problems here with duplicate content and that sort of thing and all pages were manually translated so no translation issues. We have been looking at software that can dynamically translate the complete site into a handfull of languages...lets say about 5. My problem here is these pages get produced dynamically and i have concerns that google will take issue with this aswell as the huge sudden influx of new urls....as now we could be looking at and increase of 5000 new urls. (which usually triggers an alarm)
My feeling is that it could be risking the stability of the site that we have worked so hard for and maybe just stick with the already translated static pages.
I am sure the process could be fine but fear a manual inspection and a slap on the wrist for having dynamically created content?? and also just risk a review trigger period.
These days it is hard to know what could get you in "trouble" and my gut says keep it simple and as is and dont shake it up?? Am i being overly concerned? Would love to here from others who have tried similar changes and also those who have not due to similar "fear"
thanks
-
Stumbled upon some additional information and decided to update you...
According to the internationalization FAQ...
Q: <a name="q5"></a>Can I use automated translations?
A: Yes, but they must be blocked from indexing with the “noindex” robots meta tag. We consider automated translations to be auto-generated content, so allowing them to be indexed would be a violation of our Webmaster Guidelines.So if you decide to autotranslate the text, you should use a noindex tag instead of the hreflang tag.
-
Considering they offer that service themselves, it would be hypocritical of them to penalize you for doing it. The hreflang tag would also protect you from having those pages marked as spam since you are telling G "Page a the exact same as page å, just in a different language" - avoiding "duplicate" content
-
thanks Oleg.....if the site was to get reviewed manually would there be any issues that there are thousands of pages with content being created dynamically?
thanks for your time
-
The problem with using a software to translate your content is that it will never be perfect. There will be many grammatical and/or vocabulary errors that would decrease the quality of the content. I'm not sure if Google is able to understand content quality in other languages, but a worse user experience usually leads to worse rankings. Ideal situation, you would have those pages manually translated (but I know it will cost a fortune).
In case you decide to auto translate, be sure to use the rel="alternative" hreflang="x" tag in order to tell Google that you have multiple pages with the same content, except in different languages.
I don't think you should worry about a sudden influx of pages. Ideally, you'd drip feed them in to take advantage of the freshness factor, but you shouldn't be penalized for creating a lot of new pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I hope someone can help me with page indexing problem
I have a problem with all video pages on www.tadibrothers.com.
Technical SEO | | TadiBrothers
I can not understand why google do not index all the video pages?
I never blocked them with the robots.txt file, there are no noindex/nofollow tags on the pages. The only video page that I found in search results is the main video category page: https://www.tadibrothers.com/videos and 1 video page out of 150 videos: https://www.tadibrothers.com/video/front-side-rear-view-cameras-for-backup-camera-systems I hope someone can point me to the right way0 -
Can you keep you old HTTP xml sitemape when moving to HTTPS site wide?
Hi Mozers, I want to keep the HTTP xml sitemape live on my http site to keep track of indexation during the HTTPS migration. I'm not sure if this is doable since once our tech. team forces the redirects every http page will become https. Any ideas? Thanks
Technical SEO | | znotes0 -
Does my "spam" site affect my other sites on the same IP?
I have a link directory called Liberty Resource Directory. It's the main site on my dedicated IP, all my other sites are Addon domains on top of it. While exploring the new MOZ spam ranking I saw that LRD (Liberty Resource Directory) has a spam score of 9/17 and that Google penalizes 71% of sites with a similar score. Fair enough, thin content, bunch of follow links (there's over 2,000 links by now), no problem. That site isn't for Google, it's for me. Question, does that site (and linking to my own sites on it) negatively affect my other sites on the same IP? If so, by how much? Does a simple noindex fix that potential issues? Bonus: How does one go about going through hundreds of pages with thousands of links, built with raw, plain text HTML to change things to nofollow? =/
Technical SEO | | eglove0 -
Can you noindex a page, but still index an image on that page?
If a blog is centered around visual images, and we have specific pages with high quality content that we plan to index and drive our traffic, but we have many pages with our images...what is the best way to go about getting these images indexed? We want to noindex all the pages with just images because they are thin content... Can you noindex,follow a page, but still index the images on that page? Please explain how to go about this concept.....
Technical SEO | | WebServiceConsulting.com0 -
How can i do SEO For Ecommerce site
I am doing SEO for my WP blog but now I am starting my recently launch an eCommerce site where I am selling electronics products. I want to know how can I do the SEO so at least I can top 10 position for my google India. Second how can i avoid duplicate content about copying manufacture contents. Please help
Technical SEO | | chandubaba0 -
What can i do to move my site up the search engines
Hi. my site www.in2town.co.uk is currently number five in google for the search word lifestyle magazine, sometimes it moves to four but for over a year it has not got past four. before we had to do the site from scratch due to a major problem upgrading, we were number one in the search engines and our traffic was around 30% higher than it is now. For the keyword lifestyle news, we are on the fifth page of google and would really like to improve this. I would like to know what i need to do on our home page to try and improve our rankings for these two words. the most important word for us is lifestyle news. any help in my goal to improve our rankings would be great. We have improved our design which we are still working on, and we have upgraded to a bigger dedicated server to improve the speed.
Technical SEO | | ClaireH-1848860 -
Google search result going to a page that I did not put on my site
Hi, I am seeing a very strange result in google for my site. When doing a search for the term "london reflexology" my site comes up 18th in the results. But when I click the link or check the URL it shows up as: http://www.reflexologyonline.co.uk/reflexologyonline.php?Action=Webring This is not right at all. It looks like some sort of cloaking but I am not sure. I am new to SEO and I do not know why goole is showing this URL that does not exist on my site and of witch the content is totally wrong. Can anyone please help with this? See the 2 linked images for more details. It seems to me the site might be hacked or something to that effect. Please help.... jyJdP.png 71Mf4.png
Technical SEO | | RupDog0 -
Dynamically-generated .PDF files, instead of normal pages, indexed by and ranking in Google
Hi, I come across a tough problem. I am working on an online-store website which contains the functionlaity of viewing products details in .PDF format (by the way, the website is built on Joomla CMS), now when I search my site's name in Google, the SERP simply displays my .PDF files in the first couple positions (shown in normal .PDF files format: [PDF]...)and I cannot find the normal pages there on SERP #1 unless I search the full site domain in Google. I really don't want this! Would you please tell me how to figure the problem out and solve it. I can actually remove the corresponding component (Virtuemart) that are in charge of generating the .PDF files. Now I am trying to redirect all the .PDF pages ranking in Google to a 404 page and remove the functionality, I plan to regenerate a sitemap of my site and submit it to Google, will it be working for me? I really appreciate that if you could help solve this problem. Thanks very much. Sincerely SEOmoz Pro Member
Technical SEO | | fugu0