Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can Google crawl dynamically generated links?
-
Thanks in advance!
-
A few years ago I added a location finder which then auto generated product content for ERIKS. This generated 1000's of URLs overnight, the problem was Google thought I was spamming it's indexes. Google does follow all links on a web page/sitemap however it's what it does with them that counts. The ERIKS Hose Technology Site http://www.eriks-hose-technology.com relies on this type of coding using Classic ASP.
I have a number of other sites which also rank highly on Google You'll find Revolvo by searching for 'Split Roller Bearings'. So it's also not as bad at ranking as some people may tend to think. I would agree however that English URLs are better than coded. The main issue here is to make sure that your main keyword is in the URI to ensure that Google knows what your page is about.
-
For the most part, Google can crawl any links on the page, but I think I could do with a little bit more information about what pages are being linked to and how the links are being generated.
If you are using javascript or even (shudder!) Flash to generate the links then search engines will struggle to see them.
If it's in html then it should be fine, but you need to be careful of duplicate content. If a single page can be called up by multiple dynamically generated URLs then it can harm search ranking if rel=canonical tags have not been used.
-
Hi,
For the most part, Google can find their way well around dynamic pages with few problems. There are always going to be exceptions, but this tends to be when there are lots of parameters to the URL structure.
As long as it is pretty simple, you shouldn't have any problems.
-Andy
-
Well, it can crawl anything found on a web page. If you are referring to a page whose links are dynamically generated in the sense that you build them before serving the page (php for example), then yes. If Google bot reaches that page in any way (it is not blocked etc) then your links will be crawled as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
XML sitemap generator only crawling 20% of my site
Hi guys, I am trying to submit the most recent XML sitemap but the sitemap generator tools are only crawling about 20% of my site. The site carries around 150 pages and only 37 show up on tools like xml-sitemaps.com. My goal is to get all the important URLs we care about into the XML sitemap. How should I go about this? Thanks
Intermediate & Advanced SEO | | TyEl0 -
Is possible to submit a XML sitemap to Google without using Google Search Console?
We have a client that will not grant us access to their Google Search Console (don't ask us why). Is there anyway possible to submit a XML sitemap to Google without using GSC? Thanks
Intermediate & Advanced SEO | | RosemaryB0 -
Does Navigation Bar have an effect on the link juice and the number of internal links?
Hi Moz community, I am getting the "Avoid Too Many Internal Links" error from Moz for most of my pages and Google declared the max number as 100 internal links. However, most of my pages can't have internal links less than 100, since it is a commercial website and there are many categories that I have to show to my visitors by using the drop down navigation bar. Without counting the links in the navigation bar, the number of internal links is below 100. I am wondering if the navigation bar links affect the link juice and counted as internal links by Google. The Same question also applies to the links in the footer. Additionally, how about the products? I have hundreds of products in the category pages and even though I use pagination I still have many links in the category pages (probably more than 100 without even counting the navigation bar links). Does Google count the product links as internal links and how about the effect on the link juice? Here is the website if you want to take a look: http://www.goldstore.com.tr Thank you for your answers.
Intermediate & Advanced SEO | | onurcan-ikiz0 -
Would you rate-control Googlebot? How much crawling is too much crawling?
One of our sites is very large - over 500M pages. Google has indexed 1/8th of the site - and they tend to crawl between 800k and 1M pages per day. A few times a year, Google will significantly increase their crawl rate - overnight hitting 2M pages per day or more. This creates big problems for us, because at 1M pages per day Google is consuming 70% of our API capacity, and the API overall is at 90% capacity. At 2M pages per day, 20% of our page requests are 500 errors. I've lobbied for an investment / overhaul of the API configuration to allow for more Google bandwidth without compromising user experience. My tech team counters that it's a wasted investment - as Google will crawl to our capacity whatever that capacity is. Questions to Enterprise SEOs: *Is there any validity to the tech team's claim? I thought Google's crawl rate was based on a combination of PageRank and the frequency of page updates. This indicates there is some upper limit - which we perhaps haven't reached - but which would stabilize once reached. *We've asked Google to rate-limit our crawl rate in the past. Is that harmful? I've always looked at a robust crawl rate as a good problem to have. Is 1.5M Googlebot API calls a day desirable, or something any reasonable Enterprise SEO would seek to throttle back? *What about setting a longer refresh rate in the sitemaps? Would that reduce the daily crawl demand? We could set increase it to a month, but at 500M pages Google could still have a ball at the 2M pages/day rate. Thanks
Intermediate & Advanced SEO | | lzhao0 -
Google crawling different content--ever ok?
Here are a couple of scenarios I'm encountering where Google will crawl different content than my users on initial visit to the site--and which I think should be ok. Of course, it is normally NOT ok, I'm here to find out if Google is flexible enough to allow these situations: 1. My mobile friendly site has users select a city, and then it displays the location options div which includes an explanation for why they may want to have the program use their gps location. The user must choose the gps, the entire city, or he can enter a zip code, or choose a suburb of the city, which then goes to the link chosen. OTOH it is programmed so that if it is a Google bot it doesn't get just a meaningless 'choose further' page, but rather the crawler sees the page of results for the entire city (as you would expect from the url), So basically the program defaults for the entire city results for google bot, but for for the user it first gives him the initial ability to choose gps. 2. A user comes to mysite.com/gps-loc/city/results The site, seeing the literal words 'gps-loc' in the url goes out and fetches the gps for his location and returns results dependent on his location. If Googlebot comes to that url then there is no way the program will return the same results because the program wouldn't be able to get the same long latitude as that user. So, what do you think? Are these scenarios a concern for getting penalized by Google? Thanks, Ted
Intermediate & Advanced SEO | | friendoffood0 -
Disavowin a sitewide link that has Thousands of subdomains. What do we tell Google?
Hello, I have a hosting company that partnered up with a blogger template developer that allowed users to download blog templates and have my footer links placed sitewide on their website. Sitewides i know are frowned upon and that's why i went through the rigorous Link Audit months ago and emailed every webmaster who made "WEBSITENAME.Blogspot.com" 3 times each to remove the links. I'm at a point where i have 1000 sub users left that use the domain name of "blogspot.com". I used to have 3,000! Question: When i disavow these links in Webmaster tools for Google and Bing, should i upload all 1000 subdomains of "blogspot.com" individually and show Google proof that i emailed all of them individually, or is it wise to just include just 1 domain name (www.blogspot.com) so Google sees just ONE big mistake instead of 1000. This has been on my mind for a year now and I'm open to hearing your intelligent responses.
Intermediate & Advanced SEO | | Shawn1240 -
Can too many NoFollow links damage your Google rankings?
I've been trying to recover from a Google algorithm change since Sep 2012, so far without success. I'm now wondering if the nofollow on external links in my blog posts are actually doing me damage. http://www.smartdatinguk.com/blog/ Does anyone have any experience of this?
Intermediate & Advanced SEO | | benners0 -
Can a competitor close your business on Google Places?
One of my listings says it has been closed and the business is not closed. On Google + / Google places there is a field that allows users to check that claims the business is closed. Can they actually close it? Your Google Places listing has been updated Dear Google Places user, Google has updated your listing data on our consumer properties such as Google and Google Maps to more accurately reflect the latest information we have about your business. We use many sources to determine the accuracy of our listing data and to provide the best possible experience for business owners and consumers who use Google and Google Maps to find local information. Based on our sources, the following listing has been marked as closed: Company info... If you disagree with the changes we have made, please visit your Place Page to edit your listing. Note that if you are an AdWords or Boost customer, your ads will be unaffected by this change and will continue to display the listing information you have provided in Google Places. To manage your online advertisements, please sign into Google Places or Google AdWords. For more information about updates to claimed listings, please visit:http://www.google.com/support/places/bin/answer.py?hl=en&answer=1318197 Sincerely,
Intermediate & Advanced SEO | | SEODinosaur
The Google Places Team |0