Hey guys i have this issues on my crawling report what should i do to exlude the pages? are d
-
Overly-Dynamic URL
Overly-Dynamic URL
Although search engines can crawl dynamic URLs, search engine representatives have warned against using over 2 parameters in a given URL. Search engines may also see dynamic versions of the same URL as unique URLs, creating duplicate content.
-
understand thanks
-
So you use dynamic url's with specific parameters that generate this issue because you use more then 2 parameters. According to the moz.com data warning (not error) you should reduce the number of parameters. You want to know how to exclude this page from your crawl report? Fix the error. More important, is this page ranked in Google? Why use parameters? Do you have the ability to change the urls to static versions? USe 301 to redirect. If you have an issue with duplicate page content due to the parameters prehaps rel="canonical" can solve this for you. It all depends on your exact problem with the parameters and to what end you want it solved. If the pages are not ranked in Google and you don't want them to rank in Google/Bing or any other search engine, use noindex, nofollow of disallow them in the robots.txt. If you do want them to rank maybe it would be wiser to make the static url change with a nice keyword in it.
Hope this helps some. If not, please explain yourself as clear as possible so we can assist you with this.
regards
Jarno
-
on my crawling report on moz campaign i have this issues : Overly-Dynamic URL Overly-Dynamic URL i have a page with dinamic url is page that generates diferent versions of it is a leed form from quinstreet should i use noindex for this page?
-
Adulter,
can you conform one solid question? I don't see what you are asking here...
regards
Jarno
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl Depth improvements
Hi I'm checking the crawl depth report in SEM rush, and looking at pages which are 4+ clicks away. I have a lot of product pages which fall into this category. Does anyone know the impact of this? Will they never be found by Google? If there is anything in there I want to rank, I'm guessing the course of action is to move the page so it takes less clicks to get there? How important is the crawl budget and depth for SEO? I'm just starting to look into this subject Thank you
Intermediate & Advanced SEO | | BeckyKey0 -
301 Externally Linked, But Non-Producing Pages, To Productive Pages Needing Links?
I'm working on a site that has some non-productive pages without much of an upside potential, but that are linked-to externally. The site also has some productive pages, light in external links, in a somewhat related topic. What do you think of 301ing the non-productive pages with links to the productive pages without links in order to give them more external link love? Would it make much of a difference? Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Keywords going to Subdomain instead of targeted page(general landing page)
Why are some of my keywords going to subdomains instead of the more general/targeted landing page. For example, on my ecommerce website, the keyword 'tempurpedic' is directing to the subdomain URL of a specific tempurpedic product page instead of the general landing page. The product has a page authority of 15 and the Tempurpedic landing pages with all the products has an authority of 31. I have also noticed that my 'furniture stores in houston' keyword directs to my "occasional tables" URL! instead of a the much more targeted homepage. Is there something I am missing here?
Intermediate & Advanced SEO | | nat88han0 -
Should i redirect this page?
Hi I have the following 2 pages: http://www.over50choices.co.uk/Funeral-Planning.aspx http://www.over50choices.co.uk/Funeral-Planning/Funeral-Plans.aspx My dilema is that google sees the words "funeral planning" & "funeral plans" as the same thing, which might explain why the "funeral plan" page is not ranked v well. My issue is that the "funeral planning" page is at category level and introduces the wider subject of funeral planning, which isnt just funeral plans, so if i 301 my "funeral plan" page i will have no where to talk about funeral plans. My question is, Is the "funeral plan" page not ranked v well because of this or do i just need better optimisation of the funeral plan page so google is clear which is the key focus for each page? Thanks Ash
Intermediate & Advanced SEO | | AshShep10 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
Sitemap not indexing pages
My website has about 5000 pages submitted in the sitemap but only 900 being indexed. When I checked Google Webmaster Tools about a week ago 4500 pages were being indexed. Any suggestions about what happened or how to fix it? Thanks!
Intermediate & Advanced SEO | | theLotter0