Partial Match or RegEx in Search Console's URL Parameters Tool?
-
So I currently have approximately 1000 of these URLs indexed, when I only want roughly 100 of them.
Let's say the URL is www.example.com/page.php?par1=ABC123=&par2=DEF456=&par3=GHI789=
All the indexed URLs follow that same kinda format, but I only want to index the URLs that have a par1 of ABC (but that could be ABC123 or ABC456 or whatever). Using URL Parameters tool in Search Console, I can ask Googlebot to only crawl URLs with a specific value. But is there any way to get a partial match, using regex maybe?
Am I wasting my time with Search Console, and should I just disallow any page.php without par1=ABC in robots.txt?
-
No problem
Hope you get it sorted!
-Andy
-
Thank you!
-
Haha, I think the train passed the station on that one. I would have realised eventually... XD
Thanks for your help!
-
Don't forget that . & ? have a specific meaning within regex - if you want to use them for pattern matching you will have to escape them. Also be aware that not all bots are capable of interpreting regex in robots.txt - you might want to be more explicit on the user agent - only using regex for Google bot.
User-agent: Googlebot
#disallowing page.php and any parameters after it
disallow: /page.php
#but leaving anything that starts with par1=ABC
allow: page.php?par1=ABC
Dirk
-
Ah sorry I missed that bit!
-Andy
-
Disallowing them would be my first priority really, before removing from index.
The trouble with this is that if you disallow first, Google won't be able to crawl the page to act on the noindex. If you add a noindex flag, Google won't index them the next time it comes-a-crawling and then you will be good to disallow
I'm not actually sure of the best way for you to get the noindex in to the page header of those pages though.
-Andy
-
Yep, have done. (Briefly mentioned in my previous response.) Doesn't pass
-
I thought so too, but according to Google the trailing wildcard is completely unnecessary, and only needs to be used mid-URL.
-
Hi Andy,
Disallowing them would be my first priority really, before removing from index. Didn't want to remove them before I've blocked Google from crawling them in case they get added back again next time Google comes a-crawling, as has happened before when I've simply removed a URL here and there. Does that make sense or am I getting myself mixed up here?
My other hack of a solution would be to check the URL in the page.php, and if URL includes par1=ABC then insert noindex meta tag. (Not sure if that would work well or not...)
-
My guess would be that this line needs an * at the end.
Allow: /page.php?par1=ABC* -
Sorry Martijn, just to jump in here for a second - Ria, you can test this via the Robots.txt testing tool in search console before going live to make sure it work.
-Andy
-
Hi Martijn, thanks for your response!
I'm currently looking at something like this...
**user-agent: *** #disallowing page.php and any parameters after it
disallow: /page.php #but leaving anything that starts with par1=ABC
allow: /page.php?par1=ABCI would have thought that you could disallow things broadly like that and give an exception, as you can with files in disallowed folders. But it's not passing Google's robots.txt Tester.
One thing that's probably worth mentioning really is that there are only two variables that I want to allow of the par1 parameter. For example's sake, ABC123 and ABC456. So would need to be either a partial match or "this or that" kinda deal, disallowing everything else.
-
Hi Ria,
I have never tried regular expressions in this way, so I can't tell you if this would work or not.
However, If all 1000 of these URL's are already indexed, just disallowing access won't then remove them from Google. You would ideally be able to place a noindex tag on those pages and let Google act on them, then you will be good to disallow. I am pretty sure there is no option to noindex under the URL Parameter Tool.
I hope that makes sense?
-Andy
-
Hi Ria,
What you could do, but it also depends on the rest of your structure is Disallow these urls based on the parameters (what you could do in a worst case scenario is that you would disallow all URLs and then put an exception Allow in there as well to make sure you still have the right URLs being indexed).
Martijn.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to use redirects on a massive site consolidation
We are migrating 13 websites into a single new domain and with that we have certain pages that will be terminated or moved to a new folder path so we need custom 301 redirects built for these. However, we have a huge database of pages that will NOT be changing folder paths and it's way too many to write custom 301's for. One idea was to use domain forwarding or a wild card redirect so that all the pages would be redirected to their same folder path on the new URL. The problem this creates though is that we would then need to build the custom 301s for content that is moving to a new folder path, hence creating 2 redirects on these pages (one for the domain forwarding, and then a second for the custom 301 pointing to a new folder). Any ideas on a better solution to this?
Intermediate & Advanced SEO | | MJTrevens0 -
Can you disallow links via Search Console?
Hey guys, Is it possible in anyway to nofollow links via search console (not disavow) but just nofollow external links pointing to your site? Cheers.
Intermediate & Advanced SEO | | lohardiu90 -
URL Parameters as a single solution vs Canonical tags
Hi all, We are running a classifieds platform in Spain (mercadonline.es) that has a lot of duplicate content. The majority of our duplicate content consists of URL's that contain site parameters. In other words, they are the result of multiple pages within the same subcategory, that are sorted by different field names like price and type of ad. I believe if I assign the correct group of url's to each parameter in Google webmastertools then a lot these duplicate issues will be resolved. Still a few questions remain: Once I set f.ex. the 'page' parameter and i choose 'paginates' as a behaviour, will I let Googlebot decide whether to index these pages or do i set them to 'no'? Since I told Google Webmaster what type of URL's contain this parameter, it will know that these are relevant pages, yet not always completely different in content. Other url's that contain 'sortby' don't differ in content at all so i set these to 'sorting' as behaviour and set them to 'no' for google crawling. What parameter can I use to assign this to 'search' I.e. the parameter that causes the URL's to contain an internal search string. Since this search parameter changes all the time depending on the user input, how can I choose the best one. I think I need 'specifies'? Do I still need to assign canonical tags for all of these url's after this process or is setting parameters in my case an alternative solution to this problem? I can send examples of the duplicates. But most of them contain 'page', 'descending' 'sort by' etc values. Thank you for your help. Ivor
Intermediate & Advanced SEO | | ivordg0 -
What's with the Keyword Apocalypse?
Hi, 9 of my tracked keywords have dropped by over 20 ranks since last week. The nastiest drops in ranking are by 36, 38, and 46 places. For the last month I have been chipping away at the duplicate content with 301 redirects and was expecting my keyword rankings to improve slightly as a result of this; not the opposite. I don't have any manual actions logged against my site and am at a bit of a loss to explain this sudden drop. Any suggestions would be most welcome.
Intermediate & Advanced SEO | | McCaldin1 -
Doubts with URL's structure
Hi guys i have some doubts with the correct URL structure for a new site. The question is about how show the city, the district and also the filters. I would do that: www.domain.com/category/city/disctict but maybe is better do that: **www.domain.com/category/city-district ** I also have 3 filters that are "individual/colective" "indoor/outdoor" and "young/adult" but that are not really interesting for the querys so where and how i put this filtters? At the end of the url showing these: **www.domain.com/cateogry/city/district#adult#outdoor#colective ** ? Well really i don't know what to do with the filters. Check if you could help me with that please. I also have a lof of interest in knowing if maybe is better use this combination **www.domain.com/category-city or domain.com/category/city **and know about the diference. Thank you very much!
Intermediate & Advanced SEO | | omarmoscatt0 -
Should I remove 404 urls in webmaster tools?
I've recently removed a lot of category pages so should I remove the urls in webmaster tools or let them drop out of the index naturally?
Intermediate & Advanced SEO | | SamCUK0 -
What's the best internal linking strategy for articles and on-site resources?
We recently added an education center to our site with articles and information about our products and industry. What is the best way to link to and from that content? There are two options I'm considering: Link to articles from category and subcategory pages under a section called "related articles" and link back to these category and subcategory pages from the articles: category page <<--------->> education center article education center article <<---------->> subcategory page Only link from the articles to the category and subcategory pages: education center article ---------->> category page education center article ---------->> subcategory page Would #1 dilute the SEO value of the category and subcategory pages? I want to offer shoppers links to more information if they need it, but this may also take them away from the products. Has anyone tested this? Thanks!
Intermediate & Advanced SEO | | pbhatt0 -
How do you rank in the "brands for:" section in Google's search results ?
There's a "brands for:" section that appears above the first organic listing for certain search queries. For example, if you search for "dedicated servers" in Google, you will see that a "brands for:" appears. How do you get listed there? Thanks, Brian
Intermediate & Advanced SEO | | InMotionHosting0