Moz crawl showing up ?s=keyword pages as errors
-
Hi all,
Hoping someone can she some light on a fix with ref to wordpress and the search function it uses as Moz is craling some pages which reference the search
Errors showing up are duplicate pages, descriptions and titles.
The search function is not important on this site and I have tried to use a plugin which disables the search page which it does but these errors still show up. Can anyone assist as this is the final piece of the puzzle and then we're down to 0 issues on the site.
-
I have implemented the robots method for the moment and I will look at the exclusion of parameters too. thanks , i will now wait the next crawl from Moz .
-
Hi there,
I am not sure which fix will stop Moz from flagging this, but just in case it is also being picked up by Google, are you able to serve the canonical tag on domain.com/?s=keyword and any search query pages, pointing to a generic page?
You can remove parameters from Google in Webmaster Tools as well: https://support.google.com/webmasters/answer/1235687?hl=en
This ensures that Google does not crawl pages with the parameter you specify, meaning they won't be an issue in its index.
-
Different ways to do this - best way would be to eliminate the search from your template in the editor files. If you want Moz to stop crawling your internal search pages add this to your robots.txt file:
User-agent: RogerBot
Disallow: /?s=*?
This should block Moz's crawler from any page beginning with _/?s= _after the root domain. Do a regex test just to double check your internal search URL and that piece of regex.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Have you changed 100's of links on your site? Tell me the why's, the how's and what's!
Hello there. If you've changed 100's of links, then I'd like for you to contribute to this thread. I've created a new URL structure for a website with 500+ posts in an effort to make it more user friendly, and more accessible to crawlers. I was just about to pull the trigger, when I started reading up on the subject and found that I might have a few surprises waiting for me around the corner. The status of my site. 500 posts 10 different categories 50+ tags No Backlinks No recent hits (according to Google Analytics) No rankings. I'm going to keep roughly 75% of the posts, and put them in different (new) categories to strengthen SEO for the topic which I'd like to rank multiple categories for, and also sorted a list with content which I'd like to 410. Created new structure created new categories Compiled list of old URLs, and new URLs New H1, Meta Title & Descriptions New tags It looks simple on paper, but I've got problems executing it. **Question 1. **What do I need to keep in mind when deleting posts, categories, and tags - besides 410, Google URL removal? Question 2. What do I do with all the old posts that I am going to re-direct? Each post has between 10-15 internal links. I've started manually removing each link in old posts before 301'ing them. The reason I'm doing this is control the UX, as well as internal link juice to strengthen main categories. Am I on the right path? On a side note, I've prepared for the 301'ing by changing the H1's, meta data and adding alt text to images. But I can't help but to think that just deleting the old posts, and copying over the content to the new url (with the original dates set) would be a better alternative. Any contribution to this thread would be greatly appreciated. Thank you in advance.
Web Design | | Dan-Louis1 -
Toggle Tabs on pages - How to present information to users
Hi all, I can use some help with SEO/UX related question I have got. I have a client who has some toggel tabs on its website. Is there a way to display the relevant information from these toggle tabs when a user lands on the page instead of having the same toggle tab show for whenever a user reaches the page? What I am trying to understand is that if a user searched for "vitamin C benefit" (lets say) in Google and then clicks on the link, the user is presented with the "benefits" tab on the page instead of "side effects" tab. Any help would be appreciated! Thanks
Web Design | | Malika1
Malika0 -
Reasons Why Our Website Pages Randomly Loads Without Content
I know this is not a marketing question but this community is very dev savvy so I'm hoping someone can help me. At random times we're finding that our website pages load without the main body content. The header, footer and navigation loads just fine. If you refresh, it's fine but that's not a solution. Happens on Chrome, IE and Firefox, testing with multiple browser versions Happens across various page types - but seems to be only the main content section/container Happens while on the company network, as well as externally Happens after deleting cookies, temporary internet files and restarting computer We are using a CMS that is virtually unheard of - Bridgeline/Iapps Codebase is .net Our IT/Dev group keeps pushing back, blaming it on cookies or Chrome plugins because they apparently are unable to "recreate the problem". This has been going on for months and it's a terrible experience for the user to have. It's also not great when landing PPC visitors on pages that load with no content. If anyone has ideas as to why this may be happening I would really appreciate it. I'm not sure if links are allowed, by today the issue happened on this page serversdirect.com/dm/geek-biz Linking to an image example below knEUzqd
Web Design | | CliqStudios0 -
Site is getting crushed by spam traffic and Google Webmaster Tools giving crawl warnings. Also...
Currently hosting a site I'm planning on moving to a new server ASAP, 301 redirecting and have a domain that has nice authority and very old. On the current site I need to clean up the blog. I have a few questions actually.... 1. I'd like to remove most of the blog articles as I want the new site to be very high quality, but isn't it dangerous to do a 301 redirect to the same page for all these articles? 2. I want to focus on the new site as the current site has too many issues but still managing to hang in their. is highly outdated yet I don't want to spend a ton of time on the site before the 301 redirect. With the Pigeon and Panda 4.0 rumors being released soon, I want to get the new site completed ASAP. Do you think it's better if I fix the 3. Would removing cloudflare make things better or worse with the crashing of my site due to high traffic (mainly spam on the blog.) 4. My best article by far is outdated, but should I waste time updating it before redirecting or should I just get the new site going? I did way too many guest posts thinking content is king, but at least checked the outgoing links Domain Auth, Page Auth, and MozTrust in OSE, but first off I'm going to remove a page that mentions I'm looking for guest bloggers. I tried to keep the posts relevant but at the time you could get away with 5. Anything I can do to slow down these spammers on Wordpress? I noticed most of them are checking for vulnerabilities but I'm keeping it up to date, have caching setup. Thanks!
Web Design | | eugenecomputergeeks0 -
Using a query string for linked, static landing pages - is this good practice?
My company has a page with links for each of our dozen office locations as well as a clickable map. These offices are also linked in the footer of every page along with their phone number. When one of these links is clicked, the visitor is directed to a static page with a picture of the office, contact information, a short description, and some other information. The URL for these pages is displayed as something like http:/example.com/offices.htm?office_id=123456, with seemingly random ID numbers at the end depending on the office that remain static. I know first off that this is probably bad SEO practice, as the URL should be something like htttp://example.com/offices/springfield/ My question is, why is there a question mark in the page URL? I understand that it represents a query string, but I'm not sure why it's there to begin with. A search query should not required if they are just static landing pages, correct?. Is there any reason at all why they would be queries? Is this an issue that needs to be addressed or does it have little to no impact on SEO?
Web Design | | BD690 -
How to find internal pages linking to a URL?
Hey, I had an issue where a client found a bad link on their site then I went to fix it and couldn't figure out where on earth it was. I tried using different software which would find the link, but not tell me where it was linked from. I asked for some help from someone in my office and they found it in about 15 seconds. Their strategy was "think like a client - just click everywhere". Is there a way to quickly find what URLs are pointing to a specific URL? Cheers
Web Design | | renegadeempire0 -
How do I gain full SEO value from individual property pages?
A client of ours has a vacation rental business with rental locations all over the country. Their old sites were a messy assembly of black hat, broken links and htaccess files that were used over and over on each site. We are redoing everything for them, in one site, with multiple subdirectories for individual locations, like Aspen, Fort Meyers, etc. Anyhow, I'm putting together the SEO plan for the site and I have a problem. The individual rental properties have great SEO value (lots of text, indexable pictures, can create google/bing location pages), and are great for linking in social media (Look at this wonderful property, rental price just reduced!). However, I don't want individual properties, which will have very similar keywords, links, descriptions, etc, competing with each other when indexed. Truth be told, I don't really want search engines linking directly to the individual property pages at all. The intended browsing experience should allow a user to "narrow down" exactly what they're seeking using the site until the perfect rental appears. What I want is for searchers to be directed to the property listing index that most closely matches what they're seeking (Ft. Meyers Rental Condos or Breckenridge Rental Homes), and then allow them to narrow it down from there. This is ideal for the users, because it allows them to see all available properties that match what they want, and ideal for the customer, because it applies dozens of pages of SEO mojo to a single index, rather than dozens of pages. So I can't "noindex" or "nofollow", because I want all that good SEO mojo. I can't REL=CANONICAL, because the property pages aren't similar enough to the index. I can't 301 Redirect because I want the users to be able to see the property pages at some point. I'm stymied.
Web Design | | SpokeHQ0 -
How serious is duplicate page content?
We just launched our site on a new platform - Magento Enterprise. We have a wholesale catalog and and retail catalog. We have up to 3 domains pointing to each product. We are getting tons of duplicate content errors. What are the best practices for dealing with this? Here is an example: mysite.com/product.html mysite.com/category/product.html mysite.com/dynamic-url
Web Design | | devonkrusich0