Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I remove the ?replytocom variables in wordpress?
-
I'm using Yoast's wordpress plugin and there is an option to remove the replytocom variables. I'm curious what everyone's thoughts were on that, and if I should do it.
Here's the site if you need to see it.
Thanks!
-
Hey, guys, this is a very old post but because it might be useful to people in the future I thought I would update the URL that Ryan did a. great job posting but no longer goes to the correct place. No one can control what I third-party site changes there URL structure to rite?
you can use this plug-in called
?replytocom= replaced with #replytocom=
-
Mine is not indexed in the Google Search, but in Google Webmasters and SEOmoz they showed the error !
Should i remove those links via URL parameters ?
Not indexed, someone here told me some plugins are helping you, i have All-in seo plugin and removed the comluv plugin(for more spam in my blog). other plugins are like share social network That's it.
Any help for me and my blog will be much appreciated !
-
Thanks Ryan! Have a great 4th of July!
-
What do people not using this plug in do? I'm assuming not many people do this, right?
I presume they accept the WP default options. Our practice and understanding of SEO is what allows us to analyze and make decisions regarding tidbits such as the one you mentioned.
You think there is any benefit to doing it, or just one of those "hey why not" sort of things?
I do think there is a benefit. You are impacting a LOT of links. Every comment on your site. It may be a tiny 1% benefit type of thing, but the change applies site wide and will presumably be in place for years.
-
I''ll try to find the link where he talks about using pages instead of posts and share it. Curious to hear your thoughts on it.
I'll go ahead and select that option, thanks for your help. (On a side note, what do people not using this plug in do? I'm assuming not many people do this, right?)
You think there is any benefit to doing it, or just one of those "hey why not" sort of things?
-
Regarding the new pages instead of posts idea, do you have a link to share?
Regarding the comment url, the page with the comment should be fully indexed either way. By changing the link, you are helping search engines better understand your site. The comment links do not represent a new page or new information.
Google clearly understands WP sites exceptionally well. I am confident you can choose various options and they will still understand those links represent comments. With that said, I would still go with Yoast on this one.
Actually, SEOmoz does it too. Take a look at their blog comments.
-
Thanks for taking the time to check into it. One I'm concerned with is how this will effect long tail seo / indexing of the comments. How will this effect my organic traffic? (will it hurt it?)
I don't see these sorts of pages coming up in google now, so I'm not sure what selecting that option does (and how it effects the site.)
Yoast does a few things different with his site, and I don't always follow his lead. For example he suggets making new pages instead of new posts for your blog posts. He's the only one I've ever heard say this, or do this.
-
I just took a look at Yoast's site and I now better understand the option to remove the variables. I recommend selecting that option. From the Yoast site:
method remove_reply_to_com [line 939]
string remove_reply_to_com( string $link)
Removes the ?replytocom variable from the link, replacing it with a #comment- <number>anchor.</number> Tags: access: public Parameters: string $link The comment link as a string.
Example: http://yoast.com/user-contact-fields-wordpress/#comment-110294
-
hmm..thanks for the feedback. So do you suggest not blocking those? (and I'll message yoast also and see what his thoughts are.)
Thanks.
-
I understand the logic behind blocking removing the variables. They are a lot of extra links on the page which some webmasters might prefer to manage.
What I would prefer is to reform the link so it was something like: http://noahsdad.com/treadmill-training-progress#replytocom=22729
I am guessing the "respond" portion of the URL acts as if someone pressed the reply button which seems unnecessary. If someone clicks the link whether in search results or otherwise and is taken directly to the comment, they should be quite happy. If they wish to reply they can hit the reply button.
Google ignores anything after the # character in a URL. Therefore Google would see these as simply a link to the page which should already be indexed.
Perhaps you can ask Yoast about his thoughts.
-
Thanks for the kind words, I agree, he is a cutie.
Will blocking those cause the comments not to be indexed though?
-
Yup - removing those will save you the trouble of duplicate content - since Google by default is crawling those as different URLs. By default, if you have comments enabled, there's a link at the bottom of posts with that parameter in the url (the same as the blog post url - see here ---> http://noahsdad.com/treadmill-training-progress/?replytocom=22729#respond ).
Noah is cute!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Looking to remove dates from URL permalink structure. What do you think of this idea?
I know most people who remove dates from their URL structure usually do so and then setup a 301 redirect. I believe that's the right way to go about this typically. My biggest fear with doing a global 301 redirect implementation like that across an entire site is that I've seen cases where this has sort of shocked Google and the site took a hit in organic traffic pretty bad. Heres what I'm thinking a safer approach would be and I'd like to hear others thoughts. What if... Changed permalink structure moving forward to remove the date in future posts. All current URLs stay as is with their dates Moving forward we would go back and optimize past posts in waves (including proper 301 redirects and better URL structure). This way we avoid potentially shocking Google with a global change across all URLs. Do you know of a way this is possible with a large Wordpress website? Do you see any conplications that could come about in this process? I'd like to hear any other thoughts about this please. Thanks!
Intermediate & Advanced SEO | | HashtagJeff0 -
Wordpress Blog in 2 languages. How to SEO or structure it?
Hi Moz community, I have got a wordpress blog currently in the spanish language. I want to create the same blog content but in english version. (manually translate it to english instead of using translation service such as Google Translate). How should i structure the blog for SEO? How will it work? Any structure markups i should know about? Any examples? Thanks
Intermediate & Advanced SEO | | WayneRooney0 -
Dev Subdomain Pages Indexed - How to Remove
I own a website (domain.com) and used the subdomain "dev.domain.com" while adding a new section to the site (as a development link). I forgot to block the dev.domain.com in my robots file, and google indexed all of the dev pages (around 100 of them). I blocked the site (dev.domain.com) in robots, and then proceeded to just delete the entire subdomain altogether. It's been about a week now and I still see the subdomain pages indexed on Google. How do I get these pages removed from Google? Are they causing duplicate content/title issues, or does Google know that it's a development subdomain and it's just taking time for them to recognize that I deleted it already?
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
301 vs 410 redirect: What to use when removing a URL from the website
We are in the process of detemining how to handle URLs that are completely removed from our website? Think of these as listings that have an expiration date (i.e. http://www.noodle.org/test-prep/tphU3/sat-group-course). What is the best practice for removing these listings (assuming not many people are linking to them externally). 301 to a general page (i.e. http://www.noodle.org/search/test-prep) Do nothing and leave them up but remove from the site map (as they are no longer useful from a user perspective) return a 404 or 410?
Intermediate & Advanced SEO | | abargmann0 -
How to 301 redirect old wordpress category?
Hi All, In order to avoid duplication errors we've decided to redirect old categories (merge some categories).
Intermediate & Advanced SEO | | BeytzNet
In the past we have been very generous with the number of categories we assigned each post. One category needs to be redirected back to blog home (removed completely) while a couple others should be merged. Afterwords we will re-categorize some of the old posts. What is the proper way to do so?
We are not technical, Is there a plugin that can assist? Thanks0 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
Is 404'ing a page enough to remove it from Google's index?
We set some pages to 404 status about 7 months ago, but they are still showing in Google's index (as 404's). Is there anything else I need to do to remove these?
Intermediate & Advanced SEO | | nicole.healthline0