Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
New theme adds ?v=1d20b5ff1ee9 to all URL's as part of cache. How does this affect SEO
-
New theme I am working in ads ?v=1d20b5ff1ee9 to every URL. Theme developer says its a server setting issue. GoDaddy support says its part of cache an becoming prevalent in new themes.
How does this impact SEO?
-
Thanks !
I turned of Geolocate (with page caching support), and as you said, it corrected the problem.
Thanks again.
Bob
-
Hi Bob,
I second Paul. His answer is a good one. Hope we helped you.
Sincerely,
Dana
-
Just FYI - the advice to remove query strings from static resources in that WordPress article is the proverbial Very Bad Idea. If you want a full explanation, let me know, but trust me - don't.
There's a world of difference between static files like CSS and Javascript having variables, and having those variables on page URLs.
You should have self-referential canonical tags on every page on your site anyway, which would take care of the duplicate URL issue created by the variables added to each URL, but there are still many other reasons why they're bad for SEO and usability, as Dana points out.
Paul
-
You have a configuration choice in your WooCommerce settings that is causing this, Bob.
You've got the default customer location in settings set to "Geolocate (with page caching support)". This causes the variable to be added to the URL in order to enable the geo-location for each customer. Turn it off and the variable will no longer be added.
And yea, this is a disaster for SEO, as Dana explains, and it will also badly foul your Analytics and it even borks your site's internal search.
Hope that makes sense?
Paul
-
Hi again Bob,
Take a look at this thread on how to remove query strings from static parameters...I believe your answer is there.
https://wordpress.org/support/topic/how-to-remove-query-strings-from-static-resources
Dana
P.S. Why is this a problem for SEO? A couple of reasons:
1. It's highly likely your content will get shared without the query parameter AND with the query parameter. This will effectively split your link equity between two versions of the same page.
2.Google Search Console is very bad at understanding that the page without the query string is the same as it is with the query string...you'll likely get a lot of duplicate content notifications.
3. From an end-user standpoint, it's just plain ugly....and end user experience matters to SEO right? - I understand that's somewhat facetious....but it's your business right? You want it to look a good, solid, high-quality, professional site. Ugly query parameters scream "I hired my 21 year old nephew to b build me a WordPress site."
-
Hi Bob,
What CMS are you working with? Once you answer that I might be able to help a little more.
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to test Angular JS heavy page for SEO?
Hi Moz community, Our tech team has recently decided to try switching our product pages to be JavaScript dependent, this includes links, product descriptions and things like breadcrumbs in JS. Given my concerns, they will create a proof of concept with a few product pages in a QA environment so I can test the SEO implications of these changes. They are planning to use Angular 5 client side rendering without any prerendering. I suggested universal but they said the lift was too great, so we're testing to see if this works. I've read a lot of the articles in this guide to all things SEO and JS and am fairly confident in understanding when a site uses JS and how to troubleshoot to make sure everything is getting crawled and indexed. https://sitebulb.com/resources/guides/javascript-seo-resources/ However, I am not sure I'll be able to test the QA pages since they aren't indexable and lives behind a login. I will be able to crawl the page using Screaming Frog but that's generally regarded as what a crawler should be able to crawl and not really what Googlebot will actually be able to crawl and index. Any thoughts on this, is this concern valid? Thanks!
Technical SEO | | znotes0 -
If I'm using a compressed sitemap (sitemap.xml.gz) that's the URL that gets submitted to webmaster tools, correct?
I just want to verify that if a compressed sitemap file is being used, then the URL that gets submitted to Google, Bing, etc and the URL that's used in the robots.txt indicates that it's a compressed file. For example, "sitemap.xml.gz" -- thanks!
Technical SEO | | jgresalfi0 -
Category URL Pagination where URLs don't change between pages
Hello, I am working on an e-commerce site where there are categories with multiple pages. In order to avoid pagination issues I was thinking of using rel=next and rel=prev and cannonical tags. I noticed a site where the URL doesn't change between pages, so whether you're on page 1,2, or 3 of the same category, the URL doesn't change. Would this be a cleaner way of dealing with pagination?
Technical SEO | | whiteonlySEO0 -
Will a CSS Overflow Scroll for content affect SEO rankings?
If I use a CSS overflow scroll for copy, will my SEO rankings be affected? Will Google still be able to index my copy accurately and will keywords used in the copy that are covered by the scroll be recognized by Google?
Technical SEO | | moliver10220 -
Are Collapsible DIV's SEO-Friendly?
When I have a long article about a single topic with sub-topics I can make it user friendlier when I limit the text and hide text just showing the next headlines, by using expandable-collapsible div's. My doubt is if Google is really able to read onclick textlinks (with javaScript) or if it could be "seen" as hidden text? I think I read in the SEOmoz Users Guide, that all javaScript "manipulated" contend will not be crawled. So from SEOmoz's Point of View I should better make use of old school named anchors and a side-navigation to jump to the sub-topics? (I had a similar question in my post before, but I did not use the perfect terms to describe what I really wanted. Also my text is not too long (<1000 Words) that I should use pagination with rel="next" and rel="prev" attributes.) THANKS for every answer 🙂
Technical SEO | | inlinear0 -
Structuring URL's for better SEO
Hello, We were rolling our fresh urls for our new service website. Currently we have our structure as www.practo.com/health/dental/clinic/bangalore We like to have it as www.practo.com/health/dental-clinic-bangalore Can someone advice us better which one of the above structure would work out better and why? Should this be a focus of attention while going ahead since this is like a search engine platform for patients looking out for actual doctors. Thanks, Aditya
Technical SEO | | shanky10 -
Javascript to manipulate Google's bounce rate and time on site?
I was referred to this "awesome" solution to high bounce rates. It is suppose to "fix" bounce rates and lower them through this simple script. When the bounce rate goes way down then rankings dramatically increase (interesting study but not my question). I don't know javascript but simply adding a script to the footer and watch everything fall into place seems a bit iffy to me. Can someone with experience in JS help me by explaining what this script does? I think it manipulates the reporting it does to GA but I'm not sure. It was supposed to be placed in the footer of the page and then sit back and watch the dollars fly in. 🙂
Technical SEO | | BenRWoodard1 -
Blocking URL's with specific parameters from Googlebot
Hi, I've discovered that Googlebot's are voting on products listed on our website and as a result are creating negative ratings by placing votes from 1 to 5 for every product. The voting function is handled using Javascript, as shown below, and the script prevents multiple votes so most products end up with a vote of 1, which translates to "poor". How do I go about using robots.txt to block a URL with specific parameters only? I'm worried that I might end up blocking the whole product listing, which would result in de-listing from Google and the loss of many highly ranked pages. DON'T want to block: http://www.mysite.com/product.php?productid=1234 WANT to block: http://www.mysite.com/product.php?mode=vote&productid=1234&vote=2 Javacript button code: onclick="javascript: document.voteform.submit();" Thanks in advance for any advice given. Regards,
Technical SEO | | aethereal
Asim0