Search function rendering cached pages incorrectly
-
On a category page the products are listed via/in connection with the search function on the site. Page source and front-end match as they should.
However when viewing a browser rendered version of a google cached page the URL for the product has changed from, as an example -
https://www.example.com/products/some-product
to
https://www.example.com/search/products/some-product
The source is a relative URL in the correct format, so therefore /search/ is added at browser rendering.
The developer insists that this is ok as the query string in the Google cache page result URL is triggering the behaviour, confusing the search function - all locally. I can see this but just wanted feedback that internally Google will only ever see the true source or will it's internal rendering mechanism possibly trigger similar behaviour?
-
Hi Michael,
Can you post the site URL so I can review further?
Thanks!
John
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will Reducing Number of Low Page Authority Page Increase Domain Authority?
Our commercial real estate site (www.nyc-officespace-leader.com) contains about 800 URLs. Since 2012 the domain authority has dropped from 35 to about 20. Ranking and traffic dropped significantly since then. The site has about 791 URLs. Many are set to noindex. A large percentage of these pages have a Moz page authority of only "1". It is puzzling that some pages that have similar content to "1" page rank pages rank much better, in some cases "15". If we remove or consolidate the poorly ranked pages will the overall page authority and ranking of the site improve? Would taking the following steps help?: 1. Remove or consolidate poorly ranking unnecessary URLs?
Intermediate & Advanced SEO | | Kingalan1
2. Update content on poorly ranking URLs that are important?
3. Create internal text links (as opposed to links from menus) to critical pages? A MOZ crawl of our site's URLs is visible at the link below. I am wondering if the structure of the site is just not optimized for ranking and what can be done to improve it. THANKS. https://www.dropbox.com/s/oqchfqveelm1q11/CRAWL www.nyc-officespace-leader.com (1).csv?dl=0 Thanks,
Alan0 -
We are switching our CMS local pages from a subdomain approach to a subfolder approach. What's the best way to handle this? Should we redirect every local subdomain page to its new subfolder page?
We are looking to create a new subfolder approach within our website versus our current subdomain approach. How should we go about handling this politely as to not lose everything we've worked on up to this point using the subdomain approach? Do we need to redirect every subdomain URL to the new subfolder page? Our current local pages subdomain set up: stores.websitename.com How we plan on adding our new local subfolder set-up: websitename.com/stores/state/city/storelocation Any and all help is appreciated.
Intermediate & Advanced SEO | | SEO.CIC0 -
Page disappears from search results when Google geographic location is close to offline physical location
If you use Google to search georgefox.edu for "doctor of business administration", the first search result is http://www.georgefox.edu/business/dba/ - I'll refer to this page as the DBA homepage from here on. The second page is http://www.georgefox.edu/offices/sfs/grad/tuition/business/dba/ - I'll refer to this page as the DBA program costs page from here on. Search: https://www.google.com/search?q=doctor+of+business+administration+site%3Ageorgefox.edu This appears to hold true no matter what your geographic location is set to on Google. George Fox University is located in Newberg, Oregon. If you search for "doctor of business administration" with your geographic location set to a location beyond a certain distance away from Newberg, Oregon, the first georgefox.edu result is the DBA homepage. Set your location on Google to Redmond, Oregon
Intermediate & Advanced SEO | | RCF
Search: https://www.google.com/search?q=doctor+of+business+administration But, if you set your location a little closer to home, the DBA homepage disappears from the top 50 search results on Google. Set your location on Google to Newberg, Oregon
Search: https://www.google.com/search?q=doctor+of+business+administration Now the first georgefox.edu page to appear in the search results is the DBA program costs page. Here are the locations I have tested so far: First georgefox.edu search result is the DBA homepage Redmond, OR Eugene, OR Boise, ID New York, NY Seattle, WA First georgefox.edu search result is the DBA program costs page Newberg, OR Portland, OR Salem, OR Gresham, OR Corvallis, OR It appears that if your location is set to within a certain distance of Newberg, OR, the DBA homepage is being pushed out of the search results for some reason. Can anyone verify these results? Does anyone have any idea why this is happening?0 -
Should I block temporary pages
I need some SEO advice on an odd scenario: We are launching a new product line (party supplies) on it's own domain (PartySuperCenter.com). Due to some internal/technical reasons we will not be able to launch the site until the summer. We already have the product in our warehouse so the owners want to created a section on our current site (CostumeSuperCenter.com) for the new products. Once the new site is up the product will be removed from our current site and moved to the new site. I am concerned about the effect this will have on our SEO - having thousands of product pages appear and then disappear after a few months. I was thinking about blocking the pages using the "noindex" tag. Is this how you would handle it? Thanks in advance for your help!
Intermediate & Advanced SEO | | costume0 -
Will pages irrelevant to a site's core content dilute SEO value of core pages?
We have a website with around 40 product pages. We also have around 300 pages with individual ingredients used for the products and on top of that we have some 400 pages of individual retailers which stock the products. Ingredient pages have same basic short info about the ingredients and the retail pages just have the retailer name, adress and content details. Question is, should I add noindex to all the ingredient and or retailer pages so that the focus is entirely on the product pages? Thanks for you help!
Intermediate & Advanced SEO | | ArchMedia0 -
Category Pages up - Product Pages down... what would help?
Hi I mentioned yesterday how one of our sites was losing rank on product pages. What steps do you take to improve the SERPS of product pages, in this case home/category/product is the tree. There isn't really any internal linking, except one link from the category page to each product, would setting up a host of internal links perhaps "similar products" linking them together be a place to start? How can I improve my ranking of these more deeply internal pages? Not just internal links?
Intermediate & Advanced SEO | | xoffie0 -
Deep Page is Ranking for Main Keyword, But I Want the Home Page to Rank
A deep page is ranking for a competitive and essential keyword, I'd like the home page to rank. The main reasons are probably: This specific page is optimized for just that keyword. Contains keyword in URL I've optimized the home page for this keyword as much as possible without sacrificing the integrity of the home page and the other keywords I need to maintain. My main question is: If I use a 301 redirect on this deep page to the home page, am I risking my current ranking, or will my home page replace it on the SERPs? Thanks so much in advance!
Intermediate & Advanced SEO | | ClarityVentures0 -
Why duplicate content for same page?
Hi, My SEOMOZ crawl diagnostic warn me about duplicate content. However, to me the content is not duplicated. For instance it would give me something like: (URLs/Internal Links/External Links/Page Authority/Linking Root Domains) http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110516 /1/1/31/2 http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110711 0/0/1/0 http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110811 0/0/1/0 http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110911 0/0/1/0 Why is this seen as duplicate content when it is only URL with campaign tracking codes to the same content? Do I need to clean this?Thanks for answer
Intermediate & Advanced SEO | | nuxeo0