Duplicate Version of Home Page Causing Problems?
-
Hello,
I have a .php based site and i'm curious if how we split traffic is negatively affecting our rankings.
Currently, if you visit Lipozene.com you are split 50/50 between two pages, indexa.php and indexb.php.
These have identical content right now, and i'm curious if this has negatively affected our rankings.
We've dropped off the SERPs for our brand term "lipozene" even though we are the official site and own www.lipozene.com .
Any thoughts are greatly appreciated.
-
Irving,
What do you think would be more appropriate meta keywords:
I was thinking something like this:
"...content="lipozene, weight loss, lose pure body fat, lipozene review" as our keywords
Think this would fare better?
-
Using Google's optimizer, you are telling Google- Hey we are testing here, Google likes to know these things. And yes this is affecting your rankings. Your site in general is pretty keyword stuffed, as the other poster's said. Yet, handling what you can at the level of duplicate content is important. Once you add theoptimizer code to the pages, than Google will know your intention isn't to spam the search engines, it is indeed an A/B test.
Also, when doing this- be careful that both versions of your page get indexed, this is going to cause issues down the road.
-
If you're only getting one page indexed it's not an issue, in other words if your a/b test is serving up the duplicate b page to browsers only and not bots it's fine.
Keyword density on site for "lipozene" is too high like on this page.
You are keyword stuffing, I would delete this tag globally. The keyword stuffing on the site is a sure fire way to make sure Google does not rank you for "lipozene"
<meta name="<a class="attribute-value">keywords</a>" content="<a class="attribute-value">lipozene, lipozene pills, lipozene supplements, lipozene review, lipozene diet, weight loss lipozene, diet, buy lipozene, lipozene facts</a>" />
-
Thanks Igor.
We certainly have a number of negative links out there, and i'll try and focus on those.
-
Hi there,
Google does penalize for duplicate content... I assume you want to do some A/B testing, is that why you have 2 home pages (indexa.php and indexb.php), correct? If so I would remove the 2nd duplicate home page until you have another unique home page to the A/B testing. (I'm not 100% sure as I've never done A/B testing myself, but I think you need to set it up in your GWT account so that Google is aware that you're doing A/B testing, I might be incorrect on this)
Also use the SEOMoz Open Site Explorer: http://www.opensiteexplorer.org/ See if you have any bad/negative links that might of reduced your rankings.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Home Page Deindexed overnight?
Hi, Hope you guys can help. I run an e-commerce site https://alloywheels.com Last night our home page (and a few other pages, but not all) were de-indexed by Google. The site has been ranking (UK) for years in P1 for the "alloy wheels" keyword and on the whole been running very successfully. However recently I have noticed from fluctuation on the "alloy wheels" keyword, dropping to P3 then P5 then back to P3, but this morning I noticed we were not even ranking on the first page. When I check inside Search Console there are no messages or warnings but the "/" page was de-indexed. There were a few other key pages that were also de-indexed. I have request reindexing and they have come back, P7 for the home page for "alloy wheels" The only thing I have changed was I realised yesterday there was no robots.txt on the site and was being recommended by web.dev to add one, so I did. It was just an allow all: User-agent: *
Technical SEO | | JamesDolden
Disallow Sitemap: https://alloywheels.com/sitemap.xml I ran tests on the robots.txt before it was uploaded and it all came green. I have removed the robots.txt for now. Has anybody seen anything like this before? With the recent ranking fluctuation I am not sure whether it is to do with that, the robots.txt or something different altogether? Thanks in advance, James0 -
Links On Out Of Stock Product Pages Causing 404
Hi Moz Community! We're doing an audit of our e-commerce site at the moment and have noticed a lot of 404 errors coming from out of stock/discontinued product pages that we've kept 200 in the past. We kept these and added links on them for categories or products that are similar to the discontinued items but many other links of the page like images, blog posts, and even breadcrumbs have broken or are no longer valid causing lots of additional 404s. If the product has been discontinued for a long time and gets no traffic and has no link equity would you recommend adding a noindex robots tag on these pages so we're not wasting time fixing all the broken links on these? Any thoughts?Thanks
Technical SEO | | znotes0 -
How can I get Google to forget an https version of one page on my site?
Google mysteriously decided to index the broken, https version of one page on my company's site (we have a cert for the site, but this page is not designed to be served over https and the CSS doesn't load). The page already has many incoming links to the http version, and it has a canonical URL with http. I resubmitted it on http with webmaster tools. Is there anything else I could do?
Technical SEO | | BostonWright0 -
Duplicate pages
Hi Can anyone tell me why SEO MOZ thinks these paes are duplicates when they're clearly not? Thanks very much Kate http://www.katetooncopywriter.com.au/how-to-be-a-freelance-copywriter/picture-1-58/ http://www.katetooncopywriter.com.au/portfolio/clients/other/ http://www.katetooncopywriter.com.au/portfolio/clients/travel/ http://www.katetooncopywriter.com.au/webservices/what-i-do/blog-copywriter/
Technical SEO | | ToonyWoony0 -
Tags causing Duplicate page content?
I was looking through the 'Duplicate Page Content' and Too Many On-Page Link' errors and they all seem to be linked to the 'Tags' on my blog pages. Is this really a problem and if so how should I be using tags properly to get the best SEO rewards?
Technical SEO | | zapprabbit1 -
Home page URL disappears in Google after switching to WordPress
It was a 10 page static HTML page website. 3 year old, PR2. Monday night, copied a WordPress from somewhere to this website's public_html folder and activate it. The home page was "index.html" before switching to WordPress. Now this html file (index.html) has been deleted, so WordPress' Home page can work. All other 9 static html pages are still there in Google index. Just notice it today that the home page URL disappears in Google completely. Why? All other 9 static html pages' URL are still in Google. robots.txt is Allow: / What may have gone wrong to remove the home domain URL from Google index? Thank you for your help!
Technical SEO | | johnzhel0 -
Why are my pages getting duplicate content errors?
Studying the Duplicate Page Content report reveals that all (or many) of my pages are getting flagged as having duplicate content because the crawler thinks there are two versions of the same page: http://www.mapsalive.com/Features/audio.aspx http://www.mapsalive.com/Features/Audio.aspx The only difference is the capitalization. We don't have two versions of the page so I don't understand what I'm missing or how to correct this. Anyone have any thoughts for what to look for?
Technical SEO | | jkenyon0