Curious, anyone ever had over half of their indexed links drop on an e-commerce site?
-
In a year went from around 300k indexed pages to around >100k according to GWT. Could this be duplicate content issue, lost links, spam, aged links or all of the above? either way an audit is in order. Thanks!
Chris
-
awesome and thanks! I love nashville. went to school there:)
-
By phone it is 615-678-5464, by email it is lesley@dh42.com
-
what's the best way to reach you L?
thx,
C
-
Sure. The platform I use is Prestashop. It lets you put a short description in about the manufacturer or the brand in a centralized area in the shop. I just create a new tab on the page and draw that content in programatically. So you might type up a 300 word bio about the manufacturer or use what is on their Wikipedia page, and then have that load on all of the pages for their products. You can put it in a text box so it is not obliviously seen as well.
I always generally try to put another tab as well. It is kind of a pain, but I try to type about 5 -10 different things up like "Our Return Policy" or "Why buy from us" or "Our price guarantee" Something like those and have the page choose one randomly at the render time. That way the content is always changing as well. Similar to this, http://screencast.com/t/schHrJjk It is just content to water down the feed content and make it possibly rank.
-
ok. any chance you can extend a dummies guide for that lol? i kinda follow for the most part. thanks, very very helpful L.
C
-
thank you!
C
-
There is another way too. One thing I have used to rank sites with content issues like this is to create a couple of tabs on the product pages and programatically fill them out. Say an "About {$manufacturer_name}" and a "Our Return Policy".
What you are trying to do is water down the content that is creating the duplicate. This will often work and bring the pages back into the index and ranking again.
-
Christian,
Here are your choices:
1. Rewrite the content so it is unique to your site.
OR, if that is not scalable because you have so many pages then:
2. Noindex most of those pages and allow indexation of only the ones that you have time/budget to rewrite.
Yes duplicate content is pretty rampant in eCommerce, which is precisely why Google has to handle it by choosing a canonical version and not ranking most of the others. They're not going to "ban" or "penalize" you, but ultimately the result is the same: No rankings = No Traffic.
-
well it looks like dupe content is a big issue which i am sure is pretty common in the e-commerce environment. I'm a bit fresh in the seo e-commerce as my background is more with services. I assume a stop over at Google Webmaster forum will provide some insight? thanks Lesley.
Christian
-
It could be due to any of those reasons, including others like content quality. Do you have unique product descriptions for all 300k+ pages?
-
I have seen it happen several times. Are you using a feed for your product description data? It could be an issue where a competitor has started to out rank you with the same description data and you have been dropped from the index.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Buffer Link and Google Impressions
Afternoon, I noticed a spike in impressions over a couple of days in April, so I investigated Analytics to see where these were coming from. It appears these impressions were split between two URLs; one was a blog post, the other was the Buffer link to the blog post that we used on Twitter and Facebook. According to Analytics, this Buffer URL received 1000 impressions over two days, with an average SERP position of 16. This surely can't be right, can it? Is this just another Analytics quirk? After two days of a decent amount of impressions to this Buffer link, the amount of impressions dropped to pretty much zero. I know Tweets are now starting to rank, but this would be the Twitter URL, not the Buffer link to our blog post? Any ideas, Cheers, Lewis
Reporting & Analytics | | PeaSoupDigital0 -
Structured Data dropped suddenly
Just noticed a large drop in Webmaster tools of our structured data graphs. Both "items" and "items with errors" dropped. It is across the board on all our sites. Even checked some of the sites that I do consulting work for, and they dropped. My assumption is that this is another Google glitch, similar to what we saw last year, and in March of this year, where is corrected itself. Anyone else seeing anything on their end?
Reporting & Analytics | | tdawson090 -
Any issues with Google impressions dropping in Webmaster Tools?
I'm seeing a drop in impressions across all my websites that are hosted at a certain location. Just wanted to make sure that it is not some reporting issue that others are seeing.
Reporting & Analytics | | tdawson090 -
Is there a problem with using same gmail account for multiple site analytics and GWMT?
Hi, Is there a problem or a general recommendation about using the same gmail account for two different sites (both in Google Analytics and Webmaster tools)? Thanks
Reporting & Analytics | | BeytzNet0 -
Can 500 errors hurt rankings for an entire site or just the pages with the errors?
I'm working with a site that had over 700 500 errors after a redesign in april. Most of them were fixed in June, but there are still about 200. Can 500 errors affect rankings sitewide, or just the pages with the errors? Thanks for reading!
Reporting & Analytics | | DA20130 -
Google Webmaster Tools - When will the links go away!?
About 9 months back we thought having an extremely reputable company build our client some local citations would be a good idea. You definitely know this citation company, but I'll leave names out. Regardless, it's our mistake to cut corners. Google Webmaster Tools quickly picked up these new citations and added them to the links section. One of these citation spawned a complete mess of about 60K+ links on their network of sites through ridiculous subdomains of every state in the country and so many other domain variations. We immediately went into remove mode and had the site's webmaster take down the bad links from their site. This process took about a month for outreach. The bad links (60K+) have not been on the spam site for well over 6 months but GWT still shows them in the "links to your site" section. Majestic, Bing, and OSE only displayed the bad links for a brief time. Why is webmaster tools still showing these links after 6+ months? We typically see GWT update about every 2 weeks, a month tops. Any ideas? Could a changed robots.txt on the bad site prevent Google from updating the links displayed in GWT? We have submitted to disavow, but Google replied with "no manual penalty". We even blasted the bad site with Fiverr links, in hopes that Google would re-crawl them. No luck with anything we do. We have patiently waited for way too long. The rankings for this site got crushed on Google after these citations. How do we fix this? Should we worry about this? Any advice would really help. Thanks so much in advance.
Reporting & Analytics | | zadro0 -
What is the best way to track mobile sites in Google Analytics?
Hello! I am wondering what the pros and cons of using the regular Google Analytics tracking code on a mobile site versus the tracking documentation from Google specifically on it found at http://code.google.com/mobile/analytics/docs/web/ which is still in labs mode. Does the mobile specific tracking have the same features as the regular one to be able to track events and report the same statistics? Thanks for the help on this one!
Reporting & Analytics | | CabbageTree0 -
Tracking visits who entered the site via subdomain in GA
Hello, Does anybody know how I can segment visits who entered my site via subdomain? For example: I want to know people who entered via subdomain.example.com/***** and not include people who entered via www.example.com Thanks in advance!
Reporting & Analytics | | A_Q0