Time until duplicate penalty is lifted?
-
Hello,
I recently discovered that half of the pages on my site, about 3,500 were not being indexed or were indexing very very slow and with a heavy weight on them. I discovered the problem in the "HTML Suggestions" within Google's Webmaster Tools.
An example of my main issue. All 3 of these URL were showing 200 Status OK in Google.
www.getrightmusic.com/mixtape/post/ludacris_1_21_gigawatts_back_to_the_first_time
www.getrightmusic.com/mixtape/post/ludacris_1_21_gigawatts_back_to_the_first_time/
www.getrightmusic.com/mixtape/ludacris_1_21_gigawatts_back_to_the_first_time
I added some code to the .htaccess in order to remove the trailing slashes across the board.
I also properly set up my 404 redirects, which were not properly set up by my developer (when the site "relaunched" 6 months ago
)
I then added the Canonical link rel tags on the site posts/entries.
I'm hoping I followed all the correct steps in fixing the issue and now, I guess, I just have to wait until the penalty gets lifted? I'm also not %100 certain that I have been penalized. I'm just assuming based on the SERP ceiling I feel and the super slow or lack of indexing my content.
Any insight, help or comments would be super helpful.
Thank you.
Jesse
-
I recently raised a resubmission request with Google webmaster tools and they said it would take 2 weeks but someone would get back to us with a personal reply.
so best going through a webmaster account to be honest to check.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Avg Page Load Time (sec) Comppared to site average - what does it mean?
Hi All, In google analytic In Site Speed -> Page Timings we have two columns a) Page Views & b) Avg Load Time (sec) compared to site average. Now in "b" column I am able to below % one in green and another in brown so what does it mean? Can anyone please explain me? Image attached Thanks! bNbBA
Reporting & Analytics | | amu1230 -
Mozcon: Mike Arnesen - Dwell time tip
Hi Mozzers, I was just looking at the dwell time tip from Mike Arnesen and have some difficulties understanding it. This is the formula: (Words on page)/15*1000 Why is he using 15? Where does the number stand for? What is it that I'm actually calculating now? Thanks!
Reporting & Analytics | | WeAreDigital_BE
Sander0 -
Tracking time spent on a section of a website in Google Analytics
Hi, I've been asked by a client to track time spent or number of pages visited on a specific section of their website using Google Analytics but can't see how to do this. For example, they have a "golf" section within their site and want to measure how many people either visit 5 page or more within the golf section or spend at least 6 minutes browsing the various golf section pages. Can anyone advise how if this can be done, and if so, how I go about it. Thanks
Reporting & Analytics | | geckonm0 -
Pages with Duplicate Page Content
Hi Just started use the Moz and got an analytics report today! There about 104 duplicate pages apparently, the problem is that they are not duplicates, but just the way the page has been listed with a description! The site is an Opencart and every page as got the name of the site followed by the product name for the page! How do you correct this issue?? Thank for your help
Reporting & Analytics | | DRSMPR1 -
404 errors more than 1.8 lacs, Duplicate Content, Duplicate title, missing meta description increasing as site is based on regular ticket selling (CRM), kindly help
Sites error increasing i.e. 404 errors more than 1.8 lacs, Duplicate Content, Duplicate title, missing meta description increasing day by day as site is based on regular ticket selling (CRM), We have checked with webmasters for 404's, but it is not easy to delete 1.8 lac entries. How to resolve this issue for future. kindly help and suggest the solution.
Reporting & Analytics | | 1akal0 -
How do I fix apparent duplicates
I'm auditing a site and would appreciate your help with possible explanations and solutions as to why Google Analytics in the Content Drilldown page is showing what appears to be duplicate pages. (Refer image) I'm wondering if I have got my head around the rel=canonical tag because the page I'd consider a duplicate "page/" has a Canonical tag pointing to "~/page.html" This is the tag from the page Locations/ rel="canonical" href="http://www.domain.com/Locations.html" /> so am unsure why both versions of the page are generating views. Shouldn't the Canonical tag work like a 301 redirect? I'm unsure how the pages using the path page/ are generating so many views because I have not been able to find them and they are not indexed by Google. Unfortunately the site is built using a Propriety CMS I'm not familiar with. exK4EqrU25
Reporting & Analytics | | NicDale0 -
Confirmation page gets hit multiple times by some users. How I can I segment out unique visits?
Hi All, I'm web marketing manager at http://www.evenues.com which is like an AirBnB for meeting space. When calculating the number of bookings for our meeting spaces, I've set up a goal in analytics with the confirmation page as the goal URL. The problem is, it seems that some users are looking at the same confirmation page several times. We have unique URLs for each confirmation page, but some users seem to be visiting these unique pages more than 2 to 5 times. This skews our numbers a bit. This makes things a bit problematic when it comes to segmenting visitors. is there anything we can so that each unique URL visited only counts once? Thanks, Kenji
Reporting & Analytics | | eVenuesSEO0 -
Solving link and duplicate content errors created by Wordpress blog and tags?
SEOmoz tells me my site's blog (a Wordpress site) has 2 big problems: a few pages with too many links and duplicate content. The problem is that these pages seem legit the way they are, but obviously I need to fix the problem, sooooo... Duplicate content error: error is a result of being able to search the blog by tags. Each blog post has mutliple tags, so the url.com/blog/tag pages occasionally show the same articles. Anyone know of a way to not get penalized for this? Should I exclude these pages from being crawled/sitemapped? Too many links error: SEOmoz tells me my main blog page has too many links (both url.com/blog/ and url.com/blog-2/) - these pages have excerpts of 6 most recent blog posts. I feel like this should not be an error... anyone know of a solution that will keep the site from being penalized by these pages? Thanks!
Reporting & Analytics | | RUNNERagency0