How can I correct this massive duplicate content problem?
-
I just updated a clients website which resulted in about 6000 duplicate page content errors. The way I set up my clients new website is I created a sub folder calles blog and installed wordpress on that folder. So when you go to suncoastlaw.com your taken to an html website, but if you click on the blog link in the nav, your taken to the to blog subfolder. The problem I'm having is that the url's seem to be repeating them selves. So for example, if you type in
in http://suncoastlaw.com/blog/aboutus.htm/aboutus.htm/aboutus.htm/aboutus.htm/
that somehow is a legitimate url and is being considered duplicate content of of http://suncoastlaw.com/aboutus.htm/. This repeating url only seems to be a problem when the blog/ is in the url. Any ideas as to how I can fix this?
-
anybody got a similar plugin for Joomla?
-
Hi Scott Take a look at this http://wordpress.org/support/topic/duplicate-url-in-posts All the best Richard
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Can I exclude subdomain from a Campaign?
Also, I would like to track a subdomain without the root domain, is that possible?
Moz Pro | | AndreVa0 -
Can increasing website pages decrease domain authority?
Hello Mozzers! Say there is a website with 100 pages and a domain authority of 25. If the number of pages on this website increases to 10,000 can that decrease its domain authority or affect it in any way?
Moz Pro | | MozAddict0 -
Can anybody recommend me a softwear to create sitemap?
I am trying to increase my website traffic and I know that sitemaps are super important to google and not only. So far I spent some money on softwear from several companies and the results were terrible. Money spent on nothing. I really need sugestions from you because I'm feed up to spend money on nothing. NB: I am not a programer or website developer so my knowlegde is rather basic in this stuff but I like learning new things. THANKS A LOT!!!
Moz Pro | | mihaelastam0 -
Tool for tracking actions taken on problem urls
I am looking for tool suggestions that assist in keeping track of problem urls, the actions taken on urls, and help deal with tracking and testing a large number of errors gathered from many sources. So, what I want is to be able to export lists of url's and their problems from my current sets of tools (SEOmoz campaigns, Google WM, Bing WM,.Screaming Frog) and input them into a type of centralized DB that will allow me to see all of the actions that need to be taken on each url while at the same time removing duplicates as each tool finds a significant amount of the same issues. Example Case: SEOmoz and Google identify urls with duplicate title tags (example.com/url1 & example.com/url2) , while Screaming frog sees that example.com/url1 contains a link that is no longer valid (so terminates in a 404). When I import the three reports into the tool I would like to see that example.com/url1 has two issues pending, a duplicated title and a broken link, without duplicating the entry that both SEOmoz and Google found. I would also like to see historical information on the url, so if I have written redirects to it (to fix a previous problem), or if it used to be a broken page (i.e. 4XX or 5XX error) and is now fixed. Finally, I would like to not be bothered with the same issue twice. As Google is incredibly slow with updating their issues summary, I would like to not important duplicate issues (so the tool should recognize that the url is already in the DB and that it has been resolved). Bonus for any tool that uses Google and SEOmoz API to gather this info for me Bonus Bonus for any tool that is smart enough to check and mark as resolved issues as they come in (for instance, if a url has a 403 error it would check on import if it still resolved as a 403. If it did it would add it to the issue queue, if not it would be marked as fixed). Does anything like this exist? how do you deal with tracking and fixing thousands of urls and their problems and the duplicates created from using multiple tools. Thanks!
Moz Pro | | prima-2535090 -
SEOMOZ can we trust you?
I was thinking today what are the chances that SEOMOZ can look into my campaigns and see what keywords I am targeting, strategies etc. and then sell that information to others? There has to be some type of clause when you sign up correct? But who knows because I'm not sure. Having Rand and Matt Curts talking via twitter to each other kind of concerns me. Not that I'm worried that Google will know my strategies because we don't do any black-hat linking and anything shady but how much power does SEOMOZ really have and can they leak any information to other companies or Google? Scary stuff that's why I don't put all my eggs into one basket, I spread them out. Thoughts? Thanks in advance - Brett Shaffer
Moz Pro | | EmpireSEO1 -
Can you set-up a manual SEOmoz crawl?
I received a crawl report yesterday, made some site changes, and would like to see if those changes were done correctly. Rather than wait a week for my automatic crawl to be generated, is there anyway to initiate a manual crawl on a single subdomain as a PRO member? As a PRO member, you can schedule crawls for 2 subdomains every 24 hours, and you'll get up to 3,000 pages crawled per subdomain. When we've finished crawling, your reports will be sent to your PRO email address, which is currently From here... http://pro.seomoz.org/tools/crawl-test
Moz Pro | | ICM0 -
Ranking Tracker not correct?
Over the last 6 weeks I've been linkbuilding for one of my customers. The search term is: bed and breakfast portugal. It is not easy but we are making progress. The goal is the number 1 page (top 10 results) in Google.nl I've been keeping score with Rank Tracker and also created a campaign. Last week we were still ranked 11 (for some time already) according to Rank Tracker but tuesday we made a jump to number 7. Whoehoe you would say but opening up Chrome and Firefox browser I am still seeing www.bedandbreakfast-casaceedina.com on the 2nd page? Maybe somebody can give me an explanation? I've de-personalized my search and also Rank tracker show s the command they are using: http://www.google.nl/search?q=bed+and+breakfast+portugal&pws=0 However also using that I'm not seeing my client back at position 7. How can the Rank tracker information be so wrong, I was hoping not off course 🙂 Your help is appreciated very much.
Moz Pro | | newtraffic0 -
Is there a problem with Weekly Ranking report this week?
I just received my weekly ranking report. 2 Up, 0 down, 8 the same this week. However, last week my page was #2 for "innovation conference" and this week you report "not in top 50." My first thought was penalty, but when I searched and removed personalization, we were still #2 for that search query. So, something must be wrong with the report - either we're 1 down or "not in the top 50 is wrong."
Moz Pro | | KNect3650