Tool for tracking actions taken on problem urls
-
I am looking for tool suggestions that assist in keeping track of problem urls, the actions taken on urls, and help deal with tracking and testing a large number of errors gathered from many sources.
So, what I want is to be able to export lists of url's and their problems from my current sets of tools (SEOmoz campaigns, Google WM, Bing WM,.Screaming Frog) and input them into a type of centralized DB that will allow me to see all of the actions that need to be taken on each url while at the same time removing duplicates as each tool finds a significant amount of the same issues.
Example Case:
SEOmoz and Google identify urls with duplicate title tags (example.com/url1 & example.com/url2) , while Screaming frog sees that example.com/url1 contains a link that is no longer valid (so terminates in a 404).
When I import the three reports into the tool I would like to see that example.com/url1 has two issues pending, a duplicated title and a broken link, without duplicating the entry that both SEOmoz and Google found.
I would also like to see historical information on the url, so if I have written redirects to it (to fix a previous problem), or if it used to be a broken page (i.e. 4XX or 5XX error) and is now fixed.
Finally, I would like to not be bothered with the same issue twice. As Google is incredibly slow with updating their issues summary, I would like to not important duplicate issues (so the tool should recognize that the url is already in the DB and that it has been resolved).
Bonus for any tool that uses Google and SEOmoz API to gather this info for me
Bonus Bonus for any tool that is smart enough to check and mark as resolved issues as they come in (for instance, if a url has a 403 error it would check on import if it still resolved as a 403. If it did it would add it to the issue queue, if not it would be marked as fixed).
Does anything like this exist? how do you deal with tracking and fixing thousands of urls and their problems and the duplicates created from using multiple tools.
Thanks!
-
Maybe I don't fully appreciate the power of excel
but what I am envisioning seems to require more than what excel can provide.
Thanks for the suggestion though. I will think about it some more.
-
I think you're looking for Microsoft Excel
You might try an issue tracker like Redmine. Beyond that, no, sounds like you would want a custom tool, since there isn't enough demand for this kind of thing.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Is Moz Able to Track Internal Links Per Page?
I am trying to track internal links and identify orphan pages. What is the best way to do this?
Moz Pro | | WebMarkets0 -
What's the best way to keep track of keyword rankings
Here's the deal. I keep track of my keyword rankings with the help of Rank Tracker from seopowersuite.com for the most part. I ran it on a daily basis and my keyword was not in top 100 for a few months. Moz.com panel shows pretty much the same (not in top 50) for the same months. That said, if I check that keyword ranking in my Google Webmaster Tools (avg. position) it says that its position (ranking?) was on average: 49, 7, 7, 8 (for the last four months). So, I'm not sure how it's even possible? How come Rank Tracker and Moz don't see any rankings and Google gives me sorta decent avg. positions at the same time. I assume that avg. postion means that same as avg. ranking, right? I'm not sure what I'm missing here.
Moz Pro | | VinceWicks0 -
Crawlers crawl weird long urls
I did a crawl start for the first time and i get many errors, but the weird fact is that the crawler tracks duplicate long, not existing urls. For example (to be clear): there is a page: www.website.com/dogs/dog.html but then it is continuing crawling:
Moz Pro | | r.nijkamp
www.website.com/dogs/dog.html
www.website.com/dogs/dogs/dog.html
www.website.com/dogs/dogs/dogs/dog.html
www.website.com/dogs/dogs/dogs/dogs/dog.html
www.website.com/dogs/dogs/dogs/dogs/dogs/dog.html what can I do about this? Screaming Frog gave me the same issue, so I know it's something with my website0 -
Tracking keyword rankings on sub pages
Hello, What is the best way to track keywords on sub pages of a website through seomoz? Do we need to create a separate campaign for each sub page? Thanks for all the help!
Moz Pro | | DerekDenholm0 -
Overly Dynamic URL in vBulleitin
I've got quite a few overly dynamic URLs reported like this one URL: http://www.phplinkdirectory.com/forum/forumdisplay.php?s=4a07050d7e48e8bae86ef7880d9f91e8&f=13&order=desc&page=3 Anyone know the quick fix to this problem?
Moz Pro | | dvduval0 -
Find New Keywords Tool - Information is not accurate
When using the Find New Keywords tool, the information listed for Rank is often incorrect. I attached an image with two screen shots. One from the Find New Keywords Tool and the other is a Google search results page. In Incognito mode, I searched for the third item on the list, "Speak Creative Memphis." The results put us at #1 for that term, but the Keyword Tool shows that we're #4. Can you help me understand why there is a difference? keywordtool-ss.jpg
Moz Pro | | speakcreative1 -
Do you track both plural and singular variations of your keywords?
Howdy, In trying to make the most of the keyword tracking slots we get with the SEOmoz tool our discussion turned to the importance of tracking both variations of search terms that could be plural or singular. The example is that we run a local business search database so we target search terms like "chicago pet stores" and "chicago pet store", however the language of our site almost always uses the plural version of the business category. On one hand we want to know exactly how we rank for variations of search terms, but on the other, with the number of categories we have we could be tracking thousands upon thousands if we included every variation ("pet store chicago", "pet stores in chicago", etc.) So what say ye fellow optimizers? Is it worth tracking variations of search terms or do you find that Google is smart enough in coalescing the intent of similar search variants that tracking against the most commonly searched one is enough? Thank you all!
Moz Pro | | qurve0 -
Is there a Tool to compare Duplicate content for non web Live content?
Is there a tool that can give me % of duplicate content when comparing two pieces of content that are not Live on the web? Like copyscape but for content that may not be indexed by copyscape or not live on the web? Does Word or any other program allow you do do this?
Moz Pro | | bozzie3110