Tool for tracking actions taken on problem urls
-
I am looking for tool suggestions that assist in keeping track of problem urls, the actions taken on urls, and help deal with tracking and testing a large number of errors gathered from many sources.
So, what I want is to be able to export lists of url's and their problems from my current sets of tools (SEOmoz campaigns, Google WM, Bing WM,.Screaming Frog) and input them into a type of centralized DB that will allow me to see all of the actions that need to be taken on each url while at the same time removing duplicates as each tool finds a significant amount of the same issues.
Example Case:
SEOmoz and Google identify urls with duplicate title tags (example.com/url1 & example.com/url2) , while Screaming frog sees that example.com/url1 contains a link that is no longer valid (so terminates in a 404).
When I import the three reports into the tool I would like to see that example.com/url1 has two issues pending, a duplicated title and a broken link, without duplicating the entry that both SEOmoz and Google found.
I would also like to see historical information on the url, so if I have written redirects to it (to fix a previous problem), or if it used to be a broken page (i.e. 4XX or 5XX error) and is now fixed.
Finally, I would like to not be bothered with the same issue twice. As Google is incredibly slow with updating their issues summary, I would like to not important duplicate issues (so the tool should recognize that the url is already in the DB and that it has been resolved).
Bonus for any tool that uses Google and SEOmoz API to gather this info for me
Bonus Bonus for any tool that is smart enough to check and mark as resolved issues as they come in (for instance, if a url has a 403 error it would check on import if it still resolved as a 403. If it did it would add it to the issue queue, if not it would be marked as fixed).
Does anything like this exist? how do you deal with tracking and fixing thousands of urls and their problems and the duplicates created from using multiple tools.
Thanks!
-
Maybe I don't fully appreciate the power of excel but what I am envisioning seems to require more than what excel can provide.
Thanks for the suggestion though. I will think about it some more.
-
I think you're looking for Microsoft Excel
You might try an issue tracker like Redmine. Beyond that, no, sounds like you would want a custom tool, since there isn't enough demand for this kind of thing.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tool recommendation for Page Depth?
I'd like to crawl our ecommerce site to see how deep (clicks from home page) pages are. I want to verify that every category, sub-category, and product detail page is within three clicks of the home page for googlebot. Suggestions? Thanks!
Moz Pro | | Garmentory0 -
Magento: Moz finding URL and URL?p=1 as duplicate. Solution?
Good day Mozzers! Moz bot is finding URL's in the Catalogue pages with the format www.example.com/something and www.example.com/something?p=1 as duplicate (since they are the same page) Whats the best solution to implement here? Canonical? Any other? Cheers! MozAddict
Moz Pro | | MozAddict0 -
ETA On the Keyword Research Tool?
I'm trying to do some keyword research on Google UK but it hasn't been working for a couple of days now. I have a client waiting on a proposal that features this research heavily and might lose them if I can't use it soon! Is there any estimated time for the return of the service? Is the keyword tool down for everybody?
Moz Pro | | EssexGirl0 -
What tools can I use to crawl a site which uses #! hasbhang?
I have a site which was created in a way that it uses hasbang #!. I am using 3 different SEO tools and they can't seem to crawl the website. Or what suggestion can you give me in dealing with hasbang. Any ideas please. Thanks a lot for your help. Allan
Moz Pro | | AllanDuncan0 -
Is the keyword difficulty tool the most helpful in all situations?
I understand that the scores it generates are essentially based on the difficulty of appearing on the first SERP for the keyword in question. That said, I am having a lot of difficulty finding keywords in my niche which return a score that would make this easily achievable for a site of my size.... The reason I'm pointing this out is because theoretically, a keyword could have a HIGHLY competitive first SERP, with a significant drop-off on the second SERP, which would make achieving a top ranking on that page substantially easier. So my question really is, is the importance of appearing on the first SERP so unequivocally important that it is a pointless activity to attempt deliberately to rank for keywords on the second SERP, which is ignored by the keyword difficulty tool? I know the breakdown of clicks goes something like 40% for top spot, 12% for second and downwards from there, but if a certain query has over a million searches per month, for example, it would still be possible to get considerable amounts of traffic by trying to rank highly on the second SERP, which the keyword difficulty tool cannot help with. So is this really a useless activity?
Moz Pro | | ZakGottlieb710 -
Keyword Difficulty Tool: Error
Hi - is anyone else getting an error using the Keyword Difficulty tool? I'm getting "ERROR: There was a transient error with your request. Please try again."
Moz Pro | | ErikDster0 -
301 redirect in SEOMoz campaigns tool
I did a 301 redirect to another domain and I would like to change the domain name in SeoMoz campaigns tool to continue to track the keywords, is it possible ?
Moz Pro | | mhenriques0