Does anyone have a good program they use for full site audits?
-
I'm looking to find a program that will do the following:
- Scan for page errors including code issues, hosting issues, redirect issues, etc.
- Pages missing Google Analytics
- Google + Local audit to identify issues with NAP, citations, category selection etc.
- Find pages with title issues including missing page titles, duplicates or titles that are too short or too long, header tag issues such as missing H1 tags
- Meta description issues including missing meta descriptions, duplicate meta descriptions or meta descriptions that are too short or too long
- Link issues including broken internal or external links or missing anchor or ALT text
- Identify internal or external links using rel=”nofollow”
- Image issues, such as missing ALT or title text and broken images
- Identify pages using Schema.org microdata
I know there are probably a couple programs that will do little bits here and there so I'm open to suggestions.
Thank you.
-
Don’t really want to add anything more, the one word answer to your question is “Screaming Frog”!
Hope this helps!
-
Thank you Dirk for the advice. This is very helpful.
-
Just an add-on. Screaming Frog gives you a good overview of the health of your site by going to the reports link on the main menu and selecting "crawl overview".
Then, if want to be sure you've covered everything, I highly recommend this Moz post. http://moz.com/blog/how-to-perform-the-worlds-greatest-seo-audit
-
Ask Dirk recommends, you're probably going to need more than one program to cover this list. The tools here (Moz Local for NAP issues, Moz Analytics for SEO issues) would also help. Another crawler option is Xenu Link Sleuth, while GWT, Bing Webmaster, and Yandex can also help.
If your points were numbered, Moz Analytics would address: part of 1 (redirects), 4, 5, 6, and 7.
-
Hi
You should check screaming frog - http://www.screamingfrog.co.uk/seo-spider/#download - which can do most of the tasks you list. It's free to try (crawling up to 500 url's) - 99 pound for annual licence . Not sure about the local stuff - all the other points you mention can be done.
To check Google Analytics you will have to add 2 custom filters (contains/does not contain ua-xxxxxxx) to see which pages have/have not the tag on it.
Idem for Schema.org markup - add the markup you want to check as custom filter and check the pages that do not have the markup in the HTML
rgds,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Local Site Linking to Corporate Site In Main Menu - Bad for SEO?
Hi, We have 'local' websites for different countries (UK, DE, FR, AP, US etc.) and a corporate website, the local websites are going to be linking back to the corporate website in the main menu (think about us, terms and conditions kind of pages). Any local products will have their own pages on the local website but global products will be linked back to the corporate website. We will be placing an indication the user will be going to another website next to those menu links that go to the corporate website. Is there any drawback to this for SEO? Should we use nofollow in the menu structure of regional websites for these links? Thanks for your help.
Local Website Optimization | | UNIT40 -
Which URL and rel=canonical structure to use for location based product inventory pages?
I am working on an automotive retailer site that displays local car inventory in nearby dealerships based on location. Within the site, a zip code is required to search, and the car inventory is displayed in a typical product list that can be filtered and sorted by the searcher to fit the searchers needs. We would like to structure these product inventory list pages that are based on location to give the best chance at ranking, if not now, further down the road when we have built up more authority to compete with the big dogs in SERP like AutoTrader.com, TrueCar.com, etc. These higher authority sites are able to rank their location based car inventory pages on the first page consistently across all makes and models. For example, searching the term "new nissan rogue" in the Los Angeles, CA area returns a few location based inventory pages on page 1. The sites in the industry that are able to rank their inventory pages will display a relatively clean looking URL with no redirect that still displays the local inventory like this in the SERP:
Local Website Optimization | | tdastru
https://www.autotrader.com/cars-for-sale/New+Cars/Nissan/Rogue
but almost always use a rel=canonical tag within the page to a page with a location parameter attached to the end of the URL like this one:
https://www.autotrader.com/cars-for-sale/New+Cars/Nissan/Rogue/Los+Angeles+CA-90001"/>
I'm having a hard time figuring out why sites like this example have their URLs and pages structured this way. What would be the best practice for structuring the URL and rel=canonical tags to be able to rank for and display location based inventory pages for cars near the searcher?0 -
Should I use Rel-Canonicals links for a News site with similar articles each year
Our small town news site provides coverage in a lot of seasonal areas, and we're struggling with the current year's content ranking above previous years. For instance, every year we cover the local high school football team, and create 2-3 articles per game. We'll also have some articles preseason with upcoming schedule and general team "talk". We've seen where articles from past seasons will rank higher than the current season, presumably because the older articles have more links to them from other sources (among other factors). We don't want to delete these old articles and 301 them to the newer article, since most articles include information/stories about specific players...and their families don't want the article to ever come down. Should we rel-canonical the older articles to the newer one, or perhaps to the "high school football" category page? If to the category page, should we rel-canonical even the new articles to that main category page? Thanks for the help.
Local Website Optimization | | YourMark.com0 -
How can my categories rank for my different branches? Tidied site up but now local rankings are worse
Dear Mozzers , I am wondering if someone could please help with some advice and assistance on the following for our Tool hire site: Basically I like to know how we can rank for our categories for our different branch locations ?. We have a branch finder page and separate branch pages but I do not know if I should have an internal link from all our branch pages to all my different categories or not or is google clever enough to know that I have x locations and x categories and I should rank all the categories in all the locations. I think my site structure is fairly straightforward and on the face of it similar to what others do who have multiple branches . For example I enclose a link to 2 of our categories - carpet cleaner hire category and a floor sander hire category carpet cleaner category - http://goo.gl/cMyS4i floor sander category - http://goo.gl/4ipUyA Heres a link to our Branch Finder - http://goo.gl/UyTQdK Heres a link to one of our Branches for example - Bristol Branch - http://goo.gl/9TXHTK And heres our link to our google plus Bristol page - google plus bristol branch page - https://goo.gl/h0IwAK . We have link from our bristol page going to the bristol google plus page and visa versa. Currently within our internal linking structure there is No direct link on the branch pages to the categories ?. Is this something we need to do or not necessary ?. - If we do it , then it may mess up or confuse the page as I someone need to get all the category links on the branch pages ? We have lots of good unique content , lots of citations for our branches and categories etc but we just don't seem to rank at all well for any of our categories in local search. For example if somene was to search for - Carpet cleaner hire "City Name " or Floor sander hire "City Name" (City name being where our branches are). We dont rank very well for most of our cities. Even without putting the city name in we dont rank to well in local search. We used to have individual pages for our categories in each of the cities we have branches with unique content on all and these did rank quite well in a few cities but never top 3 in most and we got rid of these last month (start of Oct) as I was told that google may see this as quite spammy or doorway pages if I have a carpet cleaner hire Bristol page or a floor sander hire Bristol page etc ?.. All my location landing pages now just 301 back to the appropriate category. I am wondering if getting rid of these landing pages was a good idea as by tidying things up , I've seemed to have lost my local rankings for my cities. Can someone please advise if what I did was right and what else I should look at doing ?> Could it be an internal linking issue I need to sort ? Any assistance much appreciated.
Local Website Optimization | | PeteC12
thanks
Pete0 -
Need advice on direction to go with site
I am taking over this site and redoing it all over. I believe that google may have penalized the site because the site doesn't show up in the SERPS, but will show under a google search (site:prosplumbingsanjoseca.com). I am just asking for your opinions on what I should do to correct the issues with this site and get back into the SERPS.
Local Website Optimization | | mikezaiss0 -
Duplicate content question for multiple sites under one brand
I would like to get some opinions on the best way to handle duplicate / similar content that is on our company website and local facility level sites. Our company website is our flagship website that contains all of our service offerings, and we use this site to complete nationally for our SEO efforts. We then have around 100 localized facility level sites for the different locations we operate that we use to rank for local SEO. There is enough of a difference between these locations that it was decided (long ago before me) that there would be a separate website for each. There is however, much duplicate content across all these sites due to the service offerings being roughly the same. Every website has it's own unique domain name, but I believe they are all on the same C-block. I'm thinking of going with 1 of 2 options and wanted to get some opinions on which would be best. 1 - Keep the services content identical across the company website and all facility sites, and use the rel=canonical tag on all the facility sites to reference the company website. My only concern here is if this would drastically hurt local SEO for the facility sites. 2 - Create two unique sets of services content. Use one set on the company website. And use the second set on the facility sites, and either live with the duplicate content or try and sprinkle in enough local geographic content to create some differential between the facility sites. Or if there are other suggestions on a better way to handle this, I would love to hear any other thoughts as well. Thanks!
Local Website Optimization | | KHCreative0 -
Google Panda 4.0 update - Good for Small businesses?
Hi guys, We recently did a post on Google Panda 4.0 release. Check this here. Have you seen any notable changes in rankings for your website? Do you think that this update will benefit small businesses/websites? Looking forward to your comments.
Local Website Optimization | | FRL2 -
Do more page links work against a Google SEO ranking when there is only 1 url that other sites will link to?
Say I have a coupon site in a major city and assume there are 20 main locations regions (suburb cities) in that city. Assume that all external links to my site will be to only the home page. www.site.com Assume also that my website business has no physical location. Which scenario is better? 1. One home page that serves up dynamic results based on the user cookie location, but mentions all 20 locations in the content. Google indexes 1 page only, and all external links are to it. 2. One home page that redirects to the user region (one of 20 pages), and therefore will have 20 pages--one for each region that is optimized for that region. Google indexes 20 pages and there will be internal links to the other 19 pages, BUT all external links are still only to the main home page. Thanks.
Local Website Optimization | | couponguy0