Recommended log file analysis software for OS X?
-
Due to some questions over direct traffic and Googlebot behavior, I want to do some log file analysis. The catch is this is a Mac shop, so all our systems are on OS X. I have Windows 8 running in an emulator, but for the sake of simplicity I'd rather run all my software in OS X.
This post by Tim Resnik recommended Web Log Explorer, but it's for Windows only. I did discover Sawmill, which claims to run on any platform.
Any other suggestions? Bear in mind our site is load balanced over three servers, so please take that into consideration.
-
Hello! I just came across your question. Here is a recent Moz post of mine on how to use log files for technical SEO -- the example screenshots are from the SEO dashboard in Logz.io. Feel free to sign-up at our site and test our public beta to see if it works for you.
Disclosure: Yes, I work for the company.
-
Hiya ufmedia,
Funny you bring this up, I just spent a few hours pulling my hair out trying to get a log parser to work on my Mac. Ultimately, I went back to my PC and ran Web Log Explorer. I have heard good things about Splunk, but seems too robust for SEO purposes alone. Their free version allows for 500/MB per day. If you are under that, It might be worth giving it a go. Sawmill looks like it could do the trick, but may require a decent amount of setup. Thanks for the tip! I will check it out.
Thanks!
Tim
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Several Items in the Organization schema structured file
Hi MOZ community! Could you please help me with the issue? I have implemented Organization Schema on the website. But according to the structure I can not markup data once. So now I have 2 Items for a Organization schema on a page. The questions are: 1. Does Google consider both of them? 2. Is that OK to have a few Items for one type of schema on the page? Thank you
Technical SEO | | juicefromtheraw10 -
Fetch as Google - stylesheets and js files are temporarily unreachable
Fetch as Google often says that some of my stylesheets and js files are temporarily unreachable. Is that a problem for SEO? These stylesheets and scripts aren't blocked and Search Consoles show that a normal user would see the page just fine.
Technical SEO | | WebGain0 -
Any recommendations for specialist magneto dedicated hosting in UK/Ireland?
Any one with experience with a specialist magneto dedicated hosting in UK/Ireland? Just getting a few quotes and looking for recommendations ( and any I should avoid) Thanks
Technical SEO | | PaddyDisplays0 -
Wordpress Woocomerce Recommended SEO URL structure
Hi Mozzers ! Thanks for looking. I have a new shop in development (http://www.vintageheirloom.biz), I'm now using WordPress & Woocommerce. I've asked Woocommerce whether it is possible to remove the 'shop' and 'product-category' categories. They say it is, but it isn't recommended, it can slow site speed & create possible duplicate pages. I'm wondering what seasoned SEO experts opinions are on my particular structure? I've heard that a flat structure is recommended, but ecommerce shops as I understand pose their own issues, so any feedback would be appreciated.. Here's some URL examples: http://vintageheirloom.biz/shop/bags/ - this for the category bags http://vintageheirloom.biz/product-category/bags/shoulder-bags/ - this for shoulder bags a child of bags category http://vintageheirloom.biz/shop/2-55-bags/vintage-chanel-caviar-skin-2-55-bag/ - a product The last URL contains the category 2-55 bags. The products name also includes the phrases 2-55 bag. Should this level of repetition be avoided or is it best to keep the whole phrase 'vintage-chanel-caviar-skin-2-55-bag/' for SEO purposes? Thanks for any help you can give me around this issue! Kevin
Technical SEO | | well-its-1-louder0 -
Htaccess file
I need to redirect the web pages which do not exist to 404 error the task need to be done in htaccess file. I am using Linux server. the webpages I want to redirect is my domain name followed by question mark e.g. www.mydomain.com/?dfdds I am using the following snippet in my htaccess file, it redirect to bing.com so far, please tell me how to change the snippet so that it redirect to redirect to 404 error page. ========================== RewriteCond %{QUERY_STRING} . RewriteRule .* http://www.bing.com? [L,R]
Technical SEO | | semer0 -
Broken links shall i put them in my htaccess file to generate juice
Hi, when i had to rebuild my website after my hosting company made an error, i lost over 10,000 pages and lost many thousands of links coming to my site. What i want to know is, instead of trying to recreate those pages again which would take me a long time, should i put them into my htacess file and have them point back into my site. so for example, if i have a link coming to my site to an article which could be, holidays in benidorm are not selling well, would it be a good idea to have that link pointed at the main benidorm section which is benidorm news. And if i had an article which was people are finding it hard to lose weight, instead of writing a new article could i have the link pointing to my health section? If this is the correct way of doing it to grab back some link juice, would it slow my site down and how many links could i put in my htacess file. So what i am trying to say is, if i put in say 1000 redirects into my htaccess file, would it slow my site down and is this a wise thing to do or should i just let the links go.
Technical SEO | | ClaireH-1848860 -
What can I do if Google Webmaster Tools doesn't recognize the robots.txt file?
I'm working on a recently hacked site for a client and and in trying to identify how exactly the hack is running I need to use the fetch as Google bot feature in GWT. I'd love to use this but it thinks the robots.txt is blocking it's acces but the only thing in the robots.txt file is a link to the sitemap. Unde the Blocked URLs section of the GWT it shows that the robots.txt was last downloaded yesterday but it's incorrect information. Is there a way to force Google to look again?
Technical SEO | | DotCar0 -
High pr doc files
I saw that the website www.comunicatedepresa.net outranks www.comunicatedepresa.ro for the therm "comunicate de presa" in google.ro SERP even though .ro beats .net in every seo indicator (links, domains linking, fb likes, g+, onpage etc) I saw that site:www.comunicatedepresa.net returns a lot of *.doc files with a title that contains the kw ("comunicate de presa"). Ex: www.comunicatedepresa.net/worddoc/1485/ It seems a little suspicios to me.Did anyone see this before (google giving higher importance to doc files)? Does anyone know why .net site is ranking better?
Technical SEO | | seo.academy0