James can you elaborate a bit on your answer?
Posts made by JaredMumford
-
RE: Is there a way to track mobile rankings vs desktop rankings in Moz?
-
RE: In need of guidance on keyword targeting
It looks like these guys do a lot more than just concrete repair. For that reason alone, I wouldnt try and fully optimize the home page for just concrete related keywords.
Since the site itself has very low metrics to begin with, you basically have a clean slate. Id optimize all of the services landing pages for their respective keywords (i.e. concrete repair, concrete repairs, concrete repair contractors for the concrete page, though I would not triplicate any one keyword) making sure they had good titles, good amount of unique copy and so on.
Its a natural approach to the taxonomy which always works well for Google placement. It wont get your to rank with on-page alone (except maybe some of the more obscure services) so some inbound links will probably be needed.
Good Luck!
-
RE: Salvaging links from WMT “Crawl Errors” list?
Hi Gregory -
Yes, as Frederico mentions you do not have to put the rewrite cond. before every rewrite since it the htaccess is on your root its implied. You might need to do this if you creating multiple redirects for www to non-www etc.
Also Frederico is right - this isnt the best way to deal with these links, but I use a different solution. First I get a flat file of my inbound links using other tools as well as WMT, and then i run them through a test to ensure that the linking page still exist.
Then I go through the list and just remove the scraper / stats sites like webstatsdomain, alexa etc so that the list is more manageable. Then I decide which links are ok to keep (there's no real quick way to decide, and everyone has their own method). But the only links are "bad" would be ones that may violate Google's Webmaster Guidelines.
Your list should be quite small at this point, unless you had a bunch of links to a page that you subsequently moved or changed its URL. In that case, add the rewrite to htaccess. The remaining list you can simply contact the sites and notify them of the broken link and ask to have it fixed. This is the best case scenario (instead of having it go to a 404 or even a 301 redirect). If its a good link, its worth the effort.
Hope that helps!
-
RE: Disavow Tool - WWW or Not?
To clear up any uncertainty, I think there are two questions being asked:
- Link to be disavowed: Do I disavow both the www and non-www versions of a bad link?
- Site you own: Which site in webmaster tools do I upload the disavow list to - www or non-www?
The link to be disavowed is an easy answer because in most cases if you want a link disavowed, you probably don't want a link from that domain (because its suspect, de-indexed, etc.). Therefore you can simply blanket it with domain:badwebsite.com. This will be sure to get any link from this site to yours, regardless of the subdomain (i.e. www.badwebsite.com, ww2.badwebsite.com, forum.badwebsite.com, etc.)
Answer #2 isn't quite as easy. The safest (and arguably proper) way is to link mine both the www and non-www versions of your website and treat each as a separate site (as Google does). Even if you are using 301 redirects or canonicals I still recommend this method. In many cases, one version will have a much smaller backlink volume. In any case, pick out the bad links and try to get them removed by emailing the website. Once the attempt has been made, Compile the remaining backlinks (still in separate lists for www and non-www), and upload them to their respective disavow tool areas.
-
RE: Google Adwords - trying to understand the figures...
Remember, Google uses LSI in their algorithms - so usually when you see strange discrepancies like this it means that the terms are being treated as semantically related. E.g. - if you search for "ready mix concrete" you'll see both terms (mix / mixed) in bold. Same for forklift truck / hire - you'll see both in bold.
I cant say that I know if this is the reason its just food for thought. I no longer use the Google Keyword Tool to estimate traffic as it can be really off - but what it still does well is measure relative traffic (keyword x has 2.5x the traffic as keyword y, and so on).
-
RE: Caching Problem !
Hi Shubham - I see this domain cached on Dec 29:
http://webcache.googleusercontent.com/search?q=cache%3Awww.glanceseo.com/
Was it a specific page you were inquiring about?
-
RE: Removing URL Parentheses in HTACCESS
I thought I'd come back and re-post the solution in case this shows up in SERPs or anyone other Moz members are looking for this answer (courtesy Noah Wooden of Izoox). HTACCESS:
<ifmodule mod_rewrite.c="">RewriteEngine On
# Strip set of opening-closing parenthesis from URL and 301 redirect.
RewriteCond %{REQUEST_URI} [()]+
RewriteRule ^(.)[(]+([^)])[)]+(.*)$ /$1$2$3 [R=301,L]</ifmodule>Remember to put this in the proper 'order' on your htaccess file if you are doing any other redirecting. The code above 301 redirects URLs with parentheses into the exact same URL minus the parentheses.
-
RE: Removing URL Parentheses in HTACCESS
thanks Merlin - Ill have their programmer try this.
-
RE: Removing URL Parentheses in HTACCESS
Hi Merlin - thank you.
Here is an example:
www.domain-name.com/category1/subcategory/product-name-details-(model-number)-length
needs to change to:
www.domain-name.com/category1/subcategory/product-name-details-model-number-length
Any suggestion would be great. Their programmer is having trouble creating a rule.
Thanks in advance
-
Removing URL Parentheses in HTACCESS
Im reworking a website for a client, and their current URLs have parentheses. I'd like to get rid of these, but individual 301 redirects in htaccess is not practical, since the parentheses are located in many URLs.
Does anyone know an HTACCESS rule that will simply remove URL parantheses as a 301 redirect?
-
RE: Canonical Related question
Hi Manish - this is exactly how you use the canonical tag.
Your other option would be to use the rel next / prev but canonical works just as well and is what I use unless various "pages" (page 2, 3, 4 etc) are actually also ranking.
Cheers!
-
RE: Choosing an SEO Company
I agree with all of the above. The reality is that SEO's now get the 'mechanic' rap - you don't know what they're doing under the hood and they charge an arm and a leg.
I think first and foremost you have to look at the client portfolio, and do reference checks. Unfortunately, companies who provide these are never the cheapest (for good reason) but you get piece of mind.
Some SEO companies claim that they cant provide references for their client's privacy. This is true in some cases, but any SEO company that has been given permission to display their client's logo should have permission to give a reference for at least 1, if not 50%.
Speak to the reference - and check the rankings they received. Its a sure fire way to know if the company you are considering is worth their salt.
A good question and one that many businesses don't ask and pay the price. I've heard a thousand times, and almost every client we get has trust issues because they've already been burned.
Good luck!
-
RE: Local SEO-How to handle multiple business at same address
FYI: the unique suite # was requested simply so that the address was 'unique' from google's perspective. assuming that google understands suite #'s. The mail was always forwarded to the proper business even with the duplicate suite # since this was a part of the virtual office service.
-
RE: Landing Page URL Structure
From a strict A/B standpoint where the two variables are:
www.domain.com/state/california
You will see no discernible difference in SEO results. That being said, if you plan to expand the URLs later, i.e.
www.domain.com/state/california/santa-monica/our-service-keywords.* then you should probably consider what length they will be and factor that into your decision.
-
RE: Local SEO-How to handle multiple business at same address
Thumbs up to Miriam who gives a lot of good advice here. And definitely merging is the worst case scenario. Since this discussion is still going, and since it seems your client cant simply go get a real business address (and the overhead that comes with it) Ill give a real case study as an example.
We had a client that was in a competitive niche, and provided a service (not a product). She had a virtual address, and by that I mean they paid a monthly fee to the location and as part of that fee the location would forward any inbound mail, and the client could also use meeting rooms, offices or boardrooms a certain amount of times per month. Other services were available such as phone answering etc...
When she became a client, the first thing we realized was that the client had the same suite number as all the other businesses that used the same 'virtual office' service. The clients previous SEO had already started citation work, but didnt warn them about merging or any other problems associated.
Anyway, so what we did was first request a different unique suite number from the service, which they provided at no extra cost. Then, we bought a local number and forwarded it to her home, which was a local transfer since she indeed was in that city but worked from home unless she needed to meet clients etc.
So now we had a unique local address with a unique local phone number. The last thing we had to do was simply mine for old citations and have them all changed.
This worked, and still does work, but we only did this after explaining to the client that it was not the best scenario for sustainability in local SEO. As per my first comment, at anytime Google could simply omit that address and all business that claim it as a brick and mortar address.
Best of luck !
-
RE: Local SEO-How to handle multiple business at same address
I agree with bjgomer13 in saying that ranking locally with shared addresses does still work. The only caveat is that Local is always changing, and its hard to know if this is something google will target - not because of clients like yours, but because of business that abuse it. Just as they did with PO boxes earlier. As always, the advice you get is confusing because no one can 'predict' whats going to change, and this was a pretty shaky year for algo changes so everyone is being careful.
This response probably just confuses things more!
-
RE: Local SEO-How to handle multiple business at same address
This is a tough one. The worst thing to do is to start with citations and then find out they arent working, and then fix the issue and start all over again (now having two NAPs floating around.
Some businesses use virtual offices in an attempt to rank in different cities or areas. If this is the case with your client, it never hurts to contact that service and explain that you must have a unique suite number. Ive found in some cases they will be quite accommodating.
As for the effects - Ive performed local optimization for clients in this scenario and it still worked fine (and the other businesses using the same virtual office also were in the maps for their keywords), but with constant changes in Local, its risky (in my opinion) to continue without getting a unique address first.
Just my 2 cents!
-
RE: Client error 404
Hi Tobias - how are you checking 404s, are you using SEOMoz crawl diagnostics?
If so, export the CSV! Many people dont do this yet there is a plethora of info there.
- Open the csv, and sort by the "4xx" column so that all of the 'TRUE' cell values are at the top.
- Delete / Hide / Move all columns except: URL, 4xx, TIME CRAWLED and REFERRER
Now you can see which URLs are 404,'ing, how the crawl found them (where the link is that sent the crawler there) and what time it crawled (in case youve fixed it since.
Its easy and its thorough.
-
RE: Moving our current homepage to a new URL
Definitely to reiterate Dana, need more info about the specifics because yes there could be disastrous effects if not done properly.
Besides meta refresh, there are other things to consider. Probably most importantly is the positioning of the home page in search results for the 'product'. If you are ranking top 5 for your product keywords, and the ranking page is your home page, you may find that when you rework your home page to reflect your brand, you lose that positioning.
If you were moving from one landing page to another (i.e. /products to say /products/buy, it would be a bit different because you could 301 the /products page and officially tell Google and other SEs that you are simply moving the page. You cannot, obviously, do this with the root domain (301) or that would defeat the branding purpose.
I would definitely check positioning and revenue for your money keywords first. Also, once you move the product, i would at the very least have a link in the main navigation of the home page that links directly to the product with the appropriate anchor. If you only have one product, or one product set, i would also encourage you to optimized the URL (i.e. instead of /product it could be /
Still need more info to make solid recommendations.
-
RE: If I am changing my domain for my website and want to keep using the same Google Analytics account to keep the data from the old domain. How should I proceed?
Right - analytics doesnt actually use the domain in the tracking code, unless you are doing special types of tracking, so as long as you are doing regular Analytics tracking, you can simply port over the code.
Go take a look at your code, and see if there are any setDomain parameters or anything else where your domain is listed right in the tracking code. If not, you're OK.
Also, I would go to the analytics profile and then the ADMIN > PROPERTY SETTINGS and change the default URL (though this doesn't affect tracking code, its good for continuity.)
-
RE: INTERNAL ANCHOR TEXT LINKS
Hi jdcline -
Would probably need a bit more info to provide an answer. Obviously you are aware of the Penguin update last Friday (http://www.seomoz.org/google-algorithm-change).
My initial response wants to be that you should create pillar pages where the home page links to 10 or so main categories, and then those pages link to next level pages, instead of linking to all sites from the home page.
Definitely need more info though - does each page, link to every other page (50)? Are the links to the 50 pages from the home page in the footer? Top navigation?
-
RE: If I am changing my domain for my website and want to keep using the same Google Analytics account to keep the data from the old domain. How should I proceed?
Hi Brian -
Simply use the same analytics account - you can usually copy paste the code, but make sure if your tracking is different (maybe you are tracking a subdomain now, or maybe you are an ecomm site where variables need to be edited for ecomm tracking).
When you add it to the new site, make an annotation in Google Analytics showing that you made the domain change on that date (its just handy when youre looking back at October a year from now).
Dont forget to 301 your old pages to your new ones!
-
RE: Backlinks in client website footers - best strategy?
SEO's are wary and for good reason - it wasn't long ago that 'footer' links was an over-used tactic that was quashed, resulting in the start of the 'blogroll' type links. Like I said, opinions are going to differ wildly for different SEO's on this one. For us, when we work a client, we know and control their link profile so we know that their site(s) are very clean. As such, we feel that having a footer link to our business (just as web design companies do) that we are not at risk.
Like you, we see this everywhere in our competitor link profiles and in fact, we see large variations in local ranking when there are links added or removed from their or our local clients.
So it's going to be your call in the end - opinions will differ, and even I might change my mind within the next few months depending on what happens in the SE Algo world. But for now this is a practice we are actively using.
-
RE: Should i remove sitemap from the mainsite at a webshop (footer link) and only submit .XML in Webmaster tools?
Hi Mickel -
The answer is both. Keep your html sitemap, and keep the link in the footer as is. Crawlers will look at these but they are generally more for human visits.
Then create your xml sitemap (www.url.com/sitemap.xml) and verify it in Google Webmaster Tools.
Hope this helps!
-
RE: Sorting Dupe Content Pages
CMC is correct - thats how I do it for larger sites.
- delete all columns except the URL column (col A) and the duplicate pages column (now Col B)
- in cell C2, enter this formula: =len(b2) it will calculate the characters in dupe pages cell
- drag that cell down to last row
- select all three columns and sort col c by largest to smallest
Obviously this isn't going to give you an exact number of dupe pages since URL text strings can vary in length, but it does give you a pretty good idea of the worst offenders....
-
RE: Testing for duplicate content and title tags
You can always check by testing in your browser but the best way is to check the header response to make sure the server is sending the proper response (a 301) - your landing pages look good (see below). I use Live HTTP Headers which is a firefox plugin - hers what it tell you:
http://pharmacy777.com.au/our-pharmacies/applecross-village/
GET /our-pharmacies/applecross-village/ HTTP/1.1
Host: pharmacy777.com.au
User-Agent: Mozilla/5.0 (Windows NT 6.0; rv:15.0) Gecko/20100101 Firefox/15.0.1
HTTP/1.1 301 Moved Permanently
Date: Thu, 04 Oct 2012 03:23:17 GMT
Server: Apache/2.2.22 (Ubuntu)
Location: http://www.pharmacy777.com.au/our-pharmacies/applecross-village/So the redirect is working. The only thing i noticed was that the home page instantly switched to www and didnt even return a 301 so it appears you may have implemented a redirect there outside of htaccess.
If your report is still showing duplicates make sure that its not the trailing slash. Your URLs can be loaded as such:
http://www.pharmacy777.com.au/our-pharmacies/applecross/
http://www.pharmacy777.com.au/our-pharmacies/applecross
The best way to find out if the SEOMoz report is counting these as dupes is to Export the crawl report to CSV (top right of crawl report). Then go all the way to the far right column called 'duplicate pages' and sort it alphabetically. This column will show you all of the duplicate urls for each particular URL row. Lots of times you can find little 'surprises' here - that csv report is priceless!
-
RE: Testing for duplicate content and title tags
Update: Reporting can be historic - so you are probably looking at a report from an older crawl.
-
RE: Testing for duplicate content and title tags
Hi Claire - we need the url of your site to check the headers on the 301 redirect!
Definitely a good way to fix this is via htaccess like you are suggesting you did. When I get a new client its in the campaign startup list and it works well. Make sure there arent any other issues like the infamous trailing slash causing duplication. If you provide the URL a quick check can be made.
-
RE: All Keywords Down.
Ive never got used to using allow although I know many people do. I simply use:
User-agent: *
Disallow: /cgi-bin/
Sitemap: http://www.aa-rental.com/Where user agent * means all crawlers;
Where disallow: /cgi-bin/ means i dont want them crawling that folder (put in any folder name there) but they can crawl anywhere else...
Sitemap: points to my XML sitemap
-
RE: ECommerc site redirect to external site when add to cart. Need HELP to track sales!!!
The way you have it set up, youll only see referrer traffic from your old site. So you can see how many people came from your site, and ended up converting. Youll never know precisely what source those visits originated from using this method.
Technically in Analytics you can track cross domain, but if you aren't an expert you would probably need to hire one. Basically you tell Google to track both domains, and site A needs to be able to transfer cookie data to site B. You would then set up a custom filter that shows URI so that you know what domain is what when looking at landing pages in Analytics. Its probably better explained in Analytics devguides here.
Gets tough to trouble shoot cross domain tracking. But when it works its great for 3rd party carts etc. Good luck!
-
RE: All Keywords Down.
Don't look for a question mark, go to the site's profile and look for an email - it will tell you that Google has found unnatural links. Keep in mind that sometimes these emails come 4 or 5 days after the [penalty has been applied.You may have to wait a few days (unless this happened some time ago).
If it's Penguin you wont get a notification. Others may disagree but the Penguin recoveries Ive performed were almost exclusively due to a higher than "natural" appearance of keyword anchor text links in the link profile. If you check your backlinks in OSE, and find that you have a high percentage of links that all use your money kw, then yo will need to have those backlinks diversified.
-
RE: Removing old versions of a page.
This is an age old question (kind of). Your answer lies lies in weighing the pros and cons of redirection and consolidation. The questions you need to ask are:
1. Are there inbound links targeting any of the 'old' pages in question. To find out, use OSE.
2. Are the "old" pages that are still in the index, ranking for anything of value? Use Analytics CONTENT > LANDING PAGES to check and make sure "old" pages aren't getting visits.
If 1. is yes, then you should look at the value of those links - if there are some really good ones (old, from pages with high domain authority, etc) then I would consider making a request to those webmasters that the link be changed to the new page URL. This is a courtesy most webmasters will acknowledge since it saves them having a dead link on their website. If the answer is no, and there are no inbound links proceed to #2!
If 2. is yes, then you should definitely make sure that all inbound links have been changed to the new page before redirecting. However, in addition to redirecting, also make sure that the page you are redirecting to is similar in nature. If the ranking page has a lot of good content, then make sure the new page does as well, etc. I
-
RE: Google Penguin Penalty
Sry guys - must have posted while others were writing. Even if you didn't get the original email - if Google is saying you have unnatural links, then you need to deal with that problem. Submitting recon makes no difference if you dont get rid of the bad links
You will almost certainly need to take action by linkmining your website, identifying 'unnatural' links or links from blatantly bad neighborhoods, submit requests for removal from the webmasters, and so on. Document all of your work and then add that information in with your reconsideration request. If you do not put the work in you will not get a positive response.
-
RE: Google Penguin Penalty
Hi Scott -
Eyepaq has it right in that you need to understand if you are under a manual penalty or an algorithmic penalty to know exactly what course of action to take.
Manual Penalty - You'll get an email in WMT if you are under penalty, This will be due to bad links. You will need to clean up your links, and then open that email and click on the 'reconsideration request' link.
Algorithmic - You'll usually see a sudden drop in positioning, in some cases it will affect a string of your best keyword variations while other low lying fruit as it were still rank well. For this penalty, check anchor text % first, then go through the other penguin stuff (you can find tons of info in the SEOMoz blog). There is a form online you can use if you "feel your website was unfairly demoted" but I am no longer sure of its efficacy.
Position 46 > 50 or vice versa. This really means nothing at all so don't take it as an indicator. Google is always messing around so 4 positions out of 50 is insignificant. For example, your site used to be top 10, but when it moved to 50, about 40 websites moved "up" a spot. Consistent movement from week to week is a better indicator.
-
RE: Backlinks in client website footers - best strategy?
Honestly, at first glance most SEO's would probably say this is a bad idea, particularly with Penguin - the problem is, it seems to work. I know as an SEO when i do competitive analysis, some of our competitors have thousands of links that read:
SEO by
And they certainly rank for their respective terms. Opinions may differ with some SEO professionals here, but I would argue that unless you have seen a significant decline in positioning for the anchored text, I wouldn't be concerned about it and would venture to say that it's probably a positive strategy. The best part about links like these (we do this as well) is that for the most part, the links are usually very clean.
Hope that helps.
-
RE: Best Anchor Text Practices Post Penguin Panda
Hi Ilya, I replied to another recent question so Im pretty sure I know you're dilemma
From what we are seeing, averaging thresholds where your kw anchor is less than 20%, and then using Brand & URL, as well as others "click here" to fill in the rest. Brand is good I still wouldn't exceed a threshold of 30% on any one keyword. So for example, a company called Sparkling Flooring, we would use Sparkling Flooring no more than 30%, www.sparklinkflooring.com around 30% (but you can go higher with URLs, we just choose not to), and no more than 20% on 'hardwood flooring', etc.
Combined with excellent unique copy, a good information section like a blog or installation tips section, and other good Optimization techniques, and youre golden.
-
RE: Does anyone else love SEOMOZ as much as me?
Official SEOMoz complaint: I never did receive my hug from Roger. Just saying...
Update: Hug received via Twitter!
-
RE: Too many on page links
The user content links are the ones that are putting you 'over the threshold' - just as would be the case if those were products on an ecomm site - in my opinion I would completely disregard that warning for the paginated pages.
-
RE: Penalization for Duplicate URLs with %29 or "/"
Canonical tags should drastically help with this. The % is being generated because the URL is being encoded and has a "(" in it. Have your product page each contain their own canonical with the URL you want indexed. Not sure which URL to use? Check your internal links and see how your site is linking to your product pages. Presumably its:
http://www.company.com/ProductX-(-etc/
or
http://www.company.com/ProductX-(-etc
Add this URL as your canonical and the SE's will understand what page is the 'real' page. This will solve both problems from an SEO standpoint. If you want to actually stop the site from doing this, you can remove trailing slashes and encoding using HTACCESS.
-
RE: How does using a CMS (i.e. Wordpress/Drupal) affect backlinks and SEO?
This is probably the most well constructed, and humorous explanation on this that I have ever read. Bravo.
-
RE: Too many on page links
I wouldn't be too concerned about this error (or more specifically 'warning'). What pages are specifically getting them, the separate user submission pages or the paginated pages?
I've had a lot of ecomm clients who have 30 products listed per page, along with collapsible menus that show all of the 'hidden' links in the source, and have not seen any negative effects from this.
Curious on other opinions here.
-
RE: Are meta tags useless?
View the page source (right click + view page source), look in the section. Youll see it -
<meta name="keywords" content="keyword1, keyword2" /> Tip: if theres a lot of stuff crammed into the HEAD section, just CTRL-F and search for ``` (meta name="keywords") without the parenthesis
-
RE: Are meta tags useless?
You're opening a can with this question!
The efficacy of meta tags is much debated. Most people believe that the keyword meta tag has no effect whatsoever on SEO, and some believe the same to be true for the meta description.
The original purpose of the meta keywords tag was to help Search Engines understand what your page was about. After years of unabashed over-optimization, the tag slowly became less and less of a signal.
The meta description tag is a brief description of the page, and is sometimes used as the description in the SERPs. There are varying arguments on the efficacy of this tag as well, although it can be useful from a clickthrough conversion standpoint.
I'm sure youll get a lot of varying opinions on this one!
-
RE: If you have multiple schema types on a page, which Rich Snippet will display in Google?
I think there is no definite answer here. I will say however that when i schema a video that is embedded and has some views (vimeo, YT) it seems to appear faster in SERPs. So that might be a weighing factor when Google determines whether to use an image or video.
-
RE: Can I write guest blogs on competitor's blogs?
Absolutely its fine. If they are local you run the risk of your article helping your competitor to outrank you (see EGOL's hilarious example). But if the competitor is not local, its a relevant site and a great place for a blog post. Getting them to do publish it may not be so easy however...
-
RE: How does using a CMS (i.e. Wordpress/Drupal) affect backlinks and SEO?
"So I'm just wondering, since CMS pages are pretty much created on spot and not retrieved from a library, how this affects backlinks and anchor text? How exactly does the external website point to yours if the URL is dynamically generated?"
Firstly, different CMS's create pages differently. CMS just means content management, which means the platform just provides a gui for you to add content or make changes. If you are using WP and creating pages, then these pages wil be indexed as any other page, and links pointing to it would simply target the page's URL.
Wordpress uses permalinks and Drupal uses pathauto to redirect platform generated links into SEO friendly one. They use an internal redirect and the resulting URL is indexed in Google. Therefore, you simply treat the resulting URL as the "real" url, and external links to it work fine.
-
RE: URL Error "NODE"
For whatever reason it seems like that node is not redirecting to the proper clean URL. It might be because the submission was improperly placed, or because the video doesnt exist. All other user content appears to be redirecting fine from NODE > Clean URL.
I think its safe to just delete the user content on this one.