Yes, of course. You can book a second, twenty minute session (though I'm sure you can have longer if needed) using the book a session email link you received previously. I've used this twice, before and Steve was really keen to help.
- Home
- Hurf
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Hurf
@Hurf
Job Title: Seo Specialist
Company: Visible Search Marketing
Favorite Thing about SEO
On-page perfection!
Latest posts made by Hurf
-
RE: No data for most of my keywords
-
RE: No data for most of my keywords
Hi,
PM sent. If you contact Moz, you can schedule a free walkthrough of the Moz tools, having given them an outline of what your are looking for from the site. This would be a good opportunity to go over issues like this. They have a UK based representative, Steve Dunn, who is really helpful.
I'd recommend contacting Moz via the blue chat icon for details.
-
RE: No data for most of my keywords
Could you PM me a link to your site so I can try a few things, please?
-
RE: No data for most of my keywords
Here's Moz's explanation of 'No data': "No data means we have not yet collected volume for the keyword" and the expanded answer from their FAQs (https://moz.com/help/guides/keyword-explorer
What does it mean when a keyword has “No Data” for its volume?
“No data” indicates that we’ve not yet collected search volume information on this keyword. It may have very high or very low volume (more likely the latter than the former, but with many exceptions, especially recently trending keywords or very obscure ones). Over time, we attempt to gather volume data for keywords on which we’ve reported “No Data” so you may see us update these as we gather it (approximately monthly).
As a rule, I usually assume low volume - the fact that these keywords are often "longer-tail" ('large black leather handbags uk', for example) will often confirm that.
There are other keyword research tools you can use to cross-reference, such as SEMRush and http://keywordtool.io but, like most of the best tools, these are paid-for solutions.
And, of course, there's Google's own keyword planner: https://adwords.google.com/ko/KeywordPlanner/ This is free, but requires you sign up for a free Google Adwords account (you don't need to create an Adwords campaign).
Don't forget that if you're not sure, you can always contact Moz for help: https://moz.com/help/contact
I hope that helps.
-
RE: What are best page titles for sub-domain pages?
A site can have multiple sub-domains with as many pages as you like and Google can crawl them, but that doesn't mean it will index them. If you are deliberately producing duplicate content (and that includes slightly re-worked content to include a keyword variation) across your site, Google are going to penalise you for it:
"Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results"
Source: Google Search Console Help - Duplicate Content: https://support.google.com/webmasters/answer/66359
You'd benefit far more from 100 pages, filled with exceptional content than a 1,000,000 pages full of zero-value, duplicate content.
Don't just build content that focuses on one keyword, either; instead, build a page around a keyword "theme" and use synonyms that you'd expect you use naturally (i.e. in conversation with a human being), referencing your main keyword a couple of times, near the top of the page and in the title at most - don't plaster it everywhere). Google is smart enough to deal with synonyms - and with the arrival of Google's RankBrain (https://en.wikipedia.org/wiki/RankBrain) even more so. Readers hate keyword-stuffed pages as much as Google do and both will punish you for using them.
Less is more. Invest your energy in the user's experience, instead.
-
RE: Should I create a menu link for sitemap?
Make sure you're doing this for the right reasons: Don't do this to in an effort to improve rankings; do it because it's it improves the user experience, particularly if the site is sizeable. Adding an HTML version of the sitemap can help users find what they want on your site as quickly as possible. This, will reduce bounce rates and increase time on site, which can be a signal that your site delivers content that is relevant to the user's search (which is Google's primary objective - if they deliver relevant search results, people will continue to use their service). Everything you do to help your visitors find what they are looking for (be that a product or information) as quickly and as painlessly as possible will benefit you directly. Google will reward you for that.
"Provide a sitemap file with links that point to the important pages on your site. Also provide a page with a human-readable list of links to these pages (sometimes called a site index or site map page)."
If it's a large site, you may want to break this down over several pages. - "Limit the number of links on a page to a reasonable number (a few thousand at most)."
I'd be inclined to add a link to this in the footer of the site and/or in an on-site search page.
Source Google Webmaster Guidelines (under Help Google Find Your Pages section): https://support.google.com/webmasters/answer/35769?hl=en
Good luck.
-
RE: Would you disavow links that have a Moz Spam score of 5?
No problem. I've expanded upon the original answer to clarify what the Spam Score references. This may be useful. (Read from UPDATE: down).
I hope that helps.
-
RE: Would you disavow links that have a Moz Spam score of 5?
Nope. Moz suggest you don't get too excited about anything below an 8 and even then proceed with caution. Disavowing links is not usually something Google would expect you to need to get involved in, unless you are dealing with thousands of links.
"This [Disavow backlinks] is an advanced feature and should only be used with caution. If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results. We recommend that you disavow backlinks only if you believe you have a considerable number of spammy, artificial, or low-quality links pointing to your site, and if you are confident that the links are causing issues for you. In most cases, Google can assess which links to trust without additional guidance, so most normal or typical sites will not need to use this tool."
From: https://support.google.com/webmasters/answer/2648487
EGOL (who is wise in such matters) references this in a recent Q and A post and says he has never disavowed a single backlink, I've only ever done so once, in a panic and don't see myself doing so again. Much better to concentrate your efforts into countering the spammy links with great content.
UPDATE: Another thing to consider, if you click on Open Site Explorer > Spam Analysis - You'll see a number of these flags are issues you can fix without disavowing links. More often than not, you can fix a few of those. And remember, the Spam Score is not some hard and fast rule that Google follows; they are just a set of signals that Moz believe to have a correlation to Google penalties.
Here's the full list of Spam Flags from OSE > Spam Analysis:
Low MozTrust or MozRank Score
The site link profile is not trustworthy.✓
Large Site with Few Links
We found very few sites linking to this site, considering its size.✓
Site Link Diversity is Low
The diversity of link sources to this subdomain is low.✓
Ratio of Followed to Nofollowed Subdomains
The ratio of followed to nofollowed subdomains linking to this subdomain is outside the normal range of others in our index.✓
Ratio of Followed to Nofollowed Domains
The ratio of followed to nofollowed domains linking to this subdomain is outside the normal range of others in our index.✓
Small Proportion of Branded Links
Links to this subdomain have low amounts of branded anchor text.✓
Thin Content
A subset of pages within this subdomain have little content.✓
Site Mark-up is Abnormally Small
There's a high ratio of visible text compared to HTML, JavaScript, etc.✓
Large Number of External Links
A subset of pages within this subdomain has a large number of external links.✓
Low Number of Internal Links
Pages crawled on the subdomain have a small number of internal links.✓
Anchor Text Heavy Page
There's a high ratio of anchor text compared to content text.✓
External Links in Navigation
There's a large number of external links within sidebars and footers.✓
No Contact Info
None of the pages crawled contain an email address or links to a social profile.✓
Low Number of Pages Found
Crawl only gets a valid response to a small number of pages.✓
TLD Correlated with Spam Domains
This subdomain is on a top level domain (TLD) extension often found to be the source of spam links.✓
Domain Name Length
This domain name's character count is higher than average.✓
Domain Name Contains Numerals
Domain names including numbers are often found to be the source of spam links.Good luck.
-
RE: 1000 Pages on old website. What to do with the 301 redirects for this domain?
Of course, you've acquired the domain and not the old site; that makes sense. If I was desperate I would consider scraping what content I could from cached versions of the site (I'd outsource that)- if there are no legal implications in doing so. If that isn't possible/feasible, I'd direct what you can to the most relevant pages where possible and take the hit. I think your plan to create matching pages for the top 50 pages is sound. Whatever you do beyond that with 301s is of limited value if you can't match the content so in that case, I'd consider saving some time and creating redirecting everything else to your home page (or product overview page, for example, if this is of greater value and has higher engagement potential).
The best you can do in each case is match as closely as you can to the content on the new site, where that isn't possible, consider the user's experience - can you deliver them to a page of interest where you can engage and potentially convert them into customers? You should always but the user's experience first, as this is what Google values most. After all, they want to do exactly the same for their customer - deliver relevant and engaging content.
Worst case, if you've captured the biggest chunk of the value with those top 50 pages, you're going to salvage some value, at least. Consider the rest a bonus.
Good luck
-
RE: 1000 Pages on old website. What to do with the 301 redirects for this domain?
I'd be extremely reluctant to let any of those old pages die.
I would suggest you move them across to an appropriate section of the site (possibly an archive section, for example, if the content doesn't fit in so well with your new site structure) and create 301s to all of them. (Bear in mind, you will get the best value keeping the content, URL structure, etc. as close to the original as possible to retain the highest value from the redirects - Linking to loosely matched pages is less valuable and matching to unrelated content has negligible value. Remember, the purpose of the 301 is to indicate the content you were looking for now lives somewhere else, and then seamlessly guide your visitor to it. Using it in any other way gives the visitor a poor experience and your engagement statistics will show this. How engaged users are with your content is of significant value in SEO terms.
This assumes, as you state, that the old site was a good match to your new site and there's no detriment to having the old copy in place on your new site. There's no shame in letting links to irrelevant content die - technically, you could create 410 redirects to indicate that the content has been removed, but often you'd just 301 these, too and take a hit on the PR. (https://moz.com/community/q/should-i-implement-301-redirects-vs-410-in-removing-product-pages)
Now that 301 redirects pass on 100% of PageRank, you've got even more reason to maintain the links from old to new. (Caveat: PR is not the only ranking factor, so you're still going to take a bit of a hit when you redirect, but not as much as you will if you let that content wither and die.)
Some useful reading: https://moz.com/learn/seo/redirection
https://moz.com/blog/301-redirection-rules-for-seo
I hope that helps and good luck!
Best posts made by Hurf
-
RE: Buying Twitter/Facebook Followers
To be clear - I am NOT a fan of trying to game the system - I'd rather take the long haul than the short cut everytime - but it's difficult see the results these fake fans bring and not be seduced by them (the results - not the fake fans!) and it would be fairly difficult to categorically determine the validity of these fake accounts - they aren't overtly spammy - though they certainly lack some originality when they create names for the accounts - and quite frankly Facebook aren't likely to tackle this issue any time soon - They love to boast about their huge userbase: http://www.facebook.com/press/info.php?statistics and as such they aren't likely to purge the untold MILLIONS of fake accounts as this will suggest a dip in popularity or, perhaps as bad, Facebook is riddled with face accounts gasps - which is bad for business and terrible for Sharholders. - However, I wonder what they will do when their stats show that facebook has more users than Earth has inhabitants???
-
RE: Buying Twitter/Facebook Followers
@EGOL - You are absolutely correct. Quality is always preferable to Quantity. Now, we all know that the BEST WAY to make gains in the SERPS is to stick to the straight and narrow and work hard and eventually you will obtain the results you want. However, in the real world, we are competing against huge numbers of SEO companies that all promise fast results - and deliver them - by using a variety of dark practices - and many of these companies are not getting penalised for it (and I have watched them prosper over the course of nearly 4 years using less than pure techniques.)
So as time passes I watch them continue to reap the rewards, thinking to myself 'Any minute now Google will see what they are doing and drop them off the face of the earth...' Well, I'm still waiting, and waiting, and waiting...
So what does the honest, wholesome, whitehat SEO do?
How realistic is it for an SME with say 400 inbound links and 50 facebook followers to HONESTLY outgun a competitor who has 320,000 inbound links and 3,000 fake facebook followers?
I have fought quite a number of battles for some time with clients and employers, all the time resisting the easy path in favour of the long haul, but I am getting rather worn down by it. The addage if you can't beat 'em - join 'em springs to mind.
There are certainly scenarios where a quick boost to your fan base numbers for credibility are essential - especially if part of your pitch to potential clients will be that you will be marketing product X using social networking sites (Let's say we are selling a targeted property listings site, which will also use facebook and twitter to further promote its listings) Now your potential client will want to see the audience you plan to promote their product to - BEFORE you win their business and therefore have a product to promote - so you are faced with a dilemma - spend a lengthy and non-revenue generating period while you earnestly build a following, or, spend $50 and build the numbers in a matter of days - show the customer your thousand/s of eager followers fans - win the business and THEN invest your time in gathering genuine interested followers and in turn build a number of followers who are genuinely interested in the products you promote.
There is safety in numbers at play here - for all parties - as @Dunamis said 'if I see a page that has cool stuff and is followed by thousands of people, I am more likely to follow it. I'm not as likely to become a fan of a page with 100 members even if the content looks decent. '
We can't fight human nature with the argument 'Hey, I might only have a few followers, but one of them is the Pope!' Numbers are a very quick indicator of Credibility and we are manipulated by the power of numbers dozens of times a day and this will not change!
-
RE: How long is it safe to use a 302 redirect?
Okay. After re-reading the question (with my eyes open this time) I understand that the fact that no link jiuce will be passed to Site A (from Site B) is not an issue, rather you don't want to lose the existing link equity when you switch Site B back on and then 301 redirect Site A to Site B?
So, with that in mind - there is no specified 'acceptable' time limit attached to a 302 redirect, so you should be able to redirect without fear of being penalised, regardless of duration.
This is mentioned elsewhere on SEOMoz here: http://www.seomoz.org/qa/view/9994/302-redirect-timeframe
This is an interesting read however: http://www.seroundtable.com/archives/007233.html - just to keep things edgy ;o)
-
RE: Replacing keywords by synonyms. Will it increase risk of google keyword stuffing penalization?
Synonyms are part of our everyday language. we use a wide variety of synonyms in every conversation without any loss of meaning or confusion; using them in our written communication (specifically, web copy) ensures that it sounds natural, makes it less of a chore to read, and not just something that has been hammered out, solely to try to gain traction in the search engines. Google knows that its continued dominance relies on it's ability to deliver relevant results and understanding how synonyms work is an integral part of that, so, unsurprisingly, this is something they are very, very good at.
(Here's a 2010 article from the official Google blog which refers to synonyms: https://googleblog.blogspot.co.uk/2010/01/helping-comnds puters-understand-language.html.)
If you look at Google's example of keyword stuffing (https://support.google.com/webmasters/answer/66358?hl=en) you can see how unreadable that is; much better to add a variety of synonyms throughout a piece of engaging copy.
To address your concern directly, editing your copy and replacing keywords with a sprinkling of synonyms is not going to result in you suddenly getting penalised for 'keyword stuffing'.
(Remember, that the notion of a keyword density % is a myth See: https://moz.com/beginners-guide-to-seo/myths-and-misconceptions-about-search-enginesDon't forget to test your edits with On-page Grader: https://moz.com/researchtools/on-page-grader.
Proceed with usability and readability in mind - write something a human being would enjoy reading - and you won't go far wrong.
-
RE: No data for most of my keywords
Here's Moz's explanation of 'No data': "No data means we have not yet collected volume for the keyword" and the expanded answer from their FAQs (https://moz.com/help/guides/keyword-explorer
What does it mean when a keyword has “No Data” for its volume?
“No data” indicates that we’ve not yet collected search volume information on this keyword. It may have very high or very low volume (more likely the latter than the former, but with many exceptions, especially recently trending keywords or very obscure ones). Over time, we attempt to gather volume data for keywords on which we’ve reported “No Data” so you may see us update these as we gather it (approximately monthly).
As a rule, I usually assume low volume - the fact that these keywords are often "longer-tail" ('large black leather handbags uk', for example) will often confirm that.
There are other keyword research tools you can use to cross-reference, such as SEMRush and http://keywordtool.io but, like most of the best tools, these are paid-for solutions.
And, of course, there's Google's own keyword planner: https://adwords.google.com/ko/KeywordPlanner/ This is free, but requires you sign up for a free Google Adwords account (you don't need to create an Adwords campaign).
Don't forget that if you're not sure, you can always contact Moz for help: https://moz.com/help/contact
I hope that helps.
-
RE: Should I create a menu link for sitemap?
Make sure you're doing this for the right reasons: Don't do this to in an effort to improve rankings; do it because it's it improves the user experience, particularly if the site is sizeable. Adding an HTML version of the sitemap can help users find what they want on your site as quickly as possible. This, will reduce bounce rates and increase time on site, which can be a signal that your site delivers content that is relevant to the user's search (which is Google's primary objective - if they deliver relevant search results, people will continue to use their service). Everything you do to help your visitors find what they are looking for (be that a product or information) as quickly and as painlessly as possible will benefit you directly. Google will reward you for that.
"Provide a sitemap file with links that point to the important pages on your site. Also provide a page with a human-readable list of links to these pages (sometimes called a site index or site map page)."
If it's a large site, you may want to break this down over several pages. - "Limit the number of links on a page to a reasonable number (a few thousand at most)."
I'd be inclined to add a link to this in the footer of the site and/or in an on-site search page.
Source Google Webmaster Guidelines (under Help Google Find Your Pages section): https://support.google.com/webmasters/answer/35769?hl=en
Good luck.
-
RE: Subdomain hosted in a different country - what are the implications?
Host in your own country where possible (and practical) as server response time and server location are SEO factors - though not hugely weighted.
However, as .Coms are considered global hosting the US AND UK site on the same US server wouldn't be of huge detriment (in terms of server response time again.) - if you only want to run one site and split the visitors between /US and /UK subfolders. Remember, you can use Google Webmaster Tools to create Geographic targeting for each sub folder (One for the US sub folder/sub directory and one for the UK for example.)
In a perfect world however I would want to see the sites hosted in the appropriate country as this does help to indicate your target audience (and the obvious server response times I keep mentioning.) Google does allow for those that host out of their own country, hence the geographic targeting option in google webmaster tools.
I think UK users do like to see .Co.Uk at the end of the domain - because they know that generally means they will sell (and ship) to UK users - but we also use global .com stores if re-directed to them (Ebuyer.com for instance.)
- You can always use a .co.uk domain (if you have that too) for marketing only and redirect all traffic to your xyzdomain.com/UK sub folder.
In terms of the ecommerce software, I would (and did) opt for another ecommerce solution altogether:
Interspire Shopping Cart is very good in terms of SEO (The URL structure, H1 tags, meta title, meta description etc are dealt with very well. We rank number one for a variety of terms using this as the ecommerce solution. (Note this is self hosting) There is a hosted version that interspire are beginning to favour - see www.Bigcommerce.com
Hope that helps!
-
RE: Would you disavow links that have a Moz Spam score of 5?
Nope. Moz suggest you don't get too excited about anything below an 8 and even then proceed with caution. Disavowing links is not usually something Google would expect you to need to get involved in, unless you are dealing with thousands of links.
"This [Disavow backlinks] is an advanced feature and should only be used with caution. If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results. We recommend that you disavow backlinks only if you believe you have a considerable number of spammy, artificial, or low-quality links pointing to your site, and if you are confident that the links are causing issues for you. In most cases, Google can assess which links to trust without additional guidance, so most normal or typical sites will not need to use this tool."
From: https://support.google.com/webmasters/answer/2648487
EGOL (who is wise in such matters) references this in a recent Q and A post and says he has never disavowed a single backlink, I've only ever done so once, in a panic and don't see myself doing so again. Much better to concentrate your efforts into countering the spammy links with great content.
UPDATE: Another thing to consider, if you click on Open Site Explorer > Spam Analysis - You'll see a number of these flags are issues you can fix without disavowing links. More often than not, you can fix a few of those. And remember, the Spam Score is not some hard and fast rule that Google follows; they are just a set of signals that Moz believe to have a correlation to Google penalties.
Here's the full list of Spam Flags from OSE > Spam Analysis:
Low MozTrust or MozRank Score
The site link profile is not trustworthy.✓
Large Site with Few Links
We found very few sites linking to this site, considering its size.✓
Site Link Diversity is Low
The diversity of link sources to this subdomain is low.✓
Ratio of Followed to Nofollowed Subdomains
The ratio of followed to nofollowed subdomains linking to this subdomain is outside the normal range of others in our index.✓
Ratio of Followed to Nofollowed Domains
The ratio of followed to nofollowed domains linking to this subdomain is outside the normal range of others in our index.✓
Small Proportion of Branded Links
Links to this subdomain have low amounts of branded anchor text.✓
Thin Content
A subset of pages within this subdomain have little content.✓
Site Mark-up is Abnormally Small
There's a high ratio of visible text compared to HTML, JavaScript, etc.✓
Large Number of External Links
A subset of pages within this subdomain has a large number of external links.✓
Low Number of Internal Links
Pages crawled on the subdomain have a small number of internal links.✓
Anchor Text Heavy Page
There's a high ratio of anchor text compared to content text.✓
External Links in Navigation
There's a large number of external links within sidebars and footers.✓
No Contact Info
None of the pages crawled contain an email address or links to a social profile.✓
Low Number of Pages Found
Crawl only gets a valid response to a small number of pages.✓
TLD Correlated with Spam Domains
This subdomain is on a top level domain (TLD) extension often found to be the source of spam links.✓
Domain Name Length
This domain name's character count is higher than average.✓
Domain Name Contains Numerals
Domain names including numbers are often found to be the source of spam links.Good luck.
-
RE: Sitemaps. When compressed do you use the .gz file format or the (untidy looking, IMHO) .xml.gz format?
Generally the .xml.gz format is the one stated in examples there are a few references to this here : http://www.sitemaps.org/protocol.php#index
Most sitemap generators that create both compressed and uncompressed sitemap files name them sitemap.xml and sitemap.xml.gz respectively. It also makes it clearer what the content of the zipped file is. I don't believe it is essential however, as you will direct tools such as google.com/webmasters to your xml sitemap - rather than expect it to find it of its own accord.
I always use the .xml.gz format when compressing. I would argue that (if both formats work) neither one is 'BETTER' than the other, rather one is more ACCEPTED than the other.
-
RE: Which is the best wordpress sitemap plugin
Yoast works well alongside Genesis - as StudioPress were gracious enough to allow a dedicated SEO plugin take over from their built in offering.
I miss the time I once had before SEO came along and ruined everything ;o)
Looks like your connection to Moz was lost, please wait while we try to reconnect.