Meh, I guess not. It's just like talking about it to clients or friends. I've made some fine noise with lots of technical words.
- Home
- Travis_Bailey
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Travis_Bailey
@Travis_Bailey
Favorite Thing about SEO
"We have purposely trained him wrong as a joke." - Master Tang
Latest posts made by Travis_Bailey
-
RE: What are best options for website built with navigation drop-down menus in JavaScript, to get those menus indexed by Google?
-
RE: What are best options for website built with navigation drop-down menus in JavaScript, to get those menus indexed by Google?
I would generally prefer CSS over JS for navigational elements, but that probably isn't the problem here. Google can crawl JavaScript and attribute links fine. And per SEM Rush, it looks like the site is enjoying a pretty sharp uptick in organic traffic recently. That would seem to be at odds with big indexation problems.
I'm not so sure if it's my network, I'm on a sub par connection now, but I noticed that some CSS and JS files were timing out when I crawled the site. That could lead to a big problem. I would advise that someone check the server log files and see if those files are regularly timing out. Ideally one would want their CSS and JS files combined/concatenated where possible, to reduce the possibility of any such rendering issues.
More on that from SE Roundtable
I checked the cache for the EN version of a few of those pages, and they appear to be cached fine.
cache:https://f5.com/products/security/distributed-denial-of-service-ddos-protection yields, which is pretty much what we want.
But I do see some problems that could lead to problems with indexation/display. The site has a number of different languages/translations. However, I noticed that the hreflang attribute was missing. It's strongly recommended that hreflang is implemented. You're good on the language meta tag Bing recommends, though.
That would cause some problems, especially on a site that large. I've researched Radware, their competitor, years ago. F5 seems like the type of organization that would pay for a decent translation. (my German and Spanish are so limited, I couldn't discern the quality of the translations) But if it is automatically generated, that would more than likely lead to indexation problems as well.
Another thing I see is that each translation is marked as canonical. This could also cause problems with display and link equity.
Here's more on internationalization from Moz and Google.
I would also look for ways to build internal links to the important products (DDoS Mitigation is supposed to be a huge money maker now.) on the home page, in the body. Not just in boilerplate (nav... footer... etc....) areas.
Edit: Forgot to mention that the mobile menu doesn't appear to directly link important products. I would make sure the experience is the same across devices.
-
RE: Different number of backlinks (Search console - Majestic)
There will likely always be a significant variance between backlink tools. They have to first discover the links, and later determine if the links are still there. It's a pretty big job to do, for the entire internet.
Google Search Console will show you a sampling of links. You'll seldom ever get anywhere near the whole story from them. Unless, the site is new. I've still seen months long delays in reporting totally legit backlinks.
Majestic is pretty good. I've always found their metrics to be something of a gobble-dee-guk, which is only useful for comparisons between sites - within Majestic.
aHrefs is another backlink tool. And they do a fine job. But still, they have to update their database after crawls. The same as anyone else.
Open Site Explorer has always kind of lagged, but it's another source you should consider for backlink data.
So, no one tool has all of the information. I don't think that will happen for at least 10 years, if not more.
The reasons various tools lack certain information varies. Some tools, like ahrefs, are actually on the 'bad bots list'. And some webmasters use that to block their official bot from crawling their site. Thus no links would be discovered through their official bot, where sites block their bot.
Backlink research is as post-modern as it gets. (read: literature) You're seldom privileged with all the information. The sources you think should be authoritative aren't.
The answer has been, and will be for a while, seek out an array of backlink data information. No one thing is going to do it for you.
-
RE: Dealing with links to your domain that the previous owner set up
It's my opinion that the Gary Illyes quote is a little out of context for the situation. Dead inbound links (404 errors) could be a bad thing, if the links are of good quality. It's more than likely Mr. Illyes was addressing on-page 404s, and in that context I would mostly agree.
Though to be pedantic, 404 errors slow page load time - and speed is a ranking factor. So while broken on-page links may not result in a direct penalty, it definitely doesn't do any favors for on-page SEO.
-
RE: Dealing with links to your domain that the previous owner set up
Before we get to the links:
Apologies in advance for all of this, but I know it can be helpful for your current situation and in the future.
The first thing that would have helped is using SEM Rush to possibly get an idea of the domain's ranking history. I say 'possibly', because it's not so great with domains/pages that geo target smaller cities. A site could be going gangbusters for Paducah, Kentucky targeted queries, and SEM Rush more than likely won't pick up on that. Major metros? Yea varily.
SEM Rush can also possibly help you determine if the site has been hit by various algorithm updates. Generally if a sharp drop in organic traffic occurs within, or shortly after, the same month of a spam related update there's a good chance the site has been penalized. If such is the case, it could more than likely hurt your efforts for some time.
In more competitive niches - penalties aren't always the case. Sometimes the competition is fierce and sites lose traffic to competitors at the time of algorithm updates. Use Moz's Google Algorithm Change History to help with those efforts.
There's also the possibility that whoever owned the domain previously made some pretty bad mistakes with their front end deployment. You can use Wayback Machine to possibly figure some of that out (you may even be able to grab a sitemap). Sometimes people/companies had enough rope to hang themselves, no algo or competition necessary.
Now... to the links!
The short answer to your second question is variable. You may have some really great links out there that are currently pointing to a dead page. On the other hand, you could have a ton of spam. So you can hurt your search engine optimization efforts through inaction or action. The rest of this is a general overview of what you should do.
It's always a good idea to get more than one source of link data. Always. Google Search Console, Bing Webmaster Tools and Open Site Explorer are all good 'free' sources of link data. I would also recommend Ahrefs and Majestic.
All of those sources will tell which page has received links, as well as the anchor text used. Ahrefs and Majestic in particular are pretty good at showing you which inbound links lead to a 404. From there, you can choose whether or not you want to 301 to a new page with comparable content.
Just make sure that you're not bringing in a whole lot of spam links, and be especially judicious about links with exact match anchor text. A boiler plate example would be 'keyword city'. The rest of your decisions should be based on Google Quality Guidelines with special attention paid to the Link Schemes section.
And should some of those linking domains not pass your judgement call, add them to your disavow file to be safe. You can disavow entire domains, so you're not bogged down in individual link entries. Just make sure to note that you had just purchased the domain, and the domain looked suspicious. Here's the official documentation for the disavow tool.
Best of luck, and I'm sure you'll have more questions. Feel free to post them here.
-
RE: Google Analytics - Average Position
There are a lot of factors that can influence where a page may rank for a given query. One of the largest differences would be a query that seems to have local intent. If I search 'pizza', there's a good chance I don't care about the history of pizza. I want a pizza place near me.
So if we skip over the map pack, I see Pizza Hut, Domino's and Andy's. There's no way Andy's should rank #3 organic for 'pizza' for everyone in the United States. It only has three locations in my home town. So it might rank... maybe 403 for everyone outside of my hometown (just for the sake of argument, and ease of calculation). Perhaps it ranks a little higher for someone just outside the city limits... let's say... #8 organic. But that #8 doesn't matter. It's the highest and the lowest rank.
3+403=406/2=203 So the average position for 'pizza' is #203. If you drop that number in front of Andy after paying you for months, he won't be happy. That's why you'll have to tell Andy that it's a high/low average based upon a complicated algorithm, and that he can easily see he ranks #3 organic when Google knows your approximate location.
As for the average position you see above all of the queries in Google Analytics, that's just an X-bar-bar. X-bar-bar is the average of averages. You simply add up all the average positions and divide by the number of keywords. You'll see the number is pretty close.
It helps if you have a little background in statistics or statistical process control. In case that was clear as mud, here's something on basic SPC that can help you better understand the calculations in GA. I was a machinist, prior to all this internet marketing nonsense. It helps.
Edit: One thing I forgot to mention: If an average position still seems off - set the secondary dimension to Country. I've found instances where sites show up for queries in foreign countries. This is despite explicit national targeting in Google Search Console.
-
RE: Direct traffic spam on Google Analytics: how can you identify and filter it?
Create a segment that only shows traffic to the hostname. (yoursite.com, youtube.com, paypal.com) That's it. It's pretty amazing how much traffic would appear to be spam.
-
RE: Will having two wordpress themes installed hurt seo?
I second, or third?, the notion that you should more than likely only have a single WordPress installation. It definitely would increase the maintenance involved. Take everything you should do to maintain an installation, then double it. I'm certain everyone in your organization could do without that.
But if your organization is willing to endure the duplication of effort, there are other things to be concerned about. Not every theme is created equal. Some themes are faster than others, some are more secure than others and most themes will differ in every other way. So one theme could be a hindrance, while the other at least pulls it's weight.
In regard to the subdomain blog or subfolder blog question, there was a time in recent history where I would have said it didn't matter. Supposedly the link equity/juice flows just fine either way. However, someone in Moz Q&A made a very good point. To paraphrase EGOL; "Algorithms change, if you install your blog on a subfolder you will always be right."
I'm not sure when your company made the jump to WordPress, but WordPress has had the ability to display static pages for years. My first agency used to run a combination of CMS Made Simple and WordPress, I think it was due to the page handling issue. That was over six years ago. They later made the jump to full WordPress about five years ago.
So it sounds like the site isn't properly configured for your purposes. Here is how you should handle that, direct from the WordPress codex. From there you can setup your site's page structure through parent/child relationships. So if you're selling widgets, your structure may look like:
**Page Hierarchy **
Posts Hierarchy
site.com/blog-diggety/sweet-post
There are Pages and Posts. You bring the hierarchy. And speaking of which, should you change your site URL structure, you will definitely want to research 301 redirects.
To me, there's no question in my mind. You should stick with one WordPress install. Hopefully that helps.
-
RE: Thumbtack Blatantly Violating Google TOS?
Ten. Years. Later. XD
It is pretty interesting to note that they specifically state they've removed the 'bonus' internet points from Thumbtack profiles. I would imagine they were told it might improve their case. It's definitely a bit of a SWAG on my part, but even the goofy internet points may have been considered material.
One could see how possibly having more 'internet points' may influence a purchase/contract decision. So that may be enough to support a materiality claim as well.
-
RE: Referencing links in Articles and Blogs
I'm just going to leave this here. ; ) It would seem that all of the typical means of citation can be recognized as such. Perhaps too readily?
Best posts made by Travis_Bailey
-
RE: How valuable is a link with a DA 82 but a PA of 1?
I would generally dispense with the concern over metrics, considering the source. It sounds like a great citation source, regardless. Plus it may do what links were intended to do in the first place: Drive Traffic
OSE, aHrefs, Majestic and the like are just keyhole views into what's really going on. Albeit important keyhole views, but still limited insights into the big picture.
I would challenge that if one focuses less on granular metrics, and puts more attention into traffic and general relevancy; one would be happier with the results and have more time for generating similar results.
-
RE: Googles stance on Back Links via a Badge/Form
These would likely be deemed a link scheme unless the links are nofollow. We can bring up a number of examples where sites appear to get away with these tactics, but followed widgets and footer links are definitely high risk. If you genuinely want exposure for your site, nofollow the links and make sure they don't contain 'money terms'. (e.g. Free Malware Scan)
Both of these tactics have been done to death. A lot of sites have been penalized due to dofollow links in themes, widgets, plugins and badges. If you still want to do this, I would strongly suggest using a nofollow attribute on the link.
-
RE: Backlinks from non-relevant site
It could be seen as a link of little value for all of the reasons you stated. If your site had a vast preponderance of little but forum links that pass juice/equity with exact match anchor text, like 'buy nike shoes', you would be heading into a possible penalty. That would show intent toward manipulation.
So the short answer is: Removing the links isn't going to help your site, but if they're truly removed they aren't going to hurt your site.
There would be other things to consider, if the forum related to your business. Some things to consider are referrals and conversions from referrals. But since you stated that the forum is unrelated, I won't worry about that.
If you haven't read this yet, it's a good idea that you do. Staying within those guidelines, and avoiding anything in the grey will generally keep your site in good standing from a linking perspective.
-
RE: Different number of backlinks (Search console - Majestic)
There will likely always be a significant variance between backlink tools. They have to first discover the links, and later determine if the links are still there. It's a pretty big job to do, for the entire internet.
Google Search Console will show you a sampling of links. You'll seldom ever get anywhere near the whole story from them. Unless, the site is new. I've still seen months long delays in reporting totally legit backlinks.
Majestic is pretty good. I've always found their metrics to be something of a gobble-dee-guk, which is only useful for comparisons between sites - within Majestic.
aHrefs is another backlink tool. And they do a fine job. But still, they have to update their database after crawls. The same as anyone else.
Open Site Explorer has always kind of lagged, but it's another source you should consider for backlink data.
So, no one tool has all of the information. I don't think that will happen for at least 10 years, if not more.
The reasons various tools lack certain information varies. Some tools, like ahrefs, are actually on the 'bad bots list'. And some webmasters use that to block their official bot from crawling their site. Thus no links would be discovered through their official bot, where sites block their bot.
Backlink research is as post-modern as it gets. (read: literature) You're seldom privileged with all the information. The sources you think should be authoritative aren't.
The answer has been, and will be for a while, seek out an array of backlink data information. No one thing is going to do it for you.
-
RE: Removing poor domain authority backlinks worth it?
Setting a DA cut-off from the outset is a bit too arbitrary. What if it's a link from a site with low DA and a low PA now, but later the site becomes the next New York Times? You don't want to disavow the next New York Times, but that's what an arbitrary number would have you do.
Further, DA and PA can be gamed to a certain extent. I'm sure Rap Genius has a pretty solid DA, but they were penalized all the same. So it would appear that using DA as a cut-off would be less than ideal.
There's no real easy way to do a disavow. You have to think about characteristics, context and intent. If you have links that pass juice, but were obviously paid - that may be a candidate. If there's a vast preponderance of links from seemingly low quality directories with exact match anchor text - those would be candidates for closer scrutiny as well. Dead giveaways are usually 'sponsored' links that pass juice.
Low quality directories usually let everyone in. You will know them by their viagra and casino anchor text. They're usually a pretty safe disavow candidate.
Does the site have a lot of links from spam blog comments from sites that are obviously unrelated? Has there been some guest blogging on free for all blogs? Those links would require some review as well.
Definitely prioritize your exact match anchor text links for review.
I would suggest you start with gathering link data from numerous sources:
- Google Webmaster Tools
- Bing Webmaster Tools
- Ahrefs
- Majestic SEO
- Etc.
Then filter the duplicates via spreadsheet voodoo. After that, drop it into a service like Link Detox. But be careful, it still throws false positives and false negatives. So again, there's no real way of getting out of a manual review. But Link Detox will speed up the process.
Are there plenty of disavow services out there? Sure, but I've never used them. I'm far too paranoid. A disavow is a delicate and lengthy process.
Are there some great disavow pros/individuals out there? Definitely. I would be far more likely to trust them. In fact, a couple will likely chime in here. Though they may be a little bit outside the budget. I don't know.
One final, important, point: A disavow is not a panacea. They take as long as they take. Though it is good that you appear to be proactive. You never know when the next Penguin filter will land. The site may be right with The Googles now, but it might not be later.
-
RE: Dealing with links to your domain that the previous owner set up
Before we get to the links:
Apologies in advance for all of this, but I know it can be helpful for your current situation and in the future.
The first thing that would have helped is using SEM Rush to possibly get an idea of the domain's ranking history. I say 'possibly', because it's not so great with domains/pages that geo target smaller cities. A site could be going gangbusters for Paducah, Kentucky targeted queries, and SEM Rush more than likely won't pick up on that. Major metros? Yea varily.
SEM Rush can also possibly help you determine if the site has been hit by various algorithm updates. Generally if a sharp drop in organic traffic occurs within, or shortly after, the same month of a spam related update there's a good chance the site has been penalized. If such is the case, it could more than likely hurt your efforts for some time.
In more competitive niches - penalties aren't always the case. Sometimes the competition is fierce and sites lose traffic to competitors at the time of algorithm updates. Use Moz's Google Algorithm Change History to help with those efforts.
There's also the possibility that whoever owned the domain previously made some pretty bad mistakes with their front end deployment. You can use Wayback Machine to possibly figure some of that out (you may even be able to grab a sitemap). Sometimes people/companies had enough rope to hang themselves, no algo or competition necessary.
Now... to the links!
The short answer to your second question is variable. You may have some really great links out there that are currently pointing to a dead page. On the other hand, you could have a ton of spam. So you can hurt your search engine optimization efforts through inaction or action. The rest of this is a general overview of what you should do.
It's always a good idea to get more than one source of link data. Always. Google Search Console, Bing Webmaster Tools and Open Site Explorer are all good 'free' sources of link data. I would also recommend Ahrefs and Majestic.
All of those sources will tell which page has received links, as well as the anchor text used. Ahrefs and Majestic in particular are pretty good at showing you which inbound links lead to a 404. From there, you can choose whether or not you want to 301 to a new page with comparable content.
Just make sure that you're not bringing in a whole lot of spam links, and be especially judicious about links with exact match anchor text. A boiler plate example would be 'keyword city'. The rest of your decisions should be based on Google Quality Guidelines with special attention paid to the Link Schemes section.
And should some of those linking domains not pass your judgement call, add them to your disavow file to be safe. You can disavow entire domains, so you're not bogged down in individual link entries. Just make sure to note that you had just purchased the domain, and the domain looked suspicious. Here's the official documentation for the disavow tool.
Best of luck, and I'm sure you'll have more questions. Feel free to post them here.
-
RE: Alternative Link Detox tools?
The difference between the number of links you see across various sources is because of the resources themselves. Some backlink services only crawl so much. Google can only crawl so much of the internet.
Your best bet is to use multiple sources. I would go with GWT, Majestic SEO and aHrefs, then filter duplicates. You'll have a much better understanding of where the site stands. Once you have that, you can use Cemper Link Detox to upload the data.
Be very careful, Link Detox still throws some false positives. Though I expect it to get better every day. There's a machine learning element to it that's based on human feedback.
Finally, I would be very careful of fully automating anything like a disavow/removal process. Do you really want something so delicate taken out of your hands? It's still very necessary to manually check each link so you know that you're getting rid of the bad and keeping the good.
Link Detox is the closest thing there is, that I'm aware of, that will help 'automate' the process in a safe-ish way. The subject of link removal/disavow is something so sensitive I wouldn't outsource it. Then again, I hate the idea of outsourcing overflow blog writing work to competent people. Call me a control freak.
-
RE: Has anyone used Fat Joe as an outsource solution for blog links?
That's probably your queue to run away screaming. Guest blogging for the sake of guest blogging is a very bad idea these days. You might also want to mention that 'Loved by Moz' bit to someone at Moz. They can confirm or deny. Whoops, they just did.
http://www.mattcutts.com/blog/guest-blogging/
That's not to say all guest blogging is dead per se, but guest blogging for ABC plumbing company on alllandscapersseolinks4all.com is dead.
Here's a semi-fictional scenario:
Internet Guy A walks up to Internet Guy B and says; "Hey, for some reason I don't want to post this on my own deal, but I think your audience would really benefit from my words. What do you say?"
And Internet Guy B says; "Cool bro, much trust. I know you'll behave yourself and be excellent - Wyld Stallyns 4 eva!"
Then Internet Guy B would be vouching for Internet Guy A. Everybody's p-toot is on the line and the honor system, with editorial control on Internet Guy B's part. So long as Internet Guy A or Internet Guy B aren't offering a material incentive for the post and everyone follows the bajillion other rules, some unwritten, everybody is cool.
But what you're talking about seems to be scary and you may want to think twice. Do you really know who you're getting into 'link bed' with? You don't until it happens. In the real world, that takes an inordinate amount of vodka. If you've had one of those nights, you know it isn't pretty.
-
RE: Duplicate content on Product pages for different product variations.
Yeah, the URL changes. That's duplicate content. Handle it with a drop down menu. You appear to be kind of borked for the category pages, if that's how you're going to approach it.
-
RE: What is the best way to remove a link that redirects to a spammy site?
If the links don't point to your client, there's little to no need to tell anyone anything - other than out of courtesy. You would do better telling their host's technical support, however. Hosts tend to hate this stuff more than site owners for some odd reason.
But yes, the example given looks like the result of a hack. Now that doesn't mean the links weren't paid, per se - but the hack is the end result. It wouldn't surprise me if whomever did the hacking later pulled the links and pointed them elsewhere.
You could follow that rabbit hole to the end of the internet. Instead I would focus on links that actually point to your client's site. And that is if it appears the links have harmed, or definitely will harm the site.
If you feel the need to continue, gather as much link information as you can from open site explorer, Google and Bing Webmaster Tools, Majestic SEO and/or aHrefs. Once you have a mess of spreadsheets in-hand, prune duplicate links. Then, feed it into Cemper Link Detox.
Link Detox should be able to flag the obvious stuff. But they also still put out some false positive/negatives, so you still have to judge links individually. During the judging phase, I always use a Linux machine. You never know when you'll hit something that executes script - then your machine is totaled.
Once in an aeon there comes one so potent, so magnificent, their very existence changes the meaning of meaning. Travis isn't that guy.
Looks like your connection to Moz was lost, please wait while we try to reconnect.