Meh, I guess not. It's just like talking about it to clients or friends. I've made some fine noise with lots of technical words.
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Posts made by Travis_Bailey
-
RE: What are best options for website built with navigation drop-down menus in JavaScript, to get those menus indexed by Google?
-
RE: What are best options for website built with navigation drop-down menus in JavaScript, to get those menus indexed by Google?
I would generally prefer CSS over JS for navigational elements, but that probably isn't the problem here. Google can crawl JavaScript and attribute links fine. And per SEM Rush, it looks like the site is enjoying a pretty sharp uptick in organic traffic recently. That would seem to be at odds with big indexation problems.
I'm not so sure if it's my network, I'm on a sub par connection now, but I noticed that some CSS and JS files were timing out when I crawled the site. That could lead to a big problem. I would advise that someone check the server log files and see if those files are regularly timing out. Ideally one would want their CSS and JS files combined/concatenated where possible, to reduce the possibility of any such rendering issues.
More on that from SE Roundtable
I checked the cache for the EN version of a few of those pages, and they appear to be cached fine.
cache:https://f5.com/products/security/distributed-denial-of-service-ddos-protection yields, which is pretty much what we want.
But I do see some problems that could lead to problems with indexation/display. The site has a number of different languages/translations. However, I noticed that the hreflang attribute was missing. It's strongly recommended that hreflang is implemented. You're good on the language meta tag Bing recommends, though.
That would cause some problems, especially on a site that large. I've researched Radware, their competitor, years ago. F5 seems like the type of organization that would pay for a decent translation. (my German and Spanish are so limited, I couldn't discern the quality of the translations) But if it is automatically generated, that would more than likely lead to indexation problems as well.
Another thing I see is that each translation is marked as canonical. This could also cause problems with display and link equity.
Here's more on internationalization from Moz and Google.
I would also look for ways to build internal links to the important products (DDoS Mitigation is supposed to be a huge money maker now.) on the home page, in the body. Not just in boilerplate (nav... footer... etc....) areas.
Edit: Forgot to mention that the mobile menu doesn't appear to directly link important products. I would make sure the experience is the same across devices.
-
RE: Different number of backlinks (Search console - Majestic)
There will likely always be a significant variance between backlink tools. They have to first discover the links, and later determine if the links are still there. It's a pretty big job to do, for the entire internet.
Google Search Console will show you a sampling of links. You'll seldom ever get anywhere near the whole story from them. Unless, the site is new. I've still seen months long delays in reporting totally legit backlinks.
Majestic is pretty good. I've always found their metrics to be something of a gobble-dee-guk, which is only useful for comparisons between sites - within Majestic.
aHrefs is another backlink tool. And they do a fine job. But still, they have to update their database after crawls. The same as anyone else.
Open Site Explorer has always kind of lagged, but it's another source you should consider for backlink data.
So, no one tool has all of the information. I don't think that will happen for at least 10 years, if not more.
The reasons various tools lack certain information varies. Some tools, like ahrefs, are actually on the 'bad bots list'. And some webmasters use that to block their official bot from crawling their site. Thus no links would be discovered through their official bot, where sites block their bot.
Backlink research is as post-modern as it gets. (read: literature) You're seldom privileged with all the information. The sources you think should be authoritative aren't.
The answer has been, and will be for a while, seek out an array of backlink data information. No one thing is going to do it for you.
-
RE: Dealing with links to your domain that the previous owner set up
It's my opinion that the Gary Illyes quote is a little out of context for the situation. Dead inbound links (404 errors) could be a bad thing, if the links are of good quality. It's more than likely Mr. Illyes was addressing on-page 404s, and in that context I would mostly agree.
Though to be pedantic, 404 errors slow page load time - and speed is a ranking factor. So while broken on-page links may not result in a direct penalty, it definitely doesn't do any favors for on-page SEO.
-
RE: Dealing with links to your domain that the previous owner set up
Before we get to the links:
Apologies in advance for all of this, but I know it can be helpful for your current situation and in the future.
The first thing that would have helped is using SEM Rush to possibly get an idea of the domain's ranking history. I say 'possibly', because it's not so great with domains/pages that geo target smaller cities. A site could be going gangbusters for Paducah, Kentucky targeted queries, and SEM Rush more than likely won't pick up on that. Major metros? Yea varily.
SEM Rush can also possibly help you determine if the site has been hit by various algorithm updates. Generally if a sharp drop in organic traffic occurs within, or shortly after, the same month of a spam related update there's a good chance the site has been penalized. If such is the case, it could more than likely hurt your efforts for some time.
In more competitive niches - penalties aren't always the case. Sometimes the competition is fierce and sites lose traffic to competitors at the time of algorithm updates. Use Moz's Google Algorithm Change History to help with those efforts.
There's also the possibility that whoever owned the domain previously made some pretty bad mistakes with their front end deployment. You can use Wayback Machine to possibly figure some of that out (you may even be able to grab a sitemap). Sometimes people/companies had enough rope to hang themselves, no algo or competition necessary.
Now... to the links!
The short answer to your second question is variable. You may have some really great links out there that are currently pointing to a dead page. On the other hand, you could have a ton of spam. So you can hurt your search engine optimization efforts through inaction or action. The rest of this is a general overview of what you should do.
It's always a good idea to get more than one source of link data. Always. Google Search Console, Bing Webmaster Tools and Open Site Explorer are all good 'free' sources of link data. I would also recommend Ahrefs and Majestic.
All of those sources will tell which page has received links, as well as the anchor text used. Ahrefs and Majestic in particular are pretty good at showing you which inbound links lead to a 404. From there, you can choose whether or not you want to 301 to a new page with comparable content.
Just make sure that you're not bringing in a whole lot of spam links, and be especially judicious about links with exact match anchor text. A boiler plate example would be 'keyword city'. The rest of your decisions should be based on Google Quality Guidelines with special attention paid to the Link Schemes section.
And should some of those linking domains not pass your judgement call, add them to your disavow file to be safe. You can disavow entire domains, so you're not bogged down in individual link entries. Just make sure to note that you had just purchased the domain, and the domain looked suspicious. Here's the official documentation for the disavow tool.
Best of luck, and I'm sure you'll have more questions. Feel free to post them here.
-
RE: Google Analytics - Average Position
There are a lot of factors that can influence where a page may rank for a given query. One of the largest differences would be a query that seems to have local intent. If I search 'pizza', there's a good chance I don't care about the history of pizza. I want a pizza place near me.
So if we skip over the map pack, I see Pizza Hut, Domino's and Andy's. There's no way Andy's should rank #3 organic for 'pizza' for everyone in the United States. It only has three locations in my home town. So it might rank... maybe 403 for everyone outside of my hometown (just for the sake of argument, and ease of calculation). Perhaps it ranks a little higher for someone just outside the city limits... let's say... #8 organic. But that #8 doesn't matter. It's the highest and the lowest rank.
3+403=406/2=203 So the average position for 'pizza' is #203. If you drop that number in front of Andy after paying you for months, he won't be happy. That's why you'll have to tell Andy that it's a high/low average based upon a complicated algorithm, and that he can easily see he ranks #3 organic when Google knows your approximate location.
As for the average position you see above all of the queries in Google Analytics, that's just an X-bar-bar. X-bar-bar is the average of averages. You simply add up all the average positions and divide by the number of keywords. You'll see the number is pretty close.
It helps if you have a little background in statistics or statistical process control. In case that was clear as mud, here's something on basic SPC that can help you better understand the calculations in GA. I was a machinist, prior to all this internet marketing nonsense. It helps.
Edit: One thing I forgot to mention: If an average position still seems off - set the secondary dimension to Country. I've found instances where sites show up for queries in foreign countries. This is despite explicit national targeting in Google Search Console.
-
RE: Direct traffic spam on Google Analytics: how can you identify and filter it?
Create a segment that only shows traffic to the hostname. (yoursite.com, youtube.com, paypal.com) That's it. It's pretty amazing how much traffic would appear to be spam.
-
RE: Will having two wordpress themes installed hurt seo?
I second, or third?, the notion that you should more than likely only have a single WordPress installation. It definitely would increase the maintenance involved. Take everything you should do to maintain an installation, then double it. I'm certain everyone in your organization could do without that.
But if your organization is willing to endure the duplication of effort, there are other things to be concerned about. Not every theme is created equal. Some themes are faster than others, some are more secure than others and most themes will differ in every other way. So one theme could be a hindrance, while the other at least pulls it's weight.
In regard to the subdomain blog or subfolder blog question, there was a time in recent history where I would have said it didn't matter. Supposedly the link equity/juice flows just fine either way. However, someone in Moz Q&A made a very good point. To paraphrase EGOL; "Algorithms change, if you install your blog on a subfolder you will always be right."
I'm not sure when your company made the jump to WordPress, but WordPress has had the ability to display static pages for years. My first agency used to run a combination of CMS Made Simple and WordPress, I think it was due to the page handling issue. That was over six years ago. They later made the jump to full WordPress about five years ago.
So it sounds like the site isn't properly configured for your purposes. Here is how you should handle that, direct from the WordPress codex. From there you can setup your site's page structure through parent/child relationships. So if you're selling widgets, your structure may look like:
**Page Hierarchy **
Posts Hierarchy
site.com/blog-diggety/sweet-post
There are Pages and Posts. You bring the hierarchy. And speaking of which, should you change your site URL structure, you will definitely want to research 301 redirects.
To me, there's no question in my mind. You should stick with one WordPress install. Hopefully that helps.
-
RE: Thumbtack Blatantly Violating Google TOS?
Ten. Years. Later. XD
It is pretty interesting to note that they specifically state they've removed the 'bonus' internet points from Thumbtack profiles. I would imagine they were told it might improve their case. It's definitely a bit of a SWAG on my part, but even the goofy internet points may have been considered material.
One could see how possibly having more 'internet points' may influence a purchase/contract decision. So that may be enough to support a materiality claim as well.
-
RE: Referencing links in Articles and Blogs
I'm just going to leave this here. ; ) It would seem that all of the typical means of citation can be recognized as such. Perhaps too readily?