Great question! We do often see a positive correlation between the number of followed outbound links and higher rankings (though I'm not sure we've scientifically measured this recently). Anecdotally, we hear this often as well. Most famously when the NYTimes made external links "followed" which was followed by an increase in traffic/rankings.
- Home
- Cyrus-Shepard
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Cyrus-Shepard
@Cyrus-Shepard
Job Title: Founder
Company: Zyppy.com
Favorite Thing about SEO
The SEO Community!
Latest posts made by Cyrus-Shepard
-
RE: Nofollow Outbound Links on Listings from Travel Sites?
-
RE: Nofollow Outbound Links on Listings from Travel Sites?
It's an interesting perspective. Looking at the pages+links, they all look trustworthy and normally I wouldn't see a reason to nofollow them, especially since they are all editorially controlled by you and your team.
Linking equity is a concern, but I honestly doubt you're saving anything by making them nofollow, especially since Google updated how they handle PageRank sculpting back in 2009.
Not that there aren't legitimate ways to preserve and flow link equity (such as including internal links withing the main body of text instead of sidebar areas/navigation) but in this case I think leaving the links follow won't hurt at all.
-
RE: Positions dropping in SERPs after Title and Snippet change
There are a few possible reasons Google might adjust rankings after seeing a change in your title and meta descriptions. Among them: (keep in mind these are only possibilities)
1. The algorithm determines that the page is less relevant to the target query keywords
2. The title change deviates from earlier anchor text pointing at the page, meaning the page might not be as relevant to the query
4. After changing your title+description, you experience a lower CTR in search results. In theory this could lower your rankings. But because you describe the old title/description still showing in SERPS, this is less likely
5. The drop in rankings is temporary, or is unrelated to any changes you made.
If Google is still showing the old title/description, #5 is a strong possibility. You may want to check Google's cache of the page to see if it's picking up on the changes. Depending on the site this can take anywhere from a few hours to several weeks.
If nothing else, you can always change the title/description back to the original version and test what happens.
-
RE: Some Old date showing in SERP
I'm actually kinda stumped. For whatever reason, Google is ignoring the sitemap date. Here's what I would do:
1. Even though the sitemap is valid, I'm still unclear if Google is reading it. The only way to know for sure is by checking the Sitemap function in Google Search console here and verifying indexation: https://www.google.com/webmasters/tools/sitemap-list
2. You could try to put a date on the page. Something like "Last Updated" at the bottom of the page.
3. A longshot, but you could add the <lastreviewed>Schema markup to the page, and see if Google honors that.</lastreviewed>
If you try any of these, let us know if any of them worked!
-
RE: Some Old date showing in SERP
How odd. I'm not sure of the answer, but before we go any further I was hoping you could verify a couple of things;
1. In Google Search console, can you verify that your sitemaps are submitted and that Google is indexing/reading them? I would think since you have a "last mod" date in your sitemap it would send a signal to Google that the page was more up to date.
2. When looking at the cache of your page in Google, it doesn't look like all the resources are loading. http://webcache.googleusercontent.com/search?q=cache:example.com
Based on this, if you perform a fetch and render in Google console, does it show that you are blocking any resources?
-
RE: Keyword Themes - What's in a theme?
Turns out I wrote a post that expanded on this idea of keyword themes: https://moz.com/blog/keywords-to-concepts
Hope that helps! Best of luck with your SEO.
-
RE: Does a non-canonical URL pass link juice?
Complex question
Caveat: I don't work for Google and the precise workings of the canonical element in Google's algorithm is mostly educated speculation.
The answer is somewhere in-between yes and no. That's because the canonical element means that URL B is treated as URL A. In that sense it really shouldn't pass any direct link authority.
But(!) now let's complicate things. Let's point some links at URL B. (and not at URL A) In theory, those links are then canonicalized to URL A, and that equity passes to your site (yeah!)
So it's not a direct influence, but you can in theory gain link equity from canonicalized versions of URLs that point to your site.
-
RE: How do you check what links there are to a specific page on a site?
Hi Leo,
Sounds like you were doing the right thing. Different tools will show different numbers, as all tools use different link indexes. In general, Moz is a bit more picky about the links it displays - we try to display the most important links that are likely to have an impact on your ranking. The downside of this is that our index can be smaller than some of the others (Ahrefs, Majestic) and you'll often find a bigger volume of links with those other indexes.
Also, for general help in using Open Site Explorer, here's an excellent resource: https://moz.com/help/guides/research-tools/open-site-explorer
-
RE: After Server Migration - Crawling Gets slow and Dynamic Pages wherein Content changes are not getting Updated
The good news is, this actually sounds pretty normal. 24 hours to reflect changes in content is better than many sites. I can't account for why it dropped from 4 to 24, but I'd say this is still in the range of "good"
-
RE: After Server Migration - Crawling Gets slow and Dynamic Pages wherein Content changes are not getting Updated
Howdy,
A couple of questions:
1. Are there certain pages that aren't getting updated, or is it your entire site?
2. How often are changes in the pages reflected in Google's cache?Is it a case where Google simply displays old/outdated information all the time? Finally, have you done a "Fetch and Render" check in Google Webmaster Tools?
Best posts made by Cyrus-Shepard
-
RE: Why "title missing or empty" when title tag exists?
Hi Loren,
I took a peek at your website, and checked some things behind the scenes using my super-awesome administrative powers here at SEOmoz. It looks like one of two things happened.
- Rogerbot encountered an error when crawling your site
- Your site had trouble with rogerbot.
In either case, you probably want to contact the help team (help@seomoz.org), especially if the problem persist in the next crawl report.
On another note..
Those extra-long title tags might cause some crawlers a little confusion. I'm not saying they're bad for you, but I doubt they are helping you much from a search engine point of view. Undoubtedly, I'd say with near certainty that Google is not indexing the entirety of your title tags. Paginated lists like this are tough to get indexed properly. If folks are actually searching for these obscure part numbers, perhaps this is the only way to scale it. That said, I would encourage you to experiment.
-
RE: Does anyone have an SEO strategic plan template for a beginning SEOmoz'r?
I'd also recommend performing a site audit. A good resource would be: http://moz.com/blog/how-to-perform-the-worlds-greatest-seo-audit
Also, our Learn SEO section is a good place to start.
-
RE: Does anyone have an SEO strategic plan template for a beginning SEOmoz'r?
Hi Matthew,
Thanks for the detailed answer. Some good advice here, but I can't say I agree with everything. In particular, it's a pretty broad statement to say that Social marketing has nothing to do with SEO. We know Google uses social media to discover new content, social shares are highly correlated with rankings, and Google has incorporated many SEO-like features into Google+ that have implications across the broader algorithm.
Also, social signals are greatly incorporated into personalized results and local SERPs.
There's a few other points. But I guess we both agree to learn as much as you can and question conventional wisdom. We agree more than we disagree.
-
RE: Whether or not to remove a link from a website with high spam score on Open Site Explorer
Anything with "Link Exchange" in the title should be bumped to a Spam Score 100
Those are really, really horrible links. I'd probably disavow them just to be safe. On the other hand, unless you've received a manual action notice in Google Webmaster Tools, it's quite possible those links aren't hurting you at all and Google is simply ignoring them.
On the other hand, those links are just awful.
-
RE: Amazon CloudFront CDN
Hi Max,
As you know, SEOmoz uses a CDN (Content Delivery Network) to host our static content. This greatly improves the load time of our pages by distributing our content across a cloud network, and results in an improved experience for users.
If I understand your question correctly, you have set up a CDN and have created duplicate content issues.
To solve this, it's important to set up your CDN only to serve static content, like images, stylesheets and javascript. That is what a CDN is designed for. Do not duplicate your entire site - your HTML - as this will cause duplicate content issues.
If for some reason you need to replicate your entire HTML, then there are some steps you can take to mitigate the damage, although it's going to depend on your exact circumstances.
For example, you can set full URL canonical tags so that all your mapped CNAMES point to your primary URL.
To revert back to one copy of your HTML, you might want to put 301 redirects in place on the duplicated content (pointing to the original) before removing them from the CDN.
But even these aren't ideal solutions. It's best just to serve your static content, and only one version of your HTML.
-
RE: How to remove the 4XX Client error,Too many links in a single page Warning and Cannonical Notices.
Hi Amit,
This is an important question, and how you address these errors and warnings depends on your experience level and the needs of your site. It's also a tremendous opportunity to further your SEO education.
For many folks like yourself, the best thing to do is to tackle each one of these issues one at time, learn from online resources until you are a near expert, then move onto the next one.
Each site is different, so there's no "one size fits all" solution. The exact "fix" will always depend on too many variables to list here, but here's some tips to get you started.
1. 4xx Errors. The best thing to do is download the CSV of your crawl report and open it in a spreadsheet program. Find the URLs that cause the error, and in the last column find the "referrer". This referrer will tell you the URL that the bad link was found on. If you go to this page, you can usually find where the broken link originated and decide if it needs fixing.
2. Too Many Links - This is a warning, not an error, so you may choose not to fix this. To understand the warning further, I recommend reading this article by Dr. Pete:
http://www.seomoz.org/blog/how-many-links-is-too-many
If you decide that you should address the pages with too many links, you can then start to decide which links you should remove.
3. Canoncial - Finally, these are notices, which aren't necessarily bad, we just want you to know they are there. For a little background, you might want to read the following:
http://www.seomoz.org/blog/complete-guide-to-rel-canonical-how-to-and-why-not
http://googlewebmastercentral.blogspot.com/2009/02/specify-your-canonical.html
-
RE: Allow or Disallow First in Robots.txt
Interesting question - I've had this discussion a couple of times with different SEOs. Here's my best understanding: There are actually 2 different answers - one if you are talking about Google, and one for every other search engine.
For most search engines, the "Allow" should come first. This is because the first matching pattern always wins, for the reasons Geoff stated.
But Google is different. They state:
"At a group-member level, in particular for
allow
anddisallow
directives, the most specific rule based on the length of the [path] entry will trump the less specific (shorter) rule. The order of precedence for rules with wildcards is undefined."Robots.txt Specifications - Webmasters — Google Developers
So for Google, order is not important, only the specificity of the rule based on the length of the entry. But the order of precedence for rules with wildcards is undefined.
This last part is important, because your directives contain wildcards. If I'm reading this right, your particular directives:
Allow: /models/ford///page*
Disallow: /models////pageSo if it's "undefined" which directive will Google follow, if order isn't important? Fortunately, there's a simple way to find out.Google Webmaster allows you to test any robots.txt file. I created a dummy file based on your rules, In this case, your directives worked perfectly no matter what order I put them in.
| http://cyrusshepard.com/models/ford/test/test/pages | Allowed by line 2: Allow: /models/ford///page* | Allowed by line 2: Allow: /models/ford///page* |
| http://cyrusshepard.com/models/chevy/test/test/pages | Blocked by line 3: Disallow: /models////page | Blocked by line 3: Disallow: /models////page |So, to summarize:1. Always put Allow directives first, as most search engines follow the "first rule counts" rule.2. Google doesn't care about order, but rather the specificity based on the length of the entry.3. The order of precedence for rules with wildcards is undefined.4. When in doubt, check your robots.txt file in Google Webmaster tools.Hope this helps.(sorry for the very long answer which basically says you were right all along
-
RE: Merging several sites into one - best practice
Hi Andreas,
Looks like you're doing everything right, but I want to make sure all the bases are covered. Depending on the size, link profile, link structure and domain authority of your site, it can take several weeks for Google and other search engines to complete migrate a domain. Here are some important processes not to overlook.
1. Did you file a change of address within Google webmaster? http://support.google.com/webmasters/bin/answer.py?hl=en&answer=83106
2. When migrating domains, it's important to leave an old sitemap up, so that Google will try crawling the old URLs and register the 301s. If you neglect this step, it may take much longer for Google to crawl the old URLs to see that they've moved.
3. As Joel pointed out, make sure to update as many internal and external links as possible.
That should cover the basics, but there are a million more details you can explore to make the process go more smoothly. For a detailed approach, here's a couple of excellent guides written my some very smart folks.
- https://seogadget.co.uk/domain-migration/
- http://www.seomoz.org/blog/web-site-migration-guide-tips-for-seos
Hope this helps. Best of luck with the migration!
-
RE: Hover texts for hyperlinks
The link title element is not known to be a significant ranking factor. In fact, stuffing this attribute with keywords might even have the opposite effect.
This is one of those cases where you ask "what's best for the user?"
-
RE: Is pointing multiple domains to a single website beneficial for SEO or not?
Lots of good answers here.
Generally, there's not much benefit to doing this, especially if the domains are new as Streamline Metrics pointed out.
The risk however, is if the domains have a negative history associated with them. If bad links were pointed at the old domain, then those links will now point at your main site when you redirect them. If an algorithmic action like Penguin or an over-optimization filter was applied to the old site, your risk carrying that baggage to the new site.
Sometimes it makes sense to redirect domains.
- SEOmoz.com redirects to seomoz.org. Lots of folks type seomoz.com, so this makes sense.
- You migrate an old domain.
In general, however, there's virtually no SEO benefit to buying a previously unestablished keyword rich domain and redirecting it for traffic, rankings boost (unless on those rare occasions it's a very popular domain name to begin with)
Hi! Cyrus here. I worked as Lead SEO for Moz intermittently from 2012-15. Today I run my own SEO company known as Zyppy. I help teach people SEO, as well as connect folks who need SEO services with expert consultants. Oh, and I'm very active on Twitter. You should probably follow me there
Best of luck with your SEO!
Looks like your connection to Moz was lost, please wait while we try to reconnect.