Someone who will not be named (it rhymes with "Bomb Itch Slow") told me to Mechanical Turk the crap out of it
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best posts made by Dr-Pete
-
RE: Tactics to Influence Keywords in Google's "Search Suggest" / Autocomplete in Instant?
-
RE: Hoping someone could take some time to give me some feedback / advice. Thanks!
Thanks for sharing your story, Rick. My wife and I lost our first pregnancy due to Turner's Syndrome, so I'm painfully familiar with how random the genetic lottery can be. I'm happy to say we have a healthy, happy 17-month-old girl now. I'm glad to hear Noah is doing well, and I'm heartened to hear how proactive the doctors are being.
First off, I'd just like to say that you're doing a lot right. You have a well-designed site with great content, a good core structure, and many of the important features of a modern site/blog. The wide world of SEO can be overwhelming, but it's rare that you need to tackle it all at once.
I think it's great to be thinking proactively about categorizing your content, and it's ok to let that evolve organically as your needs are clear. Categorizing the videos certainly makes sense.
At this point, though, given that your basic structure is good and you've got a lot of content, the social and link-building aspects are probably equally or more important. You have one tremendous tool at your disposal - sincere passion that can connect you to an audience. Your own outreach efforts, interactions with other parents, discussion boards, communities, etc. will go a LONG way. As you build relationships, links will start building themselves.
One thing that wasn't clear to me until I fully read your post and dug into the site was that your wife is a pediatrician. The "Mom MD" just read like a cute category name to me (no offense intended - that was just my first impression). This fact, IMO, adds a lot of credibility to what you're doing, and makes this more than a personal blog. I'd make this clear, especially on the About page and at the top of the Mom MD section.
-
RE: Are press release sites useful?
I have some smaller clients who have had limited luck with it, but I think it's best to just stick to one of the sites and do periodic releases (maybe every couple of months). If nothing else, it'll give you a sense of what's working, and you can take some of the popular releases and push to get them a broader market.
What I wouldn't do is go after multiple low-value PR services and plaster the same releases everywhere you can. At best, it's diminishing returns - at worst, they'll be devalued. I'm with Peter G. - the best press release opportunities come through relationships with the media. Obviously, though, that takes a lot more time.
-
RE: Edu links service
Rand has a great post on link valuation:
http://www.seomoz.org/blog/10-illustrations-on-search-engines-valuation-of-links
There's no magic to .edu links, frankly - the data over the past couple of years doesn't really support that they're inherently better than .com's, etc. It's true that many .edu sites are high-authority sites, of course, but that's just a correlation (it's not that Google prefers .edu or .gov inherently).
Within any site, though, you have to look at the Page Authority, the number of links on that page, the placement of the links, the anchor text, the relevance (to some degree), and a lot of other factors. Let's take a non-edu example - DMOZ. People kill themselves for DMOZ links, but lately I'm seeing DMOZ listings where the entire page isn't indexed because it's so deep. No indexation means ZERO link juice. So, even though it's DMOZ, the link is worthless.
-
RE: How highly do you value a link from the BBB?
This gets into the realm of opinion pretty fast - it can be shockingly difficult to measure the value of one link. Here are a few of my opinions:
(1) One link is one link. It's rarely the magic pill people want it to be, even from a very authoritative site. I've seen people get a link like this and then wait on their hands for a sudden change in rankings, and it almost never comes. If you're just starting out and you have little or no link profile, a strong link can kick-start you, but I wouldn't pay $750 just to get a link if your site is established (I'm not sure I'd pay it even if your site is new).
(2) DA and PA both matter, and how much each matters can really vary with the situation. Your profile on a deep page of BBB is not an authority=96 link. It will carry weight, but the weight of any given profile could vary a lot.
(3) BBB has gotten a bit more aggressive, IMO, and I suspect Google will devalue these links over time. People tell me that they haven't yet, in this case, but it is, in essence, a paid link. Any day, Google could say "These BBB links are counting too much" and just lower the volume. So, don't put all your eggs in one basket, no matter what you do.
Now, to be fair, your BBB listing does have other value, like using it as a trust signal. The business case for spending the money goes beyond SEO, and that's a decision you have to make for yourself. If 100% of your interest in the listing is for a followed link, though, I personally would spend the money elsewhere.
-
RE: Removing Content 301 vs 410 question
Let me jump in and clarify one small detail. If you delete a page, which would naturally result in a 404, but then 301-redirect that page/URL, there is no 404. I understand the confusion, but ultimately you can only have one HTTP status code. So, if the page properly 301s, it will never return a 404, even if it's technically deleted.
If the page 301s to a page that looks like a "not found" sort of page (content-wise), Google could consider that a "soft 404". Typically, though, once the 301 is in place, the 404 is moot.
For any change in status, the removal of crawl paths could slow Google re-processing those pages. Even if you delete a page, Google has to re-crawl it to see the 404. Now, if it's a high-authority page or has inbound (external) links, it could get re-crawled even if you cut the internal links. If it's a deep, low-value page, though, it may take Google a long time to get back and see those new signals. So, sometimes we recommend keeping the paths open.
There are other ways to kick Google to re-crawl, such as having an XML sitemap open with those pages in them (but removing the internal links). These signals aren't as powerful, but they can help the process along.
As to your specific questions:
(1) It's very tricky, in practice, especially at large-scale. I think step 1 is to dig into your index/cache (slice and dice with the site: operator) and see if Google has removed these pages. There are cases where massive 301s, etc. can look fishy to Google, but usually, once a page is gone, it's gone. If Google has redirected/removed these pages, and you're still penalized, then you may be fixing the wrong problem or possibly haven't gone far enough.
(2) It really depends on the issue. If you cut too deep and somehow cut off crawl paths or stranded inbound links, then you may need to re-establish some links/pages. If you 301'ed a lot of low-value content (and possibly bad links), you may actually need to cut some of those 301s and let those pages die off. I agree with @mememax that sometimes a helathy combination of 301s/404s is a better bet - pages go away, and 404s are normal if there's really no good alternative to the page that's gone.
-
RE: Edu links service
I appreciate your transparency, but to me that looks like low-quality article spinning. It's ok to a point, and it may get you a short term boost, but those pages are going to be devalued over time. Plus, they have no other value (those links won't drive traffic).
As for the argument that Google can do whatever they want so that makes anything ok, I strongly disagree. There are link-building tactics that can create long-term problems. Should a client risk a full-on penalty for a low-quality link-building tactic that might get them a 5% boost for 3 months? For me to suggest that as an SEO would be grossly irresponsible. There are smart risks and there are bad risks.
-
RE: Do I need to add canonical link tags to pages that I promote & track w/ UTM tags?
I find Google is usually good about UTM parameters, but not always - for use in Adwords, they're almost never a problem, but when you use them for custom tracking, they can start to cause duplicates. Bing/Yahoo also don't handle them very well.
I'm not sure on the scope of your site/usage right now, so it's hard to give a definitive solution, but my gut reaction is that I would use canonical tags on the affected pages. If you want to double-check, you can test for the URLs in the Google index. Use something like:
site:example.com inurl:utm=
If they're not being indexed, you're probably ok, and can just keep an eye on it. If it's just a few landing pages, though (and not a massive, site-wide issue), I'd be proactive and put a canonical tag in place, if it were me.
-
RE: How to prevent duplicate content at a calendar page
Sadly, the short answer is that you can't have it all. Either you index the separate calendar pages, get more pages/content out there and risk some "thinning" of your index, or you focus on one page, maximize the SEO value, but then lose the individual pages.
I would not 301 or 302 to the individual calendar URLs - that kind of daily URL shifting is going to look suspicious, Google will not re-cache consistently, and you're going to end up with a long-term mess, I strongly suspect.
I actually tend to agree with Muhammed and Paragon that a viable option would be to let the individual days have their own content, but then canonical to the main calendar page to focus the search results. That way, users can still cycle through each individual day, but Google will focus on the core content. In a way, that's how a blog home-page works - the content changes daily, but you're still keeping the bots focused on one URL.
Think of it in terms of usability, too. How valuable is old/outdated content to search users? They might find something relevant on an old page, but they still probably want to see the main calendar and view recent content.
Where are the links to the individual days, if "/calendar" always has today's content? I'm wondering if there's a hybrid approach, like letting the most recent 30 days all have their own URLs, but then redirecting or using rel-canonical to point to the main page after 30 days.
-
RE: Are pages with a canonical tag indexed?
I have to disagree on this one. If Google honors a canonical tag, the non-canonical page will generally disappear from the index, at least inasmuch as we can measure it (with "site:", getting it to rank, etc.). It's a strong signal in many cases.
This is part of the reason Google introduced rel=prev/next for paginated content. With canonical, pages in the series aren't usually able to rank. Rel=prev/next allows them to rank without clogging up the index (theoretically). For search pagination, it's generally a better solution.
If your paginated content is still showing in large quantities in the index, Google may not be honoring the canonical tag properly, and they could be causing duplicate content issues. It depends on the implementation, but they recommend these days that you don't canonical to the first page of search results. Google may choose to ignore the tag in some cases.
-
RE: Duplicate title-tags with pagination and canonical
Unfortunately, it can be really tough to tell if Google is honoring the rel=prev/next tags, but I've had gradually better luck with those tags this year. I honestly the GWT issue is a mistake on Google's part, and probably isn't a big deal. They do technically index all of the pages in the series, but the rel=prev/next tags should mitigate any ranking issues that could occur from near-duplicate content. You could add the page # to the title, but I doubt it would have any noticeable impact (other than possibly killing the GWT warning).
I would not canonical to the top page - that's specifically not recommended by Google and has fallen in disfavor over the past couple of years. Technically, you can canonical to a "View All" page, but that has its own issues (practically speaking - such as speed and usability).
Do you have any search/sort filters that may be spinning out other copies, beyond just the paginated series? That could be clouding the issue, and these things do get complicated.
I've had luck in the past with using META NOINDEX, FOLLOW on pages 2+ of pagination, but I've gradually switched to rel=prev/next. Google seems to be getting pickier about NOINDEX, and doesn't always follow the cues consistently. Unfortunately, this is true for all of the cues/tags these days.
Sorry, that's a very long way of saying that I suspect you're ok in this case, as long as the tags are properly implemented. You could tell GWT to ignore the page= parameter in parameter handling, but I'm honestly not sure what impact that has in conjunction with rel=prev/next. It might kill the warning, but the warning's just a warning.
-
RE: Is SEO Certification.org Worth Having?
Unfortunately, I haven't heard of many certifications that the industry respects. I think the MarketMotive program is a good one, but that's more for the training than the piece of paper, IMO.
It really depends on your goal. If it's for clients or your employer, get the certifications they value. It's all about perception, in that case. If it's for selling your services directly, I'm not sure I'd bother. The training can be good, but you still have to pound the pavement.
I'm Google AdWords certified, for example. It's a decent program, and some of my existing clients like that I have it, but when I first got it, it did little or nothing to bring in new clients. The training program itself is good, but you can do that without ever paying them a dime or taking the test.
Is there a specific aspect of SEO you're trying to learn?
-
RE: Can PDF be seen as duplicate content? If so, how to prevent it?
If they duplicate your main content, I think the header-level canonical may be a good way to go. For the syndication scenario, it's tough, because then you're knocking those PDFs out of the rankings, potentially, in favor of someone else's content.
Honestly, I've seen very few people deal with canonicalization for PDFs, and even those cases were small or obvious (like a page with the exact same content being outranked by the duplicate PDF). It's kind of uncharted territory.
-
RE: Footer Links And Link Juice
Just to add to the consensus (although credit goes to multiple people on the thread) - PR-sculpting with nofollow on internal links no longer works, and it can be counter-productive. If these links are needed for users, don't worry about them, and don't disrupt PR flow through your site. Ultimately, you're only talking about a few pages, and @sprynewmedia is right - Google probably discounts footer links even internally (although we may no good way to measure this).
Be careful with links like "register", though, because sometimes they spin off URL variations, and you don't want those all indexed. In that case, you'd probably want to NOINDEX the target page - it just doesn't have any search value. I'm not seeing that link in your footer, though, so I'm not clear on what it does. I see this a lot with "login" links.
-
RE: DA vs PA when building links
EGOL is essentially correct - DA and PA are measures of ranking power, essentially (they both factor in multiple variables), but we don't currently model things like the likelihood of a link to be spammy - although we're working on that. So, it is definitely possible for a site to have high authority in theory but devalued by Google in practice.
It also depends on whether you mean "trust" in a broad sense or specifically something like TrustRank. Our MozTrust metric was intended to approximate TrustRank, which essentially measure how far a site is from a seed set of trusted sites. That's a way we believe Google has quantified "trust" in the past. I don't believe that PA/DA factor in MozTrust, but I'm not entirely sure on that one.
In terms of link value, DA and PA can both matter, and it depends a bit on the situation - even both metrics are only a small piece of the puzzle. If the numbers are similar and low (like 20/34 or 34/20), I wouldn't obsess about it. It's when they differ quite a bit that you might want to consider both. A weak page on a very strong domain or a very strong page on a weak domain both have potential value as link sources.
-
RE: Should I use rel=canonical on similar product pages.
So, here's the problem - if you follow the official uses of our options, then there is no answer. You can't have thin content or Google will slap you with Panda (or, at the very least, devalue your rankings, you can't use rel=canonical on pages that aren't 100% duplicates, and you're not supposed to (according to Google) just NOINDEX content. The official advice is: "Let us sort it out, but if we don't sort it out, we'll smack you down."
I don't mean that to be critical of your comment, but I'm very frustrated with the official party line from Google. Practically speaking, I've found index control to be extremely effective even before Panda, and critical for big sites post-Panda. Sometimes, that means embracing imperfect solutions. The right tool for any situation can be complex (and it may be a combination of tools), but rel=canonical is powerful and often effective, in my experience.