Ridding of taxonomies, so that articles enhance related page's value
-
Hello,
I'm developing a website for a law firm, which offers a variety of services.
The site will also feature a blog, which would have similarly-named topics. As is customary, these topics were taxonomies.
But I want the articles to enhance the value of the service pages themselves and because the taxonomy url /category/divorce has no relationship to the actual service page url /practice-areas/divorce, I'm worried that if anything, a redundantly-titled taxonomy url would dilute the value of the service page it's related to.
Sure, I could show some of the related posts on the service page but if I wanted to view more, I'm suddenly bounced over to a taxonomy page which is stealing thunder away from the more important service page.
So I did away with these taxonomies all together, and posts are associatable with pages directly with a custom db table.
And now if I visit the blog page, instead of a list of category terms, it would technically be a list of the service pages and so if a visitor clicks on a topic they are directed to /practice-areas/divorce/resources (the subpages are created dynamically) and the posts are shown there.
I'll have to use custom breadcrumbs to make it all work. Just wondering if you guys had any thoughts on this. Really appreciate any you might have and thanks for reading
-
Thank you for taking the time to respond. Makes a lot of sense, I appreciate it.
-
It is true that having pages with the same "page-name" (the last part following the final slash of a URL, e.g the page-name of this question is "ridding-of-taxonomies-so-that-articles-enhance-related-page-s-value"), which are also topically very similar - can cause 'jumpy' SERPs.
Many feel that the dangers of what is termed 'keyword cannibalisation' are over-egged. This may be true, but I have (myself) assuredly seen examples of it in action. Usually it occurs with most prominence when neither page strongly eclipses the other in terms of SEO authority (e.g: inbound signals like referring domains, citations across the web and general 'buzz' associated with a given URL).
If both pages are new with little authority (or 'popularity') bound to their unique addresses, then certainly Google can get confused. You can end up with problems like, earning a decent ranking for a related keyword - but it hops from page to page every day / week and Google's algorithm bubbles away in the background. This can make it hard to drive traffic to the correct destination.
If both pages are very specific about the keywords which they are targeting, you could turn references of those keywords on the page you don't want to rank - into hyperlinks pointing to the URL which you do want to rank! (sorry that was a bit of a mouthful)
Although TBPR (Tool Bar PageRank) was done away with aeons ago, 'actual' PageRank is still at large within Google's ranking algorithm(s). When one page links to another page with anchor text that matches a keyword, it 'gives away' some of its (ranking) value to the page receiving the link (for the specific keyword or collection of keywords / search entity in question). Think of links as 'votes' from one page to another. The difference between this and real voting is that, for Google not all votes are equal (links from more authoritative pages boost the receiving pages more than links from pages that nobody cares about). Not very progressive but still...
In general we in SEO abused this mechanic between different domains resulting in Google's current clamp-down on EMA (Exact-Match Anchor, in regard to keyword anchor text) linking. That being said: the risk from doing the same thing internally within your own website is extremely minimal, as you are just redistributing SEO authority from one page to another along a specific axiom of relevance.
That's not like when you do it from one domain to another, obviously to leech authority from an external site to your own - which in most occurrences is a violation of Google's Web-Master guidelines.
Do be careful though, don't overdo this. If the content of the page which you don't want to rank ends up stuffed full of hyperlinks, that could make the page look spammy and hurt your CRO (or earn Panda-related algorithmic devaluation).
Just don't go mental, everything should be fine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blacklisted website no longer blacklisted, but will not appear on Google's search engine.
We have a client who before us, had a website that was blacklisted by Google. After we created their new website, we submitted an appeal through Google's Webmaster Tools, and it was approved. One year later, they are still unable to rank for anything on Google. The keyword we are attempting to rank for on their home page is "Day in the Life Legal Videos" which shouldn't be too difficult to rank for after a year. But their website cannot be found. What else can we do to repair this previously blacklisted website after we're already been approved by Google? After doing a link audit, we found only one link with a spam score of 7, but I highly doubt that is what is causing this website to no longer appear on Google. Here is the website in question: https://www.verdictvideos.com/
Intermediate & Advanced SEO | | rodneywarner0 -
Canonical's, Social Signals and Multi-Regional website.
Hi all, I have a website that is setup to target different countries by using subfolders. Example /aus/, /us/, /nz/. The homepage itself is just a landing page redirect to whichever country the user belongs to. Example somebody accesses https://domain/ and will be redirected to one of the country specific sub folders. The default subfolder is /us/, so all users will be redirected to it if their country has not been setup on the website. The content is mostly the same on each country site apart from localisation and in some case content specific to that country. I have set up each country sub folder as a separate site in Search Console and targeted /aus/ to AU users and /nz/ to NZ users. I've also left the /us/ version un-targeted to any specific geographical region. In addition to this I've also setup hreflang tags for each page on the site which links to the same content on the other country subfolder. I've target /aus/ and /nz/ to en-au and en-nz respectively and targeted /us/ to en-us and x-default as per various articles around the web. We generally advertise our links without a country code prefix, and the system will automatically redirect the user to the correct country when they hit that url. Example, somebody accesses https://domain/blog/my-post/, a 302 will be issues for https://domain/aus/blog/my-post/ or https://domain/us/blog/my-post/ etc.. The country-less links are advertised on Facebook and in all our marketing campaigns Overall, I feel our website is ranking quite poorly and I'm wondering if poor social signals are a part of it? We have a decent social following on Facebook (65k) and post regular blog posts to our Facebook page that tend to peek quite a bit of interest. I would have expected that this would contribute to our ranking at least somewhat? I am wondering whether the country-less link we advertise on Facebook would be causing Googlebot to ignore it as a social signal for the country specific pages on our website. Example Googlebot indexes https://domain/us/blog/my-post/ and looks for social signals for https://domain/us/blog/my-post/ specifically, however, it doesn't pick up anything because the campaign url we use is https://domain/blog/my-post/. If that is the case, I am wondering how I would fix that, to receive the appropriate social signals /us/blog/my-post/, /aus/blog/my-post/ & /nz/blog/my-post/. I am wondering if changing the canonical url to the country-less url of each page would improve my social signals and performance in the search engines overall. I would be interested to hear your feedback. Thanks
Intermediate & Advanced SEO | | destinyrescue0 -
Something happened within the last 2 weeks on our WordPress-hosted site that created "duplicates" by counting www.company.com/example and company.com/example (without the 'www.') as separate pages. Any idea what could have happened, and how to fix it?
Our website is running through WordPress. We've been running Moz for over a month now. Only recently, within the past 2 weeks, have we been alerted to over 100 duplicate pages. It appears something happened that created a duplicate of every single page on our site; "www.company.com/example" and "company.com/example." Again, according to our MOZ, this is a recent issue. I'm almost certain that prior to a couple of weeks ago, there existed both forms of the URL that directed to the same page without be counting as a duplicate. Thanks for you help!
Intermediate & Advanced SEO | | wzimmer0 -
Some site's links look different on google search. For example Games.com › Flash games › Decoration games How can we do our url's like this?
For example Games.com › Flash games › Decoration games How can we do our url's like this?
Intermediate & Advanced SEO | | lutfigunduz0 -
Why did this website disappear from Google's SERPs?
For the first several months this website, WEBSITE, ranked well in Google for several local search terms like, "Columbia MO spinal decompression" and "Columbia, MO car accident therapy." Recently the website has completely disappeared from Google's SEPRs. It does not even exist when I copy and paste full paragraphs into Google's search bar. The website still ranks fine in Bing and Yahoo, but something happened that caused it to be removed from Google. Beside for optimizing the meta data, adding headers, alt tags, and all of the typical on-page SEO stuff, we did create a guest post for a relevant, local blog. Here is the post: Guest Post. The post's content is 100% unique. I realize the post has way to many internal/external links, which we definitely did not recommend, but can anyone find a reason why this website was removed from Google's SERPs? And possibly how we should go about getting it back into Google's SERPs? Thanks in advance for any help.
Intermediate & Advanced SEO | | VentaMarketing0 -
Bing flags multiple H1's as an issue of high importance--any case studies?
Going through Bing's SEO Analyzer and found that Bing thinks having multiple H1's on a page is an issue. It's going to be quite a bit of work to remove the H1 tags from various pages. Do you think this is a major issue or not? Does anyone know of any case studies / interviews to show that fixing this will lead to improvement?
Intermediate & Advanced SEO | | nicole.healthline0 -
Report card shows many F's. How do I specify keywords for pages?
I have been doing general optimization for on-page, but still have many F's because SEOMoz considers the pages to be weak for keywords that are anyway not relevant. Is there a way to tease out keywords for specific pages so I can get a more accurate report card?
Intermediate & Advanced SEO | | Ocularis1 -
Looking for good examples of website's geotargeting
I am looking for some examples of sites that handle their global presence well (geotargeting and languages), ideally from a single .com domain thanks! Stephen
Intermediate & Advanced SEO | | firstconversion0