Stuffing keywords into URLs
-
The following site ranks #1 in Google for almost every key phrase in their URL path for almost every page on their site. Example: themarketinganalysts.com/en/pages/medical-translation-interpretation-pharmaceutical-equipment-specifications-medical-literature-hippa/ The last folder in this URL uses 9 keywords and I've seen as many as 18 on the same site. Curious: every page is a "default.html" under one of these kinds of folders (so much architecture?).
Question: How much does stuffing keywords into URL paths affect ranking? If it has an effect, will Google eventually ferret it out and penalize it?
-
This was a good answer and deserves to be labeled as such. I decided not to pursue this since I have been lucky to take the top spot for important key phrases. Thank you for such a well crafted answer.
-
Hi Paul, no problem at all. As Ryan says, we all like a mystery.
As for the canonicals they can have a big effect if all variations of the domain are present. i.e.
etc
Not only are these duplicate pages they will most likely split up any inbound link juice as you can see from the PA of the pages you mention. Go to the http:// version and the http://www and you'll see the problem!
Using <link rel="canonical" href="<a href="http://www.vibralogix.com/">http://www.mydomain.com/" /> would probably be sufficient, and should be included, but I think it's best to have the canonicals redirected properly in the htaccess.</link rel="canonical" href="<a>
Very best wishes
Trevor
-
Thanks for the kind words Paul.
If you are looking for outstanding SEOs to follow, I would recommend EGOL and Alan Bleiweiss. I merely ride in the wake of their excellence.
Your response jumped around a bit but a few replies I would offer:
-
You are right. The value of most directories has dropped significantly. There are very few that offer any real value nowadays.
-
MVC is the current best practice for web design, but friendly URLs is a separate item. You can achieve them with or without MVC.
-
Most people who complain about their site's ranking drop actually have issues on their site if you look closely. I can't begin to share how many people I have encountered who were insistent their site was outstanding when their site had numerous issues.
-
Likewise, I have worked with clients who were quite upset about other sites that ranked well who referred to them as "junk" sites when those ranking were earned. Yes, there are exceptions and Google still has work to do, but they are doing a reasonable job. The truly bad sites usually disappear in 4-8 weeks.
-
I know nothing about "The Marketing Analysts" but they could have an offline presence or have undergone a name change which may explain the "Since 1989" claim. Let's remember Al Gore didn't invent the internet until about 1996 and there has been tremendous changes since then.
-
-
Hi Egol,
Thank you for your reply. The long folder names are probably from using WordPress as you pointed. I found a blog on their subdomain using WordPress.
I have to say that I've enjoyed reading your responses throughout the QA forum because your responses are short and to the point, pithy and no-BS. So, I'm curious about your response to my question. Above you responded "I doubt it" to the question about Google ferreting out keyword stuffed URL paths. Instead of trying to read between the lines, let me ask you, how good of a job is Google doing? How are they falling short?
Kindest regards,
Paul
-
Hi Trevor!
Thank you for your response! I'm VERY new to the concept of canonical issues. If you not in my other response, I'm just getting back into the game. How much do you think the canonical issue really plays?
Kind regards,
Paul
-
Hi Ryan!!
Man, I'm thrilled to see you responded, and that you responded so thoroughly. I've been reading threads in this QA forum for a few days, and I've come to think of you as a bit of an SEO celebrity! I have to figure out how to filter for questions you've answered! : )
Okay...the site. I've been away from SEO for about eight years and a lot has changed. In the past, I've enjoyed top positions in the SERPs under highly competitive key phrases, even recently (probably because good legacy websites seem to carry weight). Back then, I placed my primary site in directories thinking that people who visited the directories would see my listing and click on it and visit me (as opposed to getting a link to get "juice"). This is probably what has been giving my site good rankings for a while, and the fact that I've never used web-chicanery to outrank others. Over the years, I've seen spammy and trickster sites appear and disappear. I used to rip those sites (the only way to get a global vision of what's going on), and I studied what they did. I've got a curious little archive of living black hat tricks, all of which failed as Google caught on to them.
Now I turn my reflectors back on to what's going on in SEO and what companies and individuals are doing to position themselves in SERPs. I'm saddened to report this, but for all the overhauls, tweaking and tinkering that Google has done since 2001 when I started, spammy sites and sites with poor content, usability, usefulness, and design are still outranking truly useful, high-quality, high-integrity sites.
Very recently, I read complaints by people who felt like their sites had been unfairly affected by the Panda update (http://www.google.com/support/forum/p/Webmasters/thread?tid=76830633df82fd8e&hl=en). I followed the links to "offending" sites (sites people felt ranked higher than theirs for no good reasons), and I went through the code in the complainants' sites as well. Holy cow...many of the complainants have good reason to complain. Shallow, spammy, zero-effort sites are blowing away robust sites with truly useful content. I've NEVER had a sinking feeling in my gut in 10 years that ranking well was a crapshoot - but I got that feeling after studying those complaints.
Years ago I worked in Macromedia Dreamweaver (remember how cool "Macromedia" was?) with regular HTML and nowadays I work in Visual Studio, just recently creating my first MVC3 site. MVC allows you to manipulate every tiny aspect of your site, including the URL path. There is absolutely no relation between the path you see in your browser and the actual path to the files on the server. And you can change the path and name of any page instantly and almost effortlessly. It's GREAT for SEO. So, I've been paying special attention to directory names and page names out there on the Internet. That's when I came across "themarketinganalysts" site and their unusually high rankings for so many important key phrases. After combing through that site, studying the source code, checking their rankings across many key phrases - I have to say, regardless of PA of 53 and keyword variances, the code reminds me of some of the code from spammy trickster sites from the early 90s.
If you hand code html, you get a certain vision for what the page will look like as you type along, from the mind’s eye of a visitor. When you go to a site and the code is packed with keywords, weird use of elements (like themarketinganalystemarketinganalysts' textless use of the H1tag to render the logo through CSS – an old trick to put the
next to the tag), you get the feeling that whoever wrote that code is telling search engines one thing, and visitors something different. It's duplicitous. Oddly enough, I'm not fazed by a company that outranks me (there is enough work for ALL of us), but I want to see healthy optimization, not one story in the code and another on the rendered page.
I'm going to do a more in-depth review of the code, page by page, look for trends and track down the sources that provide PA coefficients (or try to!). I’ll use the Wayback Machine to study the evolution of the site. Off the bat:
Mar 21, 2009 "This website coming soon"
Mar 31, 2009 "PREDICTIVE WEB ANALYTICS" - nothing about translation
May 25, 2009 Starts taking current formOdd. This is claimed on the current site: "Since 1989, The MARKETING ANALYSTS has built its Language Translation Services business..." That claim in not supported by what Wayback Machine shows. Geesh... Did I stumble across enterprise-wide shadiness? Hope not!
I'll come back to you and share my SEO findings.
-
Yep those PAs are strong even without canonicalization. Let's hope for Paul's sake that the site doesn't get an seo audit anytime soon!
-
Really great catch on the canonical issue Trevor! The entire time I just knew I was missing something, and that's it.
The www version of the URL has a PA of 53 which put's it as even stronger then the wiki page. The links mostly use "medical translation" as the anchor text with some "medical translator" and "medical translation service" variances thrown in. The link profile is varied enough to satisfy me the page has earned it's ranking.
-
Hi Ryan I noticed that the site has a canonical issue with both an http and www version too. Nice and thorough analysis, really interesting regarding the flag. Now I'm back home I might just have to take a look....although really should think about getting some shut eye here in blighty
-
I love a great SEO mystery and, for me at least, you have found one. I think this is a case for the famous SEO forensic analyst Alan "Sherlock" Bleiweiss.
I can confirm your overall findings and cannot explain the results. Specifically, on Google.com I searched for "medical translation". The results are listed below.
Result #19: http://en.wikipedia.org/wiki/Medical_translation
PA: 52, DA 98
Title: Medical translation - Wikipedia, the free encyclopedia
H1: Medical translation
First words of content: Medical translation is the translation of technical, regulatory....
Internal links (2): Anchor text on both links is "medical translation". Lowest PA of a linked page is 61. About 1000 links per page.
Title: Medical Translation Services: Pharmaceutical, Equipment, Specifications, Medical Literature, HIPPA, [99 chars in title so display is cut-off]
PA: 12, DA 60
H1: none. H2: Medical Translation: Medical Translation Services: Pharmaceutical, Equipment, Specifications, Medical Literature, HIPPA
First words of content: When it comes to the medical translation, you can trust THE MARKETING ANALYSTS.
Internal links (3): Anchor text on all three "Medical translation". The highest PA from a page is 15. One of the links is from the home page which has 220 links total.
As I try to reach for some other factor that would allow this site to rank so well compared to the wiki page I notice the following:
-
the site has "medical translation" in it's site's navigation bar
-
the site has a link in the left sidebar on the home page directly to the page. The sitebar is a tad spammy with 43 links.
The above two items are factors, but not enough to do it for me.
I still couldn't explain the ranking so I searched the page for the term "medical". It only appeared twice so I performed a "find" which indicated the term was being used many more times on the page but was not visible. After searching the HTML and CSS I determined there was extra hidden content. I could not find anything suspicious in the CSS and was puzzled on how this content was being hidden then I realized the "trick" involved.
Please notice the US/UK flag in the upper-right area of the page. Press it. Viola! The home page contains extra content directly related to Medical Transcription that no one will ever see. The content includes "Medical Transcription" as a H3 tag, a link to the target page, and a nice paragraph.
This technique is squarely black hat. The purpose of a language button is to offer a translation. There is only one button for the language the page is already being presented in, so no one will ever press it. The content is additional text and links which has nothing to do with a translation.
Even so, I find it interesting this content is enough to yield the #1 ranking in SERP. Either there is another factor remaining that I could not locate (I really don't think that is the case but would love to hear from others) or Google is putting more weight to content on the home page. I have always felt home page content was very strong, but this page just is not strong enough to blow the Wiki page away like this at all, unless Google is weighing this home page content quite strongly.
I like the Yahoo results MUCH better for this search. Wiki is #2 and this page is #13. Bing shows Wiki as #5 with this page as #13. I am ok with those ranking as well.
-
-
WordPress produces similar long URLs that match the post title.
-
How much will it help? Very little, except where competition is very lo.
Will Google ferret it out? I doubt it.
-
Hi Paul
Wow! To me that just looks so spammy and over-optimised. I would think that the SE's would think the same too but as you say the urls rank #1.
What are the other metrics like for the site, perhaps they may show the reasons for high rankings?
Update: Just taken a quick look and it does seem the domain is quite strong with a DA 60. Having said that they have a canonical issue which,, if they sorted may make them even stronger.....so keep that quiet!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Doing URL change losses SEO ranking or not?
Hi Webmasters, I would like to move shipwaves.me to shipwaves.aeHowever, our website is concentrated on middle east countries and moreover, we have though .me is middle east [United Arab Emirates} and later with SEO advice, we have taken .ae.Besides, our confusion is if the website move from Shipwaves.me to the new domain shipwaves.ae this makes our SEO ranking loss or not?some of our keywords has been started showing on various search pages. So, anyone knows about this concern, please let me know.
White Hat / Black Hat SEO | | LayaPaul0 -
Keyword Stuffing
I am working with a client which has amazing results on Google but his site hits every "no no" to any of the google algros. The site has duplicate content, keyword stuffing, poor content and poor UX. However the site ranks p1 on any of the keyword searches. The site has about 2000 pages all duplicated maybe 1 our 2 words changed. I am a little apprehensive to get him to remove the duplicate contents and remove the keywords stuffing if it means he drops traffic and loses page rank. I have also noticed competition to me rank higher on keywords when they've got a list of keyword and locations i.e web design windsor, web design manchester etc... etc... however they don't seem to get penalised. Is there any accurate tools to check keyword stuffing? and has anyone else experienced this and has any recommendations to tackle this?
White Hat / Black Hat SEO | | Cocoonfxmedia0 -
One page with multiple sections - unique URL for each section
Hi All, This is my first time posting to the Moz community, so forgive me if I make any silly mistakes. A little background: I run a website that for a company that makes custom parts out of specialty materials. One of my strategies is to make high quality content about all areas of these specialty materials to attract potential customers - pretty strait-forward stuff. I have always struggled with how to structure my content; from a usability point of view, I like just having one page for each material, with different subsections covering covering different topical areas. Example: for a special metal material I would have one page with subsections about the mechanical properties, thermal properties, available types, common applications, etc. Basically how Wikipedia organizes its content. I do not have a large amount of content for each section, but as a whole it makes one nice cohesive page for each material. I do use H tags to show the specific sections on the page, but I am wondering if it may be better to have one page dedicated to the specific material properties, one page dedicated to specific applications, and one page dedicated to available types. What are the communities thoughts on this? As a user of the website, I would rather have all of the information on a single, well organized page for each material. But what do SEO best practices have to say about this? My last thought would be to create a hybrid website (I don't know the proper term). Have a look at these examples from Time and Quartz. When you are viewing a article, the URL is unique to that page. However, when you scroll to the bottom of the article, you can keep on scrolling into the next article, with a new unique URL - all without clicking through to another page. I could see this technique being ideal for a good web experience while still allowing me to optimize my content for more specific topics/keywords. If I used this technique with the Canonical tag would I then get the best of both worlds? Let me know your thoughts! Thank you for the help!
White Hat / Black Hat SEO | | jaspercurry0 -
Keyword in alt tag and future G Updates
Hello, I notice that it is common practice to put the page's keywords directly into an alt tag. I don't see how this helps the user and how it helps the user using screen readers and such. Do you think future G updates will slightly penalize pages with alt tags that are just the page's keywords and not a helpful phrase? What do you recommend to put in alt tags in light of future G updates?
White Hat / Black Hat SEO | | BobGW1 -
URL structure: 301 redirect or leave as is?
Hello, My website, www.coloringbookfun.com is very old and authoritative, but the URL structure is terrible. If you check out some of our subcategories such as http://www.coloringbookfun.com/Kung Fu Panda and individual printables such as http://www.coloringbookfun.com/Kung Fu Panda/imagepages/image2.html You can see that they aren't optimized. I am curious to know the pros and cons of fixing the URL structure and 301ing them to the new optimized url. Will 301ing lose authority and backlinks for the sites pages? Does optimizing the url structure outweigh losing the authority/backlinks?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Would you consider this keyword spam?
See these pages that we've created to rank. There are 3 types: Designed to be topic-specific:
White Hat / Black Hat SEO | | Mase
https://www.upcounsel.com/lawyers/trademark Designed to be location-specific:
https://www.upcounsel.com/lawyers/san-francisco Designed to be a combo of both topic & location:
https://www.upcounsel.com/lawyers/san-francisco-real-estate Are the keywords at the bottom too many and considered keyword spam? Any other SEO tips on these pages? I'm thinking about making them a bit more hierarchical, so there can be breadcrumbs and you could click back to San Francisco Lawyers from San Francisco Real Estate Lawyers. Good examples of sites that have dome structures like this really well?0 -
Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
Hi All, In relation to this thread http://www.seomoz.org/q/what-happend-to-my-ranks-began-dec-22-detailed-info-inside I'm still getting whipped hard from Google, this week for some reason all rankings have gone for the past few days. What I was wondering though is this, when Google says- Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations? I assume my site hits the nail on the head- [removed links at request of author] As you can see I target LG Optimus 3D Sim Free, LG Optimus 3D Contract and LG Optimus 3D Deals. Based on what Google has said, I know think there needs to be 1 page that covers it all instead of 3. What I'm wondering is the best way to deal with the situation? I think it should be something like this but please correct me along the way 🙂 1. Pick the strongest page out of the 3 2. Merge the content from the 2 weaker pages into the strongest 3. Update the title/meta info of the strongest page to include the KW variations of all 3 eg- LG Optimus 3D Contract Deals And Sim Free Pricing 4. Then scatter contract, deals and sim free throughout the text naturally 5. Then delete the weaker 2 pages and 301 redirect to the strongest page 6. Submit URL removal via webmastertools for the 2 weaker pages What would you do to correct this situation? Am I on the right track?
White Hat / Black Hat SEO | | mwoody0 -
Google Bombing For A Specific URL
The term "Beruk" which means "Ape or Monkey" in english brings up this page of wikipedia amongst the first page result: URL: http://en.wikipedia.org/wiki/Khairy_Jamaluddin The page does not contain the word "Beruk". External links to the page do not contact the anchor-text "Beruk" Given the above scenario, how is the page still ranking on first page for this keyword?
White Hat / Black Hat SEO | | rajeevbala0