Adding parameters in URLs and linking to a page
-
Hi,
Here's a fairly technical question:
We would like to implement badge feature where linking websites using a badge would use urls such as:
domain.com/page?state=texas&city=houston
domain.com/page?state=neveda&city=lasvegas
Important note: the parameter will change the information and layout of the page: domain.com/page
Would those 2 urls above along with their extra parameters be considered the same page as domain.com/page by google's crawler?
We're considering adding the parameter "state" and "city" to Google WMT url parameter tool to tel them who to handle those parameters.
Any feedback or comments is appreciated!
Thanks in advance.
Martin
-
Thanks Paul. You confirmed our understanding.
Another head is always better!
Thanks again.
Martin
-
Sorry for the misunderstanding, Martin. I was misled by this statement:
... the parameter will change the information and layout of the page
If the content of the page will truly be different depending on the parameter, the search engines are probably going to consider them separate URLs no matter what you do.. That's their whole definition of separate pages.
If the parameter simply rearranges a page's content (e.g. creates a different sort order for the same products) then combining the pages can usually work.
If you want to try to combine these pages for ranking purposes anyway, the tool to use is the rel=canonical tag. You insert it into the header of the custom parameter pages pointing back to the primary page.
Google is clear though that canonical tags are taken as suggestions only, and if it thinks your pages should actually be indexed separately, it will ignore the canonical.
You could then back this process up by using Bing & Google WMT to specify that the ?state= and city= parameters are to be ignored. But the above caveat still applies.
Paul
-
Thank you for your feedback Paul.
Well, I guess I didn't explained myself correctly because we do want to have all the parameters considered 1 page!
Our goal is 2 fold:
1- Have the parameters in the URL considered 1 page in order to have 1 page with many links pointing to it in order to have higher page authority (instead of 20+ pages with lower authority).
2- Keep the user experience relevant by display and organizing the page content based on the referring url (in our case the links from badges).
So in other words how do we (if possible) make sure google and bing consider all parameters as combined page?
Thanks again!
Martin
-
You're actually in the reverse position of most people dealing with URL parameters, Martin. By default, the Search Engines consider different parameters to be different pages and most users are struggling with how to make the SEs understand they're all one page to avoid duplicate content issues.
In your case, you want the SEs to treat those pages according to the way the SEs normally do, which means you "shouldn't" need to do anything extra to get them indexed separately. That's what's naturally going to happen.
I'm with you on the hinting in Google Webmaster Tools though. May as well use that capability to confirm for Google that you do want those parameters indexed. You should do the same thing in Bing Webmaster Tools as well. Kind of a "belt & suspenders" approach.
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing Pages with Made Up URL
Hi all, Google is indexing a URL on my site that doesn't exist, and never existed in the past. The URL is completely made up. Anyone know why this is happening and more importantly how to get rid of it. Thanks 🙂
Technical SEO | | brian-madden0 -
Should you use google url remover if older indexed pages are still being kept?
Hello, A client recently did a redesign a few months ago, resulting in 700 pages being reduced to 60, mostly due to panda penalty and just low interest in products on those pages. Now google is still indexing a good number of them ( around 650 ) when we only have 70 on our sitemap. Thing is google indexes our site on average now for 115 urls when we only have 60 urls that need indexing and only 70 on our sitemap. I would of thought these urls would be crawled and not found, but is taking a very long period of time. Our rankings haven't recovered as much as we'd hope, and we believe that the indexed older pages are causes this. Would you agree and also would you think removing those old urls via the remover tool would be best option? It would mean using the url remover tool for 650 pages. Thank you in advance
Technical SEO | | Deacyde0 -
50,000 pages or a page with parameters
I have a site with about 12k pages on a topic... each of these pages could use another several pages to go into deeper detail about the topic. So, I am wondering, for SEO purposes would it be better to have something like 50,000 new pages for each sub topic or have one page that I would pass parameters to and the page would be built on the fly in code behind. The drawback to the one page with parameters is that the URL would be static but the effort to implement would be minimal. I am also not sure how google would index a single page with parameters. The drawback to the 50k pages model is the dev effort and possibly committed some faux pas by unleashing so many links to my internal pages. I might also have to mix aspx with html because my project can't be that large. Anyone here ever have this sort of choice to make? Is there a third way I am not considering?
Technical SEO | | Banknotes0 -
Why are the bots still picking up so many links on our page despite us adding nofollow?
We have been working to reduce our on-page links issue. On a particular type of page the problem arose because we automatically link out to relevant content. When we added nofollows to this content it resolved the issue for some but not all and we can't figure out why is was not successful for every one. Can you see any issues? Example of a page where nofollow did not work for... http://www.andor.com/learning-academy/4-5d-microscopy-an-overview-of-andor's-solutions-for-4-5d-microscopy
Technical SEO | | tonykelly0 -
Translating Page Titles & Page Descriptions
I am working on a site that will be published in the original English, with localized versions in French, Spanish, Japanese and Chinese. All the versions will use the English information architecture. As part of the process, we will be translating the page the titles and page descriptions. Translation quality will be outstanding. The client is a translation company. Each version will get at least four pairs of eyes including expert translators, editors, QA experts and proofreaders. My question is what special SEO instructions should be issued to translators re: the page titles and page descriptions. (We have to presume the translators know nothing about SEO.) I was thinking of: stick to the character counts for titles and descriptions make sure the title and description work together avoid over repetition of keywords page titles (over-optimization peril) think of the descriptions as marketing copy try to repeat some title phrases in the description (to get the bolding and promote click though) That's the micro stuff. The macro stuff: We haven't done extensive keyword research for the other languages. Most of the clients are in the US. The other language versions are more a demo of translation ability than looking for clients elsewhere. Are we missing something big here?
Technical SEO | | DanielFreedman0 -
Diagnostic says too many links on a page and most of the pages are from blog entries. Are tags considered links? How do I decrease links?
I just ran my first diagnostic on my site and the results came back were negative in the area of too many links one a page. There were also quite a few 404 errors. What is the best way to fix these problems? Most of the pages with too many links are from blog posts, are the tags counted as well and is this the reason for too many links?
Technical SEO | | Newport10300 -
If a page isn't linked to or directly sumitted to a search engine can it get indexed?
Hey Guys, I'm curious if there are ways a page can get indexed even if the page isn't linked to or hasn't been submitted to a search engine. To my knowledge the following page on our website is not linked to and we definitely didn't submit it to Google - but it's currently indexed: <cite>takelessons.com/admin.php/adminJobPosition/corp</cite> Anyone have any ideas as to why or how this could have happened? Hopefully I'm missing something obvious 🙂 Thanks, Jon
Technical SEO | | TakeLessons0 -
Do links count if they have no href parameter?
A SEOmoz report indicates that we have a large number of links on our pages, mainly due to an embedded mega-drop down and lots of product display options that are user activated, but otherwise hidden. Most of the links have the paramter href="#", because the links are used in combination with jQuery to trigger actions. It is still possible to trigger the actions without the href parameter, so the question is: Do links without href parameters count towards the total amount of links on the page, since a link without a href parameter is actually an internal page link? Our site (this version of the site has not had empty tags removed): http://emilea.be/
Technical SEO | | Webxtrakt0