How does Google rank a websites search queries
-
Hello, I can't seem to find an answer anywhere. I was wondering how a websites search query keyword string url can rank above other page results that have stronger backlinks. The domain is usually strong, but that url with the .php?search=keyword just seems like it doesn't fit in. How does Google index those search string pages? Is it based off of traffic alone to that url? Because those urls typically don't have backlinks, right? Has anyone tried to rank their websites search query urls ever? I'm just a little curious about it. Thanks everyone. Jesse
-
If what you say is true they have made some horrible SEO since Google will index their product pages and search pages there are some serious duplicate content issues to worry about.
you should always tell Google not to index the search result page to avoid this problem. you don't want your search results indexed, you do however want your product category pages indexed.
-
Thanks Ryan for the fast response. The URL that I was thinking of is http://www.datpiff.com/mixtapes-search.php?criteria=keyword: j cole For the keyword J Cole Mixtapes, this page ranks well, around #4. I don't think it's designed though. It seems to be the sites search results page. Thanks.
-
Hi Jesse.
Normally Google prefers not to index those pages as "search results within search results" are not a positive user experience.
The thing to check for is that some sites design specific pages which appear as search results but really are not. In these instances they are dynamic or static web pages which have been stylized to partly appear as search results and may even have "search" in the URL.
An example: http://www.gigmasters.com/Search/DJ-Tampa-FL.html
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do content copycats (plagiarism) hurt original website rankings?
Hi all, Found some websites stolen our content and using the same sentences in their website pages. Does this content hurt our website rankings? Their DA is low, still we are worried about the damage about this plagiarism. Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Website penalized never again be the same
In February 2015 I received google email that had been penalized Superficial content with little or no added value. I resolved the situation with Google and the site was reconsidered two months later. The problem happens that since I had to drop the site never again be the same, since the site has been penalized never again be the same, now owned only 10% of visits and since then has not shown more growth. I'm deciding to leave the site for no more hopes and all who have had the same problem told me to forget about and working with new. What do you think? Give up the site and get a new one? In addition, during this period I rephrased the entire site, let responsive, mobile and improved as a whole in the general context and migrated to wordpress. www.acervoamador.com.br (Warning: adult content) I thank you for your attention and have a nice day.
White Hat / Black Hat SEO | | stroke0 -
Separating the syndicated content because of Google News
Dear MozPeople, I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better: *A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news? **B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right? So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important? This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed. Thank you!
White Hat / Black Hat SEO | | Lukas_TheCurious1 -
Forcing Google to Crawl a Backlink URL
I was surprised that I couldn't find much info on this topic, considering that Googlebot must crawl a backlink url in order to process a disavow request (ie Penguin recovery and reconsideration requests). My trouble is that we recently received a great backlink from a buried page on a .gov domain and the page has yet to be crawled after 4 months. What is the best way to nudge Googlebot into crawling the url and discovering our link?
White Hat / Black Hat SEO | | Choice0 -
From keyword rankings to ......... what KPI?
Hi Folks, I have a customer whos keyword rankings for Google have fluctuated rather widly over the past two months which has caused some consternation on their part dispite our reassurance. This is caused in large part due to their lack of understanding of SEO, little effort on their part in implementing changes for SEO and what I belive to be unrealistic expectations (having done no SEO work on their site wanting to see first page ranking for competitive keywords like 'heart attack' within 4-8 weeks). At the moment we are using just keyword rankings as a KPI and I wish to reframe them by using additional or alternative KPIs so that as rankings fluctuate with future Google seach algorithm tweaks and changes that the customer isn't solely focused on them. I am still in the process of formulating this list but so far I have decided to include the KPIs listed below. Month-on-Month / Quarter-On-Quarter Organic search traffic volume (should be rising) Top landing pages excluding branding keywords and homepage (should corelate to content created to target specific keywords) Number of landing pages on the client site that rank List of landing pages and bounce rates (are the 'gateway pages' holding visitors due to meeting their search requirements?) Average number of keywords per landing page (possibly integrated with the landig page reports above as a dimension to demonstrate correlation of # of keywords to landing pages) Some visibility on top keyword search terms (provided from Adwords where possible and GWT also) Top organic keywords (from adwords and GWT) Conversions from organic search (will vary from client to client for their own needs but will primarily be implemented using Google Tag Manager event tracking for things like enquiry forms etc) Referral traffic Delta/Ranking trends over large set of datapoints (will depend on how often you poll/track rankings but for example if you track rankings weekly then assess the trend of the rankings over 3-6months to smooth out the fluctuations) Your thoughts and feedback on this would be greatly appreciated. Regards, Dave
White Hat / Black Hat SEO | | icanseeu1 -
Google Disavow and Penalty lifted please help?
We disavowed 80% of our backlink profile due to our last SEO building cheap nasty links and filed a reconsideration requested (we had the Google Webmaster Tools notice of detected unnatural links to http://www.xxx.co.uk penalty for a year from the 24<sup>th</sup> march 2012 but thought it best to clean up before round 2 – even though we had no real penalty and we dd some decent link building that moved us up). We then received a successful penalty lifted note (on the 22<sup>nd</sup> of May 2013) but our rankings dropped (due to the crap links propping us up) since then we have built a fair few high quality links but our rankings do not seem to be moving much if at all (7 weeks clear now). has anyone had any experience with the above (are we in a sandbox type situation). Thank you for your time Thanks Bob
White Hat / Black Hat SEO | | BobAnderson0 -
Geo-targeted Organic Search Traffic to a sub-domain
For a client of ours, we are likely to create a sub-domain that is to be targeted at a specific country. Most of the content on this sub-domain will be from the main site, although with some specific differentiation to suit that geographic market. We intend to tell Google through Webmaster Centre that the sub-domain is targeted at a specific country. Some questions: a) Any idea how long it could take before google gives precedence to the content in this sub-domain for queries originating from that particular country? b) What is the likely impact of content duplication ? What extent of differentiation is necessary from a search engine perspective? Thanks.
White Hat / Black Hat SEO | | ontarget-media1 -
Is Google stupid?
Why does buying links still work? I don't mean approaching an individual webmaster and cutting a deal, that seems to be nearly impossible to detect. But the huge link brokers, like Text Link Ads, Build my Rank or Linkvine, Google has to be aware of them, right? Can't they just create accounts to see the whole network, and ban the sites? Why wouldn't they just do that?
White Hat / Black Hat SEO | | menachemp0