Tool for extracting search queries
-
Hello,
Does anyone know of or have a tool that takes referrer URLs coming from Google which extracts the search query from the URL string?
Thank you
-
Thanks but as I mentioned above I am in Germany so cannott use analytics at all, hence my need to find another way to do this
-
Actually, with Google Analytics you can create a filter to see the search queries that users typed to land o a specific page of your website. It's very simple:
When you have the tracked website selected, on the left column in GA you will see "Custom Reports". Click on it, then click on "Create New Report". From the blue Metrix box select "Site Usage", scroll down to "Visits" and drag and drop it in the first empty "Metrix" box on the right. Now click on the green "Dimensions" Box, click on "Content", select "Page" and drag and drop it in the empty Dimension box. Click on "Traffic Sources", scroll down to "Country/Teritory" and drag this in a sub-dimension to also be able to see it by region.
Hope this helps, good luck!
-
Hey thanks, yeh was just wondering if anyone had a ready made tool as would make things a bit easier. The plan is to use the referral URLs from logfiles to look at search queries that are sending traffic to specific areas of the website (directories/sub-directories etc).
In Germany so cannot use Analytics and webmaster tools does not allow you to filter search queries to different areas of websites
-
Coding this yourself (or having it done by your developer) is a pretty simple job. It requires not much more than reading the 'q' parameter from the URL, which in PHP for example can be done by reading the $_GET['q'] variable.
For what purpose do you plan to use this parameter?
-
Maybe you could describe what you need that data for? Google Analytics normally records that data for natural search visits, but if you are looking to dynamically insert the keywords that requires some code work.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do you need a canonical tag for search and filter pages?
Hi Moz Community, We've been implementing new canonical tags for our category pages but I have a question about pages that are found via search and our filtering options. Would we still need a canonical tag for pages that show up in search + a filter option if it only lists one page of items? Example below. www.uncommongoods.com/search.html/find/?q=dog&exclusive=1 Thanks!
Technical SEO | | znotes0 -
Sitelink demotion not working after submitting in Google webmaster tool
Hello Friends, I have a question regarding demotion of sitelinks in Google webmaster tool. Scenario: I have demoted one of the sitelink for my website two months back; still the demoted sitelink has not been removed from the Google search results.May I know any reason, why this page is not getting removed even after demoting from GWT? If we resubmit the same link in demotion tool one more time, will it work? Can anybody help me out with this? Note: Since the validly of demotion exists only for 3 months (90 days), I am concerned about the same.
Technical SEO | | zco_seo0 -
Skip indexing the search pages
Hi, I want all such search pages skipped from indexing www.somesite.com/search/node/ So i have this in robots.txt (Disallow: /search/) Now any posts that start with search are being blocked and in Google i see this message A description for this result is not available because of this site's robots.txt – learn more. How can i handle this and also how can i find all URL's that Google is blocking from showing Thanks
Technical SEO | | mtthompsons0 -
Disallow: /search/ in robots but soft 404s are still showing in GWT and Google search?
Hi guys, I've already added the following syntax in robots.txt to prevent search engines in crawling dynamic pages produce by my website's search feature: Disallow: /search/. But soft 404s are still showing in Google Webmaster Tools. Do I need to wait(it's been almost a week since I've added the following syntax in my robots.txt)? Thanks, JC
Technical SEO | | esiow20130 -
Google Webmaster Tool - Crawl Stats Query ?
Dear All, I have been looking at GWT Crawl Stats and wondering how should I be interrupting the crawl stats chart. AllI I see is 3 charts telling me a high , low and average for the below but I am wondering is there anything I really need to be looking for ?. Pages crawled per day Kilobytes downloaded per day Time spent downloading a page (in milliseconds) thanks Sarah
Technical SEO | | SarahCollins0 -
Which is The Best Way to Handle Query Parameters?
Hi mozzers, I would like to know the best way to handle query parameters. Say my site is example.com. Here are two scenarios. Scenario #1: Duplicate content example.com/category?page=1
Technical SEO | | jombay
example.com/category?order=updated_at+DESC
example.com/category
example.com/category?page=1&sr=blog-header All have the same content. Scenario #2: Pagination example.com/category?page=1
example.com/category?page=2 and so on. What is the best way to solve both? Do I need to use Rel=next and Rel=prev or is it better to use Google Webmaster tools parameter handling? Right now I am concerned about Google traffic only. For solving the duplicate content issue, do we need to use canonical tags on each such URL's? I am not using WordPress. My site is built on Ruby on Rails platform. Thanks!0 -
Search for 404s on Sandbox
Can I verify an IP in google webmaster tools to search for any 404s? Or maybe i could do it with seomoz tools? Thanks!
Technical SEO | | tylerfraser0 -
Broken Inner Links - Tool Recommendations?
Do you have any recommendations for tools that scan an entire website and report broken inner links? I run several UGC centered websites and broken inner links, and external, is an issue. Being that these websites are several hundred thousand pages large, I am not really all that excited about running software on my desktop (xenu link sleuth for example). Any online solutions you could recommend would be great!
Technical SEO | | uderic0