Confirmation Needed: Do any search engine crawlers accept cookies?
-
I'm looking for confirmation here. Do any search engine crawlers accept cookies? I thought that the answer was always no, but we're looking in our weblogs and seeing some odd behavior.
-
The answer for all major search engines is no. There are some third party crawlers that have been created that do accept cookies. If you are using a site crawler to examine your site you might want to check to see if there is a cookie recognition function in the algo. Usually you can select to accept or ignore cookies though an internal site crawl but Google and Bing do not accept them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When doing a site search my homepage comes up second. Does that matter?
When I do a site: search the homepage comes up second. Does this matter?
Intermediate & Advanced SEO | | EcommerceSite0 -
Noindex search pages?
Is it best to noindex search results pages, exclude them using robots.txt, or both?
Intermediate & Advanced SEO | | YairSpolter0 -
SEOMoz and Facebook Graph Search
Are SEOMoz looking to integrate Facebook Graph Search (the web search section) into the product? At the moment we can measure and track rankings for Google, Bing/Yahoo, but not Facebook graph search. What are the general thoughts among the community? Do you think it will be adopted as a real search engine? I'm not overly concerned - I reckon it will take a lot to change people behaviour and have them moving away from the other search engines. It's throwing up some interesting results though in searches!
Intermediate & Advanced SEO | | littlesthobo0 -
SEOMOZ crawler is still crawling a subdomain despite disallow
This is for our client with a subdomain. We only want to analyze their main website as this is the one we want to SEO. The subdomain is not optimized so we know it's bound to have lots of errors. We added the disallow code when we started and it was working fine. We only saw the errors for the main domain and we were able to fix them. However, just a month ago, the errors and warnings spiked up and the errors we saw were for the subdomain. As far as our web guys are concerned. the disallow code is still there and was not touched. User-agent: rogerbot Disallow: / We would like to know if there's anything we might have unintentionally changed or something we need to do so that the SEOMOZ crawler will stop going through the subdomain. Any help is greatly appreciated!
Intermediate & Advanced SEO | | TheNorthernOffice790 -
To index search results or not?
In its webmaster guidelines, Google says not to index search results " that don't add much value for users coming from search engines." I've noticed several big brands index search results, and am wondering if it is generally OK to index search results with high engagement metrics (high PVPV, time on site, etc). We have an database of content, and it seems one of the best ways to get this content in search engines would be to allow indexing of search results (to capture the long tail) rather than build thousands of static URLs. Have any smaller brands had success with allowing indexing of search results? Any best practices or recommendations?
Intermediate & Advanced SEO | | nicole.healthline0 -
Penguin Rescue! A lead has been hit and I need to save them!
I had a meeting today with a prospective client who has been hit by Penguin. Their previous SEO company has obviously used some questionable techniques which is great for me, bad for the client. Their leads have dropped from 10 per day to 1 or 2. Their analytics shows a drop after the 25th, a back link check shows a lot of low quality links. Domain metrics are pretty good and they are still ranking ok for some keywords. I have 1 month to turn it around for them. How do you wise people think it can be done? First of all I will check the on-site optimisation. I will ensure that the site isn't over optimised. Secondly, do I try and remove the bad links? Or just hit the site with good content and good links to outweigh the bad ones. Also, do you think G is actually dropping rankings for the over optimisation / bad links or are the links are just being discredited rsulting in the drop in rankings. 2 very different things. Any advice is appreciated. Thanks
Intermediate & Advanced SEO | | SimpsonGareth0 -
We would like to know where googlebots search from - IP location?
We have a client who has two sites for different countries, 1 US, 1 UK and redirects visitors based on IP. In order to make sure that the English site is crawable, we need to know where the googlebot searches from. Is this a US IP or a UK IP for a UK site / server?
Intermediate & Advanced SEO | | AxonnMedia0 -
How do Google Site Search pages rank
We have started using Google Site Search (via an XML feed from Google) to power our search engines. So we have a whole load of pages we could link to of the format /search?q=keyword, and we are considering doing away with our more traditional category listing pages (e.g. /biology - not powered by GSS) which account for much of our current natural search landing pages. My question is would the GoogleBot treat these search pages any differently? My fear is it would somehow see them as duplicate search results and downgrade their links. However, since we are coding the XML from GSS into our own HTML format, it may not even be able to tell.
Intermediate & Advanced SEO | | EdwardUpton610