Cookie Free domains
-
What is the advantage of having a cookie free domain and how does one set this up on wordpress?
-
thanks Klarke
-
Advantage? It reduces unnecessary overhead and it's one of many little tweaks you can make to ensure your site is the fastest it can be.
If you have wordpress setup on www.domain.com, then all your static files (images, css, js) are served along with cookies set on the domain.
So the basic solution is the create a subdomain of your main domain (or register a whole new domain just for it) and server your static files from that domain.
I use W3 Total Cache configured to serve all static files from a subdomain which is on Amazon Cloudfront. W3 automatically rewrites all your theme and images path to the new subdomain. It's really easy to setup.
Check these out:
http://www.allthingsdemocrat.com/w3-total-cache-cookieless-subdomain-godaddy-shared-hosting/
http://www.riyaz.net/blogging/setup-own-cdn/890/
In addition to that, I had make tweaks to my Google analytics script to ensure it doesn't set cookies on my cookieless sub-domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can Robots.txt on Root Domain override a Robots.txt on a Sub Domain?
We currently have beta sites on sub-domains of our own domain. We have had issues where people forget to change the Robots.txt and these non-relevant beta sites get indexed by search engines (nightmare). We are going to move all of these beta sites to a new domain that we disallow all in the root of the domain. If we put fully configured Robots.txt on these sub-domains (that are ready to go live and open for crawling by the search engines) is there a way for the Robots.txt in the root domain to override the Robots.txt in these sub-domains? Apologies if this is unclear. I know we can handle this relatively easy by changing the Robots.txt in the sub-domain on going live but due to a few instances where people have forgotten I want to reduce the chance of human error! Cheers, Dave.
On-Page Optimization | | davelane.verve0 -
Is it possible that not having cookie warning message penalizes our SEO?
Hi! I'm wondering if not having a cookie warning message in a web may be a reason for google algorithm to penalize SEO. (we're talking about a european website) Does anybody know it it is so? Thanks!
On-Page Optimization | | Canexel0 -
Domain authority decreased but rankings increased
Hello! I'm a bit confused. It seems the industry as a whole that my company operates in has taken a bash from the update a few days ago as all of the competitiors have taken a hit to their domain authority. What is confusing is that my websites rankings have increased as well. There has been a change to the robots.txt file in an effort to stop being crawled by SEMRush which is messing with my site stats but that is the only change. Thanks in advance, Ben
On-Page Optimization | | Ben-Cleaver1 -
SEO for multilingual and multiregional site seperated with subdirectories on same domain
Hi all, I am working on a website that is multilingual and multiregional.
On-Page Optimization | | helgeolaussen
The site is on a single TDL where the countries are seperated with subdirectories.
I am only working with SEO for one country, so just one of the subdirectories.
Do you guys have any tips on important stuff to remember when doing SEO for this site?
Any pros or cons for this? Ty0 -
Effecting Domain Authority
I've taking an old brochure website and done 61 Pages and 10 Post pages On-page SEO and my Domain Authority is still a '1.' I've used Low Difficulty Keywords and gotten Grade A in On-page Moz reports. We've even keyworded 400 images - Alt text. My competitors have done nothing and they have domain authority at least in the 40s...... Am I doing anything wrong?
On-Page Optimization | | Joseph.Lusso0 -
How much weight does domain age really carry?
One of my clients competitors launched a new site in January 2014 (totally new site on a domain that had previously never been used). The competitor has very few backlinks (only double digits), most of which are directory links (dofollow and nofollow). Their authority level is good but not as high as others who rank on top pages with them and their on-page optimization is lacking in a few areas. For all intents and purposes, the site should not be ranking where it is from what I can see. However, it is literally skyrocketing up the ranks faster than I would have ever imagined. The only thing I found that this domain has going for it is age (roughly 4 years). Does this carry more weight than I think it does? When compared to my clients site, we have more backlinks (similar mix), higher DA and PA and better on-page optimization for the same keywords. However, our domain age is only a little over 1 year.
On-Page Optimization | | mattylac0 -
Duplicate content on domains we own
Hello! We are new to SEO and have a problem we have caused ourselves. We own two domains GoCentrix.com (old domain) and CallRingTalk.com (new domain that we want to SEO). The content was updated on both domains at about the same time. Both are identical with a few exceptions. Now that we are getting into SEO we now understand this to be a big issue. Is this a resolvable matter? At this point what is the best approach to handle this? So far we have considered a couple of options. 1. Change the copy, but on which site? Is one flagged as the original and the other duplicate? 2. Robots.txt noindex, nofollow on the old one. Any help is appreciated, thanks in advance!
On-Page Optimization | | CallRingTalk0 -
Multiple domains vs single domain vs subdomains ?
I have a client that recently read an article that advised him to break up his website into various URL's that targeted specific products. It was supposed to be a solution to gain footing in an already competitive industry. So rather than company.com with various pages targeting his products, he'd end up having multiple smaller sites: companyClothing.com companyShoes.com Etc. The article stated that by structuring your website this way, you were more likely to gain ranking in Google by targeting these niche markets. I wanted to know if this article was based on any facts. Are there any benefits to creating a new website that targets a specific niche market versus as a section of pages on a main website? I then began looking into structuring each of these product areas into subdomains, but the data out there is not definitive as to how subdomains are viewed by Google and other search engines - more specifically how subdomains benefit (or not!) the primary domain. So, in general, when a business targets many products and services that cover a wide range - what is the best way to structure the delivery of this info: multiple domains, single domain with folders/categories, or subdomains? If single domain with folders/categories are not an option, how do subdomains stack up? Thanks in advance for your help/suggestions!
On-Page Optimization | | dgalassi0