GTLD for SEO?
-
Hey guys
Has there been any case studies or does anyone know how the gTLDs are doing in the SERPS? I never see them in the SERPS but that of course doesn't mean anything. Google is saying that they treat them the same as .com or .net. But does anyone have 'facts'?
Cheers!
-
The only TLDs that get any known different treatment is the ccTLD (which sends a geolocation signal).
The reason non-"standard" TLDs don't show up very often is that most people buy the common TLDs. Most people still associate the ".com" with websites because they are the oldest TLD. I personally own a ".info" for my email (not a new TLD either) and I typically have to spell it out to people (I occasionally get some strange looks handing it out verbally). I can only imagine the looks if I had a more exotic TLD ("Yes, my email is ralph@crazy.ninja... no that's really it"). So other TLDs are basically less popular and less well understood, which explains why they aren't used very often.
The arguments over the years that I've seen that Google is flat out lying about the TLDs being treated equally have never really withstood scrutiny. There has never been anyone who has made a given TLD rank higher solely because of the TLD. The cases I've seen had other factors that could just as easily explain a ranking difference. And an objective test would be very difficult to create, as you would need two identical sites, and from an SEO perspective that's super hard to pull off.
Ultimately, tho, why would Google lie about this? What's there to be gained in telling people it's not a ranking factor when it is?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the proper URL length? in seo
i learned that having 50 to 60 words in a url is ok and having less words is preferable by google. but i would like to know that as i am gonna include keywords in the urls and i am afraid it will increase the length. is it gonna slighlty gonna hurt me? my competitors have 8 characters domain url and keywords length of 13 and my site has 15 character domain url and keywords length of 13 which one will be prefered by google.
White Hat / Black Hat SEO | | calvinkj0 -
How to find trustful seo specialist?
How to find trustful seo specialist if you don't know about SEO a lot?
White Hat / Black Hat SEO | | DigiVital1 -
SEO companies that own linking properties
Hi everyone, I do some SEO work for a personal injury attorney, and due to his profession, he gets cold-called by every digital marketing company under the sun. He recently got called by a company that offers packages that include posting in multiple directories (all on domains they own), creating subdomains for search listings, and PR services like writing and distributing press releases for distribution to multiple media outlets. The content they write will obviously not be local. All this and more for less than $500 a month! I'm curious if any of you have any experience with companies like this and whether you consider them black hat. I realize I'm asking you to speculate on a very broad description of what they offer, but their linking strategies sound risky to me. What experiences have you had with companies like this? Do you know anyone who has ever gotten a penalty using these tactics? Thanks, in advance, for sharing your thoughts.
White Hat / Black Hat SEO | | ptdodge0 -
SEO Template Recommendations - example provided but would welcome any advice
Hi there, I'm trying to improve the templates used on our website for SEO pages aimed at popular search terms. An example of our current page template is as follows: http://www.eteach.com/teaching-jobs Our designers have come up with the following new template: http://www.eteach.com/justindaviesnovemeber I know that changing successful pages can be risky. One concern is putting links behind JQuery, where the 'More on Surrey' link is. Does anyone had any strong suggestions or observations around our new template? Especially through the eyes of Google! Thanks in advance Justin
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Negative SEO from Spammers Killing Client Rankings
Hi - I have identified a client website which was; a ) hacked and had several fraudulent pages added e.g. www.xxx.com/images/uggaustralia.html added which have 301 redirect links to another fraudulent websites. b) had an auto generated back link campaign (over 12k back links at present) with targeted anchor text at cheap ugg boots, ugg sale etc. I've removed the dodgy redirect web pages and also undertook a link audit using Google WMT, OSE and Seo Majestic and have disavowed all the spammy links at domain level. Consequently my client has dropped from top three for the key phrase to #9. Google WMT now sees ugg boots uk, ugg boots sale etc. as some of the most popular anchor text for the site even though it's blatantly obvious that the site has nothing to do with Ugg boots. No manual webspam penalties are in place however the auto generated anchor text campaign is still ongoing and is generating more spammy links back to non existent web pages - which still Google appears to be picking up. Question is - how long do you reckon it will take for the links to disappear and is there anything I can speed Google along as this issue if not of my making? p.s. For the record I've found at least 500 sites that have been targeted by this same campaign as well.
White Hat / Black Hat SEO | | Door4seo0 -
Seo back linking proposal review
Hi guys, below is a proposal i received from someone on freelancer.com for some seo building. Is this really all it takes? Obviously done overtime but basically is this it aside from the usual basics onsite keywords, urls, artciles, content etc. This is a the proposal for $250 (some are cheaper but almost the same details as below). This is one of the top seo people on freelancers.com and they all have good reviews. Is this basically it? Shell out $100 bucks or more a month to someone who will just post stuff all over the internet. It just seems all very simple, what is $100 bucks a month to stay at #1. Is there any real questions i should ask to make sure i am not just throwing my money away? I would like to recommend the following services for attaining better search results for the website. 1)Press Release Submissions
White Hat / Black Hat SEO | | topclass
2)Social bookmarking submissions
3)Drip Feed Article Links - 100 Article submissions everyday for 25 days
4)Article directory submissions
5)Link directory submissions
6)Blog Post Submissions(All Blogs have PR1 to PR6)
7)Wiki Page Submissions(.EDU and .GOV Sites Included) PR of the directories, social bookmarking websites, Blogs, wiki pages and Article directories are from PR0 to PR8.
Most of them are in the range of PR1 to PR4. If you are interested in the above services then these are the details about those services. 1)Press release Submissions - We will write 3 press release and submit them to 25 press release websites.
Submitting press release gets the news to Google news, Yahoo news etc..
Please note we even submit to Paid press release websites like PRBuzz, SBWire, pressdoc etc.. 2)Social Bookmarking submissions - I will also submit your website to 150 social bookmarking websites.
Here are the example of social bookmarking websites.
www.digg.com
www.furl.net
After we finish submitting to social bookmarking websites we will then create rss feeds with approved link URL's and ping them so that links get indexed. 3)Drip Feed Article submissions - We will be writing one article.
Everyday we will submitting the article to 100 different websites.
We will be submitting for 25 days.
100 submissions x 25 days = 2500 submissions.
In each article submissions we can use 2 links to the website. 4)Article directory submissions - We will write 5 articles.
Each article will be around 500 words.
Then we will submit them to 300 different article directories. That means 5 articles x 300 article directories = 1500 article submissions.
In each article we can use 2 links to the website.
1500 x 2 Links.
I have experience in submitting articles to article directories.
Till now i have submitted more than 1000 articles to article directories.
I will also create separate accounts with article directories wherever possible. 5)Link directory submissions - I have a list of 1300 directories.
I will submit your website to these directories.
I have experience in submitting to link directories.
Till now i have submitted more than 2500 websites.
All the submission work is done manually.
All these directories provide one way links. 6)Blog Post Submissions(700 PR1 to PR6 Blogs) - We will write 1 article.
we spin and post to 700 PR1 to PR5 blogs.
We can spin the article, title of article and links
You will be given a confirmation when complete, and a code to search backlinks in the search engines.
They are hosted on 650 different C Class IPs! 7)Wiki Page Submissions - Get 200+ wiki site contextual backlinks (3 per posted article) from a range of PR 0 to 8 wiki sites including over 30 US .EDU and US .GOV sites.
I will also ping Them.0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
On page SEO? (This is good! I promise)
I have been doing some research on onsite optimization and I hit a dead end, need some help with OnSite.... These three I get for the most part... (If you would like to add anything please do) Title optimization - needs to be unique with keywords included under 90 words Meta description - needs to be unique with keywords included under 150 words Meta keywords – all keywords Questions begin here... H1 headings – Should this be the first thing the spider crawls? Should they be unique? Is there a penalty for having this content the same on every page? (H1s are under the logo at the top of every one of my sites pages) H2-H6 headings – Should they be unique? Is there a penalty for having this content the same on every page? Bold text – does this matter for SEO? Italic text - does this matter for SEO? Link anchor text – These are the same on most pages. However, most of these links are part of the navigation, does this matter for SEO? is this duplicate? how does the search engine analyze this data? Image alt attributes – I have the share image buttons on my site (Facebook, Twitter, etc...) and they have the same alt attributes on each page. Does this matter for SEO? Body text – I found a competitor site that’s ranking #1 for a key term. This competitor has 11,106 words in their body with the keyword mentioned 29 times (0.8%). They placed all this text in a small scroll down on the bottom of their page. Its strange how they included it. Please review attached image. the competitor URL is http://www(dot)1804design(dot)com/ w6AiM.png
White Hat / Black Hat SEO | | SEODinosaur0