How does an exact match domain.me rate for SEO
-
Anyone have any idea how an exactly matching keyword (using the "domain.me" register) will compare against an almost matching keyword in the Google .ie search engine. (assuming that on and off page SEO will be the same).
eg, www.wigets.me against www.mywigets.ie
Thanks
-
Thanks Ryan
-
If your market is exclusive to Ireland, I would recommend going with the .ie domain.
It is my understanding the deeper you go into your URL, the higher value Google offers. In the below example, all things being equal, the second URL would rank better. You could make up for the "loss" of not having the keyword match in this manner.
www.widget.com/home vs www.bigcompany.com/widget
I would like to add while it is desirable to have the keyword match, it is just one of over 100 factors used to create an overall value. Many sites are top ranked without having a keyword in their root URL.
-
Hi Ryan
No, I am comparing against a .ie
eg, www.widgets.me against www.mywidgets.ie
This will be for a widget to be marketed solely in Ireland.
Thanks
Peter
-
Are you comparing against a .com? www.widgets.me vs www.mywidgets.com?
-
match domain name is always the best, if you cant the almost matching is stilol good for seo.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO experiments
Hello All, I would appreciate if you could share with me your insight and advise on how to run SEO and UX experiments. How do we set up the proper benchmark and control groups, what tools to use? How can we ensure that the changes we see are not due to other factors and what difference is needed to confirm our findings? Some of the things I would like to test is if certain words in the title tag can correlate with higher CTR or with lower bounce rates. I would also like to quickly test if the common best practices only 1 h1 etc have any effect on rankings at all. Please share with me from knowledge and experience. Thanks Much!
Algorithm Updates | | Joseph-Green-SEO0 -
SEO - Google Local Listing & Same Day Delivery
Hi We are looking to offer same day delivery if you're in a 20 mile radius to us. I'm trying to do some research on how to optimise this for Google organic listings. Would this be the same as optimising for a local business listing? I'm not sure where to start. Thanks! Becky
Algorithm Updates | | BeckyKey0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Ecommerce SEO help
Hi I'm having difficulty managing our product pages for optimisation, we have over 20,000 products. We do keyword research & optimise product titles/meta of new products - however there's a lot to clean up but we have done a lot. I find we rank/convert better on product pages so they would be great to focus on - however when an old product is discontinued, the page is removed & we lose authority by creating new pages for similar products - does anyone have any ideas for managing this? This is something done automatically on the dev side in France. I then have the issue of trying to rank category pages - these are highly competitive areas competing with big brands. I'm finding it tough to know where to focus, the site is vast and I am the only SEO. I've started looking into low hanging fruit - but these aren't necessarily the areas which bring in much revenue. Thanks!
Algorithm Updates | | BeckyKey0 -
Are SEO Friendly URLS Less Important Now That Google Is Indexing Breadcrumb Markup?
Hi Moz Community and staffers, Would appreciate your thoughts on the following question: **Are SEO friendly URLS less important now that Google is indexing breadcrumb markup in both desktop and mobile search? ** Background that inspired the question: Our ecommerce platform's out of the box functionality has very limited "friendly url" settings and would need some development work to setup an alias for more friendly URLS. Meanwhile, the breadcrumb markup is implemented correctly and indexed so it seems there's no longer an argument for improved CTR with SEO friendly URLS . With that said I'm having a hard time justifying the URL investment, as well as the 301 redirect mapping we would need to setup, and am wondering if more friendly URLs would lead to a significant increase in rankings for level of effort? Sidenote: We already rank well for non-brand and branded searches since we are brand manufacturer with an ecommerce presence. Our breadcrumbs are much cleaner & concise than our URL structure. Here are a couple examples. Category URL: http://www.mysite.com/browse/category1/subcat2/subcat3/_/N-7th
Algorithm Updates | | jessekanman
Breadcrumb: www.mysite.com > category1 > subcat2 > subcat3 Product URL: http://www.mysite.com/product/product-name/_/R-133456E112
Breadcrumb: www.mysite.com > category1 > subcat2 > subcat3 > product name The "categories" contain actual keywords just hiding them here in the example. According to my devs they can't get rid of the "_" but could possible replace it with a letter. Also they said it's an easier fix to make the URLs always lower case. Lastly some of our product URLS contain non-standard characters in the product name like "." and "," which is also a simpler fix according to my developers. Looking forward to your thoughts on the topic! Jesse0 -
301'ing old (2000), high PR, high pages indexed domain
Hi, I have an old (2000), very high PR, 20M+ pages indexed by goog domain which... got adsense banned. The domain has taken a few hits over the years from penguin/panda, but come out pretty well compared to many competitors. The problem is it was adsense banned in the big adsense acct ban of 2012 for invalid activity. No, I still have no idea what the issue was. I'd like to start using a new domain if I can safely get goog to pass the PR & indexing love so I can run adsense & Adx. What are your initial thoughts? Am I out of my mind to try?
Algorithm Updates | | comfortsteve1 -
What was the biggest challenge you faced as an SEO in 2012?
As an SEO (in-house, freelance, consultant, agency, entrepreneur) what was the biggest challenge you faced in 2012? Please be as specific as you can, and let us all know what you are doing to overcome this challenge in 2013. For me personally I would have to say the biggest challenge I had to deal with was Google+ Local. Obviously Google is putting a lot into G+L, but it has been so messy and at times I have just thrown my arms up in the air. Especially when it comes to multi-state locations and losing reviews.
Algorithm Updates | | clarktbell0 -
Shouldn’t Google always rank a website for its own unique, exact +10 word content such as a whole sentence?
Hello fellow SEO's, I'm working with a new client who owns a property related website in the UK.
Algorithm Updates | | Qasim_IMG
Recently (May onwards) they have experienced significant drops in nearly all non domain/brand related rankings. From page 1 to +5 or worse. Please see the attached webmaster tools traffic graph.
The 13th of June seemed to have the biggest drop (UK Panda update???) When we copy and paste individual +20 word sentences from within top level content Google does bring up exact results, the content is indexed but the clients site nearly always appears at the bottom of SERP's. Even very new or small, 3-4 page domains that have clearly all copied all of their content are out ranking the original content on the clients site. As I'm sure know, this is very annoying for the client! And this even happens when Google’s cache date (that appears next to the results) for the clients content is clearly older then the other results! The only major activity was the client utilising Google optimiser which redirects traffic to various test pages. These tests finished in June. Details about the clients website: Domain has been around for 4+ years The website doesn't have a huge amount of content, around 40 pages. I would consider 50% original, 20% thin and 30% duplicate (working on fixing this) There haven’t been any signicant sitewide or page changes. Webmaster tools show nothing abnormal or any errors messages (some duplicate meta/title tags that are being fixed) All the pages of the site are indexed by Google Domain/page authority is above average for the niche (around 45 in for the domain in OSE) There are no ads of any kind on the site There are no special scripts or anything fancy that could cause problems I can't seem to figure it out, I know the site can be improved but such a severe drop where even very weak domains are out ranking suggests a penalty of some sort? Can anyone help me out here? hxuSn.jpg0