Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Outranking a crappy outdated site with domain age & keywords in URL.
-
I'm trying to outrank a website with the following:
Website with #1 ranking for a search query with "City & Brand"
- Domain Authority - 2
- Domain Age - 11 years & 9 months old
- Has both the City & brand in the URL name.
- The site is crap, outdated.. probably last designed in the 90's, old layouts, not a lot of content & NO keywords in the titles & descriptions on all pages.
My site ranks 5th for the same keyword.. BEHIND 4 pages from the site described above.
- Domain Authority - 2
- Domain Age - 4 years & 2 months old
- Has only the CITY in the URL.
- Brand new site design this past year, new content & individual keywords in the titles, descriptions on each page.
My main question is.... do you think it would be be beneficial to buy a new domain name with the BRAND in the URL & CITY & 301 redirect my 4 year old domain to the new domain to pass along the authority it has gained.
Will having the brand in the URL make much of a difference?
Do you think that small step would even help to beat the crappy but old site out?
Thanks for any help & suggestions on how to beat this old site or at least show up second.
-
Thanks all. This is what I had recommended to the client to begin with. I just needed some backup from all you smart SEO's out there.
Unfortunately the URL would not be for sale as it's a brick and mortar business.
Thanks again!
-
I personally lean more towards the reaction of EGOL. If you put enough effort in it, keep it straight white hat and do all the steps suggested in the SEOblog section (like link earning in stead of link building) then over time you can outrank that site for sure. But keep working on it. Be social.. share everything you can on facebook, twitter and off course the BIG G+.
Tricks can help you in the short run but hurt you in the long run so i wouldn't go for that straight away.
You could also try registering the other domain you mentioned, put up some content and everything and build it next to your existing website (without copying text etc.). You could do this as a supporting role for your primary website if you wish. But i would focus on my primary website first and improving that one.
regards
Jarno
-
I like irving's suggestion to see if the webmaster is willing to sell the site. whats the link profile like? any particular high authority links that might be giving it the advantage over your site?
-
**Will having the brand in the URL make much of a difference? **
The brand? Yes, if people know you.
Really, I would not change domains for the tiny advantage that you think a keyword in the domain might bring. It is very possible that you will lose more linkjuice in the redirect than you will gain from the keyword in the domain.
Do you think that small step would even help to beat the crappy but old site out?
heh... That crappy old site is beating you because they are beating you.

It is easier to beat an old crappy site with "work" than it is to beat them with "tricks'.
-
Absolutely not, your site is aged. A new site is like starting all over even if you do 301 the old site to the new.
a) work on improving the on page SEO on your site
b) if that new domain is available you could play around with setting that up as a stand alone site and see if you can get it ranked #1, it could take 6-12 months before Google really trusts it enough.
c) if it's that old and outdated maybe he wants to sell it at a reasonable price if it's worth that much to you?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I treat URLs with bookmarks when migrating a site?
I'm migrating an old website into a new one, and have several pages that have bookmarks on them. Do I need to redirect those? or how should they be treated? For example, both https://www.tnscanada.ca/our-expertise.html and https://www.tnscanada.ca/our-expertise.html#auto resolve .
Intermediate & Advanced SEO | | NatalieB_Kantar0 -
Will using a reverse proxy give me the benefits of the main sites domain authority?
If I am running example.com and have a blog on exampleblog.com Will moving the blog to example.com/blog and using a reverse proxy give the blog the same domain authority as example.com Thanks
Intermediate & Advanced SEO | | El-Bracko0 -
Linking from & to in domains and sub-domains
What's the best optimised linking between sub-domains and domains? And every time we'll give website link at top with logo...do we need to link sub-domain also with all it's pages? If example.com is domain and example.com/blog is sub-domain or sub-folder... Do we need to link to example.com from /blog? Do we need to give /blog link in all pages of /blog? Is there any difference in connecting domains with sub-domains and sub-folders?
Intermediate & Advanced SEO | | vtmoz0 -
Site-wide Canonical Rewrite Rule for Multiple Currency URL Parameters?
Hi Guys, I am currently working with an eCommerce site which has site-wide duplicate content caused by currency URL parameter variations. Example: https://www.marcb.com/ https://www.marcb.com/?setCurrencyId=3 https://www.marcb.com/?setCurrencyId=2 https://www.marcb.com/?setCurrencyId=1 My initial thought is to create a bunch of canonical tags which will pass on link equity to the core URL version. However I was wondering if there was a rule which could be implemented within the .htaccess file that will make the canonical site-wide without being so labour intensive. I also noticed that these URLs are being indexed in Google, so would it be worth setting a site-wide noindex to these variations also? Thanks
Intermediate & Advanced SEO | | NickG-1230 -
Domain Authority: 23, Page Authority: 33, Can My Site Still Rank?
Greetings: Our New York City commercial real estate site is www.nyc-officespace-leader.com. Key MOZ metric are as follows: Domain Authority: 23
Intermediate & Advanced SEO | | Kingalan1
Page Authority: 33
28 Root Domains linking to the site
179 Total Links. In the last six months domain authority, page authority, domains linking to the site have declined. We have focused on removing duplicate content and low quality links which may have had a negative impact on the above metrics. Our ranking has dropped greatly in the last two months. Could it be due to the above metrics? These numbers seem pretty bad. How can I reverse without engaging in any black hat behavior that could work against me in the future? Ideas?
Thanks, Alan Rosinsky0 -
Why does a site have no domain authority?
A website was built and launched eight months ago, and their domain authority is 1. When a site has been live for a while and has such a low DA, what's causing it?
Intermediate & Advanced SEO | | optimalwebinc0 -
Include Cross Domain Canonical URL's in Sitemap - Yes or No?
I have several sites that have cross domain canonical tags setup on similar pages. I am unsure if these pages that are canonicalized to a different domain should be included in the sitemap. My first thought is no, because I should only include pages in the sitemap that I want indexed. On the other hand, if I include ALL pages on my site in the sitemap, once Google gets to a page that has a cross domain canonical tag, I'm assuming it will just note that and determine if the canonicalized page is the better version. I have yet to see any errors in GWT about this. I have seen errors where I included a 301 redirect in my sitemap file. I suspect its ok, but to me, it seems that Google would rather not find these URL's in a sitemap, have to crawl them time and time again to determine if they are the best page, even though I'm indicating that this page has a similar page that I'd rather have indexed.
Intermediate & Advanced SEO | | WEB-IRS0 -
Robots.txt & url removal vs. noindex, follow?
When de-indexing pages from google, what are the pros & cons of each of the below two options: robots.txt & requesting url removal from google webmasters Use the noindex, follow meta tag on all doctor profile pages Keep the URLs in the Sitemap file so that Google will recrawl them and find the noindex meta tag make sure that they're not disallowed by the robots.txt file
Intermediate & Advanced SEO | | nicole.healthline0