Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Outranking a crappy outdated site with domain age & keywords in URL.
-
I'm trying to outrank a website with the following:
Website with #1 ranking for a search query with "City & Brand"
- Domain Authority - 2
- Domain Age - 11 years & 9 months old
- Has both the City & brand in the URL name.
- The site is crap, outdated.. probably last designed in the 90's, old layouts, not a lot of content & NO keywords in the titles & descriptions on all pages.
My site ranks 5th for the same keyword.. BEHIND 4 pages from the site described above.
- Domain Authority - 2
- Domain Age - 4 years & 2 months old
- Has only the CITY in the URL.
- Brand new site design this past year, new content & individual keywords in the titles, descriptions on each page.
My main question is.... do you think it would be be beneficial to buy a new domain name with the BRAND in the URL & CITY & 301 redirect my 4 year old domain to the new domain to pass along the authority it has gained.
Will having the brand in the URL make much of a difference?
Do you think that small step would even help to beat the crappy but old site out?
Thanks for any help & suggestions on how to beat this old site or at least show up second.
-
Thanks all. This is what I had recommended to the client to begin with. I just needed some backup from all you smart SEO's out there.
Unfortunately the URL would not be for sale as it's a brick and mortar business.
Thanks again!
-
I personally lean more towards the reaction of EGOL. If you put enough effort in it, keep it straight white hat and do all the steps suggested in the SEOblog section (like link earning in stead of link building) then over time you can outrank that site for sure. But keep working on it. Be social.. share everything you can on facebook, twitter and off course the BIG G+.
Tricks can help you in the short run but hurt you in the long run so i wouldn't go for that straight away.
You could also try registering the other domain you mentioned, put up some content and everything and build it next to your existing website (without copying text etc.). You could do this as a supporting role for your primary website if you wish. But i would focus on my primary website first and improving that one.
regards
Jarno
-
I like irving's suggestion to see if the webmaster is willing to sell the site. whats the link profile like? any particular high authority links that might be giving it the advantage over your site?
-
**Will having the brand in the URL make much of a difference? **
The brand? Yes, if people know you.
Really, I would not change domains for the tiny advantage that you think a keyword in the domain might bring. It is very possible that you will lose more linkjuice in the redirect than you will gain from the keyword in the domain.
Do you think that small step would even help to beat the crappy but old site out?
heh... That crappy old site is beating you because they are beating you.
It is easier to beat an old crappy site with "work" than it is to beat them with "tricks'.
-
Absolutely not, your site is aged. A new site is like starting all over even if you do 301 the old site to the new.
a) work on improving the on page SEO on your site
b) if that new domain is available you could play around with setting that up as a stand alone site and see if you can get it ranked #1, it could take 6-12 months before Google really trusts it enough.
c) if it's that old and outdated maybe he wants to sell it at a reasonable price if it's worth that much to you?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I treat URLs with bookmarks when migrating a site?
I'm migrating an old website into a new one, and have several pages that have bookmarks on them. Do I need to redirect those? or how should they be treated? For example, both https://www.tnscanada.ca/our-expertise.html and https://www.tnscanada.ca/our-expertise.html#auto resolve .
Intermediate & Advanced SEO | | NatalieB_Kantar0 -
Will I loose from SEO if I rename my urls to be more keyword friendly?
As a good practice of SEO is to have your keywords in the links. I am thinking of doing some optimization and change my urls to more effective keywords. I am using shopify and there is an option (a tick) that you can check while changing the url (ex. for a category, for a product, for a blog post). This will give a redirection to the old post to the new. Is it good practice? Is it risky for losing SEO or it will help to rank higher because I will have better keywords in my links?
Intermediate & Advanced SEO | | Spiros.im0 -
Site-wide Canonical Rewrite Rule for Multiple Currency URL Parameters?
Hi Guys, I am currently working with an eCommerce site which has site-wide duplicate content caused by currency URL parameter variations. Example: https://www.marcb.com/ https://www.marcb.com/?setCurrencyId=3 https://www.marcb.com/?setCurrencyId=2 https://www.marcb.com/?setCurrencyId=1 My initial thought is to create a bunch of canonical tags which will pass on link equity to the core URL version. However I was wondering if there was a rule which could be implemented within the .htaccess file that will make the canonical site-wide without being so labour intensive. I also noticed that these URLs are being indexed in Google, so would it be worth setting a site-wide noindex to these variations also? Thanks
Intermediate & Advanced SEO | | NickG-1230 -
How do you 301 redirect URLs with a hashbang (#!) format? We just lost a ton of pagerank because we thought javascript redirect was the only way! But other sites have been able to do this – examples and details inside
Hi Moz, Here's more info on our problem, and thanks for reading! We’re trying to Create 301 redirects for 44 pages on site.com. We’re having trouble 301 redirecting these pages, possibly because they are AJAX and have hashbangs in the URLs. These are locations pages. The old locations URLs are in the following format: www.site.com/locations/#!new-york and the new URLs that we want to redirect to are in this format: www.site.com/locations/new-york We have not been able to create these redirects using Yoast WordPress SEO plugin v.1.5.3.2. The CMS is WordPress version 3.9.1 The reason we want to 301 redirect these pages is because we have created new pages to replace them, and we want to pass pagerank from the old pages to the new. A 301 redirect is the ideal way to pass pagerank. Examples of pages that are able to 301 redirect hashbang URLs include http://www.sherrilltree.com/Saddles#!Saddles and https://twitter.com/#!RobOusbey.
Intermediate & Advanced SEO | | DA20130 -
Weird 404 URL Problem - domain name being placed at end of urls
Hey there. For some reason when doing crawl tests I'm finding pages with the domain name being tacked on the end and causing 404 errors.
Intermediate & Advanced SEO | | Jay328
For example: http://domainname.com/page-name/http://domainname.com This is happening to all pages, posts and even category type 1. Site is in Wordpress
2. Using Yoast SEO plugin Any suggestions? Thanks!0 -
Using both dofollow & nofollow links within the same blog site (but different post).
Hi all, I have been actively pursuing bloggers for my site in order to build page rank. My website sells women undergarments that are more on the exotic end. I noticed a large amount of prospective bloggers demand product samples. As already confirm, bloggers that are given "free" samples should use a rel=no follow attribute in their links. Unfortunately this does not build my page rank or transfer links juice. My question is this: is it advisable for them to also blog additional posts and include dofollow links? The idea is for the blogger to use a nofollow when posting about the sample and a regular link for a secondary post at a later time. What are you thoughts concerning this matter?
Intermediate & Advanced SEO | | 90miLLA0 -
Google Ranking Generally in Germany - Keywords & Umlauts
Hi Mozzers, I was hoping i could get some advice/opinions on a website ranking problem i have been working on, in particular one of the pages. This is our German language website which is hosted from Germany and a flaunt German speaking member of staff from our German office moderates the text content of the website for us.Our website seems to get good traffic ,visitor navigation and conversions. One of the keywords i focus building around is Schallpegelmessgerät which is one way of basically saying Sound level meter in German. The keyword uses an umlaut which i cannot use in the URL, but google is picking up and putting into the snippets, but apart from that our on-page optimization is good according to the moz tool. I have been trying to improve our content and we post many blog articles around the topic/keyword but google.de seems to choose not to even display this on the first couple of pages and sometimes ranks our blog articles around the third page. We are even been outranked by some low quality cheap online shop websites some of which with low quality content and low page and domain authorities. I had accepted this but after looking at bing.de and doing a search i find our page in the top 5 results, i understand that google and bing's algorhythms are different but just struggling to get my head around it all. Here is our website & page - http://www.cirrusresearch.de/produkte/schallpegelmessgerat/ Any advice on this situation would be greatly appreciated, thank you very much for reading this James
Intermediate & Advanced SEO | | Antony_Towle0 -
Robots.txt & url removal vs. noindex, follow?
When de-indexing pages from google, what are the pros & cons of each of the below two options: robots.txt & requesting url removal from google webmasters Use the noindex, follow meta tag on all doctor profile pages Keep the URLs in the Sitemap file so that Google will recrawl them and find the noindex meta tag make sure that they're not disallowed by the robots.txt file
Intermediate & Advanced SEO | | nicole.healthline0