Many pages small unique content vs 1 page with big content
-
Dear all,
I am redesigning some areas of our website, eurasmus.com and we do not have clear what is the best
option to follow. In our site, we have a city area i.e: www.eurasmus.com/en/erasmus-sevilla which we are going
to redesign and a guide area where we explain about the city, etc...http://eurasmus.com/en/erasmus-sevilla/guide/
all with unique content.The thing is that at this point due to lack of resources, our guide is not really deep and we believe like this it does not
add extra value for users creating a page with 500 characters text for every area (transport...). It is not also really user friendly.
On the other hand, this pages, in long tail are getting some results though is not our keyword target (i.e. transport in sevilla)
our keyword target would be (erasmus sevilla).When redesigning the city, we have to choose between:
a)www.eurasmus.com/en/erasmus-sevilla -> with all the content one one page about 2500 characters unique.
b)www.eurasmus.com/en/erasmus-sevilla -> With better amount of content and a nice redesign but keeping
the guide pages.What would you choose?
Let me know what you think.
Thanks!
-
Wow, Jose, you got a whole audit from Luis.
1. Luis makes a good point about Seville vs Sevilla. When you're trying to target a region other than your own, make sure that you change the location in Google Keyword Planner. Seville is the English version of Sevilla (which I know sounds strange, but we also call your country Spain rather than España).
2. Both subdomains and subfolders can effectively designate different languages. If you've made the call to use subfolders, that's fine. It's probably what I would have done, too, since that means the Domain Authority will transfer easily.
4 & 5. Keyword repetition in URLs isn't necessarily bad in your case, because it's caused by a lot of subfolders.
It seems like there's been some debate here on more subfolders vs less: there isn't a hard and fast rule about it. If you have more subfolders, those pages higher up in the structure tend to get more link equity out of the deal and rank better. That takes away from deeper pages, though, which are presumably targeting the most important words. If you use fewer subfolders, the link equity will be evenly distributed, but that means that higher level pages will be weaker and deeper pages will be stronger. In your case, I don't know the answer, since I don't know how competitive different keywords are at different levels. If I were your SEO, I'd tell you to stick with your current URL structure, because moving pages to new URLs tends to cause a big knock in rankings for awhile.
-
Hi Jose,
I like your current set up, with more pages at 500 words. 500 words doesn't make for thin content from a search engine's perspective, and it means that you're delivering a more targeted result for searchers; if you don't have separate pages and they search for "restaurants in Seville," they're not going to be thrilled if they land on your mega guide page and have to search to find what they're looking for.
That said, you may want to change the language on the main Seville page so you don't call these "detailed" city guides.
Good luck!
Kristina
-
hello again,
1. Seville has 22,000 searches in UK but very few people look for Sevilla.
2. It depends, I prefer subdomain.domain.com instead of subfolders.... I only found English language at your site. Even if you use /en/ you need a main language (that could be English), and this is not necessary to have the subfolder: www.domain.com (for English), then www.domain.com/es (for Spanish).... and so on. But well is a personal decision
3. OK
4. You didnt get my point. Please read my message and my example carefully (since I checked your site carefully). It's very very important you dont repeat similar or same keywords in the same URL. In my example before it was "Seville+Sevilla" and "university+universities" in one single URL.
5. Again, the best is to have the minimum subfolder as possible! URL like this: www.eurasmus.com/erasmus-seville-city-guide are much nicer for Google than www.eurasmus.com/erasmus-sevilla/city-guide
You can keep only one if this is your strategy, or both since they have different content and context. It's up to you and if you apply a good SEO strategy I dont see any problem having two pages.
About the long tail, I already explained before. You are maybe ranking now for non-competitive keywords (study the keyword difficulty rankings for your pages/keywords) for those pages. I recommend to focus on why you are not ranking well for the pages/KW you want and optimize your strategy.
Hope this helps!
Luis
-
Hi Luis!
Thank you for your message.
I will try to answer to all your comments.1. We have done all the research of the cities already and used the one with more results.
I will recheck it, to check that it is applied everywhere.
2.We are now publishing spanish and 7 languages more, that is why we have the /en.
We decided to go for the /fr /it etc...as far as i know there is not a relevant difference, we believe.
Am I right?
3. I agree. That is why we are redesigning (also not friendly user at all).
4. It is eurasmus.com, brand name, what is not erasmus. Different words. Another chapter would be
to discuss if it is a good brand election for SEO that has been a long discussion in our company long time.
5. We will study how to make it better!Concerning my direct question, would you recommend using all the guides content in the erasmus-sevilla
home page and delete the guide area or would you leave the guide and just make more content in the home?
Main thing is that we get results for long tail but those keywords do not really generate conversion....What do you think?
-
Hi Jose,
Some advices and questions:
- Have you done a keyword analyse before? How many searches you have for your supposed "focus keywords"?After checking a little bit I see the word "Seville" is much better than "Sevilla".... foreign users call it like that
- Don't abuse of URL sublevels: /en/eramus-sevilla/guide/... (You don't need the /en/ since your site is only in English. Please, if you plan to translate to new languages you can use subdomains for this (es.eurasmus.com, fr.eurasmus.com,...)
- Add much more content to your landing page (/erasmus-sevilla is quite poor in content)
- Don't repeat keywords in the URL: http://eurasmus.com/en/erasmus-sevilla/universities/university-of-seville (here you have two repeated keywords man!)
- Make things simplier! Some ideas:
- www.euramus.com/erasmus-spain/seville-city-guide
- www.euramus.com/erasmus-spain/seville-city-transport
- www.euramus.com/erasmus-spain/seville-universities
- www.euramus.com/erasmus-spain/madrid-city-guide
- www.euramus.com/erasmus-belgium/brussels-city-guide
Long tail results for different keywords are normal and that happen. Have you tested with the Moz Grade tool if your pages need some improvements for the related keywords? That would be necessary too.
Btw, I'm Spanish so dont hesitate to send me a PM if you need more help man
Luis
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to 301 Redirect /page.php to /page, after a RewriteRule has already made /page.php accessible by /page (Getting errors)
A site has its URLs with php extensions, like this: example.com/page.php I used the following rewrite to remove the extension so that the page can now be accessed from example.com/page RewriteCond %{REQUEST_FILENAME}.php -f
Intermediate & Advanced SEO | | rcseo
RewriteRule ^(.*)$ $1.php [L] It works great. I can access it via the example.com/page URL. However, the problem is the page can still be accessed from example.com/page.php. Because I have external links going to the page, I want to 301 redirect example.com/page.php to example.com/page. I've tried this a couple of ways but I get redirect loops or 500 internal server errors. Is there a way to have both? Remove the extension and 301 the .php to no extension? By the way, if it matters, page.php is an actual file in the root directory (not created through another rewrite or URI routing). I'm hoping I can do this, and not just throw a example.com/page canonical tag on the page. Thanks!0 -
Canonical page 1 and rel=next/prev
Hi! I'm checking a site that has something like a News section, where they publish some posts, quite similar to a blog.
Intermediate & Advanced SEO | | teconsite
They have a canonical url pointing to the page=1. I was thinking of implementing the rel=next/ prev and the view all page and set the view all page as the canonical. But, as this is not a category page of an ecommerce site, and it would has more than 100 posts inside in less than a year, It made me think that maybe the best solution would be the following Implementing rel=next/prev
Keep page 1 as the canonical version. I don't want to make the users wait for a such a big page to load (a view all with more than 100 elements would be too much, I think) What do you think about this solution? Thank you!0 -
After Server Migration - Crawling Gets slow and Dynamic Pages wherein Content changes are not getting Updated
Hello, I have just performed doing server migration 2 days back All's well with traffic moved to new servers But somehow - it seems that w.r.t previous host that on submitting a new article - it was getting indexed in minutes. Now even after submitting page for indexing - its taking bit of time in coming to Search Engines and some pages wherein content is daily updated - despite submitting for indexing - changes are not getting reflected Site name is - http://www.mycarhelpline.com Have checked in robots, meta tags, url structure - all remains well intact. No unknown errors reports through Google webmaster Could someone advise - is it normal - due to name server and ip address change and expect to correct it automatically or am i missing something Kindly advise in . Thanks
Intermediate & Advanced SEO | | Modi0 -
Noindexing Duplicate (non-unique) Content
When "noindex" is added to a page, does this ensure Google does not count page as part of their analysis of unique vs duplicate content ratio on a website? Example: I have a real estate business and I have noindex on MLS pages. However, is there a chance that even though Google does not index these pages, Google will still see those pages and think "ah, these are duplicate MLS pages, we are going to let those pages drag down value of entire site and lower ranking of even the unique pages". I like to just use "noindex, follow" on those MLS pages, but would it be safer to add pages to robots.txt as well and that should - in theory - increase likelihood Google will not see such MLS pages as duplicate content on my website? On another note: I had these MLS pages indexed and 3-4 weeks ago added "noindex, follow". However, still all indexed and no signs Google is noindexing yet.....
Intermediate & Advanced SEO | | khi50 -
I have removed over 2000+ pages but Google still says i have 3000+ pages indexed
Good Afternoon, I run a office equipment website called top4office.co.uk. My predecessor decided that he would make an exact copy of the content on our existing site top4office.com and place it on the top4office.co.uk domain which included over 2k of thin pages. Since coming in i have hired a copywriter who has rewritten all the important content and I have removed over 2k pages of thin pages. I have set up 301's and blocked the thin pages using robots.txt and then used Google's removal tool to remove the pages from the index which was successfully done. But, although they were removed and can now longer be found in Google, when i use site:top4office.co.uk i still have over 3k of indexed pages (Originally i had 3700). Does anyone have any ideas why this is happening and more importantly how i can fix it? Our ranking on this site is woeful in comparison to what it was in 2011. I have a deadline and was wondering how quickly, in your opinion, do you think all these changes will impact my SERPs rankings? Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
Same content pages in different versions of Google - is it duplicate>
Here's my issue I have the same page twice for content but on different url for the country, for example: www.example.com/gb/page/ and www.example.com/us/page So one for USA and one for Great Britain. Or it could be a subdomain gb. or us. etc. Now is it duplicate content is US version indexes the page and UK indexes other page (same content different url), the UK search engine will only see the UK page and the US the us page, different urls but same content. Is this bad for the panda update? or does this get away with it? People suggest it is ok and good for localised search for an international website - im not so sure. Really appreciate advice.
Intermediate & Advanced SEO | | pauledwards0 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0 -
Linking to local pages on main page - keyword self-cannibalization issue?
Hi guys, Our website has this landing page: www.example.com/service1/ Is this considered keyword self-cannibalization if on the above page we link to local pages such as: www.example.com/service1-in-chicago/ www.example.com/service1-in-newyork/ www.example.com/service1-in-texas/ Many thanks David
Intermediate & Advanced SEO | | sssrpm0