You go be fine without one. You only need one if you want to manage that subdmain: add specific xml sitemaps links in robots.txt, cut access to specific folders for that subdomain.
if you don't need any of that - just move forward without one.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
You go be fine without one. You only need one if you want to manage that subdmain: add specific xml sitemaps links in robots.txt, cut access to specific folders for that subdomain.
if you don't need any of that - just move forward without one.
Hi Chase,
Removing dev via web master tools should do the trick for now. Then since google won't get to dev anymore you should be safe.
Adding both noindex and password protection is not needed. Since it's password protected Google won't get to see the noindex on the pages. So you should only do one of the two. No need to change now. The password protection is safe.
As expected 'dev.chiplab.com' was removed from the SERP. Now, I'm a bit worried that the link equity was transferred for good to the subdomain from 'www.chiplab.com'. That's not possible, right?
*** Yes, that's not possible so you are good.
Only 301 redirections are "mandatory" for Google to pass equity - so all good.
noindex would be the easiest way.
Seen some people having the same issue fixing it by adding rel canonical to dev pointing to the new site and so the main site got back step by step with no interruptions...
Cheers.
No, the Geo targeting in Web Master Tools is not for boosting the Geo keyword (in your case "India") so it's not going to boost : "SEO services India" for your site because you have the GEO setting to India.
The GEO setting in WMT is just to clearly state what is your target audience - and in this case is India -> so your site should do better for all searches from India.
It might actually decrease your chances to rank in US based on that - so you have to chose GEO location settings only if you are really targeting mainly a specific location...
Hope it make sense.
Are you sure eyepaq?
** Yes. I have the same format implemented across several projects - big and small. All is perfect. I have a few cases when some domains are helping eachouther out – so when a new country is deployed it gets a small boost in that geo location due to the others. The approach was also confirmed by several trend analysis in Google in the google forum and at least one Google hangout and across the web in different articles.
If I had 5 domains so say .uk .fr .de .ie and .es and pasted the same 1000 words on each I would assume it would be duplicate content and wouldn't have equal rankings across all 5 domains, but I may be wrong?
** It won't be duplicate if you have the content in de in german and the content in uk in english. It will have the same message but it is not duplicate Of course you won't have the same rankings since it's different competition in Germany and UK for example and also the signals, mainly links are counted different for each country. One link from x.de will count towards the de domain in a different way then y.co.uk linking to the your uk domain.
I don't think Cole is talking about recreating the same article in different languages because then I would understand the use of the href-lang tag but I think he means the exact same article on separate domains, could be wrong here as well
*** if I understand correctly he is mainly concern about english content on different geo english based domains (uk, com, canada, co.nz, co au let's say) and for that - if it's the same content - he needs hreflang set for those and he is safe. Google will then rank co.uk domain and content in UK and not the canadian domain. He will also be safe with any "duplicate content issues" - although even without href lang there won’t be any.
Yes, that's it
The use of hreflang has a lot of benefits and overall is very straight forward - google will understand how the structure is setup and you are safe.
Cheers.
HI,
In this case the use of hreflang is needed:
https://support.google.com/webmasters/answer/189077?hl=en
As summary each version will have rel alternate hreflang set with hreflang="en-ca" for Canada for example, hreflang="en-us" for US and so on. (first is language and second geo location). So even if the language is the same, it's for a particular region as in some cases you might have some small differences in UK vs Au or Ca etc.
Whne you have a domain with example.ch, the hreflang will be hreflang="de-ch" .
Hope it helps.
Hi,
URL format and length it's a ranking signal - that means shorter URLs are "more important" but only if there is a taxonomy in place. If all your URLs are directly on the root - then this doesn’t play any role. So if your domain has a lot of content and some or most is structured domain/folder/ and domain/folder/folder etc - then moving from /blog to the root will bring some signals that this content might be important.
But note that having short urls on the root directly it's not a roadmap to good rankings
On the user side - if the content is "blog related" - blogging, articles, opinions etc - then I would keep it on the /blog/ as it can influence the CTR in SERPs - that also depending on the queries of course and if it will match what users are expecting - a blog thing or some other resource ...
Hope it makes sense.
Cheers.
If it's six months old - they yes - it might be a reason for concern as users might be set to the old domain. Can you check and see if you are blocking with robots.txt the old domain some how ? Since if that's the case the bot can't reach the old pages and see the redirection and if those pages are already in the index they will stay that way.
Alternatively check the logs and see if google bot did hit those pages in the last 6 mo - although I doubt it didn't - it's safe to check.
Cheers !
Hi,
Google's crawler is fetching the source code. if the content in the slider is visible in the source code - then the content is visible to google. There are a few "extra" factors related with the "real-estate" of the content that comes into play - but the bottom line is : if it's in the source code, the Google can see it.
Hope it helps.
Hi,
Since it said that search engines treat subdomains as different stand-alone sites,
** Search engines could treat subdomains as different stand alone sites. Usually they don't. A good example when they do that is the blogspot subdomains - and that is because those are in fact separate stand alone sites.
In your case however, since it's about users, similar with blogspot somehow - google can treat them at som point as separate websites.
whats best for the main site? to show multiple search results with profiles in subdomains or subdirectories?
** My personal opinion is that from a user point of view you should go with the subdomains as you do right now - it make sense, it's easy for them to use those urls, to link to them and so on.
You could lose some link equity for the main domain if some or all subdomain will be at some point treated as separate domains but if you put everything on a scale, it will balance as advantages on the subdomain approach anyway.
What if i use both? meaning in search results i use search directory url for each profile while same time each profile owns a subdomains as well? and if so which one should be the canonical?
** To be honest I do see the point / advantages in doing that.
One other advantage is that if you go with subdomains and google will count them as separate websites, if one of your user is ding something stupid (trying to rank with it and start building gambling and porn links to that sub domain you will be safe with the root domain and the other users won't be affected.
Hope it helps.
Edited to underline the word could
Hi,
I would assume you are referring to Site Search - see what people are looking for one landed on your site ?!
If that is the case you need to setup Site Search in Google Analytics - it's not set by default and you also need to add a pice of code to the backend of the site - but that's a copy and paste job.
See more here: http://support.google.com/analytics/bin/answer.py?hl=en&answer=1012264
(it's easy to follow)
You can also look for setting up site search if you will need more resources.
On the other hand if this is not what you meant and you are referring to queries performed in Google when your site is showing up, for those you either need to check Web master Tools at Traffic -> Search Queries or if you have your Web Master Tools connected with your Google Analytics account, you can see the same data directly from Analytics under: Traffic Sources -> Search Engine Optimisation -> Queries.
Hope it helps !
Hello,
Please let me know what are the exact right steps in order to get rid of the duplicate content issues related with:
www.domain.com/index.html same as www.domain.com without creating an infinite loop.
Do you have a step by step guide posted within seomoz including 301 redirect for non www to www for all urls and index.whatever to main domain name without going into a infinite loop ?
btw how to you spot the loop ? is it obvious like never ending refresh of the home page ?
thanks a lot !
Hi,
In order to deindex you should use noindex as content=none also means nofollow. You do need to follow now in order to reach all other pages and see the no index tag and remove those from the index.
When you have all of them out of the index you can set the none back on.
This is the main reason "none" as attribute is not very wide in usage as "shooting yourself in the foot" with it it's easy.
On the otehr hand you need to see if google bot is actually reaching those pages:
see if you don't have any robots.txt restrictions first
see when google's bot last have a hit on any of the pages - that will give you a good idea and you can do a prediction.
If those pages are in the sup index you can wait for some time for Google bit to revisit.
One last note: build xml sitemaps with all of those pages and submit those via WMT - that will help at 100% to get those in front of the firing squad and also to be able to monitor those better.
Hope it helps.
We work with load balancing a lot using multiple IPs - there are no issues SEO wise however you must be certain that those IPs, in the past, didn't got associated with spam or are on any black lists from previous users as in this case you can get a red flag for bad neighborhood.
In my personal experience this is the only down side - if this is ok - there is no reason for concern - only good things can happen.
Hope it helps.
I think is worth it to develop the proposal in a custom manner for each client based on his needs, KPIs, current status and so on.
We usually touch the bellow point in each proposal:
general status of the domain and general guidelines on what can be improved
on page optimisation improvments (from url structure, code, ux, nine relining, meta tags and content
link profile analysis, improvement channels
social media in correlation with SEO.
depending on the client and status of the proposal (initial, strategy etc) you can get into details with all or some of the points.
Hope it helps.
if the articles are good - then there just might be value to the user . Depending on the niche / industry those old articles could be very important.
Google dosen't like those as you probably have a lot of impression but no clicks (so mainly no traffic) or maybe the "score" is bad (bounce rate - not Google analytics bounce rate, but Google's bounce rate - if they bounce to serps that is).
Since you got hit by panda, in my opinion, I see two options:
1. No index those old pages. The users can still get tho those by navigation, site search etc but google won't see them. Google is fine with having content (old, poor, thin etc) if it's not in the index. I work with a site that has several million pages and 80% is no index - everything is fine now (they also got hit by Panda).
2. Merge those pages into rich, cool, fresh topic pages (see new york time topic pages sample - search for it - I think there is also an seomoz post - a whiteboard friday about it). This is a good approach and if you manage to merge those old pages with some new content you will be fine. Topic pages are great as an anti panda tool !
If you merge the pages into topic pages do that based on a simple flow:
1. identify a group of pages that covers the same topic.
2. identify the page that has the highest authority of all.
3. Change this page into the topic page - keep the url.
4. Merge the other into this page (based on your new topic page structure and flow)
5. 301 redirect the others to this one
6. build a separat xml sitemaps with all those pages and load it up to WMT. Monitor it.
7. Build some links to some of those landing pages, get some minimum social signals to those - to a few (depending on the number). Build an index typoe of page with those topic pages or some of them (user friendly one/ ones) and use those as target to build some links to send the 'love'.
Hope it helps - just some ideas.
You go be fine without one. You only need one if you want to manage that subdmain: add specific xml sitemaps links in robots.txt, cut access to specific folders for that subdomain.
if you don't need any of that - just move forward without one.
Normally for this type of flow you won't get any value for that link.
That is because that iframe is in fact a page that is displayed within another page and use for this purpose alone - to link to you that is. That page doesn't have any authority because no one is linking to it - not even the site that part of - this page won't be in any navigation, sitemap or any other linked source - so it will have almost no value. Maybe a little value as part of an authority site but I doubt it will cont.
Just to underline the above - you will have a link from an obscure page, a satellite that for the user will look like is part of an important page but for Google and friends (since they see beyond the smoke screen) this page won't pass any link juice.
This is the reason why for link building one of the most important requests is : the link shouldn't be within an iframe or using a javascript.
My personal opinion is to move away from this concept as it can eat up a lot of resources with only a small to none amount of benefits.
Hope it helps as an outside advice.
Looks like google dosen't really like this one: http://screencast.com/t/1nwsdABxQv
So I would't put any hopes to get some love from any links from there ...
However for sure you will get some love phone calls from a lot of companies trying to push services and products
kicking since 1979.
Looks like your connection to Moz was lost, please wait while we try to reconnect.