Targeting City via Web Server
-
Here's a question I can't seem to find an answer to.
Does web hosting within a targeting city make a different in the engines?
For example, a site targeting the Denver area, with web hosting in Denver. Will this boast the ranking or is targeting limited to countries?
Thanks!
-
Thanks, that's what I assumed (re: city targeting)
-
Agree with Adam. A lot of people in the UK use US web hosting and it makes no difference (from what I have seen) to rankings.
-
I've never seen any evidence or indication that it matters what city your web hosting is in.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Web Page Dropped Out of Google?
One of our web pages seems to have completely dropped out of Google after featuring on page 1 for a number of years. It can't be a site wide issue as all other web pages are performing as normal. The page is http://www.contractormoney.com/income-protection/ and the key phrase it was performing well for was 'contractor income protection'. Any ideas??
Technical SEO | | Pete40 -
Blocking Affiliate Links via robots.txt
Hi, I work with a client who has a large affiliate network pointing to their domain which is a large part of their inbound marketing strategy. All of these links point to a subdomain of affiliates.example.com, which then redirects the links through a 301 redirect to the relevant target page for the link. These links have been showing up in Webmaster Tools as top linking domains and also in the latest downloaded links reports. To follow guidelines and ensure that these links aren't counted by Google for either positive or negative impact on the site, we have added a block on the robots.txt of the affiliates.example.com subdomain, blocking search engines from crawling the full subddomain. The robots.txt file is the following code: User-agent: * Disallow: / We have authenticated the subdomain with Google Webmaster Tools and made certain that Google can reach and read the robots.txt file. We know they are being blocked from reading the affiliates subdomain. However, we added this affiliates subdomain block a few weeks ago to the robots.txt, but links are still showing up in the latest downloads report as first being discovered after we added the block. It's been a few weeks already, and we want to make sure that the block was implemented properly and that these links aren't being used to negatively impact the site. Any suggestions or clarification would be helpful - if the subdomain is being blocked for the search engines, why are the search engines following the links and reporting them in the www.example.com subdomain GWMT account as latest links. And if the block is implemented properly, will the total number of links pointing to our site as reported in the links to your site section be reduced, or does this not have an impact on that figure?From a development standpoint, it's a much easier fix for us to adjust the robots.txt file than to change the affiliate linking connection from a 301 to a 302, which is why we decided to go with this option.Any help you can offer will be greatly appreciated.Thanks,Mark
Technical SEO | | Mark_Ginsberg0 -
Would using javascript onclick functions to override href target be ok?
Hi all, I am currently working on a new search facility for me ecommerce site... it has very quickly dawned on me that this new facility is far better than my standard product pages - from a user point of view - i.e lots of product attributes for customers to find what they need faster, ability to compare products etc... All in all just better. BUT NO SEO VALUE!!! i want to use this search facility instead of my category/product pages... however as they are search pages i have "robots noindex them" and dont think its wise to change that... I have spoken to the developers of this software and they suggested i could use some javascript in the navigation to change the onlclick function to take the user to the search equivelant of the page... They said this way my normal pages are the ones that are still indexed by google etc, but the user has the benefit of using the improved search pages... This sounds perfect, however it also sounds a little deceptive... and i know google has loads of rules about these kinds of things, the last thing i want is to get any kind of penalty or any negative reaction from an SEO point of view... I am only considering this as it will improve the user experience on my website... Can any one advise if this is OK, or a "no no"... P.s for those wondering i use an "off the shelf" cart system and it would cost me an arm and a leg to have these features built into my actual category / product pages.
Technical SEO | | isntworkdull0 -
How to Delete a Page on the Web?
Google reports and I have confirmed that the following old page is presenting on the Web. http://www.audiobooksonline.com/The_Great_American_Baseball_Box_Greatest_Moments_from_the_Last_80_Years_original_audio_collection_compact_discs.html This page hasn't been in our site's directory for some time and is no longer needed by us. What is the best way to fix this Google reported crawl error?
Technical SEO | | lbohen0 -
What sitemap generator for mac for php web
i would like to generate a sitemap for my web, and i have mac computer, can you advise me about what site map use
Technical SEO | | maestrosonrisas0 -
I have found a website on my dedicated server which is not mine
Hi i have found a website on my dedicated server that is not mine by using a number of tools including http://www.yougetsignal.com/tools/web-sites-on-web-server/ the site that i found that was not mine is buycostumes.com i contacted my hosting company and told them and this is the reply i got back We have checked the issue and it appears there have a bug/glitch in this site, because after we checked the mentioned domain we found that this website respond on IP which is not our and is not assisgned in your Dedicated server as you may preview below: Found 3 domains hosted on the same web server as buycostumes.com (66.45.3.44). buycostumes.com (linkback) target.com (linkback) www.certifigroup.com (linkback) Thus if you find any issues on your web server, please mention them in this ticket and we will be glad to provide you with further assistance on this matter. Please feel free to contact us if you have any further technical difficulties. Best regards, Now they have not said if they are going to do anything about this and to be honest i am getting fed up with the hosting company because i am being told that the slow speed i was receiving for my website www.in2town.co.uk was down to it taking a long time to reach my server before reaching my site. Now not sure if this is correct or not but with all the help i have received off semoz i have managed to increase the speed to my site but now i have found this problem. Can anyone tell me if i am being played and also can anyone recommend a professional UK hosting company. Also is this site affecting my spped and my site performance.
Technical SEO | | ClaireH-1848860 -
Should you worry about adding geo-targeted pages to your site?
Post-Panda, should I worry about adding a bunch of geo-targeted landing pages at once? It's a community, people have added their location on their profile pages. I'm worried if we decide to make all the locations into hyperlinks that point to new geo-targeted pages, it could get us extra traffic for those geo-specific keyword phrases but penalize the site as a whole for having so many low-quality pages. What I'm thinking is maybe to start small and turn, say, United States into a hyperlink that points to a page (that would house our community members that reside in the United States) and add extra unique content to the page. And only add a new location page when we know we'll be adding unique content to it, so it's not basically just page sorting. Thoughts? Hope that makes sense. Thanks!
Technical SEO | | poolguy0 -
REL = cannonical and web app
I started a web app campaign for a site that I recently finished. It had no errors or warnings, but issued rel=cannonical notices for every page on the site. What does this mean?
Technical SEO | | waynekolenchuk0