External URLs - ones you can't reach out
-
My fellow Mozzers,
I have been reviewing our Google Webmaster error reports and noticed high url errors. These URLs are generated from international sites, mainly China. Upon further inspection, these look to be links to dynamic URLs that are no longer active on our site. (search pages)
China is linking to old URLs that simply spew out a 'Bad Request' pages now. Problems I face is that:
- I can't contact these chinese sites to remove/edit the URLs.
- I could work with my developers to identify the urls and direct them all to the homepage, but is that good. The URLs are still present.
- Some of these look like pages that haven't been updated in a while, so now I have links from sites that are archived, or "dead"
Have you tackled anything like this before? Thoughts are welcome
Thanks
-
Agreed. Great answer Highland.
-
I'm actually looking into creating a dynamic block on a static 404 page. The block would feed in products/data based on the URL string, so it recognizes the URL and displays relevant content.
The thing with the chinese urls is that, they introduced a unique character to the URL and that's why I don't get a 404 page but a 'bad request' page. Servers don't recognize the unique character. In talks with the developer to see if we can do something to direct them to a 404.
Thanks
-
I would just 404 the pages (provided they're not already) and move on. There's nothing wrong with having dead links to your site. Some people would recreate the pages to try and capture some of that link juice but if they're from abandoned Chinese sites I don't think they would provide the high quality you need.
If you want the best of both worlds, have the page return a 404 but return some content that would direct anyone who happened to click in to some other part of your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can anyone tell me how these are pulled?
Hi Guys, I look after multiple automotive brands websites and one of my sites is getting these images (see screenshot) pulled and placed into search results. When you click on them it then takes you to the model page of that specific vehicle. You can also scroll to the left and more vehicle images will appear. I'm trying to figure out how these are pulled and what i can do to get them on my other website search results - any ideas? Thanks,
SERP Trends | | InspireGlobalNetworks
Gareth yOgTK0 -
Houston Company Needs Help (Will Our SEO Work Be Destroyed While Site is Down?, Can Anything be Done?)
I'm a Moz member, mostly just lurk, and love the Moz community. I work at a non profit company that does benchmarking data and helps school districts improve process in education. Our building flooded and our site is currently offline, is there anything we can do to stop/lesson any SEO page rank drop between now and when we are back up? We have worked very hard to get these rankings. I know it is minor compared to all tragedy in Houston, but we have worked hard to get these SEO gains (YEAH MOZ!) and I'd hate to lose them because of Harvey. Any suggestions, assistance appreciated! Ralph
SERP Trends | | inhouseninja1 -
URL Parameter for Limiting Results
We have a category page that lists products. We have parameters and the default value is to limit the page to display 9 products. If the user wishes, they can view 15 products or 30 products on the same page. The parameter is ?limit=9 or ?limit=15 and so on. Google is recognizing this as duplicate meta tags and meta descriptions via HTML Suggestions. I have a couple questions. 1. What should be my goal? Is my goal to have Google crawl the page with 9 items or crawl the page with all items in the category? In Search Console, the first part of setting up a URL parameter says "Does this parameter change page content seen by the user?". In my opinion, I think the answer is Yes. Then, when I select how the parameter affects page content, I assume I'd choose Narrows because it's either narrowing or expanding the number of items displayed on the page. 2. When setting up my URL Parameters in Search Console, do I want to select Every URL or just let Googlebot decide? I'm torn because when I read about Every URL, it says this setting could result in Googlebot unnecessarily crawling duplicate content on your site (it's already doing that). When reading further, I begin to second guess the Narrowing option. Now I'm at a loss on what to do. Any advice or suggestions will be helpful! Thanks.
SERP Trends | | dkeipper0 -
Google News results ...can it be SEOed?
Hello Everyone. I simply wanted to know if anyone had some useful insight on what it takes for a legitimate website to appear within the Google News results. I have rarely, or ever, had to dabble in this kind of SEO, but after coming across a situation with a perfectly legitimate website, I'm now scratching my head. The site in question is a very well established website, with 0 "seo" done to it. All links organic, all traffic legit and they have VERY strong social media presence. The site's current DA is 50. Its a 3 letter domain. Some of the points I believe are important quantity and quality of content (% of aggregated vs actually original content) overall % of "news" content vs rest of the site content authors/writers credentials (how would Google evaluate the authority of a writer, so his/her content is newsworthy?) overall site authority rich snippet and code needed to be indexed? I think rel publisher or rel author tags have something to do with it? making sure to have basic SEO in place: canonical tag, unique headers, etc. What am I missing? They have one particular competitor that seem to be ranking for almost everything news related, while being a similar site in content and authority, however they are nowhere. They have submitted to Google News before (not even sure what that means) but have failed to be included -- does this put a "stain" on them for any reason or impede the possibility of being indexed in the Google News results in the future? ANY input is appreciated.
SERP Trends | | 1stOTLJK0 -
Why Google shows site's in serp coming with wikipedia link as of its brand name?
Hi Mozzers & Moz Team, Let's know how actually this happens to come in Google SERPs with wikipedia link as of its brand name?
SERP Trends | | Futura
It is great to show, but i really want how this comes with brand associated link of wikipedia. Some sources also come wtih DMOZ if ones don't find in wikipedia. Please find the snapshot, where i borderd in red color , may show you clear in it. Waiting the responses. ujknjVr.jpg0 -
How to count number of app's installations for users who install app from https://play.google.com?
The task is to find out which banner on my site provides more installations of the game. The banners' urls link directly (refferal source in the url is specified) to the game page on https://play.google.com. Google Analytics and Google Play are connected but I've faced the problem that Google Analytics counts only visitors who install app via Google Play app so visitors from the banners to play.google.com aren't counted at all. That's the question, is there any way to count visitors to the app page on play.google.com and to count the number of installations by these visitors ?
SERP Trends | | seoMob0 -
Should URL Follow Navigation Of A Site?
Following an SEOMoz webinar the other day, where the presenter made a case of eliminating folders in URLs, as these could confuse a bot when crawling a site. I love the idea. However, there are still a lot of Best Practices and guidelines out there that will suggest there should be a logic in the URL, just as there should be in your Navigation. My question in that regard is whether or not there is any value for a bot to crawl a website URL that follows the navigation by "stuffing" the URL with folders, identical to the navigation present on the site, and even a secondary navigation present on all pages? Example: the navigation of a site goes [Domain > Folder > Sub folder > Sub-sub folder > product]. What is the benefit of using a URL such as [www.domain.com/folder/sub-folder/sub-sub-folder/product] vs [www.domain.com/product] Thank you guys for your insights! PS this is a WP site we are talking about.
SERP Trends | | Discountvc0 -
My customer has about 50 domain names they own, what can I suggest they do with them to increase SEO?
This customer has purchased about 50 domains over the years and has them all redirecting to their main website. What could I suggest they do with all of these domain names to increase SEO? Any ideas are greatly appreciated... Thanks!
SERP Trends | | jboddiford1