Problem with indexed files before domain was purchased
-
Hello everybody,
We bought this domain a few months back and we're trying to figure out how to get rid of indexed pages that (i assume) existed before we bought this domain - the domain was registered in 2001 and had a few owners.
I attached 3 files from my webmasters tools, can anyone tell me how to get rid of those "pages" and more important: aren't this kind of "pages" result of some kind of "sabotage"?
Looking forward to hearing your thoughts on this.
Thank you,
Alex
-
Thank you Ryan!
-
You can file a Reconsideration Request and explain you have recently acquired the domain and suspect it may be under a penalty. If the site is under a manual penalty, it would likely be lifted. Otherwise, the site may be under an algorithmic penalty. Build your site out, establish new links and you should be fine over time.
-
Thanks Ryan,
Yes the domain is ours and I was hoping to see those links disappearing from our webmasters tools - I guess I need to be more patient.
It appears that in the past, this domain was a paid directory - I wonder if there's a penalization on it as well - kind of strange not to rank for it's own name...
-
Is stocktips.com your domain? If so, it seems you have removed the old content which means the links on those pages will be gone as well. All the internal links and their 404 errors will naturally disappear in a month or two. You should see the number of those errors decreasing each week as they drop off.
You could redirect the pages to a 410 error, but it already has been over 2 1/2 weeks since the Oct 8th date in your screenshot, so I would simply allow them to fall off naturally.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google not index main keyword on homepage in 2 countries same language, rest of pages no problem
Hello, Two the same websites, two countries, same language http://www.lavistarelatiegeschenken.nl / http://www.lavistarelatiegeschenken.be The main keyword "relatiegeschenken" in top 10 of netherlands (steady position for 2 years) and in ** belgium** not in top 15****0 the main keyword "relatiegeschenken| but other keywords good positions, thats so strange I didn't understand and now every thing turned around suddenly: Now the main keyword "relatiegeschenken suddenly " not anymore in top 10 in the netherslandsits gone and other kewyords still good positions , now **main keyword suddenly in top 10 of belgium 2 years was not **other pages still ok. It are exactly the same websites and the same language. So double content But my programmer told me in google webmaster tools settings are right, so no problem with double content ? I really dont understand first main keyword in netherland in top 10 and in belgium not, now changed, now in belgium top 10 and not findable in the netherland on the main keyword. Maybe problem in code ? Maybe problems in code because websites are identical and active in two different countries wit same language ? No message about a penalty message in WMT, no spam links week i delete two strong but according to Linkdetox a bad links. I can not find a solution but its really important keyword that my customer want back in top 10 in netherland, like it was. All other positions and visitors are the same. Befor i have had this with belgium site, also main keyword google not index homepage. But suddenly no google show in belgium in top 10 Its turned around Kind regards, Marcel
Technical SEO | | Bossie720 -
Problems with to many indexed pages
A client of our have not been able to rank very well the last few years. They are a big brand in our country, have more than 100+ offline stores and have plenty of inbound links. Our main issue has been that they have to many indexed pages. Before we started we they had around 750.000 pages in the Google index. After a bit of work we got it down to 400-450.000. During our latest push we used the robots meta tag with "noindex, nofollow" on all pages we wanted to get out of the index, along with canonical to correct URL - nothing was done to robots.txt to block the crawlers from entering the pages we wanted out. Our aim is to get it down to roughly 5000+ pages. They just passed 5000 products + 100 categories. I added this about 10 days ago, but nothing has happened yet. Is there anything I can to do speed up the process of getting all the pages out of index? The page is vita.no if you want to have a look!
Technical SEO | | Inevo0 -
Problem with Google SERPS
I am running yoast SEO plugin in WP. I just noticed when I google the client, none of their meta data is showing. I see that I had facebook OG clicked, which looks like it made duplicates of all the titles etc. Would that be the problem? I have since turned it off. I am hoping that was the problem. Also, when the client searches it says in the meta desc - you've viewed this site many times". What is that?
Technical SEO | | netviper0 -
Preferred domain
In GWT it gives an option to do the following but which is best? and why? If you specify your preferred domain as http://www.example.com and we find a link to http://example.com, we'll consider both links the same. | <label for="no_assoc">Don't set a preferred domain</label> |
Technical SEO | | jwdl
| <label for="use_www">Display URLs as ** www.example.com**</label> |
| <label for="use_nowww">Display URLs as example**.com **</label> |0 -
We have set up 301 redirects for pages from an old domain, but they aren't working and we are having duplicate content problems - Can you help?
We have several old domains. One is http://www.ccisound.com - Our "real" site is http://www.ccisolutions.com The 301 redirect from the old domain to the new domain works. However, the 301-redirects for interior pages, like: http://www.ccisolund.com/StoreFront/category/cd-duplicators do not work. This URL should redirect to http://www.ccisolutions.com/StoreFront/category/cd-duplicators but as you can see it does not. Our IT director supplied me with this code from the HT Access file in hopes that someone can help point us in the right direction and suggest how we might fix the problem: RewriteCond%{HTTP_HOST} ccisound.com$ [NC] RewriteRule^(.*)$ http://www.ccisolutions.com/$1 [R=301,L] Any ideas on why the 301 redirect isn't happening? Thanks all!
Technical SEO | | danatanseo0 -
301 for old domain to new domain - Joomla plugin or cpanel?
A client changed domains and both are being indexed. There are thousands of content pages. I can install a 301 redirect Joomla plugin and configure it so that each page redirects to the new domain. I have a feeling I will need to manual set every page. OR I can create a domain level redirect setting in cpanel using wildcards. I think this will automatically pass every old URL to the new URL. Which is the better approach? The cpanel option sounds like less work.
Technical SEO | | designquotes0 -
Block a sub-domain from being indexed
This is a pretty quick and simple (i'm hoping) question. What is the best way to completely block a sub domain from getting indexed from all search engines? One item i cannot use is the meta "no follow" tag. Thanks! - Kyle
Technical SEO | | kchandler0 -
Will password protecting my test sub-domain help keep the SEs from indexing it?
Hi, all. I'm working in an unfamiliar area here, so I hope someone can tell me if I'm out in left field. I am building a sub-domain called http://test.mysite.com, so that I can upload a client's still-under-construction site while working on it. When completed, it'll go up on his server, replacing his old site. Obviously, I want to ensure that it doesn't get indexed while it's on my test platform. A friend suggested that I password it with htaccess and htpasswd, since we can never be certain the SEs will obey site directives. My question is, what do you think would be the best (and hopefully, simplest) way to accomplish this? I'm no code-monkey, so "simple" is a big plus! Doc By the way, the platform will be Wordpress CMS.
Technical SEO | | Doc_Sheldon0