SEO trending down after adding content to website
-
Hi
Looking for some guidance. I added about 14 pages of unique content and did all of the on page SEO work using Yoast - have 'good' status on all of them some of the website architecture was changed - mainly on one page.That being said, we got a significant bump the day I implemented, however every day thereafter we have had very bad results. Worse than we had before for about 3 days now.
I did resubmit the updated sitemap to GWT and I'm showing no crawl errors.
Also, curious if my Robots.txt file could be the issue. All it contains is
User-agent: *
Disallow: /wp-admin/Any insight or advise is greatly appreciated!
Thanks for your time -
-
I think Moosa and Andy both make great points which I agree with. I would definitely give it more time and it would be easier to give input with your domain. In terms of the content you have added in the form of new pages are these targeting new keywords or ones you were already ranking for? The reason I say this is because I have seen people cause themselves issues by creating new pages targeting terms that current pages were ranking for and this lead to keyword cannibalization. Jon Earnshaw gave a brilliant talk on this at Brighton SEO earlier this year - https://www.youtube.com/watch?v=ASsxh8ZwseQ
I just thought I would mention it on the off chance
Best
Matt
-
Hi,
Always awkward to speculate on issues like this because there could be a problem we aren't aware of because we can't see it.
However, as Moosa said, I would also give it a little time as these things are often just part of the 'Google Dance' while it is decided what to do with the site and new content. Your Robots.txt file certainly isn't going to be the issue here.
If you wish to share the site and URL's, then we can take more of a look.
-Andy
-
Personally I think you have to give it some more time. If you have added news pages of unique content, just make sure that the quality of content is good.
Also see if the keywords are mapped correctly. It is also important to add some internal linking as this will help new pages get crawled in Google and internal link will give a link juice to new pages.
If I would be at your place I would have boost new pages on social media and add some links to the new pages as well.
I think you should be doing the above mentioned activities and give it some time to see how Google is reacting to that.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content change within the same URL/Page (UX vs SEO)
Context: I'm asking my client to create city pages so he can present all of his appartements in that specific sector so i can have a page that ranks for "appartement for rent in +sector". The page will present a map with all the sector so the user can navigate and choose the sector he wants after he landed on the page. Question: The UX team is asking if we absolutly need to reload the sector page when the user is clicking the location on the map or if they can switch the content within the same page/url once the user is on the landing page. My concern: 1. Can this be analysed as duplicate content if Google can crawl within the javascript app or if Google only analyse his "first view" of the page. 2. Do you consider that it would be preferable to keep the "page change" so i'm increasing the number of page viewed ?
Technical SEO | | alexrbrg0 -
Angular seo
Hi, how to do seo with angular ? is there an easy way to make google crawl your website healthy?
Technical SEO | | bigrat950 -
Development Website Duplicate Content Issue
Hi, We launched a client's website around 7th January 2013 (http://rollerbannerscheap.co.uk), we originally constructed the website on a development domain (http://dev.rollerbannerscheap.co.uk) which was active for around 6-8 months (the dev site was unblocked from search engines for the first 3-4 months, but then blocked again) before we migrated dev --> live. In late Jan 2013 changed the robots.txt file to allow search engines to index the website. A week later I accidentally logged into the DEV website and also changed the robots.txt file to allow the search engines to index it. This obviously caused a duplicate content issue as both sites were identical. I realised what I had done a couple of days later and blocked the dev site from the search engines with the robots.txt file. Most of the pages from the dev site had been de-indexed from Google apart from 3, the home page (dev.rollerbannerscheap.co.uk, and two blog pages). The live site has 184 pages indexed in Google. So I thought the last 3 dev pages would disappear after a few weeks. I checked back late February and the 3 dev site pages were still indexed in Google. I decided to 301 redirect the dev site to the live site to tell Google to rank the live site and to ignore the dev site content. I also checked the robots.txt file on the dev site and this was blocking search engines too. But still the dev site is being found in Google wherever the live site should be found. When I do find the dev site in Google it displays this; Roller Banners Cheap » admin dev.rollerbannerscheap.co.uk/ A description for this result is not available because of this site's robots.txt – learn more. This is really affecting our clients SEO plan and we can't seem to remove the dev site or rank the live site in Google. In GWT I have tried to remove the sub domain. When I visit remove URLs, I enter dev.rollerbannerscheap.co.uk but then it displays the URL as http://www.rollerbannerscheap.co.uk/dev.rollerbannerscheap.co.uk. I want to remove a sub domain not a page. Can anyone help please?
Technical SEO | | SO_UK0 -
What I doing wrong when trying to search for links from external websites to my website
This is just the little frustrating question nothing important but I’m sure somebody will know the answer. In the white board Friday this week Rand suggested at one point that when you’re searching for results links to your website if you put a - followed by site followed by your url like –site:yourwebsite.com you get the results of pages with links on other websites but excluding your own webpages but it just doesn’t work I get no results just an error message, any idea why? If I remove the - I get tons of results but there on my own webpages……….
Technical SEO | | whitbycottages0 -
Website Redirects
Background information: We have a website (devicelock.com) which is currently our corporate website. The company use to operate under (ntutility.com) which is now being redirected to devicelock.com via a DNS Forward - 302 Redirect. The IT admin (a founder of the company) is reluctant to change it to a 301. The current flow is ntutility.com redirects to protect-me.com then redirects again to devicelock.com. When i search up Devicelock on google, it shows up as ntutlity.com. There is no devicelock.com homepage on google search. Question: Are there any negative implications about this? Is this hurting our SEO in any way? When i do link building, will this have any negative affects? Will my links for devicelock be attributed to devicelock.com?
Technical SEO | | Devicelock0 -
Does adding Tool Tips to a site hurt it's SEO?
I'm wanting to add tool tips to my site as it's intended for non-technical people that are wanting high tech equipment and services. I thought that by adding tool tips, I could clear any confusion they may have about a particular word right there rather then them having to search for what it means. I did some research online and saw that it may hurt SEO ratings but wanted to verify here first before deciding.
Technical SEO | | sDevik0 -
Duplicate content
I have just ran a report in seomoz on my domain and has noticed that there are duplicate content issues, the issues are: www.domainname/directory-name/ www.domainname/directory-name/index.php All my internal links and external links point to the first domain, as i prefer this style as it looks clear & concise, however doing this has created duplicate content as within the site itself i have an index.php page inside this /directory-name/ to show the page. Could anyone give me some advice on what i should do please? Kind Regards
Technical SEO | | Paul780 -
Duplicate content on my home
Hello, I have duplication with my home page. It comes in two versions of the languages: French and English. http://www.numeridanse.tv/fr/ http://www.numeridanse.tv/en/ You should know that the home page are not directories : http://www.numeridanse.tv/ Google indexes the three versions: http://bit.ly/oqKT0H To avoid duplicating what is the best solution?
Technical SEO | | android_lyon
Have a version of the default language? Thanks a lot for your answers. Take care. A.0