How can I stop serious traffic lost on my website
-
I need help resolving technical SEO issues on my website CamRojud. I have tried allSEO tactics but no improvement yet. Can someone in the forum guide me through please.
-
Hi Dawodus,
Can you get a bit more specific as to what the issue is?
Thanks,
Zack -
can you elaborate that which kind of tactic you have applied, because one of my client website went down in past few month and then I did some simple steps and the site again up in a few weeks it takes time to restore your ranking .
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Linking Websites/ Plagiarized Content Ranking Above Original Content
Hey friends! Sooo this article was originally published in December 2016: https://www.realwealthnetwork.com/learn/best-places-to-buy-rental-property-2017/ It has been consistently ranking in positions 2-3 for long tail keyword "best places to buy rental property 2017" (and related keywords) since January-ish. It's been getting about 2000-2,500 unique views per week, until last week when it completely dropped off the internet (it's now ranking 51+). We just did a site redesign and changed some URL structures, but I created a redirect, so I don't understand why that would affect our ranking so much. Plus all of our other top pages have held their rankings -- in fact, our top organic article actually moved up from position 3 to 2 for much more competitive keywords (1031 exchange). What's even weirder is when I copy the sections of my article & paste into Google with quotes, our websites doesn't show up anywhere. Other websites that have plagiarized my article (some have included links back to the article, and some haven't) are ranking, but mine is nowhere to be found. Here are some examples: https://www.dawgsinc.com/rental-property-the-best-places-to-buy-in-the-year-2017/ http://b2blabs.com/2017/08/rental-property-the-best-places-to-buy-in-the-year-2017/ https://www.linkedin.com/pulse/best-places-buy-rental-property-year-2017-missy-lawwill/?trk=mp-reader-card http://news.sys-con.com/node/4136506 Is it possible that Google thinks my article is newer than the copycat articles, because of the new URL, and now I'm being flagged as spam? Does it think these are spam websites we've created to link back to our own content? Also, clearly my article is higher quality than the ranking articles. Why are they showing up? I double checked the redirect. It's good. The page is indexed... Ahhh what is going on?! Thanks for your help in advance!
White Hat / Black Hat SEO | | Jessica7110 -
Googlebot crawling AJAX website not always uses _escaped_fragment_
Hi, I started to investigate googlebot crawl log of our website, and it appears that there is no 1:1 correlation between a crawled URL with escaped_fragment and without it.
White Hat / Black Hat SEO | | yohayg
My expectation is that each time that google crawls a URL, a minute or so after, it suppose to crawl the same URL using an escaped_fragment For example:
Googlebot crawl log for https://my_web_site/some_slug Results:
Googlebot crawled this URL 17 times in July: http://i.imgur.com/sA141O0.jpg Googlebot crawled this URL additional 3 crawls using the escaped_fragment: http://i.imgur.com/sOQjyPU.jpg Do you have any idea if this behavior is normal? Thanks, Yohay sOQjyPU.jpg sA141O0.jpg0 -
How Can I Safely Establish Homepage Relevancy With Internal Keyword Links?
My website has roughly 1000-2000 pages. However, our homepage is lacking relevancy as to what it is about. One way that I'd like to tackle this problem, is by updating many of our pages with internal linking. I often hear, use exact keyword links with caution, but have assumed this mainly referred to external backlinks. Would it be a disaster to set up our single most relevant keyword on about 300 pages and point it to our homepage? There are breadcrumbs on our site, but the home link uses an image (It's a picture of a house, if you're curious.) Am I better off just to change that to our most relevant keyword? I could use any advice on internal links for establishing better homepage relevancy. Thank you!
White Hat / Black Hat SEO | | osaka730 -
Website starts ranking on Google then always drops - Targeted for Australia but most traffic from U.S - Bounce Rate at 94.49% - HELP!
Hi everyone, Thank you for your time. During the past 8 months I have been working on this website which is a .com.au . I have fully optimised the website which is targeting Brisbane in Australia and I have setup everything (Sitemaps, Geo location on WMT, Fetched as Google etc..) However the website just does not want to rank at all. I know that the previous SEO company were not too good but since then I have disavowed all unnatural links, we have moved the hosting to a new company and the website content has been updated. Only recently the Website has started ranking for it's brand name (not even in top of Google) and whenever a keyword starts ranking above the Top 50 of Google it suddenly drops again. The other issues is that even if I have setup the website to target Australia the majority of traffic comes from the U.S. Last month out of the 127 Session - 85 from United States - 29 from Australia - 3 Brazil - 2 India - 2 Italy - 1 Canada etc... Because of this the website has a Bounce rate of 95%. If you would have any advice, tips or recommendations that I could do to try and fix this it would be much appreciated. I suppose we can consider this as some kind of penalisation - potentially due to the past work and issues that occurred before the business became our client but I am not sure what more I can do to stop the wrong traffic and improve the rankings. Thanks for your help. Lyam
White Hat / Black Hat SEO | | AlphaDigital20 -
Website mallware attacks
I keep getting attacks to my website every time that are being blocked by OSE firewall Is there any way to stop this? I am affraid because they actually manage enter my website on the past, and i dont know if they can enter on the future or if having all the pluggins and wordpress updated. I am safe enough, and i am not sure if there is any type of virus on my computer Macbook as those attacked pages were recently updated from my computer. Is there any malware scan for Mac Thanl you == Attack Details == TYPE: Found Basic DoS Attacks DETECTED ATTACK VALUE: dDos Attack ACTION: Blocked LOGTIME: 2013-02-25 11:48:18 FROM IP: http://whois.domaintools.com/75.126.24.81 URI: [http://www.propdental.es/](http://www.propdental.es/) METHOD: HEAD USERAGENT: N/A REFERRER: N/A == Attack Details == TYPE: Found Basic DoS Attacks DETECTED ATTACK VALUE: dDos Attack ACTION: Blocked LOGTIME: 2013-02-25 10:13:17 FROM IP: http://whois.domaintools.com/107.21.150.82 URI: [http://www.propdental.es/blanqueamiento-dental/](http://www.propdental.es/blanqueamiento-dental/) METHOD: HEAD USERAGENT: N/A REFERRER: N/A ``` == Attack Details == TYPE: Found Malicious User Agent DETECTED ATTACK VALUE: curl/7.15.5 (x86_64-redhat-linux-gnu) libcurl/7.15.5 OpenSSL/0.9.8b zlib/1.2.3 libidn/0.6.5 ACTION: Blocked LOGTIME: 2013-02-25 03:13:52 FROM IP: http://whois.domaintools.com/119.245.226.74 URI: [http://www.propdental.es/sonrisas/los-martinez/](http://www.propdental.es/sonrisas/los-martinez/) METHOD: HEAD USERAGENT: curl/7.15.5 (x86_64-redhat-linux-gnu) libcurl/7.15.5 OpenSSL/0.9.8b zlib/1.2.3 libidn/0.6.5 REFERRER: N/A ``` ```
White Hat / Black Hat SEO | | maestrosonrisas0 -
Domain Structure For A Network of Websites
To achieve this we need to set up a new architecture of domains and sub-websites to effectively build this network. We want to make sure we follow the right protocols for setting up the domain structures to achieve good SEO for the primary domain and local websites. Today we have our core website at www.doctorsvisioncenter.com which will ultimately will become dvceyecarenetwork.com. That website will serve as the core web presence that can be custom branded for hundreds. For example, today you can go to www.doctorsvisioncenter.com/pinehurst. Note when you start there, you can click around and it is still branded for Pinehurst or spectrum eye care. So the burning question(s). - if I am an independent doc at www.newyorkeye.com, I could do domain forwarding but Google does not index forwarded domains so that is out. I could do a 301 permanent redirect to my page www.doctorsvisioncenter.com/newyorkeye. I could then put a rule in the HT Access file that says if newyorkeye.com redirect to www.doctorsvisioncenter/newyorkeye and then have the domain show up as www.newyorkeye.com. Another way to do that is we point the newyorkeye DNS to doctorsvisioncenter.com rather than a 301 redirect with the same basic rule in the HT Access file. That means that, theoretically, every sub page would show up, for example, as www.newyorkeye.com/contact-lens-center which is actually www.doctorsvisioncenter.com/contact-lens-center. It also means, theoretically, that it will be seen as an individual domain but pointing to all the same content under that individual domain just like potentially hundreds of others. The goal is we build once, manage once and benefit many. If we do something like the above which will mean that each domain will essentially be a separate domain, but, will google see it that way or as duplicative content? While it is easy to answer "yes" it would be duplicative, it is not necessarily the case if the content is on separate domains. Is this a good way to proceed, or does anyone have another recommendation for us?
White Hat / Black Hat SEO | | JessTopps0 -
Can I just delete pages to get rid of bad back-links to those pages?
I just picked up a client who had built a large set of landing pages (1000+) and built a huge amount of spammy links to them (too many to even consider manually requesting deletion for from the respective webmasters). We now think that google may also be seeing the 'landing pages' as 'doorway pages' as there are so many of them 1000+ and they are all optimized for specific keywords and generally pretty low quality. Also, the client received an unnatural links found email from google. I'm going to download the links discovered by google around the date of that email and check out if there are any that look specifily bad but I'm sure it will be just one of the several thosand bad links they built. Anyway, they are now wanting to clean up their act and are considering deleting the landing/doorway pages in a hope to a. rank better for the other non landing/doorway pages (Ie category and sub cats) but more to the crux of my question.. b. essentially get rid of all the 1000s of bad links that were built to those landing/doorway pages. - will this work? if we just remove those pages and use 404 or 410 codes will google see any inbound (external) links to those pages as basicly no longer being links to the site? or is the TLD still likely to be penilized for all the bad links coming into no longer existing URLs on it? Also, any thoughts on whether a 404 or 410 would be better is appreciated. Some info on that here: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=64033 I guess another option is the disavow feature with google, but Matt Cutts video here: http://www.youtube.com/watch?v=393nmCYFRtA&feature=em- kind of makes it sound like this should just be used for a few links, not 1000s... Thanks so much!!!!
White Hat / Black Hat SEO | | zingseo0 -
Can a "Trusted Retailer" badge scheme affect us in the SERPs?
Hi Guys, In the last week our website saw a drop on some of our biggest and best converting keywords and we think it might be down to us rolling out a “Trusted Retailer” badge scheme. We sell our products directly to consumers via our website, but we also sell our products to other online resellers. We think badges are a good to show the consumer that we trust a site. On the 17th September we sent out badges to about 39 of our best retailers, two of whom have already put them on their sites. Instead of sending them a flat jpeg, we sent them HTML files containing code that pulled in the image from our servers. We wanted to host the image to make sure that we always had some leverage. So if a company stopped selling our products, or the quality of their site went down, we could just remove the badge. Whilst at it, we stuck a link in there pointing to an FAQ on our website all about trusted retailers and what people need to look out for. We chose the anchor text “(brand name) Trusted Retailer”, because that seemed to be the most relevant. The code looks like this: (our brand) Trusted Retailer You might notice that there is a div just before the link. This is there to stop the user from clicking on the top 65% of the badge (because this contains the shop name and ID number), and we also used a negative text-indent to move the anchor text out of the way. But right underneath this is our Logo, so it’s almost a hidden link, but you can still click it. So far the badge has been put in on two sites, one of which isn’t so great and maybe looks a tiny bit spammy. (They sell mostly through ebay as opposed to on their main site). Also, these sites seem to have put it on most of their pages! So my questions are; Is this seen as black or grey hat? Is it the fact we put in anchor text with our brand? Or is it the fact the url is transparent in the coding? Or is it the fact the sites are using sitewide links? In any case would Google react so quickly as to penalise us in two days? If this is the issue, do you think there’s anything we can do to stop getting penalised? (Other than having to e-mail 39 retailers back and getting them to take the badges down). Thoughts much appreciated – we do our SEO in-house and are still learning every day… Thank you James
White Hat / Black Hat SEO | | OptiBacUK0