Site Getting hacked
-
Hi There,
My one Website gets hacked Again and Again, I had Reset Many times ,But again, Also generating unnecessary URLs to My website in Webmaster tools, Can anyone Help Me To Solve This Problem please?
please help, thx in advance,
-
Hey,
When you are fixing the website do you roll back to a save version of the website or just change password?
Most likely there is Malware hidden away which if you do not remove will allow them to keep doing the same thing. I think generally backups go back 2 weeks but if your server provider has copied a hacked version you will need to remove the Malware manually - not fun.
I would Google the plugins you are using and see if anyone has been hacked using that plugin.
-
hi donford
Thanx For your Valuable reply.
But I have not given Access to anyone, other then Me.
-
You should also evaluate who at your company has access to passwords. Did you hire any part of this site out?
People who have access to the site could knowingly our unknowingly be the problem.
-
Hello,
The one thing popular CMS sites have in common is they are primary targets. This is especially true to open source or free systems like WordPress, Joomla, Drupal, Oscommerce, this is because the hackers are available to download and review the source code. In addition to this, many people contribute to these projects which provides additional opportunities for security holes.
There are whole websites out there that share techniques to hack these high profile CMS, complete with scripts and how to guides. Its unfortunate, but some people "enjoy" other people's pain.
The one thing you must do when you choose a CMS like this is to continue to upgrade versions and stay on top of the latest security vulnerabilities and fixes. If you hired your build out, you should factor in a continual support package because you will always need to upgrade and patch.
My suggestions is to visit your sites help forums, chances are there are some answers there for you. There are many precautionary things I could suggest but the primary thing you need to do take your site offline, identify the security hole, restore from a backup and patch it.
Good luck,
Don
-
Hi Chris ,
Thanx For Reply
we have always get all things updated, and also theme is self designed not dummy code.plugins & WordPress All are Updated.
Then Also It Happens Again & Again.
-
This type of thing is generally better addressed by someone with more dev knowledge but one thing you may want to check is your versions of Wordpress and plugins.
Try to keep everything up to date since updates are often released to patch security flaws and if you're running any plugins that aren't really necessary, consider removing them as well. Doing so can often help your site speed and it also removes a potential vulnerability.
The only time we've experienced the same site getting hacked several times in a row it was through vulnerabilities in a particular form plugin.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best SEO Option for Multi-site Set-up
Hi Guys, We have a Business to Business Software Website. We are Global business but mainly operate in Ireland, UK and USA. I would like your input on best practice for domain set-up for best SEO results in local markets. Currently we have: example.com (no market specified) and now we are creating: example.com/ie (Ireland) example.com/uk (united kingdom) example.com/us (united states) My question is mainly based on the example.com/us website - should we create example.com/us for the US market OR just use example.com for the US the market? If the decision is example.com/us should we build links to the directory or the main .com website. To summarize there is two questions: 1. Advise on domain set-up 2. Which site to build links to if example.com/us is the decision. Thank you in advance, Glen.
Local Website Optimization | | DigitalCRO0 -
Site Audit: Indexed Pages Issue
Over the last couple of months I've been working through some issues with a client. One of my starting points was doing a site Audit. I'm following a post written by Geoff Kenyon https://moz.com/blog/technical-site-audit-for-2015 . One of the main issues of the site audit seems to be that when I run a "site:domain.com" query in Google my homepage isn't the first page listed in fact it isn't listed in this search when I go through all of the listings. I understand that it isn't required to have your homepage listed first when running this type of query, but I would prefer it. Here are some things I've done I ran another query "info:homepage.com" and the home page is indexed by Google. When I run a branded search for the company name the home page does come up first. The current page that is showing up first in the "site:domain.com" listing is my blog index page. Several months back I redirected the index.php page to the root of the domain. Not sure if this is helping or hurting. In the sitemap I removed the index.php and left only the root domain as the page to index. Also all interior links are sent to the root, index.php has been eliminated from all internal links everything links to root The main site navigation does not refer to the "Home" page, but instead my logo is the link to the Home page. Should I noindex my blog/index.php page? This page is only a compilation of posts and does not have any original content instead it actually throws up duplicate content warnings. Any help would be much appreciated. I apologize if this is a silly question, but I'm getting frustrated/ annoyed at the whole situation.
Local Website Optimization | | SEO_Matt0 -
What more can be done to get Google to change the landing pages it uses for certain search terms?
For one of my SEO campaigns, Google is using the website's home page as the landing page for the majority of search terms being tracked. The website splits its products by region and so we want specific region pages to rank for search terms related to that region, rather than the home page. We have optimised each regional page to a reasonably high standard and we have ensured that there is a good amount of internal linking and sign-posting to those region pages, however, Google is still using the home page. The only complication is that for the first few months there were canonical tags on these pages to the home page. These were removed around 3 months ago and we've checked that the region pages are indexed properly. Is there anything we are missing? Has anyone had any success in getting Google to change its landing pages?
Local Website Optimization | | ClickHub-Harry0 -
Why has my site dropped to page 2?
I haven't been paying attention to my sites SERP for the past year, and only realized I've dropped to page 2 on a keyword search. Specifically, on Google.ca, searching the keywords "wedding invitations" My site, www.stephita.com, used to consistently rank in the top 3 links. While my competitors have leapfrogged me. 😞 I realized that my site wasn't "mobile-friendly", and had a few other issues like keyword stuffing, long meta descriptions and titles. I've fixed these issues "now", but wanted to know does this mean my site was severely penalized by the Panda/Penguin updates for the last few years? Does having a PR3 site mean anything? My competitors who our rank me on SERP, are all PR1 sites. Greatly appreciate any feedback you can give me! 🙂
Local Website Optimization | | TysonWong0 -
How to approach SEO for a national umbrella site that has multiple chapters in different locations that are different URLS
We are currently working with a client who has one national site - let's call it CompanyName.net, and multiple, independent chapter sites listed under different URLs that are structured, for example, as CompanyNamechicago.org, and sometimes specific to neighborhoods, as in CompanyNamechicago.org/lakeview.org. The national site is .net, while all others are .orgs. These are not subdomains or subfolders, as far as we can tell. You can use a search function on the .net site to find a location near you and click to that specific local site. They are looking for help optimizing and increasing traffic to certain landing pages on the .net site...but similar landing pages also exist on a local level, which appear to be competing with the national site. (Example: there is a landing page on the national .net umbrella site for a "dog safety" campaign they are doing, but also that campaign has led to a landing page created independently on the local CompanyNameChicago.org website, which seems to get higher ranking due to a user looking for this info while located in Chicago. We are wondering if our hands are tied here since they appear to be competing for traffic with all their localized sites, or if there are best practices to handle a situation like this. Thanks!
Local Website Optimization | | timfrick0 -
Ecommerce Site with Unique Location Pages - Issue with unique content and thin content?
Hello All, I have an Ecommerce Site specializing in Hire and we have individual location pages on each of our categories for each of our depots. All these pages show the NAP of the specific branch Given the size of our website (10K approx pages) , it's physically impossible for us to write unique content for each location against each category so what we are doing is writing unique content for our top 10 locations in a category for example , and the remaining 20 odd locations against the same category has the same content but it will bring in the location name and the individual NAP of that branch so in effect I think this thin content. My question is , I am quite sure I we are getting some form of algorithmic penalty with regards the thin/duplicate content. Using the example above , should we 301 redirect the 20 odd locations with the thin content , or should be say only 301 redirect 10 of them , so we in effect end up with a more 50/50 split on a category with regards to unique content on pages verses thin content for the same category. Alternatively, should we can 301 all the thin content pages so we only have 10 locations against the category and therefore 100% unique content. I am trying to work out which would help most with regards to local rankings for my location pages. Also , does anyone know if a thin/duplicate content penalty is site wide or can it just affect specific parts of a website. Any advice greatly appreciated thanks Pete
Local Website Optimization | | PeteC120 -
How to target an established .co.uk site/blog to audiences in other English speaking countries - UAE, Singapore for example?
Excuse for the novice questions, but looking for help! 🙂 I have an established .co.uk website/blog for which I have established a good solid following in the UK over a good number of years. That said I have recently relocated to Dubai and so I am looking to target my English blog content to English speakers here and Singapore? While the language setting of my site is "en" is there anyway that I can change this to "en-ae" and "en-sg" for example to build a following in these markets? Or is my .co.uk TLD an issue that is going to hold me back from building following in these locations? I ask as I have just read the hreflang announcement from Google, but noticed in my Webmaster Tools that I get the following message: "Your site has no hreflang tags". Thanks in advance!
Local Website Optimization | | twofourseven0 -
Single Site For Multiple Locations Or Multiple Sites?
Hi, Sorry if this rambles on. There's a few details that kind of convolute this issue so I'll try and be as clear as possible. The site in question has been online for roughly 5 years. It's established with many local citations, does well in local SERPs (working on organic results currently), and represents a business with 2 locations in the same county. The domain is structured as location1brandname.com. The site was recently upgraded from a 6-10 page static HTML site with loads of duplicate content and poor structure to a nice, clean WordPress layout. Again, Google is cool with it, everything was 301'd properly, and our rankings haven't dropped (some have improved). Here's the tricky part: To properly optimize this site for our second location, I am basically building a second website within the original, but customized for our second location. It will be location1brandname.com/secondcity and the menu will be unique to second-city service pages, unique NAP on footer, etc. I will then update our local citations with this new URL and hopefully we'll start appearing higher in local SERPs for the second-city keywords that our main URL isn't currently optimized for. The issue I have is that our root domain has our first city location in the domain and that this might have some negative effect on ranking for the second URL. Conversely, starting on a brand new domain (secondcitybrandname.com) requires building an entire new site and being brand new. My hunch is that we'll be fine making root.com/secondcity that locations homepage and starting a new domain, while cleaner and compeltely separate from our other location, is too much work for not enough benefit. It seems like if they're the same company/brand, they should be on the same sitee. and we can use the root juice to help. Thoughts?
Local Website Optimization | | kirmeliux0