Search Console Incorrectly Identifies WordPress Version and Recommends Update
-
Howdy, Moz fans,
Today I received four emails from Google Search Console recommending I update WordPress. The message reads, "Google has detected that your site is currently running WordPress 3.3.1, an older version of WordPress. Outdated or unpatched software can be vulnerable to hacking and malware exploits that harm potential visitors to your site. Therefore, we suggest you update the software on your site as soon as possible."
This is incorrect, however, since I've been on 4.3.1 for a while. 3.3.1 was never even installed since this site was created in September, 2015, so the initial WP Engine install was likely 4.3.
What's interesting is that it doesn't list the root URL as the problem source. The email states that it found that issue on a URL that is set up via WP Engine to 301 to a different site, which doesn't use WordPress. I also have other redirects set up to different pages on the second site that aren't listed in the Search Console email.
Anyone have any ideas as to what's causing this misidentification of WP versions? I am afraid that Google sees this as a vulnerability and is penalizing my site accordingly.
Thanks in advance!
-
I saw this for a client as well, who I know for sure isn't running WordPress at all. Personally, I think it's a Google mistake.
-
Thanks for that info, but I actually don't see a trace of 3.3.1 anywhere in my source code, so I'm still confused as to how it came up with that info. I do have a meta generator tag but it just contains a credit to Visual Composer.
The site is http://foam-roller.com.
-
Thanks for the response. It's interesting to me that Google doesn't penalize for vulnerabilities - you'd think it'd have some effect since it'd be in Google's best interest not to serve potentially insecure/malicious websites, just as SSL has a positive effect on rankings.
-
Peter is right, what I also wouldn't worry about is that you might get a penalty because of this. Google is very concerned about the security issues that Web sites might have and that's why they're alerting webmasters through Search Console that this is the case.
-
I also get notifications.
On first site in wp-content/uploads there was HTML file with this in header:
so checking works almost perfect. Just file was downloaded somewhere from other authors.
On second site Jooma was identified as 1.5 or less:
and this is correct. But wasn't hacked yet from creation like 5-6 years ago.
I think that this is part of their notifications about updates and pushing internet CMSes to latest versions. This isn't their first nor be last mail. Do you remember wp-timthumb notification? Do you remember Fancybox notification? Do you remember Revolution slider notification? What's equal in all cases? I know - one vulnerability and over 100k sites are at risk. And bad guys knows this and uses such vulnerability for black hat seo.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl anamoly issue on Search Console
Has anyone checked the crwal anamoly issue under the index section on Search console? We recently move to a new site and I'm seeing a huge list of excluded urls which are classified as crawl anamoly (they all lead to 404 page). Does anyone know that if we need to 301 redirect all the links? Is there any other smarter/ more efficiently way to deal with them like set up canonical link (I thought that's what they're used for isn't it?) Thanks!
White Hat / Black Hat SEO | | greenshinenewenergy0 -
Submitting a page to Google Search Console or Bing Webmaster Tools with nofollow tags
Hello, I was hoping someone could help me understand if there is any point to submit a domain or subdomain to Google Search Console (Webmaster Tools) and Bing Webmaster Tools if the pages (on the subdomain for example) all have nofollow/noindex tags ... or the pages are being blocked by the robots.txt file). There are some pages on a data feed onto a subdomain which I manage that have these above characteristics ... which I cannot change ... but I am wondering if it is better to simply exclude from submitting those from GWT and BWT (above) thereby eliminating generating errors or warnings ... or is it better to tell Google and Bing about them anyway then perhaps there is a chance those nofollow pages may be indexed/contextualised in some way, making it worth the effort? Many thanks!
White Hat / Black Hat SEO | | uworlds
Mark0 -
Can I get updated opinions on PR Web?
I saw Moz has discussed PR web in earlier posts, but they are mostly months to years old. I'm wondering if PR Web is a good service? A lot of my competitors use it, but it seems just like a paid link to me. If for whatever reason, PR Web is an approved loophole, does anyone have any suggestions on which plan to purchase? Thanks, Ruben
White Hat / Black Hat SEO | | KempRugeLawGroup0 -
Wordpress keeps reinfected
hello my wordpress theme keeps reinfected i dont know were the virus is coming from, they upload archives on ftp and redirects all wordpress pages i installed this pluggin http://sucuri.net what is your opinion about this pluggin my wordpress is all actualized. Any ideas to spot reinfections
White Hat / Black Hat SEO | | maestrosonrisas0 -
I think a virus as enter my wordpress
I think a virus as enter my wordpress and is changing my titles and redirecting urls. Help please. How can i find the virus and werre was is gate to enter my web thank you
White Hat / Black Hat SEO | | maestrosonrisas0 -
Switching existing website to a Wordpress Site and afraid of losing top spot
I am going to be switching my current site from a standard html site to a wordpress site. I'm kind of paranoid of losing my top spot for the keyterms. If I keep the content the same, and keep the same amount of image alt tags, the same anchor text etc, nothing should change right? Grateful for any advice. Thanks Will
White Hat / Black Hat SEO | | willie790 -
Blogspot or Wordpress.com Redirect?
I have multiple domains with the same registrar. Is there an SEO benefit to create complimentary blogs on blogspot, wordpress.com or other "free" blog sites and forward these domains with the purpose of backlinking to the main site?
White Hat / Black Hat SEO | | reeljerc0 -
Farmer Update Case Study. Please question my logic here. (Very long!)
Hi SEOmoz community! I would like to try to give a small (well...) case study of a Farmer victim and some logical conclusions of mine that you are more then welcome to shred to pieces. So, I run MANY sites ranging from low to super quality and actually have a few that have been hit by farmer but this particular site had me scratching my head as to why it was torched. Quick background: Sitei s in a very competetive niche, been around since 2004 initially as a forum site but from 2005 also a content driven site. Site is an affiliate site and has been ranking top 5 for many high-value commercial KW's and has a big long-tail of informational kw's. Limk profile is a mix between natural, good links and purchased links from various qualilty sources. Content is high quality written articles, how-to's, blog posts etc. by in-house pro writers plus UGC from a semi active forum (20-30 posts a day). Farmer: After Farmer, this site's vertical is pretty much same as before with the biggest exception being my site. I quickly discounted low-quality content (spider-food) and focused instead on technical reasons. I took this approach since this site isn't the most well kept site I have and I figured the crappy CMS + PHPBB might have caused isseus. I didn't want to waste my time crawling the site myself so I quickly downloaded all the URLs that Majestic had crawled. Too my surprise the result of Majestic's crawler was over 3 million URLs when the real number would likley be 30-40k and Google has about 20k indexed. After scanning through the file with URLs I knew I had issues. Massive amounts of auto-generated dupe pages from the forum and so on. By adding around 20 new lines to robots.txt I was able to block millions of pages from being crawled again. My logic: Ok, so now I think I've found what caused the drop. Milllions of dupe pages and empty pages could have tripped the Farmer algo update to think the site is low quality or dupe or just trying to feed the spiders with uselessness. My WEAK point in this logic is that I can't prove that Google even knew about (or smart enough to ignore them). Google WMT tells me they've crawled an average of around 10k pages the last 90 days. Given this I'm doubting my logic and if I've found the issue or not. My next step is to see if this gets resolved algorithmically or not, if not i feel I have a legitimate case to submit a reinclusion request but i'm not sure? Since I haven't been a contributing member to this community I'm not looking to get direct help with my site, but hopefully this could spark some discussion about Farmer and maybe some flaming of my logic regarding the update 🙂 So, would any of you have drawn similar conclusions as I did? (Sweet blog bro!)
White Hat / Black Hat SEO | | YesBaby0