Search Console Incorrectly Identifies WordPress Version and Recommends Update
-
Howdy, Moz fans,
Today I received four emails from Google Search Console recommending I update WordPress. The message reads, "Google has detected that your site is currently running WordPress 3.3.1, an older version of WordPress. Outdated or unpatched software can be vulnerable to hacking and malware exploits that harm potential visitors to your site. Therefore, we suggest you update the software on your site as soon as possible."
This is incorrect, however, since I've been on 4.3.1 for a while. 3.3.1 was never even installed since this site was created in September, 2015, so the initial WP Engine install was likely 4.3.
What's interesting is that it doesn't list the root URL as the problem source. The email states that it found that issue on a URL that is set up via WP Engine to 301 to a different site, which doesn't use WordPress. I also have other redirects set up to different pages on the second site that aren't listed in the Search Console email.
Anyone have any ideas as to what's causing this misidentification of WP versions? I am afraid that Google sees this as a vulnerability and is penalizing my site accordingly.
Thanks in advance!
-
I saw this for a client as well, who I know for sure isn't running WordPress at all. Personally, I think it's a Google mistake.
-
Thanks for that info, but I actually don't see a trace of 3.3.1 anywhere in my source code, so I'm still confused as to how it came up with that info. I do have a meta generator tag but it just contains a credit to Visual Composer.
The site is http://foam-roller.com.
-
Thanks for the response. It's interesting to me that Google doesn't penalize for vulnerabilities - you'd think it'd have some effect since it'd be in Google's best interest not to serve potentially insecure/malicious websites, just as SSL has a positive effect on rankings.
-
Peter is right, what I also wouldn't worry about is that you might get a penalty because of this. Google is very concerned about the security issues that Web sites might have and that's why they're alerting webmasters through Search Console that this is the case.
-
I also get notifications.
On first site in wp-content/uploads there was HTML file with this in header:
so checking works almost perfect. Just file was downloaded somewhere from other authors.
On second site Jooma was identified as 1.5 or less:
and this is correct. But wasn't hacked yet from creation like 5-6 years ago.
I think that this is part of their notifications about updates and pushing internet CMSes to latest versions. This isn't their first nor be last mail. Do you remember wp-timthumb notification? Do you remember Fancybox notification? Do you remember Revolution slider notification? What's equal in all cases? I know - one vulnerability and over 100k sites are at risk. And bad guys knows this and uses such vulnerability for black hat seo.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitename in Mobile SERPS is Incorrect
Our site is being presented in mobile SERPS with a completely wrong sitename. Screenshot is attached. Despite confirming multiple times that "HYPR Biometrics" does not actually appear anywhere in the back-end, schema markup, or webmaster tools settings - Google still _decides _that this is the site name. It makes no sense at all and is driving us crazy. What can be done to correct this? I imagine this can be a major issue for companies who are completely misrepresented in SERPs. Our URL is https://www.hypr.com/ Thanks in advance for any advice. djNlfkt
White Hat / Black Hat SEO | | gray_jedi0 -
Not getting any data in Search console
Hi there My website Ranking well, But in Search console it is not Fetching any Data, here is Screenshot http://prntscr.com/d4m2tz , Why i am not getting any report For Clicks, Impressions ?? is there any mistake which is made?? please any body can help out. Thanx,
White Hat / Black Hat SEO | | pooja.verify050 -
I Mistakenly uploaded Disavow File to Non WWW version of Webiste in Webmaster Tools...is this a Problem???
Hey guys and gals, I need some advice on this please. I recently had someone perform a negative S.E.O campaign on my site and I was inundated with 13,000 + spammy links pointing to my website and I had to perform a Disavow in Google Webmaster Tools but for some reason, it is showing that I uploaded the Disavow text file to the Non WWW version of the website but the WWW version of my website is the preferred domain and I have all NON WWW queries being 301 redirected to www.pcmedicsoncall.com My question is should I correct this and upload the Disavow text to the preferred domain in Google Webmaster Tools??? Please advise on how I should proceed with this situation.... Thank you. Cam
White Hat / Black Hat SEO | | CamMcArthur0 -
Please Correct This on-site SEO strategy w/ respect to all the updates
Hello, I believe my on-site SEO process that I used to use a couple of years ago is not working well anymore for a couple of my sites, including this one. I'll tell you the old strategy as well as my new strategy and I'm wondering if you can give me pointers that will help us rank where we should rank with our PA and DA instead of getting moved down because of what could be our old on-site SEO. OLD ON-SITE SEO STRATEGY: Title tags usually match the page, but title tags occasionally on this site don't match the pages exactly. There's not many of them, but they do still exist in a couple of places. Title tags are either 1. A phrase describing the page 2. Keywords 1, Keyword 2 3. Keyword 1 | Keyword 2 4. Keywords 1, Keyword 2, branding The keywords are in the h1 and h2 of each main page, at the very top of the page. The h1 and h2 do not exactly copy the title tag, but are a longer phrase with the keywords appearing in their exact word order or in word variations. See this page for an example. Keywords occur 3-4 times in the body of the main pages (the pages with a menu link). Right now some of the pages have the exact phrases 3 or 4 times and no variation. meta description tags have exact keyword phrases once per keyword. Meta description tag are a short paragraph describing the page. No meta keyword tags, but a couple haven't been deleted yet. FUTURE ON-SITE SEO STRATEGY: I'm going to change all of the page titles to make sure they match the content they're on exactly. If the title is a phrase describing a page, I'm going to make sure a variation of that phrase occurs at least three times in the content, and once in the meta description tag. Title tags will be either a. Short phrase exactly matching page b. Keyword 1, Keyword 2 | branding c. Keyword 1 | branding 2. I'm thinking about taking out the H1 and H2 and replacing them with one tag that is a phrase describing the page that I'll sometimes put the keyword phrase in, only a variation in it and not the exact keyword phrase - unless it just makes total sense to use the keyword phrase exactly. **I'm thinking of only using the keyword phrase in it's exact words once on the page unless it occurs more naturally, and to include the keyword phrase in word variations two more times. So once (in non-exact word order) in the at the top, once (exact word order) in the text, and two more times (varied word orders) somewhere in the text. All this will be different if the keywords show up naturally in the text. **3. I'll delete all meta keyword tags, and still use exact keyword phrases in meta description tag, though I'll change the meta description tags to always very closely match what the page is about. Do you think my new strategy will make a difference? Your thoughts on any of this?****
White Hat / Black Hat SEO | | BobGW0 -
Multiple Versions of Mobile Site
Hey Guys, We have recently finished the latest version of our mobile site which means currently we have 2 mobile sites. Depending on what device and Os will depend on which site you will be presented with.
White Hat / Black Hat SEO | | seekjobs
e.g.
iPhone 3 or 4 users on iOS4 will get version 1 of our mobile site
iPhone 5 users on iOS5 will get the new version (version 2) of our mobile site. Our old mobile site is currently indexed in Google and performing pretty well.
Since the launch of the second mobile site we have not see any major changes to our visibility in Google and so was curious My main concern here is duplicate content so I am curious can Google detect that we have 2 mobile site that we serve depending on device? And if Google can detect this, why has our sites not been penalized! Thanks, LW I know the first thing that comes to your mind is Duplicate content0 -
Search Results Showing Additional info/Links
Did I miss something? I was looking at search result listings this morning and noticed that Walmart has additional information at the bottom of their (non-paid (I think)) search results. Please see the attached image and you'll notice links to "Item Description - Product Warranty and Service - Specifications - Gifting Plans" How are they doing this? I just noticed the same on one of our competitors listings so It's not just Walmart and the links are item specific. (I have update the image) Z0yqKtO.jpg
White Hat / Black Hat SEO | | BWallacejr1 -
Is it outside of Google's search quality guidelines to use rel=author on the homepage?
I have recently seen a few competitors using rel=author to markup their homepage. I don't want to follow suit if it is outside of Google's search quality guidelines. But I've seen very little on this topic, so any advice would be helpful. Thanks!
White Hat / Black Hat SEO | | smilingbunny0 -
Is it possible that since the Google Farmer's Update, that people practicing Google Bowling can negatively affect your site?
We have hundreds of random bad links that have been added to our sites across the board that nobody in our company paid for. Two of our domains have been penalized and three of our sites have pages that have been penalized. Our sites are established with quality content. One was built in 2007, the other in 2008. We pay writers to contribute quality and unique content. We just can't figure out a) Why the sites were pulled out of Google indexing suddenly after operating well for years b) Where the spike in links came from. Thanks
White Hat / Black Hat SEO | | dahnyogaworks0