Search Console Incorrectly Identifies WordPress Version and Recommends Update
-
Howdy, Moz fans,
Today I received four emails from Google Search Console recommending I update WordPress. The message reads, "Google has detected that your site is currently running WordPress 3.3.1, an older version of WordPress. Outdated or unpatched software can be vulnerable to hacking and malware exploits that harm potential visitors to your site. Therefore, we suggest you update the software on your site as soon as possible."
This is incorrect, however, since I've been on 4.3.1 for a while. 3.3.1 was never even installed since this site was created in September, 2015, so the initial WP Engine install was likely 4.3.
What's interesting is that it doesn't list the root URL as the problem source. The email states that it found that issue on a URL that is set up via WP Engine to 301 to a different site, which doesn't use WordPress. I also have other redirects set up to different pages on the second site that aren't listed in the Search Console email.
Anyone have any ideas as to what's causing this misidentification of WP versions? I am afraid that Google sees this as a vulnerability and is penalizing my site accordingly.
Thanks in advance!
-
I saw this for a client as well, who I know for sure isn't running WordPress at all. Personally, I think it's a Google mistake.
-
Thanks for that info, but I actually don't see a trace of 3.3.1 anywhere in my source code, so I'm still confused as to how it came up with that info. I do have a meta generator tag but it just contains a credit to Visual Composer.
The site is http://foam-roller.com.
-
Thanks for the response. It's interesting to me that Google doesn't penalize for vulnerabilities - you'd think it'd have some effect since it'd be in Google's best interest not to serve potentially insecure/malicious websites, just as SSL has a positive effect on rankings.
-
Peter is right, what I also wouldn't worry about is that you might get a penalty because of this. Google is very concerned about the security issues that Web sites might have and that's why they're alerting webmasters through Search Console that this is the case.
-
I also get notifications.
On first site in wp-content/uploads there was HTML file with this in header:
so checking works almost perfect. Just file was downloaded somewhere from other authors.
On second site Jooma was identified as 1.5 or less:
and this is correct. But wasn't hacked yet from creation like 5-6 years ago.
I think that this is part of their notifications about updates and pushing internet CMSes to latest versions. This isn't their first nor be last mail. Do you remember wp-timthumb notification? Do you remember Fancybox notification? Do you remember Revolution slider notification? What's equal in all cases? I know - one vulnerability and over 100k sites are at risk. And bad guys knows this and uses such vulnerability for black hat seo.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I show different versions of the same page to the crawlers and users, but do not want to do anymore
Hello, While Google could not read JavaScript, I created two versions of the same page, one of them is for human and another is for Google. Now I do not want to serve different content to the search engine. But, I am worry if I will lose my traffic value. What is the best way to succeed it without loss? Can you help me?
White Hat / Black Hat SEO | | kipra0 -
Homepage not ranking for branded searches after Google penalty removal
Hi all, A site I work on was hit with a manual action penalty some time ago for spammy links built by a former SEO agency. It was a partial match penalty so only affected some pages - most likely the homepage. We carried out a lot of work cleaning up links and disavowed suspicious links which we couldn't get removed. Again, most of these were to the homepage. The disavow file was uploaded to Google last Friday and our penalty was lifted this Tuesday. Since uploading the disavow file, our homepage does not show up at all for branded searches. I've carried out the obvious checks - robots.txt, making sure we're not accidentally noindexing the page or doing anything funky with canonicals etc and it's all good. Have any of you guys had a similar experience? I'm thinking Google simply needs time to catch up due to all the links we've disavowed and sitting tight is the best option but could do with some reassurance! Any past experiences or advice on what I might be missing would be great. Thanks in advance, Brendan.
White Hat / Black Hat SEO | | Brendan-Jackson1 -
Help identifying cause for total rank loss
Hello, Last week I noticed one of my pages decreased in rank for a particular query from #8 to #13. Although I had recently made a few minor edits to the page (added an introductory paragraph and left-column promo to increase word count), I thought the reason for the decrease was due to a few newly ranked pages that I hadn't seen before. In an attempt to regain my original position, I tried to optimize the meta title for the singular form of the word. After making this change, I fetched and rendered the page as Google (status = partial) and submitted the page for indexing (URL only, not including on-page links). Almost immediately after submitting, the page dropped from #13 out of the top 50. I've since changed the meta title back to what it was originally and let Google crawl and index the page on its own, but the page is still not in the top 50. Could the addition of the page description and left column promos tipped the scales of keyword stuffing? If I change everything back to the way it was originally, is it reasonable to think I should regain my original position below the new pages? Any insights would be greatly appreciated!
White Hat / Black Hat SEO | | jmorehouse0 -
Pagination for Search Results Pages: Noindex/Follow, Rel=Canonical, Ajax Best Option?
I have a site with paginated search result pages. What I've done is noindex/follow them and I've placed the rel=canonical tag on page2, page3, page4, etc pointing back to the main/first search result page. These paginated search result pages aren't visible to the user (since I'm not technically selling products, just providing different images to the user), and I've added a text link on the bottom of the first/main search result page that says "click here to load more" and once clicked, it automatically lists more images on the page (ajax). Is this a proper strategy? Also, for a site that does sell products, would simply noindexing/following the search results/paginated pages and placing the canonical tag on the paginated pages pointing back to the main search result page suffice? I would love feedback on if this is a proper method/strategy to keep Google happy. Side question - When the robots go through a page that is noindexed/followed, are they taking into consideration the text on those pages, page titles, meta tags, etc, or are they only worrying about the actual links within that page and passing link juice through them all?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Has google done well with these search results?
I am struggling to grasp the new logic behind google, my understanding was that they wanted to return more related searches so that the search matched the results giving people exactly what they are looking for from trusted suppliers. However I work in the vacation rental niche and I have found that the individual long tail searches have started to become less valuable as they are no longer giving the exact property. Here is a screenshot of the top 10 results for the key phrase "10 bedroom villas in quinta do lago" Position 1 & 2 are good results and would be expected however the next 7 positions are completely not related to the search, yes it is quinta do lago. But I am looking specifically for a 10 bedroom villa, none of these pages offer 10 bedroom villas. I actually found my listing outside the top 20 and mine is a 10 bedroom villa in quinta do lago. Does anyone have anything that can enlighten me on this? Thanks Andy 0bqdRJi
White Hat / Black Hat SEO | | iprosoftware0 -
Correct way to block search bots momentarily... HTTP 503?
Hi, What is the best way to block googlebot etc momentarily? For example, if I am implementing a programming update to our magento ecommerce platform and am unsure of the results and potential layout/ file changes that may impact SEO (Googlebot continuously spiders our site) How can you block the bots for like 30 mins or so? Thanks
White Hat / Black Hat SEO | | bjs20100 -
Is my SEO strategy solid moving forward (post panda update) or am I doing risky things that might hurt my sites down the road?
Hey all, WIhen I first started doing SEO, I was encouraged by several supposed experts that it was a good idea to buy links from "respectable" sources and as well make use of SEO experimentation offered on Fiverr. I did that a lot for the clients I represented not knowing if this was going to hurt. But now after the latest Google shift, I am realizing that this was stupid and thus deserving of the ranking drops I have received. In the aftermath, I want to list out here what I am doing now to try to build better and stronger rankings for my sites using white hat techniques only... Below is a list of what I'm doing. Please let me know if any of these are bad choices and I will immediately dump them. Also, If i am not including some good options, please let me know that too. I am really embarrassed and humbled by this and could use whatever help you can offer. Thanks in advance for your help... What am I doing now? *Writing quality articles for external blogs with keyword links back to sites *Taking the above articles and spinning them at SEOLINKVINE to create several articles *Writing quality articles for every site's internal blog and using keywords to link out to other sites that are on different servers - All articles are original, varied and not duplicate content. *Writing quality, relevant articles and submitting them to places like Ezine *Signing clients up for Facebook, Yelp, Twitter, etc so they have a social presence *Working to fix mistakes with onsite issues (mirror sites, duplicate page titles, etc.) *Writing quality keyword-rich unique content on each page of each site *Submitting URL listings and descriptions to directories like JoeAnt, REALS and business.com (Any other good ones that people can recommend that give good link juice?) *Doing competitive research and going after highly authoritative links that our competitors have That is about it... HELP!!! Thanks again
White Hat / Black Hat SEO | | creativeguy0