GWT Message - CMS Update Available
-
Howdy Moz,
Just received a message in Google Webmaster Tools about a CMS update:
"Joomla Update Available
As of the last crawl of your website, you appear to be running Joomla 1.5. One or more of the URLs found were:
http://www.website/custom-url/article5034
Google recommends that you update to the latest release. Older or unpatched software may be vulnerable to hacking or malware that can hurt your users. To download the latest release, visit the Joomla download page. If you have already updated to the latest version of Joomla, please disregard this message.
If you have any additional questions about why you are receiving this message, Google has provided more background information in a blog post about this subject."
Read through the associated blog post. According to the post a generator meta tag is created in Joomla that notes the CMS version. Here's the oddity:
The site was on Joomla 1.5 over 2 years ago. 1 Year ago it was updated to Joomla 2.5. About a week ago it was converted completely to Wordpress. According to GWT the last date the Google bot accessed the site was the day before (5/1/14) the email.
I went through the code, css/html, and the database and found no reference of Joomla 1.5.
Has anyone seen this message? If so, how did you rectify it? Were there any adverse effects on rankings?
-
Just wanted to add, no I don't think there would be any adverse effects on ranking, unless the site was compromised somehow. Since you are on a completely different system, you should be fine with a resubmission.
On a side note, since you are on WordPress now, make sure you have the right file permissions, and shell access turned off. Hackers love a WordPress site that is unprepared.
Best of luck with the new site!
-
Sounds like they have a super old page cached in their system. Do you have caching turned off in the new WordPress site? Most people turn it on, but I have found if you have gzip compression and css compiling turned on, it's not needed. To me, caching can create more headaches then solutions. (personal opinion)
You could also go into your webmaster account, and do a fresh "fetch as google" for the root domain, and all linked pages. This way they will have the latest version of your site in their database. Google downloads your site so they can reference the content quickly for the search queries entered. Could be they are still on an old download. Have you ever resubmitted the site since you rebuilt it?
-
You can check the cached version of that page to see what they have in the cache - is it a recent version of the page or something pretty old, and they haven't reindexed that page in a while.
Either way, I think these notices are more recommendations and helpful tidbits they think will assist webmasters than crucial information that will influence rankings. So if the message isn't relevant anymore, I would ignore it and move on building a great website for your visitors!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Whats up with the last google update.
I have numerous clients who were at the top of page in the top 3 spots. They all dropped to page 3 or 4 or 2 and now they are number 1 in maps or in the top 3. Content is great on all these sites. backlinks are high quality and we do not build high quantity, we always focus on quality. the sites have authorship information. trust . we have excellent content written by professionals in the industry for each of the websites. The sites load super fast. they are very mobile friendly. we have CDN installed. content is organized per topic. all of our citations are setup properly and no duplicates, or missing citations. code is good on the websites. we do not have anchor text links pointing to the site from gust posts or whatever. we have plenty of content. our DA/PA is great. Audits of the website are great. I've been doing this a long time and ive never been so dumb founded as to what google has done this time. Or better yet what exactly is wrong with our clients websites today that was working perfectly for the last 5 years. I really am getting frustrated. im comparing my sites to competitors and everything's better. Please someone guide me here and tell me what im missing or tell me what you have done to recover from this nonsense.
Intermediate & Advanced SEO | | waqid0 -
URL change - Sitemap update / redirect
Hi everyone Recently we performed a massive, hybrid site migration (CMS, URL, site structure change) without losing any traffic (yay!). Today I am finding out that our developers+copy writers decided to change Some URLs (pages are the same) without notifying anyone (I'm not going into details why). Anyhow, some URLs in site map changed, so old URLs don't exist anymore. Here is the example: OLD (in sitemap, indexed): https://www.domain.com/destinations/massachusetts/dennis-port NEW: https://www.domain.com/destinations/massachusetts/cape-cod Also, you should know that there is a number of redirects that happened in the past (whole site) Example : Last couple years redirections: HTTP to HTTPS non-www to www trailing slash to no trailing slash Most recent (a month ago ) Site Migration Redirects (URLs / site structure change) So I could add new URLs to the sitemap and resubmit in GSC. My dilemma is what to do with old URL? So we already have a ton of redirects and adding another one is not something I'm in favor of because of redirect loops and issues that can affect our SEO efforts. I would suggest to change the original, most recent 301 redirects and point to the new URL ( pre-migration 301 redirect to newly created URL). The goal is not to send mixed signals to SEs and not to lose visibility. Any advice? Please let me know if you need more clarification. Thank you
Intermediate & Advanced SEO | | bgvsiteadmin0 -
I'm noticing that URL that were once indexed by Google are suddenly getting dropped without any error messages in Webmasters Tools, has anyone seen issues like this before?
I'm noticing that URLs that were once indexed by Google are suddenly getting dropped without any error messages in Webmasters Tools, has anyone seen issues like this before? Here's an example:
Intermediate & Advanced SEO | | nystromandy
http://www.thefader.com/2017/01/11/the-carter-documentary-lil-wayne-black-lives-matter0 -
Site recovery after manual penalty, disavow, SSL, Mobile update = but dropped again in May
I have a site that has had a few problems over the last year. We had a manual penalty in late 2013 for bad links, some from guest blogs and some from spammy sites. Reconsideration requests had me disavow almost all of the incoming links. Later in 2014, the site was hit with link injection malware and had another manual penalty. That was cleared up and manual penalty removed in Jan 2015. During this time the site was moved to SSL, but there were some redirect problems. By Feb 2015 everything was cleared up and a an updated disavow list was added. The site recovered in March and did great. A mobile version was added in April. About May 1st rankings dropped again. Traffic is about 40% off it's March levels. Recently I read that a new disavow file will supersede an old one, and if all of the original domains and URLs aren't included in the new disavow file they will no longer be disavowed. Is this true? If so, is it possible that a smaller disavow file uploaded in Feb would cause rankings to drop after the May 3 Quality update? Can I correct this by disavowing all the previously disavowed domains and URLs? Any advice for determining why the site is performing poorly again? We have well written content, regular blogs, nothing that seems like it should violate the Google guidelines.
Intermediate & Advanced SEO | | Robertjw0 -
Sort term product pages and fast indexing - XML sitemaps be updated daily, weekly, etc?
Hi everyone, I am currently working on a website that the XML sitemap is set to update weekly. Our client has requested that this be changed to daily. The real issue is that the website creates short term product pages (10-20 days) and then the product page URL's go 404. So the real problem is quick indexing not daily vs weekly sitemap. I suspect that daily vs weekly sitemaps may help solve the indexing time but does not completely solve the problem. So my question for you is how can I improve indexing time on this project? The real problem is how to get the product pages indexed and ranking before the 404 page shows u?. . Here are some of my initial thoughts and background on the project. Product pages are only available for 10 to 20 days (Auction site).Once the auction on the product ends the URL goes 404. If the pages only exist for 10 to 20 days (404 shows up when the auction is over), this sucks for SEO for several reasons (BTW I was called onto the project as the SEO specialist after the project and site were completed). Reason 1 - It is highly unlikely that the product pages will rank (positions 1 -5) since the site has a very low Domain Authority) and by the time Google indexes the link the auction is over therefore the user sees a 404. Possible solution 1 - all products have authorship from a "trustworthy" author therefore the indexing time improves. Possible solution 2 - Incorporate G+ posts for each product to improve indexing time. There is still a ranking issue here since the site has a low DA. The product might appear but at the bottom of page 2 or 1..etc. Any other ideas? From what I understand, even though sitemaps are fed to Google on a weekly or daily basis this does not mean that Google indexes them right away (please confirm). Best case scenario - Google indexes the links every day (totally unrealistic in my opinion), URL shows up on page 1 or 2 of Google and slowly start to move up. By the time the product ranks in the first 5 positions the auction is over and therefore the user sees a 404. I do think that a sitemap updated daily is better for this project than weekly but I would like to hear the communities opinion. Thanks
Intermediate & Advanced SEO | | Carla_Dawson0 -
Better to publish regular new pricelist articles or update the existing ones ?
Hello Moooooooooooooz ! I could not sleep yesterday because of a SEO nightmare ! So I came up with the following question: "Is it better to release regular new articles or update the existing ones" I explain more. Our company release regular pricelists (every month new pricelists available for a month, with the same brands. ex: January pricelist for brand A, etc.) Right now those pricelists are ranking good on google. So I wondered: Would it better to do: Make the pricelist articles stronger: Our company - Brand A pricelist (title) blog/offer/brand-A-pricelist.html (url) -> every month I update the text. So I just have one article /link to work on **Make more content on the pricelist: **Our company - Brand A pricelist - January 2014 (title) blog/offer/brand-A-pricelist-january.html (url) -> So google keeps indexing new fresh content **Work on a extra category: **Our company - Brand A pricelist - January 2014 (title) blog/offer/brand-A/pricelist-january.html (url) -> So I work on one link over the web blog/offer/brand-A where Google finds lots of new relevant contents I know that Matt Cutts said it's good to udpate an old article but in this case it's a bit different. Has anyone experiment the same ? Tks a lot !
Intermediate & Advanced SEO | | AymanH0 -
Was anyone hit by BOTH the 'Phantom' update as well as Penguin 2.0?
I'm interested to know if Phantom was just a "pre-Penguin" 2.0 or if it was a completely different update. Thoughts?
Intermediate & Advanced SEO | | nicole.healthline0 -
Penguin Update Issues.. What would you recommend?
Hi, We've been pretty badly hit by this penguin Update. Site traffic is down 40-50%. We suspect it's for a couple of reasons 1)Google is saying we have duplicate content. e.g. for a given category we will have 4-5 pages of content (products). So it's saying pagenum=2 , pagenum=3 etc are duplicate pages. We've implemented rel=canonical so that pagenum=2 point to the original category e.g. http://mydomain/widgets.aspx We've even specified pagenum as a url parameter that pagniates. Google still hasn't picked up these changes. How long does it take - it's been about a week 2)They've saying we have soft 404 errors. e.g. we remove a category or product we point users to a category or page not found. is it best to block googlebot from crawling these page by specifying in robots.txt. because we really don't care about these categories or product pages. How best to handle? 3)There are some bad directory and crawlers that have crawled our website but have put incorrect links . So we've got like 1700 product not found. I'm sure that's taking up a lot of crawling time. So how do we tell Google not to bother with these link coming from specific sources e.g. ignore all links coming from xxx.com. Any help will be much appreciated as this is Killing our business. Jay
Intermediate & Advanced SEO | | ConservationM0