What problems could arise from updating php version?
-
i havent really gotten a straight answer yet for this question - my client says:
"The developers are skeptical about the possibility to update PHP on our server as this could seriously damage the entire RV site functionality."
since i know nothing about php and any potential hazards i have to ask the community to see if there is any validity to these concerns.
we cant update our version of WP unless the version of php is first upgraded from 5.1.6 to 5.2.4
client wont do this because developers say its a potential nightmare
i , as seo, want a current updated version of WP for many obvious reasons
can anyone please tell me what, if any, problems could arise from upgrading the sites php? or is it just a lot of work and the developers are making excuses because they dont want to do it?
thanks very much to whoever answers
-
Alan,
I cannot think of an answer I have seen recently with the clarity of thought and ability to refute a very bad practice. (Especially on a security level). One thing we do with clients and upgrades (we do not handle clients on other peoples platforms) is to take a new update, give it a few weeks for bugs to be discovered and then do the upgrade on our end. We have clients sign off on us handling the upgrades, etc. from the beginning of the relationship.
For Erik I would suggest showing the client what has been said by someone with a lot of savvy experience. If the devs are worth their salt, they will change.
This was a good question and Alan delivered a great answer.
Robert
-
I've got very limited bandwidth for training (90% of my work comes from audits) - and is typically limited to in-person on-site for clients in the LA area because I find the in-person experience to be much more effective. Pricing depends on level and extent, and starts at $250 an hour so is ideal for groups (one fee regardless of participant count). Audits range from $3500 upwards of $7500 or more depending on scale.
-
what do you charge for
Individual Personalized Training? and site audits?
-
You can thank my combined 11 years SEO after 7 years web dev project management, with a background in information security and business ownership
-
i want to have your baby
brilliant answer!
i just copy/pasted the entire thing to my client - even got your pic and bio in there for added cred.
i have been asking this question in one way or another since early Feb. and you just nailed it
thank you very much
-
Any time you upgrade a server solution, the potential exists for things that are currently working to suddenly break. That is just the nature of technology. In an ideal world, this wouldn't happen, however unfortunately it's quite possible for many reasons.
Just one reason is technology developers cannot possibly test for every single unique server configuration on earth when working on an upgrade. They have time, resource and fiscal constraints.
In one example of how an upgrade from PHP 5.1.6 to 5.2.4 caused a WP site to collapse, the problem was neither with PHP OR WP. It was with a separate server solution related to firewalls that had to then be dealt with.
That example validates the concern expressed by the developers you're dealing with.
HOWEVER
Regardless of potential problems of this nature, it is irresponsible and deplorable for developers to refuse to upgrade servers out of the fear that something might break. Could you imagine 90% of the world still operating on IBM mainframe computers because of a fear to upgrade? The security implications alone are appalling, let alone business-case reasons.
Developers and systems administrators "should" be required to implement upgrades on a regular consistent basis, with the understanding that it is their responsibility to deal with problems that may arise, and during upgrades "should" also use intelligent best practices precautions and methods to ensure the least likely chance for a critical failure. THAT is the only proper business path for a business to remain successful long-term.
Hiding under the guise of "it's too dangerous" is a terrible pitiful excuse for laziness.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Whats up with the last google update.
I have numerous clients who were at the top of page in the top 3 spots. They all dropped to page 3 or 4 or 2 and now they are number 1 in maps or in the top 3. Content is great on all these sites. backlinks are high quality and we do not build high quantity, we always focus on quality. the sites have authorship information. trust . we have excellent content written by professionals in the industry for each of the websites. The sites load super fast. they are very mobile friendly. we have CDN installed. content is organized per topic. all of our citations are setup properly and no duplicates, or missing citations. code is good on the websites. we do not have anchor text links pointing to the site from gust posts or whatever. we have plenty of content. our DA/PA is great. Audits of the website are great. I've been doing this a long time and ive never been so dumb founded as to what google has done this time. Or better yet what exactly is wrong with our clients websites today that was working perfectly for the last 5 years. I really am getting frustrated. im comparing my sites to competitors and everything's better. Please someone guide me here and tell me what im missing or tell me what you have done to recover from this nonsense.
Intermediate & Advanced SEO | | waqid0 -
Content update on 24hr schedule
Hello! I have a website with over 1300 landings pages for specific products. These individual pages update on a 24hr cycle through out API. Our API pulls reviews/ratings from other sources and then writes/updates that content onto the page. Is that 'bad"? Can that be viewed as spammy or dangerous in the eyes of google? (My first thought is no, its fine) Is there such a thing as "too much content". For example if we are adding roughly 20 articles to our site a week, is that ok? (I know news websites add much more than that on a daily basis but I just figured I would ask) On that note, would it be better to stagger our posting? For example 20 articles each week for a total of 80 articles, or 80 articles once a month? (I feel like trickle posting is probably preferable but I figured I would ask.) Is there any negatives to the process of an API writing/updating content? Should we have 800+ words of static content on each page? Thank you all mozzers!
Intermediate & Advanced SEO | | HashtagHustler0 -
Is a rootserver that has previously been used for porn and spam a problem for SEO?
As we found out via Alexa we are hosting our website on a server that has been used under the same IP heavily mostly for pornsites. Will this become a problem for our webproject? By the way: We are not in the porn-sector. We are an NGO with an alternative social media project for a better world. Thanks for your help.
Intermediate & Advanced SEO | | RWW0 -
Having possible problems with rankings due to development website
Hi all, I've got an interesting issue and a bit of a technical challenge for you. It's a bit complicated to explain, but please bear with me. We have a client website (http://clientwebsite.com) which we are having a hard time ranking in the past few months. Main keywords simply don't show up in Top100 searches, even though we are constantly building backlinks through Guest Posts, Citations, Media mentions, Profile links etc. Normally, we use ahrefs to look at the client's website backlinks, but just today we used Majestic to look at the backlink profile and one backlink stood out. This is a backlink from a development server (http://developmentwebsite.com) which redirects to http://clientwebsite.com
Intermediate & Advanced SEO | | zakkyg
The developers who were working on the redesign of the client website, put it up on their server and forgot to delete it.
Also, the content inside the development website is almost identical with the client website. We then checked to see if http://developmentwebsite.com is indexed.
It's not. Although, inside the robots file http://developmentwebsite.com/robots.txt there's:
User-agent: *
Allow: /
The funny (and weird thing) is that http://developmentwebsite.com/ and all development website inner pages are not indexed in Google. But if we go to http://developmentwebsite.com/inner-page, it doesn't redirect to the corresponding http://clientwebsite.com/inner-page, it's the same development website page URL and the pages even have links to the client website, but like I said, none of the pages of the development website are indexed, even though crawlers are allowed in the robots.txt's development website. In your opinion, could this be the reason why we are having a hard time to rank the client website? Second question is:
How do we approach in solving this issue?
Do we simply delete the whole http://developmentwebsite.com with all the inner pages?
Or should we do 301 redirrects on a per-page basis?0 -
Only the mobile version of the site is being indexed
We've got an interesting situation going on at the moment where a recently on-boarded clients site is being indexed and displayed, but it's on the mobile version of the site that is showing in serps. A quick rundown of the situation. Retail shopping center with approximately 200 URLS Mobile version of the site is www.mydomain.com/m/ XML sitemap submitted to Google with 202 URLs, 3 URLS indexed Doing site:www.mydomain.com in a Google search brings up the home page (desktop version) and then everything else is /m/ versions. There is no rel="canonical" on mobile site pages to their desktop counterpart (working on fixing that) We have limited CMS access, but developers are open to working with us on whatever is needed. Within desktop site source code, there are no "noindex, nofollow, etc" issues on the pages. No manual actions, link issues, etc Has anyone ever encoutnered this before? Any input or thoughts are appreciated. Thanks
Intermediate & Advanced SEO | | GregWalt0 -
Magento Trailing Slash URL Problem
Howdy Mozzers! Our magento store URL's are accessible with or without a trailing slash at the end. Canonical's and 301 redirects are not set up for one of them at the moment. Will this cause duplicate issue? Do we need to set canonical or 301 up? Which one is recommended? MozAddict
Intermediate & Advanced SEO | | MozAddict0 -
Is Google indexing Mp3 audio and MIDI music files? Can that cause any duplicate problems?
Hello, I own virtualsheetmusic.com website and we have several thousands of media files (Mp3 and MIDI files) that potentially Google can index. If that's the case, I am wondering if that could cause any "duplicate" issues of some sort since many of such media files have exact file names or same meta information inside. Any thoughts about this issue are very welcome! Thank you in advance to anyone.
Intermediate & Advanced SEO | | fablau0 -
Rel=canonical an iframed version of the same website?
My issue is that we have two websites with the same content. For the sake of an example lets say they are: jackson.com jacksonboats.com When you go to jacksonboats.com, the website is an iframed version of jackson.com. However all of the companies email addresses are example@jacksonboats.com so a 301 is not possible. What would be the best way to forward over the link juice from jacksonboats.com to jackson.com? I'm thinking a rel=canonical tag, but I wanted to ask first. Thanks,
Intermediate & Advanced SEO | | BenGMKT0