SEO implications of changing Date/Time format on website
-
Looking for some advice on an area that I can't seem to find much research about online.
Since starting our website, it's always been hosted in the UK and targeting UK visitors. That means we always had the date/time format of the website as DD.MM.YY for example.
We've now changed business focus and are targeting US visitors. We recently moved the site over to US hosting, and our web developers have instructed that we change to US date/time format (MM.DD.YY).
My question is, are there any implications on doing this from an SEO perspective? Obviously, all our historic blog posts will need to have their date updated from, for example, 9 July to July 9. Does this make any difference at all?
Anyone got any insights as to what best practice with this is?
Cheers.
-
Changing your date/time format will not affect your SEO in any way.
-
Best practice is to do what is best for your consumers - so if targeting US consumers then recommend adopt the language and UX that best suits their needs. From a seo perspective July 5, 2019 - is the US way... for clarity - I always head to the New York Times or Washinton Post, and look at their structure for US.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Single topic website or as part of a multiple topic website?
I have content sitting on a site here - https://www.pfizerpro.co.uk/product/xeljanz/rheumatoid-arthritis - domain authority 25 page authority 18 - the pages went live three months ago and the website was launched 18 months. We now have the option to use a brand new domain www.xeljanz.co.uk Which is the better option to stick with the www.pfizerpro.co.uk as it is a larger multiple topic site that should attract more links or to start a new single topic site which google may view as the better source as it is dedicated to the topic? Thanks
Intermediate & Advanced SEO | | Kate_team_DM0 -
HELP!!! Steep Drop in Organic Traffic Starting 11/1/16
Starting November 1st, organic web traffic from Google dropped from an average of about 60 visits a day to about 5 per day. So we are more than 90% off!!!! At the end of September, we modified the header of the site to simplify it. We also added a snippet of code to each page to enable Zoho "Sales IQ" to work. Sales IQ enables us to track visitors and engage in chat sessions with them. Apart from that no changes have been made from the site. Any ideas as to what could have caused this drop in traffic? Was there a Google update at that time that could have caused the drop? Or could the recent site changes have caused this? I have attached a Google Webmasters Tool report showing the drop in traffic. I would very much appreciate some insight into this, as all organic traffic to our site has ceased. Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan 9VNB1O50 -
What's the best way to A/B test new version of your website having different URL structure?
Hi Mozzers, Hope you're doing good. Well, we have a website, up and running for a decent tenure with millions of pages indexed in search engines. We're planning to go live with a new version of it i.e a new experience for our users, some changes in site architecture which includes change in URL structure for existing URLs and introduction of some new URLs as well. Now, my question is, what's the best way to do a A/B test with the new version? We can't launch it for a part of users (say, we'll make it live for 50% of the users, an remaining 50% of the users will see old/existing site only) because the URL structure is changed now and bots will get confused if they start landing on different versions. Will this work if I reduce crawl rate to ZERO during this A/B tenure? How will this impact us from SEO perspective? How will those old to new 301 URL redirects will affect our users? Have you ever faced/handled this kind of scenario? If yes, please share how you handled this along with the impact. If this is something new to you, would love to know your recommendations before taking the final call on this. Note: We're taking care of all existing URLs, properly 301 redirecting them to their newer versions but there are some new URLs which are supported only on newer version (architectural changes I mentioned above), and these URLs aren't backward compatible, can't redirect them to a valid URL on old version.
Intermediate & Advanced SEO | | _nitman0 -
Our website is not being indexed
We have an issue with a site that we can't get to the bottom of. This site: (URL removed) is not being properly indexed. When we do a search for (URL removed) in google.com.au. The site appears as the 4th listing with the following title and description: (Title removed) A description for this result is not available because of this site's robots.txt – learn more. We have checked the site's robots.txt and can see its been now implemented correctly: (URL removed) About a week ago, we also went into Webmaster Tools and submitted a request for Google to recrawl our site. We are unsure what the issue is that is causing the site to not be properly indexed and how to resolve it. Any assistance on this topic would be most appreciated!
Intermediate & Advanced SEO | | Gavo0 -
We are moving one website to a different domain and would like to know what is the best way to do it without hurting SEO
The website we want to move, let's say www.olddomain.com has a low quality back links profile, in fact it received a manual notification from google of unnatural links detected; but the home page has a PR 3. We want to move it to a different domain let's say www.newdomain.com. We would like to know if it's better to do a 301 redirect to the new domain, in order to transfer the link juice or if it would be better to do a 302, taking into account that this redirect won't pass any link juice, so it would be like start from scratch with this new domain. Thanks for your help.
Intermediate & Advanced SEO | | DoitWiser0 -
Are micro formats and schema.org a worthwhile investment of my time?
Hi all, I recently watched this webinar http://www.seomoz.org/webinars/microformats-real-life-use-cases and it was very good, but I'm not entirely certain if this is a worthwhile thing to be doing? I have a couple of websites I'm improving the SEO on, and things are slowly picking up but I'm not sure if I should be spending time working on improving my landing pages (once these have been identified) or working on the code side of improving my SEO position. When improving my landing pages I'm asking myself the following questions: What is this page about? (Is it clear through headings or introduction text / imagery Does the page sell the benefits of the product / service in a clear manor? Does the page explain what the product is all about? Is the information easy to read, well positioned on-screen and is the content optimised for 1, maybe 2 distinct keywords. Is there an easy method of communication to the site owner if the visitor is interested in the product or service and wants either a quote or wants to find out more. Are there a few links to other relevant areas of the site Granted at the moment I'm in the process of teasing the relevant information out of the client but that's like trying to get blood out of a stone, so I'm exploring other avenues, while they get more content. I am considering going down the route of guest blogging, to possibly get a one-way followed backlink, but again this seems like a lot of effort with no clear guarantee if my efforts in doing this will pay off in being able to generate more leads / enquiries from the websites. Would anyone like to share their thoughts on this?
Intermediate & Advanced SEO | | blacey0 -
"Original Content" Dynamic Hurting SEO? -- Strategies for Differentiating Template Websites for a Nationwide Local Business Segment?
The Problem I have a stable of clients spread around the U.S. in the maid service/cleaning industry -- each client is a franchisee, however their business is truly 'local' with a local service area, local phone/address, unique business name, and virtually complete control over their web presence (URL, site design, content; apart from a few branding guidelines). Over time I've developed a website template with a high lead conversion rate, and I've rolled this website out to 3 or 4 dozen clients. Each client has exclusivity in their region/metro area. Lately my white hat back linking strategies have not been yielding the results they were one year ago, including legitimate directories, customer blogging (as compelling as maid service/cleaning blogs can really be!), and some article writing. This is expected, or at least reflected in articles on SEO trends and directory/article strategies. I am writing this question because I see sites with seemingly much weaker back link profiles outranking my clients (using SEOMoz toolbar and Site Explorer stats, and factoring in general quality vs. quantity dynamics). Questions Assuming general on-page optimization and linking factors are equal: Might my clients be suffering because they're using my oft-repeated template website (albeit with some unique 'content' variables)? If I choose to differentiate each client's website, how much differentiation makes sense? Specifically: Even if primary content (copy, essentially) is differentiated, will Google still interpret the matching code structure as 'the same website'? Are images as important as copy in differentiating content? From an 'machine' or algorithm perspective evaluating unique content, I wonder if strategies will be effective such as saving the images in a different format, or altering them slightly in Photoshop, or using unique CSS selectors or slightly different table structures for each site (differentiating the code)? Considerations My understanding of Google's "duplicate content " dynamics is that they mainly apply to de-duping search results at a query specific level, and choosing which result to show from a pool of duplicate results. My clients' search terms most often contain client-specific city and state names. Despite the "original content" mantra, I believe my clients being local businesses who have opted to use a template website (an economical choice), still represent legitimate and relevant matches for their target user searches -- it is in this spirit I ask these questions, not to 'game' Google with malicious intent. In an ideal world my clients would all have their own unique website developed, but these are Main St business owners balancing solutions with economics and I'm trying to provide them with scalable solutions. Thank You! I am new to this community, thank you for any thoughts, discussion and comments!
Intermediate & Advanced SEO | | localizedseo0 -
Should we block urls like this - domainname/shop/leather-chairs.html?brand=244&cat=16&dir=ascℴ=price&price=1 within the robots.txt?
I've recently added a campaign within the SEOmoz interface and received an alarming number of errors ~9,000 on our eCommerce website. This site was built in Magento, and we are using search friendly url's however most of our errors were duplicate content / titles due to url's like: domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=1 and domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=4. Is this hurting us in the search engines? Is rogerbot too good? What can we do to cut off bots after the ".html?" ? Any help would be much appreciated 🙂
Intermediate & Advanced SEO | | MonsterWeb280