Should We Wait To Launch a Redesigned Site After Google's Core Web Vitals & Page Experience Algo Update
-
We are redesigning our WordPress site (over 1300 posts and pages) and are on schedule to launch in May of 2021. Should we wait for after Google's Core Web Vitals & Page Experience algorithm update?
-
I echo what miguelsantico said. Google's changes are constant and waiting on them will probably only frustrate you. (Not every update Google implements works out for them, either. ) Here are another couple blog posts with insights:
http://www.canirank.com/blog/on-page-optimization-strategies-that-still-work-in-2021/
-
I don't believe there is any reason to wait for an algo update. Especially if your new site has improvements which could help your CWV scores. Google states that they will be using "field data" (from real users, not bots) over a 28-day period to assess CWV. So, if your new site is going to score better, you would want to build up those scores now. That said, if your new site is going to score worse than your current one, you might do well to fix it prior to launching it. There are plenty of tools (both lab data-based and field data-based) to assess your old and new pages. Page Speed Insights is helpful for public-facing pages. Whereas for not-yet-public pages, you might need to resort to using the Audits tab of Chrome Dev Tools, or other tools which allow for authentication, etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Have we been penalised?
Hey Community, We need help! Have we been penalised, or is there some technical SEO issue that is stopping our service pages from being properly read? Website: www.digitalnext.com.au In July 2021, we suffered a huge drop in coverage for both short and longtail keywords. We thought that this could have been because of the link spam, core web vitals or core update around that time period. SEMRush: https://gyazo.com/d85bd2541abd7c5ed2e33edecc62854c
Technical SEO | | StevenLord
GSC: https://gyazo.com/c1d689aff3506d5d4194848e625af6ec There is no manual action within GSC and we have historically ranked page 1 for super competitive keywords. After waiting some time thinking it was an error, we have then have taken the following actions: Launched new website. Rewrote all page content (except blog posts). Ensured each page passes core web vitals. Submitted a backlink detox. Removed a website that was spoofing our old one. Introduced strong pillar and cluster internal link structure. After 3 months of the new website, none of our core terms has come back and we are struggling for visibility. We still rank for some super long-tail keywords but this is the lowest amount of visibility we have had in over 5 years. Every time we launch a blog post it does rank for competitive keywords, yet the old keywords are still completely missing. It almost feels like any URLs that used to rank for core terms are being penalised. So, I am wondering whether this is a penalisation (and what algorithm), or, there is something wrong with the structure of our service pages for them to not rank. Look forward to hearing from you
Steven0 -
Reducing cumulative layout shift for responsive images - core web vitals
In preparation for Core Web Vitals becoming a ranking factor in May 2021, we are making efforts to reduce our Cumulative Layout Shift (CLS) on pages where the shift is being caused by images loading. The general recommendation is to specify both height and width attributes in the html, in addition to the CSS formatting which is applied when the images load. However, this is problematic in situations where responsive images are being used with different aspect ratios for mobile vs desktop. And where a CMS is being used to manage the pages with images, where width and height may change each time new images are used, as well as aspect ratios for the mobile and desktop versions of those. So, I'm posting this inquiry here to see what kinds of approaches others are taking to reduce CLS in these situations (where responsive images are used, with differing aspect ratios for desktop and mobile, and where a CMS allows the business users to utilize any dimension of images they desire).
Web Design | | seoelevated3 -
Google's Search Algorithm update to 'Local Snack Pack'
Hi there - I was wondering if anyone else has noticed a big shift in the Google Local 'snack pack' in the past 48 hours? We have noticed a big change in clients results - specifically today. Has anyone else noticed any changes or perhaps data on possible changes? I am aware of this update: https://www.seroundtable.com/big-google-search-algorithm-ranking-update-29953.html but perhaps there maybe another update since. Any input would be much appreciated! Phil.
Algorithm Updates | | Globalgraphics0 -
Why my website some pages is not index in google ?
Hi, I have submitted my pages in Google fetch for consideration tool but they are not indexed yet in the Google search. Additionally, there is also no error shown by the Google.
On-Page Optimization | | seo.kishore890 -
Google Search Console issue: "This is how Googlebot saw the page" showing part of page being covered up
Hi everyone! Kind of a weird question here but I'll ask and see if anyone else has seen this: In Google Search Console when I do a fetch and render request for a specific site, the fetch and blocked resources all look A-OK. However, in the render, there's a large grey box (background of navigation) that covers up a significant amount of what is on the page. Attaching a screenshot. You can see the text start peeking out below (had to trim for confidentiality reasons). But behind that block of grey IS text. And text that apparently in the fetch part Googlebot does see and can crawl. My question: is this an issue? Should I be concerned about this visual look? Or no? Never have experienced an issue like that. I will say - trying to make a play at a featured snippet and can't seem to have Google display this page's information, despite it being the first result and the query showing a featured snippet of a result #4. I know that it isn't guaranteed for the #1 result but wonder if this has anything to do with why it isn't showing one. VmIqgFB.png
On-Page Optimization | | ChristianMKG0 -
How can you activate the 'Results From' internal search bar on Google SERP?
Hi There, I am hoping someone can advise me on getting the 'Results From' sitelink to display for my site on the Google SERP? I have searched far and wide for the answer with no luck. I'd really appreciate your advice. Thanks! Internal_Search_Google_SERP_zps75a5383e.jpg
On-Page Optimization | | tmg.seo0 -
Not using H1's with keywords to simulate natural non SEO'd content?
There has been a lot of talk lately about making a website seem like it is not SEO'd to avoid over optimization penalties with the recent Google Algorithmic updates. Has anyone come across the practice of not using Headings (H1's, H2's etc..) properly to simulate that the current webpage isn't over optimized? I've come across a site that used to use multiple keywords within their headings & now they are using none. In fact they are marking their company name & logo as an H1 and non keyworded H2's such as our work or Contact. Is anyone holding back on their old SEO tactics to not seem over optimized to Google? Thanks!
On-Page Optimization | | DCochrane0 -
Remove internal site SERPS from Google Index?
1. Internal Serp pages did not have a robots meta tag 2. As a result, client site has thousands (~4,400) of internal site SERP pages in the Google index. 3. We added the NoIndex, Follow attribute to all internal SERPS 4. We Disallowed: domain.com/internal-search-operator in Robots.txt 5. No new SERP pages are being indexed, but the other 4000 something that were already there are still in the index weeks later. 6. The pages are dynamically created and still work, so I can't use the Remove Content tool from google, because the pages don't 404. Is there any way to get these pages out of the index besides just waiting and hoping google eventuall drops them? Thanks
On-Page Optimization | | delegator.com0