Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
-
Hi there,
A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description.
I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are:
-
If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap?
-
Is submitting each url manually bad, and if so, why?
-
Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls?
-
Any other suggestions?
-
-
Hi David,
The Fetch and Render looked blank, but I know Google can still read the code since it picked up on the schema we added less than a week after we added it. I sent the javascript guides over to our developers, but I would still really appreciate you looking at the URL if possible. I can't find a way to DM you on here, so I've sent you a LinkedIn request. Feel free to ignore it if there's a better way to communicate
- JW
-
That is a interesting Question
-
Hi,
I would mostly look into the site itself, from what you've mentioned here I don't think that the problem is in your sitemap but more on the side or React. Are you using server side or client side rendering for the pages in React? That usually can have a big impact on how Google is able to see the different pages and pick up on content (including meta tags).
Martijn.
-
Hi DigitalMarketingSEO,
This sounds like it's Google having some issues with your React website.
There are plenty of good SEO for Javascript guides out there that I would recommending reading through:
https://www.elephate.com/blog/ultimate-guide-javascript-seo/
https://builtvisible.com/javascript-framework-seo/
https://www.briggsby.com/dealing-with-javascript-for-seoHow did the "Fetch and Render" look? Was Googlebot able to see your page exactly as a human user would?
Can you share the URL here (or PM me)? I've done a lot of work on JS sites and I'd be happy to take a quick look to see I can give some more specific advice.
Cheers,
David
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do things like using labels on an element that is not a form input affect how google sees us in regards to accessibility?
Do things like using labels on an element that is not a form input affect how google sees us? It's an accessibility error that our devs have made - using a label element because it looks good, not because it's an actual label on a form field. Just wondering how that affects accessibility in Google's eyes.
Web Design | | GregLB0 -
I am Using <noscript>in All Webpage and google not Crawl my site automatically any solution</noscript>
| |
Web Design | | ahtisham2018
| | <noscript></span></td> </tr> <tr> <td class="line-number"> </td> <td class="line-content"><meta http-equiv="refresh" content="0;url=errorPages/content-blocked.jsp?reason=js"></td> </tr> <tr> <td class="line-number"> </td> <td class="line-content"><span class="html-tag"></noscript> | and Please tell me effect on seo or not1 -
Website Redesign and Migration to Squarespace killed my Ranking
My old website was dated, ugly, impossible to update and a mess between hard-coded pages and WP, but we were ranking #1 in the organic searches for our key words. I just redesigned my website using Squarespace. I kept most of the same text on the pages (for key words) and kept the same Meta-Tags and Title Tags for each page as much as possible. Once I was satisfied that I had done as much on-page optimization as I could, I changed the IP in our Domain Name Registry so that it would point to our new website on the Squarespace host. And our new website was live! ...Then I watched in dismay as our ranking fell into oblivion. I think this might have something to do with not doing any 301 redirects from the old website and losing all of my link juice. Is this the case? And, if so, how do I fix it? Our website url is www.kanataskinclinic.ca Thanks
Web Design | | StillLearning1 -
Hiding content until user scrolls - Will Google penalize me?
I've used: "opacity:0;" to hide sections of my content, which are triggered to show (using Javascript) once the user scrolls over these sections. I remember reading a while back that Google essentially ignores content which is hidden from your page (it mentioned they don't index it, so it's close to impossible to rank for it). Is this still the case? Thanks, Sam
Web Design | | Sam.at.Moz0 -
Website Redesign - What to do with old 301 URLs?
My current site is on wordpress. We are currently designing a new wordpress site, with the same URLs. Our current approach is to go into the server, delete the current website files and ad the new website files. My current site has old urls which are 301 redirected to current urls. Here is my question. In the current redesign process, do i need to create pages for old the 301 redirected urls so that we do not lose them in the launch of the new site? or is the 301 command currently existing outside of our server so this does not matter? Thank you in advance.
Web Design | | CamiloSC0 -
Do I need to 301 redirect www.domain.com/index.html to www.domain.com/ ?
So, interestingly enough, the Moz crawler picked up my index.html file (homepage) and reported duplicate content, of course. But, Google hasn't seemed to index the www.domain.com/index.html version of my homepage, just the www.domain.com version. However, it looks like I do have links going specifically to www.domain.com/index.html and I want to make sure those are getting counted towards my overall domain strength. Is it necessary to 301 redirect in the scenario described above?
Web Design | | Small_Business_SEO0 -
Site-wide footer links or single "website credits" page?
I see that you have already answered this question before back in 2007 (http://www.seomoz.org/qa/view/2163), but wanted to ask your current opinion on the same question: Should I add a site-wide footer link to my client websites pointing to my website, or should I create a "website credits" page on my clients site, add this to the footer and then link from within this page out to my website?
Web Design | | eseyo0 -
How do you visualize website structure
How do you visualize a website structure in terms of (categories of) pages and interlinking. I use such visuals for discussing what you are actually doing now and what can be improved. I have made visuals I few times myself (basically making boxes representing categories of pages and lines representing internal links), but I found that I soon ran into a scheme of huge proportions and needed more paper and more time. Appreciate your thoughts!
Web Design | | NewBuilder2