Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
-
Hi there,
A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description.
I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are:
-
If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap?
-
Is submitting each url manually bad, and if so, why?
-
Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls?
-
Any other suggestions?
-
-
Hi David,
The Fetch and Render looked blank, but I know Google can still read the code since it picked up on the schema we added less than a week after we added it. I sent the javascript guides over to our developers, but I would still really appreciate you looking at the URL if possible. I can't find a way to DM you on here, so I've sent you a LinkedIn request. Feel free to ignore it if there's a better way to communicate

- JW
-
That is a interesting Question
-
Hi,
I would mostly look into the site itself, from what you've mentioned here I don't think that the problem is in your sitemap but more on the side or React. Are you using server side or client side rendering for the pages in React? That usually can have a big impact on how Google is able to see the different pages and pick up on content (including meta tags).
Martijn.
-
Hi DigitalMarketingSEO,
This sounds like it's Google having some issues with your React website.
There are plenty of good SEO for Javascript guides out there that I would recommending reading through:
https://www.elephate.com/blog/ultimate-guide-javascript-seo/
https://builtvisible.com/javascript-framework-seo/
https://www.briggsby.com/dealing-with-javascript-for-seoHow did the "Fetch and Render" look? Was Googlebot able to see your page exactly as a human user would?
Can you share the URL here (or PM me)? I've done a lot of work on JS sites and I'd be happy to take a quick look to see I can give some more specific advice.
Cheers,
David
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Bing Indexation and handling of X-ROBOTS tag or AngularJS
Hi MozCommunity, I have been tearing my hair out trying to figure out why BING wont index a test site we're running. We're in the midst of upgrading one of our sites from archaic technology and infrastructure to a fully responsive version.
Web Design | | AU-SEO
This new site is a fully AngularJS driven site. There's currently over 2 million pages and as we're developing the new site in the backend, we would like to test out the tech with Google and Bing. We're looking at a pre-render option to be able to create static HTML snapshots of the pages that we care about the most and will be available on the sitemap.xml.gz However, with 3 completely static HTML control pages established, where we had a page with no robots metatag on the page, one with the robots NOINDEX metatag in the head section and one with a dynamic header (X-ROBOTS meta) on a third page with the NOINDEX directive as well. We expected the one without the meta tag to at least get indexed along with the homepage of the test site. In addition to those 3 control pages, we had 3 pages where we had an internal search results page with the dynamic NOINDEX header. A listing page with no such header and the homepage with no such header. With Google, the correct indexation occured with only 3 pages being indexed, being the homepage, the listing page and the control page without the metatag. However, with BING, there's nothing. No page indexed at all. Not even the flat static HTML page without any robots directive. I have a valid sitemap.xml file and a robots.txt directive open to all engines across all pages yet, nothing. I used the fetch as Bingbot tool, the SEO analyzer Tool and the Preview Page Tool within Bing Webmaster Tools, and they all show a preview of the requested pages. Including the ones with the dynamic header asking it not to index those pages. I'm stumped. I don't know what to do next to understand if BING can accurately process dynamic headers or AngularJS content. Upon checking BWT, there's definitely been crawl activity since it marked against the XML sitemap as successful and put a 4 next to the number of crawled pages. Still no result when running a site: command though. Google responded perfectly and understood exactly which pages to index and crawl. Anyone else used dynamic headers or AngularJS that might be able to chime in perhaps with running similar tests? Thanks in advance for your assistance....0 -
Spanish website indexed in English, redirect to spanish or english version if i do a new website design?
Hi MOZ users, i have this problem. We have a website in Spanish Language but Google crawls it on English (it is not important the reasons). We re made the entire website and now we are planning the move. The new website will have different language versions, english, spanish and portuguese. Somebody tells me that we have to redirect the old urls (crawled on english) to the new english versions, not to the spanish (the real language of the firsts). Example: URL1 Language: Spanish - Crawled on English --> redirect to Language English version. the other option will be redirect to the spanish new version, which the visitor is waiting to find. URL1 Language: Spanish - Crawled on English --> redirect to Language Spanish version. What do you think? Which is the better option?
Web Design | | NachoRetta0 -
E-Commerce Website Architecture - Cannibalization between Product Categories and Blog Categories?
Hi, I have an e-commerce site that sells laptops. My main landing pages and category pages are as follows:
Web Design | | BeytzNet
"Toshiba Laptops", "Samsung Laptops", etc. We also run a WP blog with industry news.
The posts are divided into categories which are basically as our landing pages.
The posts themselves usually link to the appropriate e-commerce landing page.
For example: a post about a new Samsung Laptop which is categorized in the blog under "Samsung Laptops" will naturally link somewhere inside to the "samsung laptops" ecommerce landing page. Is that good or do the categories on the blog cannibalize my more important e-commerce section landing pages? Thanks0 -
Subdomains For Real Estate Website
I am currently working on a proposal for a clients Wordpress website development which includes ongoing SEO after the website is developed. I have looked into a number of options and the one that seems the most cost effective involves using subdomains for the individual listings pages. What I want: clientsdomain.com/listings/idxnumber/ What I can get for a decent price: listings.clientsdomain.com/idxnumber/ So the majority of the website will actually exist on a subdomain because the IDX API will automatically populate pages for all of the MLS listings in the area (hundreds or thousands). Meanwhile the domain itself will have all the neighborhood pages and other optimized content, blogs and whatnot. My concern is that dividing the website like this will have negative effects on SEO. There wont be duplicate content across subdomain and main domain, but they will share a lot of links back and forth. I haven't found any recent sources on the topic. Almost everything I have found says that dividing a website in this manor is bad for SEO, but these articles are often many years old. Does anyone know of a Wordpress plugin/IDX company that can provide a solution that doesn't use a subdomain and actually just lists each MLS page within a directory? I am open to using another platform, I am just most familiar with Wordpress. Will using a subdomain in the ways mentioned above have a profound negative effect on SEO? Thank you for your time in responding, I greatly appreciate it.
Web Design | | TotalMarketExposure0 -
Does Google count the domain name in its 115-character "ideal" URL length?
I've been following various threads having to do with URL length and Google's happiness therewith and have yet to find an answer to the question posed in the title. Some answers and discussions have come close, but none I've found have addressed this with any specificity. Here are four hypothetical URLs of varying lengths and configurations: EXAMPLE ONE:
Web Design | | RScime25
my-big-widgets-are-the-best-widgets-in-the-world-and-come-in-many-vibrant-and-unique-colors-and-configurations.html (115 characters) EXAMPLE TWO: sample.com/my-big-widgets-are-the-best-widgets-in-the-world-and-come-in-many-vibrant-and-unique-colors-and-configurations.html (126 characters) EXAMPLE THREE: www.sample.com/my-big-widgets-are-the-best-widgets-in-the-world-and-come-in-many-vibrant-and-unique-colors-and-configurations.html (130 characters) EXAMPLE FOUR: http://www.sample.com/my-big-widgets-are-the-best-widgets-in-the-world-and-come-in-many-vibrant-and-unique-colors-and-configurations.html (137 characters) Assuming the examples contain appropriate keywords and are linked to appropriate anchor text (etc.,) how would Google look upon each? All I've been able to garner thus far is that URLs should be as short as possible while still containing and contextualizing keywords. I have 500+ URLs to review for the company I work for and could use some guidance; yes, I know I should test, but testing is problematical to the extreme; I look to the collective/accumulated wisdom of the MOZVerse for help. Thanks.1 -
Does Google penalize duplicate website design?
Hello, We are very close to launching five new websites, all in the same business sector. Because we would like to keep our brand intact, we are looking to use the same design on all five websites. My question is, will Google penalize the sites if they have the same design? Thank you! Best regards,
Web Design | | Tiberiu
Tiberiu0 -
How to put 'Link to this article' HTML code at bottom of article & is it helpful?
Hello, I was thinking about putting a box down at the bottom of my client's main articles that let's the reader easily copy the html code it takes to link to the article they're reading. Maybe I'd put it after the author bio. Do any of you do this? If so, what format do you use? It has to look nice of course. This is a non-techie industry. Thanks.
Web Design | | BobGW0 -
Need to rebuild client's flash website
I am working with their web designer and need to figure out a way to rebuild their site which is currently all in flash. I was wondering if there was a way to do this without spending a ton of time in completely re-doing the site from scratch.
Web Design | | awalker840