Index Problem
-
Hi guys
I have a critical problem with google crawler.
Its my website : https://1stquest.com
I can't create sitemap with online site map creator tools such as XML-simemap.org
Fetch as google tools usually mark as partial
MOZ crawler test found both HTTP and HTTPS version on site!
and google cant index several pages on site.
Is problem regards to "unsafe URL"? or something else?
-
Hi peter
just curious did you have to do anything specific for SEO side of things? I am working with a developer who assures me that angular will not impact on SEO but looking at their past efforts I am not too sure?
-
Hello all,
Thought I'd drop in my $0.02 about Google and Angular JS. We switched over to it two weeks ago at FixedPriceCarService.com.au - existing pages were fine, with no transition issues, and same goes for all the new pages we added. All were indexed quite quickly on Google.
Google Search Console has no issue with finding things like Page Title and Meta Description that have their home now in the DOM.
I'm curious about when MOZ might also be able to crawl the DOM, and not report missing page titles that are not truly missing, rather they've been relocated.
Scott - Fixed Price Car Service
-
Hi Hamid!
Did Peter or Martijn answer your question? If so, please mark one or both as a Good Answer.
If not, what are you still looking for?
-
Peter is indeed correct, it doesn't seem you have to worry about unsafe URLs but more about the technical way that your site is built. AngularJS is relatively new and Google and as it's JS based it's rather hard for Google to retrieve all the content already on these pages in order to 'get' the whole page. His suggestion is also spot on and will make it easier for search engines to crawl, read and index your site.
-
If you can't create xml sitemap then you can make plugin for your site with your devs for this. Google can't render your webiste because technology. I've seen that your site using AngularJS and you have tiny HTML code.
However there is solution for you described here:
https://www.distilled.net/resources/prerender-and-you-a-case-study-in-ajax-crawlability/So if you provide fully rendered HTML to search engine bots then your site can be finally indexed proper.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap indexing
Hi everyone, Here's a duplicate content challenge I'm facing: Let's assume that we sell brown, blue, white and black 'Nike Shoes model 2017'. Because of technical reasons, we really need four urls to properly show these variations on our website. We find substantial search volume on 'Nike Shoes model 2017', but none on any of the color variants. Would it be theoretically possible to show page A, B, C and D on the website and: Give each page a canonical to page X, which is the 'default' page that we want to rank in Google (a product page that has a color selector) but is not directly linked from the site Mention page X in the sitemap.xml. (And not A, B, C or D). So the 'clean' urls get indexed and the color variations do not? In other words: Is it possible to rank a page that is only discovered via sitemap and canonicals?
Intermediate & Advanced SEO | | Adriaan.Multiply1 -
Schema.org problems (still)
Hey Mozzers, I've been working at this for a while now, and I can't figure out why the rich snippet data is not getting pulled for our reviews and product rating. I've included a sample URL where we have reduced the schema.org markup: http://www.tripcentral.ca/vacations-packages_00_03_JN_gran-bahia-principe-coba.html | } | Any thoughts? I was told not to list multiple reviews, so I took them out. But it's still not being picked up in the SERPs, and we would really like the star rating data to appear. Any useful advice would be appreciated!
Intermediate & Advanced SEO | | tripcentral0 -
Website not being indexed after relocation
I have a scenario where a 'draft' website was built using Google Sites, and published using a Google Sites sub domain. Consequently, the 'same' website was rebuilt and published on its own domain. So effectively there were two sites, both more or less identical, with identical content. The first website was thoroughly indexed by Google. The second website has not been indexed at all - I am assuming for the obvious reasons ie. that Google is viewing it as an obvious rip-off of the first site / duplicate content etc. I was reluctant to take down the first website until I had found an effective way to resolve this issue long-term => ensuring that in future Google would index the second 'proper' site. A permanent 301 redirect was put forward as a solution - however, believe it or not, the Google Sites platform has no facility for implementing this. For lack of an alternative solution I have gone ahead and taken down the first site. I understand that this may take some time to drop out of Google's index, however, and I am merely hoping that eventually the second site will be picked up in the index. I would sincerely appreciate an advice or recommendations on the best course of action - if any! - I can take from here. Many thanks! Matt.
Intermediate & Advanced SEO | | collectedrunning0 -
How do you de-index and prevent indexation of a whole domain?
I have parts of an online portal displaying in SERPs which it definitely shouldn't be. It's due to thoughtless developers but I need to have the whole portal's domain de-indexed and prevented from future indexing. I'm not too tech savvy but how is this achieved? No index? Robots? thanks
Intermediate & Advanced SEO | | Martin_S0 -
Freshness Index?
Hi, I've been a member for a few months but this is my first entry. I typically build small portal websites to help attract more customers for small business approx. 5-7 pages and very tightly optimized around one primary keyword and 2 secondaries. These are typically very low competition. I do no link building to speak of. I don't keyword stuff or use poorly written content. I know that may be subjective but I believe the content I am using is genuinely useful to the reader. What I have noticed recently is the sites get ranked quite well to begin with e.g. anywhere from the bottom half of the first page to page 2-3 and they stick for maybe 2-3 weeks, and the client is very happy, they then just vanish. It's not just the Google dance either these sites don't typically come back at all or when they do they are 100+ I was advised this was due to the freshness index but honestly these sites are hardly newsworthy...just wondering if anyone had any ideas? Many thanks in advance.
Intermediate & Advanced SEO | | nichemarkettools0 -
Thousands of 404 Pages Indexed - Recommendations?
Background: I have a newly acquired client who has had a lot of issues over the past few months. What happened is he had a major issue with broken dynamic URL's where they would start infinite loops due to redirects and relative links. His previous SEO didn't pay attention to the sitemaps created by a backend generator, and it caused hundreds of thousands of pages to be indexed. Useless pages. These useless pages were all bringing up a 404 page that didn't have a 404 server response (it had a 200 response) which created a ton of duplicate content and bad links (relative linking). Now here I am, cleaning up this mess. I've fixed the 404 page so it creates a 404 server response. Google webmaster tools is now returning thousands of "not found" errors, great start. I fixed all site errors that cause infinite redirects. Cleaned up the sitemap and submitted it. When I search site:www.(domainname).com I am still getting an insane amount of pages that no longer exist. My question: How does Google handle all of these 404's? My client wants all the bad pages removed now but I don't have as much control over that. It's a slow process getting Google to remove these pages that are returning a 404. He is continuously dropping in rankings still. Is there a way of speeding up the process? It's not reasonable to enter tens of thousands of pages into the URL Removal Tool. I want to clean house and have Google just index the pages in the sitemap.
Intermediate & Advanced SEO | | BeTheBoss0 -
Canonical Problem
Hello all. Could someone have a look at my page here www.ashley-wedding-cars.co.uk here and tell me why I have a canonical problem.
Intermediate & Advanced SEO | | AshJez0 -
Should I Allow Blog Tag Pages to be Indexed?
I have a wordpress blog with settings currently set so that Google does not index tag pages. Is this a best practice that avoids duplicate content or am I hurting the site by taking eligible pages out of the index?
Intermediate & Advanced SEO | | JSOC0