Crawl Test Report only shows home page and no inner site pages?
-
Hi,
My site is [removed]
When I first tried to set up a new campaign for the site, I received the error:
Roger has detected a problem:
We have detected that the root domain [removed] does not respond to web requests. Using this domain, we will be unable to crawl your site or present accurate SERP information.
I then ran a Crawl Test per the FAQ. The SEOmoz crawl report only shows my home page URL and does not have any inner site pages.
This is a Joomla site. What is the problem?
Thanks!
Dave
-
you're welcome
-
OK, no problem. Thanks for your time Stephanie!
-
Weird, I would contact the help desk for support. I'm sure they can help. Sorry I couldn't be of much assistance
-
Nope, that doesn't work. I am trying to set up the campaign for the root domain level.
-
try with www in front of it
-
I still can't create a new campaign. I don't understand why you can submit it, but I can't? Please see the attached image. Thanks!
-
Try again, I submitted it and it worked fine. The website may have been temporarily down when you tried the first time. Try again and see if it works.
-
Thanks for the reply.
Yes, I have submitted sitemaps to Google Webmaster Tools as well as Bing about one week ago.
Please advise, thanks!
-
Did you create a sitemap?
I would create a sitemap and submit to Google Webmaster Central.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When rogerbot tried to crawl my site it gets a 404\. Why?
When rogerbot tries to craw my site it tries http://website.com. My website then tries to redirect to http://www.website.com and is throwing a 404 and ends up not getting crawled. It also throws a 404 when trying to read my robots.txt file for some reason. We allow rogerbot user agent so unsure whats happening here. Is there something weird going on when trying to access my site without the 'www' that is causing the 404? Any insight is helpful here. Thanks,
Technical SEO | | BlakeBooth0 -
Will my site get devalued if I add the same company schema to all the pages of my website?
If I add the exact same schema markup to every page on my website - is it considered duplicate content? Our CMS is telling me that if I want schema mark-up on our site that it has to be the same on every page on the website. This limitation is frustrating but I am trying to figure out the best way to work within their boundaries. Your help is appreciated.
Technical SEO | | Annette_Wetzel0 -
Redirecting a single page on a separate domain to a new site?
My client started a subdivision of their company, along with a new website. There was already an individual page about the new product/topic on the main site, but recognizing a growth area they wanted to devote an entire site to the product/topic. Can we/should we redirect that page on the old corporate/main site to the new domain, or just place a link or two? Thoughts?
Technical SEO | | VTDesignWorks0 -
New Page Showing Up On My Reports w/o Page Title, Words, etc - However, I didn't create it
I have a WordPress site and I was doing a crawl for errors and it is now showing up as of today that this page : https://thinkbiglearnsmart.com/event-registration/?event_id=551&name_of_event=HTML5 CSS3 is new and has no page title, words, etc. I am not even sure where this page or URL came from. I was messing with the robots.txt file to allow some /category/ posts that were being hidden, but I didn't re-allow anything with the above appendages. I just want to make sure that I didn't screw something up that is now going to impact my rankings - this was just a really odd message to come up as I didn't create this page recently - and that shouldnt even be a page accessible to the public. When I edit the page - it is using an Event Espresso (WordPress plugin) shortcode - and I don't want to noindex this page as it is all of my events. Sorry this post is confusing, any help or insight would be appreciated! I am also interested in hiring someone for some hourly consulting work on SEO type issues if anyone has any references. Thank you!
Technical SEO | | webbmason0 -
Why are pages still showing in SERPs, despite being NOINDEXed for months?
We have thousands of pages we're trying to have de-indexed in Google for months now. They've all got . But they simply will not go away in the SERPs. Here is just one example.... http://bitly.com/VutCFiIf you search this URL in Google, you will see that it is indexed, yet it's had for many months. This is just one example for thousands of pages, that will not get de-indexed. Am I missing something here? Does it have to do with using content="none" instead of content="noindex, follow"? Any help is very much appreciated.
Technical SEO | | MadeLoud0 -
Is there any value to a home page URL adding the /index.html ?
For proper SEO, which version would you prefer? A. www.abccompany.com B. www.abccompany.com/index.html Is there any value or difference with either home page URL??
Technical SEO | | theideapeople0 -
International Site, flow of page rank?
OK. I'm working on an international site. The site is setup with folders for UK, US, AU e.g www.site.com/UK/index.aspx The root (non folder based) is the international version of the site e.g www.site.com/index.aspx www.site.com/index.aspx has the lions share of links. Therefore, the pages immediately linked from www.site.com/index.aspx have page rank distributed between them. My UK, US and AU home pages are linked via a country selector from the www.site.com/index.aspx page via an aspx redirect page that 301's to the appropriate country home page. Therefore the home pages of UK, US, AU are recieving some of the 'juice' that is coming in to www.site.com/index.aspx (but only a fraction via the redirect links) Am I right in thinking that pages on the international version of the site will have much more potential to rank (because of their 'juice') than the pages on UK, US and AU versions of the site? If so, am I right in thinking that these will tend to rank over the equivalent UK, US and AU versions of the pages in each country version of Google despite having set directory level Geo-targetting in GWT?
Technical SEO | | QubaSEO1 -
Why do I have one page showing as two url's?
My SEOMoz stats show that I have duplicate titles for the following two url's: http://www.rmtracking.com/products.php and http://www.rmtracking.com/products I have checked my server files, and I don't see a live page without the php. A while back, we converted our site from html to php, but the html pages have 301's and as you can see the page without the php is properly redirecting to the php page. Any ideas why this would show as two separate url's?
Technical SEO | | BradBorst0