Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to get multiple pages to appear under main url in search - photo attached
-
How do you get a site to have an organized site map under the main url when it is searched as in the example photo?
-
On the Entrepreneur article: http://www.entrepreneur.com/article/235102
Search "Step" on the page and you should find it.
-
Matt - you mentioned lower on the page are great tips - "Getting Google Sitelinks: A Step-by-Step Guide" but I was unable to find where you are referring to.
Thanks
-
Fully agree with Matt-POP.
I would like to quote Google on this .
"We only show sitelinks for results when we think they'll be useful to the user. If the structure of your site doesn't allow our algorithms to find good sitelinks, or we don't think that the sitelinks for your site are relevant for the user's query, we won't show them."
"At the moment, sitelinks are automated. We're always working to improve our sitelinks algorithms, and we may incorporate webmaster input in the future. There are best practices you can follow, however, to improve the quality of your sitelinks. For example, for your site's internal links, make sure you use anchor text and
alt
text that's informative, compact, and avoids repetition.Thanks
-
What you're talking about are called sitelinks. I found a really good article that explains them here:
http://www.entrepreneur.com/article/235102
The most important bit (for your question) is:
- Sitelinks are automated. There is no Google-given process for creating sitelinks. You don’t get to stipulate what links are featured and when. You can, however, indicate that a sitelink is not important or relevant by demoting it.
That said, lower down the page are some great tips under "Getting Google Sitelinks: A Step-by-Step Guide"
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will changing the property from http to https in Google Analytics affect main unfiltered view?
I set my client up with an unfiltered view in Google Analytics. This is the one with historical data going back for years, so I don't want to do anything that will affect this view. Recently, the website moved from HTTP to HTTPS. There's a setting for the property that will allow me to change the property name to https://EXAMPLE.com and change the default URL to https://EXAMPLE.com. Questions: 1. If I change the property name and the default URL, will this somehow affect my unfiltered view in a way that I'll lose historical data or data moving forward? 2. I have heard that changing the default URL to HTTPS will help me avoid a common problem others have experienced (where they lose the referrer in Google Analytics and a bunch of their sessions go to direct / other). Is this true?
Reporting & Analytics | | Kevin_P3 -
We have a client that wants to apply UTM URL tagging to track local organic traffic in Google Analytics. Is there any benefit in doing this?
One of our clients requested that we apply UTM URL tagging to better track organic traffic in Google Analytics. We found this to be an odd request because we are most familiar with UTM tracking for special campaigns (referral tracking, PPC, email tracking, etc). Is there any benefit of applying UTM tags to urls to analyze local organic traffic in Google Analytics? Are there any resources out there about this? Thanks!
Reporting & Analytics | | RosemaryB0 -
Help Blocking Crawlers. Huge Spike in "Direct Visits" with 96% Bounce Rate & Low Pages/Visit.
Hello, I'm hoping one of you search geniuses can help me. We have a successful client who started seeing a HUGE spike in direct visits as reported by Google Analytics. This traffic now represents approximately 70% of all website traffic. These "direct visits" have a bounce rate of 96%+ and only 1-2 pages/visit. This is skewing our analytics in a big way and rendering them pretty much useless. I suspect this is some sort of crawler activity but we have no access to the server log files to verify this or identify the culprit. The client's site is on a GoDaddy Managed WordPress hosting account. The way I see it, there are a couple of possibilities.
Reporting & Analytics | | EricFish
1.) Our client's competitors are scraping the site on a regular basis to stay on top of site modifications, keyword emphasis, etc. It seems like whenever we make meaningful changes to the site, one of their competitors does a knock-off a few days later. Hmmm. 2.) Our client's competitors have this crawler hitting the site thousands of times a day to raise bounce rates and decrease the average time on site, which could like have an negative impact on SEO. Correct me if I'm wrong but I don't believe Google is going to reward sites with 90% bounce rates, 1-2 pages/visit and an 18 second average time on site. The bottom line is that we need to identify these bogus "direct visits" and find a way to block them. I've seen several WordPress plugins that claim to help with this but I certainly don't want to block valid crawlers, especially Google, from accessing the site. If someone out there could please weigh in on this and help us resolve the issue, I'd really appreciate it. Heck, I'll even name my third-born after you. Thanks for your help. Eric0 -
Www.googleadservices.com/pagead/conversion_async.js what is this url doing on my site?
Hello Guys, I am using google tagmanager and i have configured adwords in tag manager now what i find is that this link - www.googleadservices.com/pagead/conversion_async.js showing on my homepage not in view source but when i do inspect element at that time it appears. So do you think after using google tag manager still i need to use the given link? Thanks, Raghu
Reporting & Analytics | | raghuvinder0 -
Why would page views per visitor suddenly increase?
My website traffic is growing by about 1% a week. It has a fairly stable page views/visitor of about 1.69. There's normally very little variability in this As we sell an industrial product. Today page views jumped by 50% and so did page views/visitor but visitor numbers stayed the same. I dont have a useful hypothesis to explain this. Analytics shows me that the traffic source, country of origin and pages viewed are pretty much the same as normal. There's been no substantive change to the site (today we changed the text in a widget to link to a new page - and no one visited it). It doesn't look like 1 person has gone through the whole site as that would skew the distribution of page views by country So why would user behavour suddenly change? I'll look at it for the rest of the week but in 7 years of looking after this website I haven't seen anything like this before.
Reporting & Analytics | | Zippy-Bungle0 -
Why google stubbornly keeps indexing my http urls instead of the https ones?
I moved everything to https in November, but there are plenty of pages which are still indexed by google as http instead of https, and I am wondering why. Example: http://www.gomme-auto.it/pneumatici/barum correctly redirect permanently to https://www.gomme-auto.it/pneumatici/barum Nevertheless if you search for pneumatici barum: https://www.google.it/search?q=pneumatici+barum&oq=pneumatici+barum The third organic result listed is still http. Since we moved to https google crawler visited that page tens of time, last one two days ago. But doesn't seems to care to update the protocol in google index. Anyone knows why? My concern is when I use API like semrush and ahrefs I have to do it twice to try both http and https, for a total of around 65k urls I waste a lot of my quota.
Reporting & Analytics | | max.favilli0 -
How to safely exclude search result pages from Google's index?
Hello everyone,
Reporting & Analytics | | llamb
I'm wondering what's the best way to prevent/block search result pages from being indexed by Google. The way search works on my site is that search form generates URLs like:
/index.php?blah-blah-search-results-blah I wanted to block everything of that sort, but how do I do it without blocking /index.php ? Thanks in advance and have a great day everyone!0 -
Why are Seemingly Randomly Generated URLs Appearing as Errors in Google Webmaster Tools?
I've been confused by some URLs that are showing up as errors in our GWT account. They seem to just be randomly generated alphanumeric strings that Google is reporting as 404 errors. The pages do 404 because nothing ever existed there or was linked to. Here are some examples that are just off of our root domain: /JEzjLs2wBR0D6wILPy0RCkM/WFRnUK9JrDyRoVCnR8= /MevaBpcKoXnbHJpoTI5P42QPmQpjEPBlYffwY8Mc5I= /YAKM15iU846X/ymikGEPsdq 26PUoIYSwfb8 FBh34= I haven't been able to track down these character strings in any internet index or anywhere in our source code so I have no idea why Google is reporting them. We've been pretty vigilant lately about duplicate content and thin content issues and my concern is that there are an unspecified number of urls like this that Google thinks exist but don't really. Has anyone else seen GWT reporting errors like this for their site? Does anyone have any clue why Google would report them as errors?
Reporting & Analytics | | kimwetter0