Best way to get pages indexed fast?
-
Any suggestion on best ways to get new sites pages indexed?
Was thinking getting high pr inbound links on fiverr but always a little risky right?
Thanks for your opinions.
-
Posting the new pages of your website on Google+, makes your page get indexed quite quickly.
-
Crawled and indexed is the easy part: Google Webmaster Tools and/or Bing Webmaster Tools, Submit a sitemap, do a crawl request of the page once it goes live, try some easy social bookmarking like StumbleUpon. That should be a good start at least.
Now ranking well... that's the hard (fun) part.
-
Submit Sitemap to the search engines and make sure to inter-link within your website so that it will be easier for bots to crawl.
-
I think we all know by now that buying links never bodes well. It's not sustainable and if Google catches on it could hurt worse than being patient.
Definitely submit a sitemap, and work on your long term social media strategy.
-
While likely effective in the short term, I think buying links from Fiverr is definitely risky and I would advise against it.
Instead, I would suggest the following
1. Add the site to Google Webmaster Tools and submitting a sitemap (if you haven't already done so).
2. Post links to the site/pages on Twitter, Facebook and other social media sites.
3. Try pinging services such as http://pingler.com/ and http://freebacklinktool.com/
Usually this will be all it takes for a new site to start being indexed in Google.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Glossary/Terms Page - What is the best way?
We have a glossary section on our website with hundreds of terms. At the moment we have it split into letters, e.g. there one page with all the terms starting with A, another for B etc.. I am conscious that this is not the best way to do things as not all of these pages are being indexed, and the traffic we get to these pages is very low. Any suggestions on what would be the best way to improve this? The 2 ideas I have at the moment are Have every term on a separate page, but ensuring there is enough copy for that term Leave as is, but have the URL change once a user scrolls down the page. E.g. the first page would be www.website.com/glossary/a/term-1 then once the user scrolls past this terms and onto the next one the URL would change to www.website.com/glossary/a/term-2
Intermediate & Advanced SEO | | brian-madden0 -
How Should We Best List Events Pages?
Hi everyone! Luke here from CHARGED.fm hoping that a brilliant mind could help me with another annoying (at least for me) technical seo question. It's about how we list the events on our ticketing site. Here's the rundown: We currently list tickets by event id, but our competitors keep the event page in the same silo and use the venue name and date of event in the url. So we do this: http://www.charged.fm/kinky-boots-tickets (disregard redirect for now) List the events where you can choose from these: http://www.charged.fm/event/tickets/2518362/kinky-boots
Intermediate & Advanced SEO | | keL.A.xT.o
http://www.charged.fm/event/tickets/2511448/kinky-boots Moz lists these as duplicate content, so we're wondering how to resolve this. We're also wondering if it would be benficial to keep the events page in the same silo like our competitors: http://www.vividseats.com/theatre/kinky-boots-tickets/kinky-boots-9-20-1537274.html (notice how they go /theatre/kinky-boots-tickets/event/) Would it be beneficial to list like this? Is it inconsequential? Could we leave things the way that they are or should we at least add the venue and date to the events page URL? Thanks a lot for any help,
Luke0 -
Any downsides of (permanent)redirecting 404 pages to more generic pages(category page)
Hi, We have a site which is somewhat like e-bay, they have several categories and advertisements posted by customers/ client. These advertisements disappear over time and turn into 404 pages. We have the option to redirect the user to the corresponding category page, but we're afraid of any negative impact of this change. Are there any downsides, and is this really the best option we have? Thanks in advance!
Intermediate & Advanced SEO | | vhendriks0 -
Certain Pages Not Being Indexed - Please Help
We are having trouble getting a bulk of our pages indexed in google. Any help would be greatly appreciated! The Following Page types are being indexed through escaped fragment: http://www.cbuy.tv/#! http://www.cbuy.tv/celebrity#!65-Ashley-Tisdale/fashion/4097-Casadei-BLADE-PUMP/Product/175199 <cite>www.cbuy.tv/celebrity/155-Sophia-Bush#!</cite> However, all our pages that look like this, are not being indexed: http://www.cbuy.tv/#!Type=Photo&id=b1d18759-5e52-4a1c-9491-6fb3cb9d4b95&Katie-Holmes-Hot-Pink-Pants-Isabel-Marant-DAVID-DOUBLE-BREASTED-Wool-COAT-Maison-Pumps-Black-Bag
Intermediate & Advanced SEO | | CBuy0 -
Why are new pages not being indexed, and old pages (now in robots.txt) remain in the index?
I currently have a site that was recently restructured, causing much of its content to be reposted, creating new URL's for each page. To avoid duplicates, all of the existing pages were added to the robots file. That said, it has now been over a week - I know Google has recrawled the site - and when I search for term X, it is stil the old page that is ranking, with the new one nowhere to be seen. I'm assuming it's a cached version, but why are so many of the old pages still appearing in the index? Furthermore, all "tags" pages (it's a Q&A site, like this one) were also added to the robots a few months ago, yet I think they are all still appearing in the index. Anyone got any ideas about why this is happening, and how I can get my new pages indexed?
Intermediate & Advanced SEO | | corp08030 -
404'd pages still in index
I recently launched a site and shortly after performed a URL rewrite (not the greatest idea, i know). The developer 404'd the old pages instead of a permanent 301 redirect. This caused a mess in the index. I have tried to use Google's removal tool to remove these URL's from the index. These pages were being removed but now I am finding them in the index as just URL's to the 404'd page (i.e. no title tag or meta description). Should I wait this out or now go back and 301 redirect the old URL's (that are 404'd now) to the new URL's? I am sure this is the reason for my lack of ranking as the rest of my site is pretty well optimized and I have some quality links.
Intermediate & Advanced SEO | | mj7750 -
How to link back to our main site from landing pages without getting penalized
I work for a small family insurance agency in CA and I am trying to learn how to compete in this extremely competitive industry. One of the ideas we had was to purchase all the long-tail keyword urls we could and use them as landing pages to direct traffic back to our primary site. (ex. autoinsurancecity.com). Our thought was that we could put landing pages on each that looked almost identical to the main page and use the navigation in the landing pages as links to direct traffic to the applicable category pages on the main site. (Ex. autoinsurancecity.com -> mainpage.com/auto-insurance). My concern is that I want to make sure we don't tick off Google. Implementing this strategy would result in each of the category pages getting lots of links from the landing page navigation very quickly. I don't think the links will be worth much from an SEO perspective, but I don't want them to look like spam either. Any suggestions on if this sort of tactic would put us at risk of being penalized? If so, does anyone have any suggestions on a better way to implement a strategy like this? Thank you in advance for the help! I'm totally new to this and any advice goes a long way!
Intermediate & Advanced SEO | | matthewbyers0 -
Tool to calculate the number of pages in Google's index?
When working with a very large site, are there any tools that will help you calculate the number of links in the Google index? I know you can use site:www.domain.com to see all the links indexed for a particular url. But what if you want to see the number of pages indexed for 100 different subdirectories (i.e. www.domain.com/a, www.domain.com/b)? is there a tool to help automate the process of finding the number of pages from each subdirectory in Google's index?
Intermediate & Advanced SEO | | nicole.healthline0