Indexing techniques
-
Hi,
I just want a confirmation about my indexing technique, if is good or can be improved. The technique is totally whitehat and can be done by one person. Any suggestions or improvements are welcome.
- I create the backlinks ofcource first
- I make a list on public doc from Google.
- On the doc are only ten links.
- After I digg it , and add some more bookmarks 5-6.
- I tweet the digg and each doc. (my 2 twitter accounts have page authority 98)
- I like them in Fb.
- I ping them thru ping serviecs.
- Thats it. Works ok for moment.
Is anything what I can do to improve my technique?
Thanks lot
-
No is not gaming, is adult but I am thinking also to develop a gaming site , to turn Mine in a gaming site because in Cy no jobs about SEO. They are more gamblers there , And Online I dont think so that I will go good... Also I make more money from affiliate like to work for somebody... Maybe I wasnt so much lucky I guess...But is ok..Im still happy:)
-
Based on your profile, I'm guessing this is a gaming-related site?
-
My goal is about the old pages to get crawled fast. Which contains my links on them. Is not about my pages.
-
Many of them are authority 10-20-30-40, some other are zero. All are indexed pages because I am taking the links from a competitor. Yes some are low quality links but he is ranking number 1 after 2 500 000 exact matches.I just do this effort to speed up the indexing because many of them are not getting indexed fast. I mean I saw some of them that after 1 month start to show up in Webmaster Tools. After this process all are etting indexed in one day maximum. As for the quality links what you are suggesting to get is almost impossible due to the nature of the niche. Nobody want to give them, as this specific keyword is extremely profitable and have millions of searches. I mean the hardest part is to get the already good ones, and build authority for the other what I create new...OHHHH.. Also we are just 2 persons working here...From 1000 links what I visit until now only 60 was possible to get . Stay another 9000 links for checking.....If I get until 600 from his links will be good I guess , my site is already ranking with his keyword, but in position 50 about(just on page optimization)...and is old, pr 2 with 150 likes and some tweets, all real.The new links are builded in the last 2 days so I dont know where it will goes the site . Other bad on this is that they are around 45 exact matches domains under him with the same keyword...Mine is even not in url..
-
I believe you are referring to getting backlinks indexed. The only reason you would need to go to all that effort is if you were building low quality links on deep pages or pages with thin content that Google would not value in their index (e.g. Forum profile links, blog comments) I'm sure you are doing more than enough to get your links indexed but they will become quickly deindexed if Google no longer values the page content. If you are going to all this effort to index a batch low quality links then why not put that same effort into building links on pages with more trust & better quality content that Google will want in their index?
-
IF your goal is to get your webpages indexed, then why not create a sitemap and submit it in GWT? I don't understand why you would go through all that trouble to get your webpages indexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Shopify Website Page Indexing issue
Hi, I am working on an eCommerce website on Shopify.
Intermediate & Advanced SEO | | Bhisshaun
When I tried Indexing my newly created service pages. The pages are not getting indexed on Google.
I also tried manual indexing of each page and submitted a sitemap but still, the issue doesn't seem to be resolved. Thanks0 -
Old URLs that have 301s to 404s not being de-indexed.
We have a scenario on a domain that recently moved to enforcing SSL. If a page is requested over non-ssl (http) requests, the server automatically redirects to the SSL (https) URL using a good old fashioned 301. This is great except for any page that no longer exists, in which case you get a 301 going to a 404. Here's what I mean. Case 1 - Good page: http://domain.com/goodpage -> 301 -> https://domain.com/goodpage -> 200 Case 2 - Bad page that no longer exists: http://domain.com/badpage -> 301 -> https://domain.com/badpage -> 404 Google is correctly re-indexing all the "good" pages and just displaying search results going directly to the https version. Google is stubbornly hanging on to all the "bad" pages and serving up the original URL (http://domain.com/badpage) unless we submit a removal request. But there are hundreds of these pages and this is starting to suck. Note: the load balancer does the SSL enforcement, not the CMS. So we can't detect a 404 and serve it up first. The CMS does the 404'ing. Any ideas on the best way to approach this problem? Or any idea why Google is holding on to all the old "bad" pages that no longer exist, given that we've clearly indicated with 301s that no one is home at the old address?
Intermediate & Advanced SEO | | boxclever0 -
Robots.txt, Disallow & Indexed-Pages..
Hi guys, hope you're well. I have a problem with my new website. I have 3 pages with the same content: http://example.examples.com/brand/brand1 (good page) http://example.examples.com/brand/brand1?show=false http://example.examples.com/brand/brand1?show=true The good page has rel=canonical & it is the only page should be appear in Search results but Google has indexed 3 pages... I don't know how should do now, but, i am thinking 2 posibilites: Remove filters (true, false) and leave only the good page and show 404 page for others pages. Update robots.txt with disallow for these parameters & remove those URL's manually Thank you so much!
Intermediate & Advanced SEO | | thekiller990 -
Removing Parameterized URLs from Google Index
We have duplicate eCommerce websites, and we are in the process of implementing cross-domain canonicals. (We can't 301 - both sites are major brands). So far, this is working well - rankings are improving dramatically in most cases. However, what we are seeing in some cases is that Google has indexed a parameterized page for the site being canonicaled (this is the site that is getting the canonical tag - the "from" page). When this happens, both sites are being ranked, and the parameterized page appears to be blocking the canonical. The question is, how do I remove canonicaled pages from Google's index? If Google doesn't crawl the page in question, it never sees the canonical tag, and we still have duplicate content. Example: A. www.domain2.com/productname.cfm%3FclickSource%3DXSELL_PR is ranked at #35, and B. www.domain1.com/productname.cfm is ranked at #12. (yes, I know that upper case is bad. We fixed that too.) Page A has the canonical tag, but page B's rank didn't improve. I know that there are no guarantees that it will improve, but I am seeing a pattern. Page A appears to be preventing Google from passing link juice via canonical. If Google doesn't crawl Page A, it can't see the rel=canonical tag. We likely have thousands of pages like this. Any ideas? Does it make sense to block the "clicksource" parameter in GWT? That kind of scares me.
Intermediate & Advanced SEO | | AMHC0 -
Defining Canonical First and Later No Indexing
We found some repetitive pages on site which has mostly sort or filter parameters, tried lot to remove them but nothing much improvement Is it correct way that:- a) We are creating new pages altogther of that section and putting up rel canonical tag from old ones to new ones b) Now, after canonical declared, we will noindex the old pages Is it a correct way to let new pages supercede the old pages with new pages.
Intermediate & Advanced SEO | | Modi0 -
Is it better to not allow Google to index my Tumblr Blog?
Currently using a subdomain for my blog via Tumblr In my seo reports I see alot of errors. Mostly from the Tumblr blog. Made change so there are unique titles and tags. Too many errors I am wondering if it is best to just not allow it to be indexed via tumblr control panel. It certainly is doing a great job with engagement and social network follows, but i'm starting to wonder if and how much it is penalizing my domain.. Appreciate your input.. By the way this theme is not flash for the content very basic single a theme...
Intermediate & Advanced SEO | | wickerparadise0 -
Google Site Extended Listing Not Indexed
I am trying to get the new Site map to be picked up by Google for the extended listing as its pulling from the old links and returning 404 errors. How can I get the site listing indexed quickly and have the extended listing get updated to point to the right places. This is the site - http://epaperflip.com/Default.aspx This is the search with the extended listing and some 404's - Broad Match search for "epaperflip"
Intermediate & Advanced SEO | | Intergen0 -
Freshness Index?
Hi, I've been a member for a few months but this is my first entry. I typically build small portal websites to help attract more customers for small business approx. 5-7 pages and very tightly optimized around one primary keyword and 2 secondaries. These are typically very low competition. I do no link building to speak of. I don't keyword stuff or use poorly written content. I know that may be subjective but I believe the content I am using is genuinely useful to the reader. What I have noticed recently is the sites get ranked quite well to begin with e.g. anywhere from the bottom half of the first page to page 2-3 and they stick for maybe 2-3 weeks, and the client is very happy, they then just vanish. It's not just the Google dance either these sites don't typically come back at all or when they do they are 100+ I was advised this was due to the freshness index but honestly these sites are hardly newsworthy...just wondering if anyone had any ideas? Many thanks in advance.
Intermediate & Advanced SEO | | nichemarkettools0