Getting a citation page indexed
-
Howdy mozzers,
I have a citation on a .govt domain with 2 links pointing to my site. The page is not indexed by Google, bing or yahoo.
URL; http://www.familyservices.govt.nz/directory/viewprovider.htm?id=17077
I have tried getting the paged indexed by building bookmark links to it. I have tweeted the url and gotten a few re-tweets for it. But no luck.
The page has got no nofollow meta tag. Other listings have been indexed by google.
Could someone please advise on means to help me get the page indexed?
A strategy that I have not yet tried is submitting a sitemap that includes the external url as I am not sure if it is possible to include url's not part of my domain.
Any advice, help would be greatly appreciated.
viva le SEOmoz
Thanks
-
Agreed. There are also a ton of other local directories out there that will help build your citation (if your going for local) authority & credibility.
-
Quick test, please ignore.
-
I don't know getting a web developer to go into a server and change a robots.txt for a totally random business is something which I would not advise. they have obviously blocked the directory for a reason, possibly they do not want it indexed due to the content it holds?
Most web devs need good reasons to do things, I know I have dealt with 100s
the bigger company and especially govt devs are hard to move in hurry, they usually have a bunch of other jobs to do, or they will put it on the requests list.
Yet I agree with your point you could contact this website via email or phone and touch base to see if they want to include your link as it is highly relevant to them.
Yet in my experience most government websites are very funny about small business asking for links even if you are a huge business they ask a million and one questions before you can acquire a sucessful link.But like the old saying if you don't ask you don't get, so email/call away.
-
What you could try do if the link means that much to you (judging by the steps you've already taken I'd say it does) - is to try contact the webmaster and pitch the concept of the importance of their site being indexed so people looking for their family services will be able to find them easier. Offer Jame's solution which any webmaster would be able to quickly correct.
This would be best done over the phone though.
Good luck
-
If you are looking for better google places rankings then you can build citations on social media profiles.
-
Hi,
this specific page is blocked in robots.txt
http://www.familyservices.govt.nz/robots.txt
User-agent: * Disallow: /images/ Disallow: /media/ Disallow: /webadmin/ Disallow: /directory/ So I would not waste any time trying to get it ranked. Kind Regards, James.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should search pages be indexed?
Hey guys, I've always believed that search pages should be no-indexed but now I'm wondering if there is an argument to index them? Appreciate any thoughts!
Technical SEO | | RebekahVP0 -
Indexed pages
Just started a site audit and trying to determine the number of pages on a client site and whether there are more pages being indexed than actually exist. I've used four tools and got four very different answers... Google Search Console: 237 indexed pages Google search using site command: 468 results MOZ site crawl: 1013 unique URLs Screaming Frog: 183 page titles, 187 URIs (note this is a free licence, but should cut off at 500) Can anyone shed any light on why they differ so much? And where lies the truth?
Technical SEO | | muzzmoz1 -
When i type site:jamalon.com to discover number of pages indexed it gives me different result from google web master tools
when i type site:jamalon.com to discover number of pages indexed it gives me different result from google web master tools
Technical SEO | | Jamalon0 -
Website SEO Product Pages - Condense Product Pages
We are managing a website that has seen consistently dropping rankings over the last 2 years (http://www.independence-bunting.com/). Our long term strategy has been purely content-based and is of high quality, but isn’t seeing the desired results. It is an ecommerce site that has a lot of pages, most of which are category or product pages. Many of the product pages have duplicate or thin content, which we currently see as one of the primary reasons for the ranking drops.The website has many individual products which have the same fabric and size options, but have different designs. So it is difficult to write valuable content that differs between several products that have similar designs. Right now each of the different designs has its own product page. We have a dilemma, because our options are:A.Combine similar designs of the product into one product page where the customer must choose a design, a fabric, and a size before checking out. This way we can have valuable content and don’t have to duplicate that content on other pages or try to find more to say about something that there really isn’t anything else to say about. However, this process will remove between 50% and 70% of the pages on the website. We know number of indexed pages is important to search engines and if they suddenly see that half of our pages are gone, we may cause more negative effects despite the fact that we are in fact aiming to provide more value to the user, rather than less.B.Leave the product pages alone and try to write more valuable content for each product page, which will be difficult because there really isn’t that much more to say, or more valuable ways to say it. This is the “safe” option as it means that our negative potential impact is reduced but we won’t necessarily see much positive trending either. C.Test solution A on a small percentage of the product categories to see any impact over the next several months before making sitewide updates to the product pages if we see positive impact, or revert to the old way if we see negative impact.Any sound advice would be of incredible value at this point, as the work we are doing isn’t having the desired effects and we are seeing consistent dropping rankings at this point.Any information would be greatly appreciated. Thank you,
Technical SEO | | Ed-iOVA0 -
Page not indexed but still has a PageRank, how?
http://www.optiproerp.com/products.aspx page is not indexed in Google but still has a PageRank of 1. How? Regards
Technical SEO | | IM_Learner0 -
De-indexing millions of pages - would this work?
Hi all, We run an e-commerce site with a catalogue of around 5 million products. Unfortunately, we have let Googlebot crawl and index tens of millions of search URLs, the majority of which are very thin of content or duplicates of other URLs. In short: we are in deep. Our bloated Google-index is hampering our real content to rank; Googlebot does not bother crawling our real content (product pages specifically) and hammers the life out of our servers. Since having Googlebot crawl and de-index tens of millions of old URLs would probably take years (?), my plan is this: 301 redirect all old SERP URLs to a new SERP URL. If new URL should not be indexed, add meta robots noindex tag on new URL. When it is evident that Google has indexed most "high quality" new URLs, robots.txt disallow crawling of old SERP URLs. Then directory style remove all old SERP URLs in GWT URL Removal Tool This would be an example of an old URL:
Technical SEO | | TalkInThePark
www.site.com/cgi-bin/weirdapplicationname.cgi?word=bmw&what=1.2&how=2 This would be an example of a new URL:
www.site.com/search?q=bmw&category=cars&color=blue I have to specific questions: Would Google both de-index the old URL and not index the new URL after 301 redirecting the old URL to the new URL (which is noindexed) as described in point 2 above? What risks are associated with removing tens of millions of URLs directory style in GWT URL Removal Tool? I have done this before but then I removed "only" some useless 50 000 "add to cart"-URLs.Google says themselves that you should not remove duplicate/thin content this way and that using this tool tools this way "may cause problems for your site". And yes, these tens of millions of SERP URLs is a result of a faceted navigation/search function let loose all to long.
And no, we cannot wait for Googlebot to crawl all these millions of URLs in order to discover the 301. By then we would be out of business. Best regards,
TalkInThePark0 -
Campaign shows 5,000 warnings from shared database feed, made pages no-follow and no-index, are we OK now?
One of our campaigns shows 5,000 warnings for dup content, meta descriptions, and urls. This is from a xml database feed that is shared throughout the industry. We made the pages no-follow and no-index, but on Moz crawl still get the warnings. No warnings on Webmaster tools. Should we ignore these warnings and are we OK now, or is there more work to do?
Technical SEO | | geilt0 -
Diagnostic says too many links on a page and most of the pages are from blog entries. Are tags considered links? How do I decrease links?
I just ran my first diagnostic on my site and the results came back were negative in the area of too many links one a page. There were also quite a few 404 errors. What is the best way to fix these problems? Most of the pages with too many links are from blog posts, are the tags counted as well and is this the reason for too many links?
Technical SEO | | Newport10300