Website Page Structuring and URL re writing - need helpful resources
-
Hello,
I am not technically very sound and I need some good articles that teach me how to think about and go about website pages structuring and url rewriting that is seo friendly.
I will be most obliged if some of you great seomoz-ers can pitch in with help.
Regards,
Talha
-
A web crawler will start at the top of a page and work it's way to the bottom, discovering links and content on the way down. The first links it encounters will be used for anchor text association and weighed more then other links to the same page. Links in content are given greater weight then links in navigation so it is preferable to have those links discovered first.
HTML code can be adjusted to position the navigation after the content. You can look at xenforo which is a forum software developer who designs their software in this manner. If you view the page's source code you will notice the navigation is at the bottom of the HTML code.
With respect to your question on content structuring, there are many pages covering this topic. In general I have two pieces of advice.
1. Have your content reviewed by an English teacher. I am not looking for grammatical corrections (although that helps too) but more along the idea of properly presenting a topic. A good header, intro sentence and paragraph, etc.
2. Examine the wikipedia approach. Most wiki pages perform the basics of SEO content presentation very effectively.
A WBF article which may be of interest: http://www.seomoz.org/blog/improving-content-shareability-whiteboard-friday
-
Hello Ryan...
thank you very much for your suggestion. Had that in mind already.
About website page structuring... I was actually asking about content structuring...
BUT I am surely, surely appreciate if you could also give me a bit of insight (or links) into html structuring.... the suggestion about placing navigation code is extremely interesting... can I know more please sir???
-
Thanks a lot for your response Steven... im most obliged.
I am on my way to understanding url re-writing.
Can you also help me with the remaining part of my question - I want to understand about HOW TO STRUCTURE A WEBSITE FOR MAXIMUM NOTICE BY SEARCH ENGINE SPIDERS.... OR we can rephrase it to be WHAT IS CRAWL FRIENDLY WEBSITE STRUCTURE???
I am sure your answer will be very helpful for mr.
Thanks bro
Peace!
-
Hi Talha.
Since you are an IT company I am guessing you are asking about the general structure of URLs as opposed to the technical details of how to implement the change.
The primary change I recommed with your present URLs is to remove the technology suffix. Change http://www.zigzagsolutions.co.uk/jobs.html to http://www.zigzagsolutions.co.uk/jobs. Also for the home page drop the index.html.
These changes make your URL shorter and friendlier in appearance. You also benefit from more stable URLs. If you later decide to change your web pages to php or another technology, your existing URLs can continue to work without needing to redirect your whole site.
For web page structure, can you offer a bit more detail as to what you are seeking? Are you looking for technical details such as how to structure your html? An example would be placing your navigation code at the bottom of your HTML so search engines see your content first. Or are you asking about how to present your content?
-
I assume you are talking about Mod_rewrite with apache.
Apache's rewrite would be a good start http://httpd.apache.org/docs/2.0/misc/rewriteguide.html. You say you are not technically strong, but you should be able to copy details from that and just change things around. The matching pattern works on regular expressions so if you have every used this you will find the syntax makes sense. If you haven't then it can look daunting when you first start but its really not that bad and through trial and error you can quickly pick it up.
This is the full reference for mod_rewrite http://httpd.apache.org/docs/current/mod/mod_rewrite.html.
Alternatively once you have decided on your friendly URL to actual file mapping, post those in here and I am sure people will be able to help you create the rewrite rules
-
What web server are you using?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap: Linking horizontal pages on a sitemap that has a vertical hierarchy structure
I'm currently in the process of revamping a website and creating a sitemap for it so that all pages get indexed by search engines. The site is divided into two websites that share the same root domain. The marketing site is on example.com and the application is on go.example.com. To get to go.example.com from example.com, you need to go through one of three “action pages”. The action pages are accessed from every page on example.com where we have a CTA button on the site (that’s pretty much every page). These action pages do not link back to any other page on the site though, nor are they a necessary step to navigate to other webpages. These action pages are only viewed when a user is ready to be taken to the application site. My question is, how should these pages be set up in a vertical sitemap since these three pages have a horizontal structure? Any insight would be much appreciated!
Technical SEO | | RallyUp0 -
How can I make it so that robots.txt is not ignored due to a URL re-direct?
Recently a site moved from blog.site.com to site.com/blog with an instruction like this one: /etc/httpd/conf.d/site_com.conf:94: ProxyPass /blog http://blog.site.com
Technical SEO | | rodelmo4
/etc/httpd/conf.d/site_com.conf:95: ProxyPassReverse /blog http://blog.site.com It's a Wordpress.org blog that was set as a subdomain, and now is being redirected to look like a directory. That said, the robots.txt file seems to be ignored by Google bot. There is a Disallow: /tag/ on that file to avoid "duplicate content" on the site. I have tried this before with other Wordpress subdomains and works like a charm, except for this time, in which the blog is rendered as a subdirectory. Any ideas why? Thanks!0 -
150+ Pages of URL Parameters - Mass Duplicate Content Issue?
Hi we run a large e-commerce site and while doing some checking through GWT we came across these URL parameters and are now wondering if we have a duplicate content issue. If so, we are wodnering what is the best way to fix them, is this a task with GWT or a Rel:Canonical task? Many of the urls are driven from the filters in our category pages and are coming up like this: page04%3Fpage04%3Fpage04%3Fpage04%3F (See the image for more). Does anyone know if these links are duplicate content and if so how should we handle them? Richard I7SKvHS
Technical SEO | | Richard-Kitmondo0 -
Need advice for new site's structure
Hi everyone, I need to update the structure of my site www.chedonna.it Basicly I've two main problems: 1. I've 61.000 index tag (more with no post)2. The category of my site are noindex I thought to fix my problem making the category index and the tag noindex, but I'm not sure if this is the best solution because I've a great number of tag idexed by Google for a long time. Mybe it is correct just to make the category index and linking it from the post and leave the tag index. Could you please let me know what's your opinion? Regards.
Technical SEO | | salvyy0 -
Adding parameters in URLs and linking to a page
Hi, Here's a fairly technical question: We would like to implement badge feature where linking websites using a badge would use urls such as: domain.com/page?state=texas&city=houston domain.com/page?state=neveda&city=lasvegas Important note: the parameter will change the information and layout of the page: domain.com/page Would those 2 urls above along with their extra parameters be considered the same page as domain.com/page by google's crawler? We're considering adding the parameter "state" and "city" to Google WMT url parameter tool to tel them who to handle those parameters. Any feedback or comments is appreciated! Thanks in advance. Martin
Technical SEO | | MartinH0 -
Optimal Structure for Forum Thread URL
For getting forum threads ranked, which is best and why? site.com**/topic/**thread-title-goes-here site.com**/t/**thread-title-goes-here site.com**/**thread-title-goes-here I'd take comfort in knowing that SEOmoz uses the middle version, except that "q" is more meaningful to a human than "t". The last option seems like the best bet overall, except that users could potentially steal urls that I may want to use in the future. My old structure was site.com/forum/topic/TOPIC_ID-thread-title-goes-here so obviously any of those would be a vast improvement, but I might as well make the best choice now so I only have to change once.
Technical SEO | | PatrickGriffith0 -
Frequent updating of pages on website to rank better
Will updating each page often for example everyday or few days, rank the page and/or website better in google. The reason I ask is that I made 18 websites about three months ago and the traffic initially was alot higher and has fallen little by little thru the 3 months. Also how well my pages rank has also fallen. I just put out the websites, have done nothing else. No linking, etc. No updates. It is evident that without new links coming in, the website will fall in rank ie., link aquistion velocity But my question is if I update the pages and change content frequently will this improve my position in google and other search engines. The traffic on websites over the three months if graphed sort of looks like a stair case going down.
Technical SEO | | mickey110 -
Duplicate exact match domains flagged by google - need help reinclusion
Okay I admit, I've been naughty....I have 270+ domains that are all exact match for city+keyword and have built tons of back links to all of them. I reaped the benefits....and now google has found my duplicate templates and flagged them all down. Question is, how to get the reincluded quickly? Do you guys think converting a site to a basic wordpress template and then simply using 275 different templates and begging applying each site manually would do it, or do you recommend. 1. create a unique site template for each site 2. create unique content any other advice for getting reincluded? Aside from owning up and saying, "hey i used the same template for all the sites, and I have created new templates and unique content, so please let me back".
Technical SEO | | ilyaelbert3