Back to the old fashioned way it is then
Thanks for the reply
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Job Title: Digital Media Executive
Company: ATP Instrumentation Ltd
Favorite Thing about SEO
Seeing those rankings climb and the traffic spike
Back to the old fashioned way it is then
Thanks for the reply
Rand,
I am a UK user.
I just went onto the keyword difficulty page and it said it is being removed from June 2nd to be replaced by the keyword explorer tool. Whilst I welcome this change as I love the keyword explorer, as a UK user i rely on the keyword difficulty tool for volume metrics as they keyword explorer doesn't yet show these for the UK.
Why the haste to remove this tool? Its a vital part of the puzzle for me until the explorer is fully operational. Could it not stick around until the explorer is fully functional?
Thanks
I thought I had but i had just hit the thumbs up button instead
Hey Mozzers,
I am optimising a chaotic section of the site including many similar products. Writing unique content etc. The titles and urls were all over the place so my first job was to tidy them up so I could make some sense of the situation, especially as sometimes they didn't even match!
I should point out were on Magento, so product name = Both the Heading and Title of the page, the meta title can be set separately. When i refer to title I mean both <title>and <h1></strong><br /><br />Before they existed as such<br />URL: domain.com/200-x-0-5-g-rs-232-balance.html<br />TItle: PC-1234 200 x 0.5g x 0.3 RS-232 Balance<br /><br />This format was (Product Code, Capacities, Resolutions, Accuracy, Product Title)<br /><br />The issue was all 60 products in a page followed this format. Navigating through the page was a nightmare and was just a jumble of numbers and highly confusing even to me who learnt what they all mean, especially when you had 8 products from the same range you got presented with<br /><br />APC-1234 200 x 0.5g x 0.3 RS-232 Balance<br />APC-1235 500 x 1g x 0.3 RS-232 Balance<br />APC-1236 1000 x 2g x 0.3 RS-232 Balance<br />APC-1238 5000 x 10g x 0.3 RS-232 Balance<br />APC-1239 10000 x 15g x 0.3 RS-232 Balance<br />APC-1210 20000 x 25g x 0.3 RS-232 Balance<br />APC-1211 50000 x 50g x 0.3 RS-232 Balance</p> <p>I changed them to something more user friendly.<br /><br />URL: domain.com/200g-precision-balance.html<br />Title: 200g Precision Balance<br /><br />This has seen the following benefits<br />- URL is now clear and means something to the user<br />- Product titles are easy to navigate and the page is more pleasing to the eye<br />- The jumble of numbers in the title are now all labelled and shown below each product listing in bullet point so the user can see the basic spec of a product without having to decipher any titles<br /><br />Upon reflection I has a couple of concerns I was hoping you could discuss, I am wondering if I have made the titles too simple.<br />1) I have no product code in the title<br />We have our own products manufactured and sell existing brands with their own product codes. Some of these can be lengthy. Adding them makes them hard to the eye and the page looked cramped.<br /><br />The codes are listed beneath each product title on category pages and on a list on the actual product page, but no where in the titles. <br /><br />2)None of our products have a brand listed in the title<br />None of the products on the site had brand names in anything but the images when i started and as such it snuck under my radar. But should i pre-fix all titles with a brand name?<br /><br />Should </p> <p>URL: domain.com/200g-precision-balance.html<br />Title: 200g Precision Balance</p> <p>become</p> <p>URL: domain.com/BRAND1-200g-precision-balance.html<br />Title: BRAND1 200g Precision Balance<br /><br />My instinct tells me to change things to include brands as its useful to the customer and should have an SEO benefit, but to leave out product codes as they are accessible to the customer where they are now and dont make things messy and unreadable.<br /><br />As always, thanks for the input!</p></title>
I asked a similar question back in September and got a different response. Would you mind taking a look and commenting as i value your opinion. https://moz.com/community/q/display-none-read-more-implimentation
Short answer: No its never too late.
People rescue lost links in this way all the time. The old pages may not have been de-indexed yet especially if there are being linked to from another website.
Ideal solution: Locate all the links pointing to old pages and get them updated to point to the new page. Put the 301 in place anyway to save any you miss.
Nearly ideal solution: Slap a 301 redirect on it - BUT make sure that the 301 is to a direct replacement / relevant page.
There is no negative implications for doing the 301's this late... (as long as the pages are relevant). But not doing them... well as you have seen... rankings will suffer.
Ive seen links that are months to years old get rescued this way, so get them redirects on!
I agree with Andy,
I use it as a guidance tool on any website i build. It serves a purpose, to check things are understood how they should be by a predetermined standard. But like any other automated tool it compares to set requirements that cannot always be met and cannot identify and ok these exceptions.
As long as you understand the error its pointing out and why its pointing it out, and know that despite this the code is rendering correctly and all outcomes are working as expected then there is no problem.
From an SEO stand point, aslong as google see's your site how you want it too i think it is a very very minor factor. Hell all of google returns errors of some variety.
Hi John, sorry ive been on leave so not checked back on the forums.
Glad it looks like its working for you. I dont think the comments do anything except signify where word press has begun writing to the .htaccess (i dont run wordpress so can't be sure). Normally comments do nothing but signify something useful to the user.
I can try to breakdown the code a little for you, but my htaccess isn't fantastic so its by no means complete.
Firstline: RewriteCond%{REQUEST_FILENAME}!-d
RewriteCond% = This says use this condition if....
{REQUEST_FILENAME}!-d = ... is NOT a directory
RewriteRule^(.*)/$ /$1 [L,R=301]
I believe this bit takes a snapshot of the url upto the final / then rewrites it to that snapshot.
The combination of these must mean it doesn't affect your wordpress admin directory. I know this code can break if your install is within a directory (as is discussed in the stackover flow link) but they have provided a solution for that in that topic. I would say test if on your live website to make complete sure it will work as this may be slightly different to your local install. Have a back-up ready just incase it doesn't.
Make sure you check every url including
Homepage
Pages
Posts
Category Pages
Sub Category Pages
Post Pages
Any images or files
To make sure it is working as expected on all of them.
Hi John,
I asked something similar myself something myself but im on the Magento platform. This should matter as the solution wasn't platform specific. It just involved editing htaccess file. If your up for editing your .htacccess file then it could be of some use. The topic URL is below and it contains multiple solutions for editing and removing the / and the debugging process we went through along the way. (Courtesy of Andy and Dirk) Hopefully its of some use to you
https://moz.com/community/q/cms-pages-multiple-urls
SUMMARY:
If you know how to edit your .htaccess and your ready to dive straight in this code should do it.
RewriteCond%{REQUEST_FILENAME}!-d
RewriteRule^(.*)/$ /$1 [L,R=301]
If you want the page with explanations and walk-through please see the original topic as editing your htaccess badly can cause all sorts of errors.
Edit: I realised i was probably a tiny bit lazy and should of probably included this link which is the original link i got sent from stackoverflow with instructions on how to to edit your .htaccess file.
http://stackoverflow.com/questions/21417263/htaccess-add-remove-trailing-slash-from-url
Dirks answer later in the post offer guidance on applying it to certain parameters which should prove helpful if your still having loop problems with the admin page.
Sorry i should have been more clear. This php solution is not a solution, at least in any practical sense.
Basically you would have for example
Descriptions Stored in variables
Meta Description UK
Meta Description USA
The page HTML would be
but the user is already on the website... they have already passed any point the meta-description would have been displayed to them.
The problem with this is that Google displays the meta-description that it has indexed from is most recent crawl. When google crawls your page is loads your page as if it were a user from wherever the bot is located, so wouldn't be able to see the other meta-descriptions for other counties and so couldn't display them if it wanted to.
Even if it crawled your site for various places around the globe, it would just be constantly tyring to update. It can change but each time it does is takes a period of time to be reflected on google. Once its changed the old one isn't stored, its just replaced. Its kinda a 1 description per page rule.
The only solution i could think off is a different landing page for each country (which you have) but there isn't a way to have a dynamic meta description on your main .com (as far as i know) sorry
Hi Brad,
Im not an "expert" but do a fair bit if developing in my spare time. My understanding is that this shouldn't affect you from an SEO stand point. Its only the databases and administration files behind the scenes that are ran from the same place, and if set up correctly google bots dont crawl these anyway. Each site is still technically self contained and a website in its own right. You are not trying to anything naughty like linking between the sites to boost rankings etc and are treating them as separate entities. Also I believe the "store" system in magento is designed with exactly what you are doing in mind.
In terms of servers, lots of websites bar the absolutely huge ones out there are hosted on shared servers anyway, you just simply don't see it and the bots know its not an important factor (unless in extreme cases where dodgy stuff is happening i believe). The personal sites I develop are technically sub domains and I still get them ranking completely independently and treated as separate entities.
Please dont take just my opinion as the answer, wait for someone with a little more experience, I dont want to just spam the obvious but it might also be worth a post on the Magento help forums as the community there may be able to offer some magento SEO expertise that this forum cant.
Hi John,
I asked something similar myself something myself but im on the Magento platform. This should matter as the solution wasn't platform specific. It just involved editing htaccess file. If your up for editing your .htacccess file then it could be of some use. The topic URL is below and it contains multiple solutions for editing and removing the / and the debugging process we went through along the way. (Courtesy of Andy and Dirk) Hopefully its of some use to you
https://moz.com/community/q/cms-pages-multiple-urls
SUMMARY:
If you know how to edit your .htaccess and your ready to dive straight in this code should do it.
RewriteCond%{REQUEST_FILENAME}!-d
RewriteRule^(.*)/$ /$1 [L,R=301]
If you want the page with explanations and walk-through please see the original topic as editing your htaccess badly can cause all sorts of errors.
Edit: I realised i was probably a tiny bit lazy and should of probably included this link which is the original link i got sent from stackoverflow with instructions on how to to edit your .htaccess file.
http://stackoverflow.com/questions/21417263/htaccess-add-remove-trailing-slash-from-url
Dirks answer later in the post offer guidance on applying it to certain parameters which should prove helpful if your still having loop problems with the admin page.
Harry, Could you provide a little more info/clarify a few things?
You stated that your direct traffic on the spike day followed the same overall pattern as a normal day etc... but you haven't clarified if the spike was all direct traffic.
Was this spike definitely direct traffic?
Also I think its beneficial for us to know (so we can look at factors that might influence direct traffic)
What Industry / Type of website is it?
Why are Mondays normally big traffic days?
With it being over a week since your email was sent I don't think the two events are related, the pattern we see from our emails see an initial spike and a much smaller spike the Monday of the following week due to out of offices etc.
If your certain its direct traffic then I would be investigating further with analytics. The spike is at a time you normally experience spikes, is this coincidence or a pattern?
In our marketing department we try to paint a picture of our direct traffic users.
Which page are they landing on, what are they doing, where is the user located, are the visits resulting in more bookings/services being orders. Then consider external factors which may cause people to go looking more.
For example, one company I work with here in the UK see a jump in direct traffic correlating with the end of the financial year and tax refunds.
As SMG said, Behaviour and Acquisition tabs are your friend. Sorry its not more of an "answer" but direct traffic can be vague.
To my knowledge there isn't
Are there are ways with php to dynamically load in a different meta descriptions? Sure.
The problem is that this wouldn't be reflected in google. Google takes days-> weeks or even months to update a meta-description within its index. So even though you could make the HTML code change on the actual page, it wouldn't change on the search listing nullifying the whole point.
I don't believe an option exists around this, but somebody may know something i dont.
Sorry i should have been more clear. This php solution is not a solution, at least in any practical sense.
Basically you would have for example
Descriptions Stored in variables
Meta Description UK
Meta Description USA
The page HTML would be
but the user is already on the website... they have already passed any point the meta-description would have been displayed to them.
The problem with this is that Google displays the meta-description that it has indexed from is most recent crawl. When google crawls your page is loads your page as if it were a user from wherever the bot is located, so wouldn't be able to see the other meta-descriptions for other counties and so couldn't display them if it wanted to.
Even if it crawled your site for various places around the globe, it would just be constantly tyring to update. It can change but each time it does is takes a period of time to be reflected on google. Once its changed the old one isn't stored, its just replaced. Its kinda a 1 description per page rule.
The only solution i could think off is a different landing page for each country (which you have) but there isn't a way to have a dynamic meta description on your main .com (as far as i know) sorry
Hi John, sorry ive been on leave so not checked back on the forums.
Glad it looks like its working for you. I dont think the comments do anything except signify where word press has begun writing to the .htaccess (i dont run wordpress so can't be sure). Normally comments do nothing but signify something useful to the user.
I can try to breakdown the code a little for you, but my htaccess isn't fantastic so its by no means complete.
Firstline: RewriteCond%{REQUEST_FILENAME}!-d
RewriteCond% = This says use this condition if....
{REQUEST_FILENAME}!-d = ... is NOT a directory
RewriteRule^(.*)/$ /$1 [L,R=301]
I believe this bit takes a snapshot of the url upto the final / then rewrites it to that snapshot.
The combination of these must mean it doesn't affect your wordpress admin directory. I know this code can break if your install is within a directory (as is discussed in the stackover flow link) but they have provided a solution for that in that topic. I would say test if on your live website to make complete sure it will work as this may be slightly different to your local install. Have a back-up ready just incase it doesn't.
Make sure you check every url including
Homepage
Pages
Posts
Category Pages
Sub Category Pages
Post Pages
Any images or files
To make sure it is working as expected on all of them.
Short answer: No its never too late.
People rescue lost links in this way all the time. The old pages may not have been de-indexed yet especially if there are being linked to from another website.
Ideal solution: Locate all the links pointing to old pages and get them updated to point to the new page. Put the 301 in place anyway to save any you miss.
Nearly ideal solution: Slap a 301 redirect on it - BUT make sure that the 301 is to a direct replacement / relevant page.
There is no negative implications for doing the 301's this late... (as long as the pages are relevant). But not doing them... well as you have seen... rankings will suffer.
Ive seen links that are months to years old get rescued this way, so get them redirects on!
I agree with Andy,
I use it as a guidance tool on any website i build. It serves a purpose, to check things are understood how they should be by a predetermined standard. But like any other automated tool it compares to set requirements that cannot always be met and cannot identify and ok these exceptions.
As long as you understand the error its pointing out and why its pointing it out, and know that despite this the code is rendering correctly and all outcomes are working as expected then there is no problem.
From an SEO stand point, aslong as google see's your site how you want it too i think it is a very very minor factor. Hell all of google returns errors of some variety.
Rand,
I am a UK user.
I just went onto the keyword difficulty page and it said it is being removed from June 2nd to be replaced by the keyword explorer tool. Whilst I welcome this change as I love the keyword explorer, as a UK user i rely on the keyword difficulty tool for volume metrics as they keyword explorer doesn't yet show these for the UK.
Why the haste to remove this tool? Its a vital part of the puzzle for me until the explorer is fully operational. Could it not stick around until the explorer is fully functional?
Thanks
Just a simple guy doing some simple SEO
Looks like your connection to Moz was lost, please wait while we try to reconnect.