Wordtracker for keyword volume and Overture PPC for keyword value were my two go-to resources. And WebTrends for the painful process of attempting to figure out what was happening on-site.
- Home
- AlanBleiweiss
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
AlanBleiweiss
@AlanBleiweiss
Job Title: Forensic SEO Audit and Consulting Specialist, Blogger, Speaker, Author
Company: Alan Bleiweiss Consulting
Favorite Thing about SEO
The challenge. And the ability to help clients succeed
Latest posts made by AlanBleiweiss
-
RE: Old school SEO tools / software / websitesposted in Algorithm Updates
-
RE: In the U.S., how can I stop the European version of my site from outranking the U.S. version?posted in International SEO
Gianluca
Thanks for jumping in on this one. So if I'm reading your answer correctly, the bottom line here that there really should be one site per country, regardless of language spoken, correct?
-
RE: In the U.S., how can I stop the European version of my site from outranking the U.S. version?posted in International SEO
Yeah inheriting previous work can be a challenge.
Since you are already planning on rolling out content in different languages, you will have not only the opportunity to set the hreflang tags for each, but also it will be important to ensure all of the content within each section is actually in that section's primary language for consistency. That too will help address the confusion Google has.
-
RE: Should I use **tags or h1/h2 tags for article titles on my homepage**posted in Intermediate & Advanced SEO
having the clearer understanding about the concept of having multiple "titles" on a single page (an H1 headline is the in-content "title" for that page), David is correct - while HTML 5 allows multiple H1 tags on a single page, this is bad because the H1 communicates "this is the primary topical focus of this unique page".
Because of that, if you have headlines within the content area for content elsewhere on the site, and link to that other content, then those are absolutely best served with H2 headline tags, or if not , then at the very least, "strong" tags if the topic of each target page is significantly different than the primary topic of the page they're all listed on.
-
RE: In the U.S., how can I stop the European version of my site from outranking the U.S. version?posted in International SEO
Have you set the different hreflang tags appropriately across your content?
You said "US" and "European" - so does that mean you have just one set of content for all of Europe? If so, that can be more difficult to deal with, however if you set all of the US pages with an hreflang of "en-us" and the European pages with an hreflang of en-gb, you can at least help Google understand "this set is for the U.S. and this set is not".
What I always recommend if you're not targeting individual countries with your content (the "Europe" reference you made says you are not for that content), is to at the very least, split out content to two different domains. Have a .com domain for US content, and a separate .eu or .co.uk or .de or whatever other domain for your European content. That, while also setting up hreflang tagging, is really more helpful in communicating what should show up in which search results higher up.
You'll also need to accumulate inbound geo-relevant links to point to the appropriate content set to help reinforce this.
And if you split out domains, you can set country targeting more readily in Google Search Console.
For more info:
-
RE: What's the best possible URL structure for a local search engine?posted in Intermediate & Advanced SEO
In regard to shorter URLs:
The goal is to find a proper balance for your needs. You want to group things into sub-groups based on proper hierarchy, however you also don't want to go too deep if you don't have enough pages/individual listings deep down the chain.
So the Moz post you point to refers to that - at a certain point, having too many layers can be a problem. However there is one one single correct answer.
The most important thing to be aware of and consider is your own research and evaluation process for your situation in your market.
However, as far as what you found most people search for, be aware that with location based search, many people don't actually type in a location when they are doing a search. Except Google DOES factor in the location when deciding what to present in results. So the location matters even though people don't always include it themselves.
The issue is not to become completely lost in making a decision either though - consider all the factors, make a business decision to move forward with what you come up with, and be consistent in applying that plan across the board.
What I mean in regard to URLs and Breadcrumbs:
If the URL is www.askme.com/dehli/saket/pizza/pizza-hut/ the breadcrumb should be:
Home > Dehli > Saket > Pizza > Pizza Hut
If the URL is www.askme.com/pizza-huts/saket-delhi/ the breadcrumb should be
Home > Pizza Hut > Saket-Delhi
-
RE: What's the best possible URL structure for a local search engine?posted in Intermediate & Advanced SEO
Proximity to root is not a valid best practice, especially in this instance.
Here's why:
More people search based on geo-location than actual business name when looking for location based businesses. So by putting "Pizza Hut" first, that contradicts this notion. It implies "more people look for Pizza Hut than the number of people looking for all the different businesses in this geo-location".
Also, by using the URL you suggest, that's blatant over-optimization - attempting to stuff exact match keywords into the URL. In reality, people use a very wide range of keyword variations, so that's another conflict that harms your overall focus needs.
All of the individual factors need to reinforce each other as much as is reasonable for human readability. So URL, breadcrumb both should match the sequence. If one has one sequence, and the other has a different sequence, that confuses search algorithms.
-
RE: What's the best possible URL structure for a local search engine?posted in Intermediate & Advanced SEO
Local pack exists, yet is far from complete or consistently helpful. Business directories thrive even in an age of local packs. It's all about finding the best way to provide value, and the internet is large enough that many players can play in the game.
-
RE: What's the best possible URL structure for a local search engine?posted in Intermediate & Advanced SEO
Business listing directory environments have a big challenge when it comes to URL structure / information architecture and content organization because:
- Many businesses are searched for based on geo-location
- Many of those require hyper-local referencing while many others can be "in the general vacinity"
- Many other businesses are not as relevant to geo-location
So what is a site to do?
The best path is to recognize that as mobile becomes more and more critical to searcher needs, hyper-local optimization becomes more critical. It becomes the most important focus for SEO.
As a result, URL structure needs to reflect hyper-local first and foremost. So:
- www.askme.com/delhi/
- www.askme.com/delhi/saket/
- www.askme.com/delhi/saket/pizza/
- www.askme.com/dehli/saket/pizza/pizza-hut/
This way, if someone searches for "Pizza Hut Dehli", all of the Dehli Pizza Huts will show up, regardless of neighborhood, while anyone searching for "Pizza Hut Saket" will get more micro-locally relevant results.
And for those businesses that serve a wider geo-area, even though they too will be assigned a hyper-local final destination page, they will still be related to their broader geo-area as well. So someone searching "plumbers in Dehli" will get the right results and then they can choose any of the plumbers in Dehli regardless of what neighborhood they are in.
Note how I removed /search/ from the URL structure as well. It's an irrelevant level.
-
RE: How authentic is a dynamic footer from bots' perspective?posted in White Hat / Black Hat SEO
Nitin
You're dealing with multiple considerations and multiple issues in this setup.
First, it's a matter of link distribution. When you link to x pages from page 1, this informs search engines "we think these are important destination pages". If you change those links every day, or on every refresh, and if crawlers also encounter those changes, it's going to strain that communication.
This is something that happens naturally on news sites - news changes on a regular basis. So it's not completely invalid and alien to search algorithms to see or deal with. And thus it's not likely their systems would consider this black hat.
The scale and frequency of the changes is more of a concern because of that constantly changing link value distribution issue.
Either X cities are really "top" cities, or they are not.
Next, that link value distribution is further weakened by the volume of links. 25 links per section, three sections - that's 75 links. Added to the links at the top of the page, the "scrolling" links in the main content area of the home page, and the actual "footer" links (black background) so it dilutes link equity even further. (Think "going too thin" with too many links).
On category pages it's "only" 50 links in two sub-footer sections. Yet the total number of links even on a category page is a concern.
And on category pages, all those links dilute the primary focus of any main category page. If a category page is "Cell Phone Accessories in Bangalore", then all of those links in the "Top Cities" section dilute the location. All the links in the "Trending Searches" section dilute the non-geo focus.
What we end up with here then is an attempt to "link to all the things". This is never a best practice strategy.
Best practice strategies require a refined experience across the board. Consistency of signals, combined with not over-straining link equity distribution, and combined with refined, non-diluted topical focus are the best path to the most success long-term.
So in the example of where I said initially that news sites change the actual links shown when new news comes along, the best news sites do that while not constantly changing the primary categories featured, and where the overwhelming majority of links on a single category page are not diluted with lots of links to other categories. Consistency is critical.
SO - where any one or a handful of these issues might themselves not be a critical flaw scale big problem, the cumulative negative impact just harms the site's ability to communicate a quality consistent message.
The combined problem here then needs to be recognized as exponentially more problematic because of the scale of what you are doing across the entire site.
Best posts made by AlanBleiweiss
-
RE: Site Architecture: Cross Linking vs. Siloingposted in Intermediate & Advanced SEO
There's never one perfect solution, however here's the bigger issue. Some people hear "flat" and they take it to the extreme. Which is a terrible concept in 2011.
If you go too flat, you muddy up the proper group relationships. This is where Siloing comes in.
In my presentation at SMX Advanced this week, one of the many methods I recommend for "sustainable SEO" is to group your content, and reinforce that group relationship in URL structure, then with breadcrumbs, and finally with section-level navigation, where all the pages in that section have a link to all the other pages in that section, but where that specific sub-navigation is replaced or disappears as appropriate when you leave that section.
If you've got more than a handful of pages in a section, you should definitely go deeper.
The trick is knowing how wide, how deep to go. It's an art as much as a process studying site data over time.
Another factor is the competitive landscape for a particular niche market. The more competitive, the more important this concept becomes.
-
RE: Meta Robot Tag:Index, Follow, Noodp, Noydirposted in Intermediate & Advanced SEO
I'll just add that Yahoo closed the Yahoo Directory down last year.
-
RE: Will multiple domains from the same company rank for the same keyword search?posted in Intermediate & Advanced SEO
Actually the correct deeper answer based on both Google policies and SEO best practices is as follows:
- It is directly against Google's terms of service to attempt to rank multiple sites for the same phrases.
- When you have more than one site that contains content that directly competes against any other site, whether its a site you own or someone else owns, or even other content on your own site, Google's multi-algorithm system attempts to determine which site deserves the higher ranking for a particular phrase or search query. In that process their system attempts to then determine whether any of those shouldn't even be indexed, let alone show up in search results.
- Based on these considerations, any of your content could quite possibly suffer from either a loss of position it should otherwise deserve, or even have some or all of its content deindexed. And in a worst case scenario, you could be penalized as well.
SO - the only issue then is this - WHY would you want multiple sites? Do any of the following reasons match your vision? If so, then you CAN have multiple sites IF they are done properly.
A) If you've got a big active brand, with a lot of customers/clients, it can help to create multiple sites often including:
- Corporate Site
- eCommerce Site
- Careers Site
- Community Site
- Charitable Giving Site
- Customer Support Site
B) If you have specific separate and quite distinctly different service or product offerings, you can create multiple sites so that the very different topical intent of each site is kept uniquely refined in that specific funnel and doesn't "pollute" or "dilute" the umbrella topical focus of each niche.
C) If you have an eCommerce site (where intent is online sales) you may have a desire to have a separate community or blog site (where intent is informational) as another way to keep the "intent" funnels cleanly separated.
NOTE:
It is VITAL that you understand the concept that when executed properly, multiple sites are very useful. However, these need to factor in the following:
1. Every site needs to be able to pass the "5 Super-Signals" test:
- Quality
- Uniqueness
- Authority
- Relevance
- Trust
In regard to the above, content needs to be truly unique across each site. While you can have similar content specific to your brand identity, and even some similarity about the umbrella topic of your product or service offerings, this needs to be done in a way that does not violate the "multiple sites for ranking domination" except as it relates to your brand (as opposed to generic non-brand product or service offerings).
2. Each site needs to have a LEGITIMATE business case reason for its existence not considering SEO - the "why this site exists" question needs to pass muster.
3. Every additional site you create requires its own consistent quality effort, as well as trustworthy off-site reinforcement. If a proper concerted effort cannot be maintained over the long-haul on multiple sites, it is much wiser to go with one single unified site.
-
RE: Where to put Schema On Pageposted in Technical SEO
Always place schema markup directly in the position on the page where you want the content to appear if it's content specific - wrapping it around that content. So if your business name and address are in the main content area, that's where you place the schema code. It's literally a wrapper just like a CSS div would be, or an old-school HTML table, but not for display purposes on-site.
EDITED 11/14/2013 based on a question from Oliver (below) regarding situations where markup is located in the "head" area of the page:
Exceptions to "in-body" markup:
As is the case with any structured markup solution, there will, from time to time, be cases where certain, specific elements go in the "head" section of the code. Anything that applies to an individual page in its entirety, and does not limit itself to an element of content within the page does, in fact, belong in the "head" area of the page code.
-
RE: Should I include a "|" for better page title SEO results?posted in On-Page Optimization
There's some disagreement in the industry as to whether the pipe symbol or hyphens are best - either way, one of these two would be recommended for readability purposes. This is especially valid when you've got more than one keyword phrase.
Product Name | Alternate Product Name | Company Name
-
RE: Numbers in URLposted in Technical SEO
Just offering my opinion. There is no such thing as "concrete proof" that can't be disproven in this case due to the complexity of SEO.
Every factor is just one among many. So a site that has "proper" URL syntax can easily and readily outrank and outperform a site that doesn't if enough individual factors across the whole spectrum are strong enough.
Conversely, A site that has numeric URL structure and "non-ideal" syntax can also easily and outrank / outperform a site that has "proper" URL syntax if that site has enough strength from other factors to outweigh the "proper" structured URL site.
Anyone who has a case study claiming otherwise is not acknowledging how complex the reality of what we do is, and how any sub-group of signals can be so strong as to far outweigh any other sub-group of signals.
-
RE: Javascript, PhP and SEO Impact?posted in Web Design
JavaScript is one of several technologies that offers severe limitations in search engines and their ability to properly see content, then just as important but often overlooked, properly and cleanly evaluate that content from an SEO perspective.
Specific considerations:
- Google does a "fair" job at discovering content passed through JavaScript (either on-page or at the code level)
- A "fair" job means it's hit and miss as to whether their system can actually find that content
- Whatever content the Google system CAN find via JavaScript is NOT necessarily able to be used to properly evaluate content intent, focus or relationship to other content
So - the best practices recommendation is if you want/need content to be found and properly evaluated by Google (or Bing) do NOT pass it through JavaScript.
And also, if you want to HIDE content from Google, don't assume you can successfully do so via JavaScript either.
As for PHP, its the most widely adopted and utilized web programming language out there. The language by itself is essentially SEO neutral. It's all in how a programmer utilized PHP that matters. In the hands of a programmer that either truly understands SEO or collaborates closely with an SEO expert (who also understands the limitations/pitfalls that can arise with "bad" (SEO-unfriendly) PHP coding, it's a great language.
-
RE: How do you limit the number of keywords that will be researchedposted in Keyword Research
Eric,
Unfortunately you're in a very difficult position. Personally, I would never proceed with a client who can't even define their own audience. They're quite likely going to change their mind often, and if you get involved this early on, will just as likely be overly demanding and play the "needy victim" role. It's a mess.
Having said that, if you insist that you absolutely must work with them, the best approach might be to choose four, five or six topics you think are appropriate, run them through the Google Keyword tool, export each result set and pass them an Excel spreadsheet with each result set being in a separate tab. Then, help them understand what the columns mean (competitive, search volume, etc) and let them chew on that data. Explain that you need them to decide which phrases to go with, but that you can help them refine it down a bit.
Be very careful though to not get sucked into an endless hours vortex!
-
RE: Text to HTML Ratioposted in Content Development
I'd caution against assuming it's meaningless, but only from the perspective of not HTML specifically. Instead, if a page is bloated with code at the source / crawler bot level, that can definitely have a negative impact, because it can cause topical confusion. Whether the code is essentially gibberish to the human eye, or javascript with a lot of text that describes functions, calls and parameters.
I see such problems quite often on large sites that try to do a lot at the code level.
-
RE: Site Architecture: Cross Linking vs. Siloingposted in Intermediate & Advanced SEO
Well it depends. Is there only one BMW or are there several? If there is only one, then yes - cross link all the luxury detail pages. If there are several, then that's the level for cross linking detail pages, even though it's so deep. If that's the case though, you'd better get inbound links pointing to the parent luxury category page.
And in any regard, don't just have a bunch of links on those category pages - have descriptive paragraph content focused on that category's primary topical focus.
Going back to January of 1995, Alan Bleiweiss was a pioneer in the Internet as a marketing medium. Some of his most notable early clients include Publishers ClearingHouse, Weight Watchers International, Starkist Tuna, Writers Guild of America, Hill and Knowlton, Porter Novelli, Mechanic's Bank, and Princess Cruises. Alan pioneered online display advertising with the nation's first Business To Business Yellowbook site where he was the architect of that site's content management system and display ad business model.
Looks like your connection to Moz was lost, please wait while we try to reconnect.