Emergency Help...
-
Hello All,
I'm trying to get a better handle on this, but any help would be hugely appreciated. Per my Pro account, i just found out that the keyword i was severely trying to rank for "Boston Wedding Phot*grapher" i just declined by over 40 positions. Just last week i was in the #3 position.
Needless to say, this is extremely bad. I feel sick from it. This is my livelyhood. I recently hired a 'so-called' SEO expert to look at it, but i'm having my doubts. I'm using a php based site with a wordpress blog.
He added a bunch of 301 redirects from pages that the crawler was complaining about to my .htaccess file.
He also installed the following plugins:
Link Juice Keeper
NoFollow Free
The SEO Rich Snippets
Udinra All Image Sitemap
WP Robots Txt
WP-PageNavi
Add Meta Tags
These are essentially the only changes made. Does anyone see anything blaring and/or obvious? I could really really use some help.
My blog link is : http://www.symbolphoto.com/blog/
I'm assuming it's the blog because that's where most of my site content is located.
Any advice is hugely appreciated. TIA.
-
I agree with everything inhouseseo says above. Sound advice.
I've recently been looking at the wedding niche with a view to finding guest post opportunities, one thing that came to light is there's a ton of blogs out there that love publishing series of photos from weddings.
I'm not sure where you stand with ownership of the photos and whether you have the right to publish without the consent of the 'models'. Theres a great opportunity to assemble 10-15 shots write some commentary and then create a series of posts using 6/7 different shots from the collection and differing comments and post them around various wedding.
Myblogguest.com is great place to start. I think this could be the cheapest/quickest way to get the links your looking for.
Best of luck and I sincerely hope you get your old positions back.
Regards
Aran
Ps.(oh and in the mean time you might wanna hit facebook/twitter/pinterest and post some photos, connect with your target audience, it could drive some valuable traffic in! and it just might help you rank)
-
You don't have a lot of backlinks, and the ones you do have don't seem spammy- no overuse of keywords or spammy sites. I didn't go through the all the top sites, but the very first result (used to be ONLY 2 spots above you) is much better optimized. They have a much higher domain authority + plus they have more than 10 times as many links (again I didn't spend much time on the links, but the anchor text didn't seem spammy).
It seems that there were just some fluctuations in ranking factors, and your site doesn't deserve the ranking it once had in Google's opinion. You likely did not set off any sort of spam filter.
At this point, invest in some good content and try to make the site more useful.
BTW, when I searched for "boston wedding photographer" I didn't see any sponsored results. You might be able to drive some traffic without breaking the bank via PPC.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can anyone help me figure out these sitelinks?
My company is Squatty Potty (yes, of magic unicorn fame) and I recently redid our website's navigation. We're overhauling it currently to rebuild the whole thing, but what is there should give a good idea of site hierarchy to Google I would think. The funny thing is, when you Google [squatty potty website] we do have sitelinks. But when you Google just [squatty potty] we don't. Any ideas on why sitelinks would appear on one search but not the other? I see they appear with [squatty potty logo] as well. I can't figure out how to get them to appear for my brand name search, any help appreciated!
Intermediate & Advanced SEO | | DanDeceuster0 -
URL Errors Help - 350K Page Not Founds in 22 days
Got a good one for you all this time... For our site, Google Search Console is reporting 436,758 "Page Not Found" errors within the Crawl Error report. This is an increase of 350,000 errors in just 22 days (on Sept 21 we had 87,000 errors which was essentially consistently at that number for the previous 4 months or more). Then on August 22nd the errors jumped to 140,000, then climbed steadily from the 26th until the 31st reaching 326,000 errors, and then climbed again slowly from Sept 2nd until today's 436K. Unfortunately I can only see the top 1,000 erroneous URLs in the console, of which they seem to be custom Google tracking URLs my team uses to track our pages. A few questions: 1. Is there anyway to see the full list of 400K URLs Google is reporting they cannot find?
Intermediate & Advanced SEO | | usnseomoz
2. Should we be concerned at all about these?
3. Any other advice? thanks in advance! C0 -
Help in Internal Links
Which link attribute should be given to internal links of website? Do follow or No follow and why?
Intermediate & Advanced SEO | | Obbserv0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Help With This Page
This is page - http://www.kempruge.com/location/tampa/tampa-personal-injury-legal-attorneys/ - is the most important one to my business, and I can't seem to get it to rank higher. It has the second highest authority and links, second only to my homepage (though none are all that impressive) but it is just buried in the SERPs. Granted, I know Tampa Personal Injury Attorney is the hardest keyword for us to rank for, but there must be some way to improve this. I know getting high quality links is an appropriate answer, but I'm looking for anything I can do solely on my end to improve it. However, if anyone has some ways to make the page more linkable, I'm all ears! Please, if you have a second to take a look, I'd appreciate any and all feedback. Thanks, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
I'm having an exteremly hard time with SERPs despite my best efforts. Can someone help?
My site is www.drupalgeeks.org Our traffic is going up but our SERPS are not. We simply don't rank for any of our targeted keywords. I have covered nearly every white hat SEO strategy possible. Our site has a great social presence (Facebook, Twitter, LinkedIn, Pinterest), we write blogs regularly, and even guest blog. We have a YouTube channel, an RSS feed. We've cleaned up page speed times, set 301 redirects, checked for duplicate content. We use Bing and Google webmaster tools and have submitted a sitemap. We are indexed and webmaster tools see our keywords as relevant in our content. We have a robots.txt file configured properly. The only thing I can think of is that our services pages also display (as a truncated summary) on our homepage. Could this be considered duplicate content, and is this causing a problem? Is there anything else we can do? Or are we missing something vital? We thank you in advance for your help! Candice
Intermediate & Advanced SEO | | candylotus0 -
Splitting one Website into 2 Different New Websites with 301 redirects, help?
Here's the deal. My website stbands.com does fairly well. The only issue it is facing a long term branding crisis. It sells custom products and sporting goods. We decided that we want to make a sporting goods website for the retail stuff and then a custom site only focusing on the custom stuff. One website transformed and broken into 2 new ones, with two new brand names. The way we are thinking about doing this is doing a lot of 301 redirects, but what do we do with the homepage (stbands.com) and what is the best practice to make sure we don't lose traffic to the categories, etc.? Which new website do we 301 the homepage to? It's rough because for some keywords we rank 3 or 4 times on the first page. Scary times, but something must be done for the long term. Any advise is greatly appreciated. Thank you in advance. We are set for a busy next few months 🙂
Intermediate & Advanced SEO | | Hyrule0 -
My ranking spiked then died in the butt? why? Please help!
Hi guys, I have been persevering with this ranking for some time now and thought you might be able to help. Or direct me to where I can get help. I am learning a lot through SEOmoz but I am still very green. Basically on the 20th of the 12th we jumped up to a 2nd place listing and then dropped back down on the 17th of the 1st 2011. The site is http://mlb.broomeaccommodation.com.au and the search term is 'Broome Accommodation' as you can see it is a considerable drop and really affecting our bookings and sales figures. I have attached a link to a screen capture of the problem - http://exitforward.com/kimberleyaccomm/seomoz.png Interested to hear your thoughts and get some help on this frustrating matter Kind regards Bodie Czeladka
Intermediate & Advanced SEO | | Bodie0