Updated site with new Url Structure - What Should I expect to happen ?. Also it's showing PR 1 for my urls on Opensite explorer
-
Hi All,
We updated our website with a new url structure. Apart from the root domain , everyother page is showing up in opensite explorer with a page rank 1. Although we only went live with this yesterday, I would have thought that the 301's etc from the old urls would be coming through and the PR would show ?..
I am not familiar what to expect or what alarms bells I need to watch out for when doing this type of thing although I would probably expect a small drop in traffic ?..I don;t know what the norm is though so Any advice greatly appreciated?
thanks
PEte
-
When you say page rank, do you mean Google's Page rank or do you mean Moz's page authority, from Open Site Explorer? If you mean Moz's page authority, that measure should go up, though I don't know how long that would take. You might need to wait till there is another crawl.
-
Many thanks Gazzerman1 .
Pete
-
Your 301's if setup correctly will work fairly quickly in Google and should see results for your main pages within days sometimes hours depending on the sites popularity/crawl rate.
pagerank has not been updated since last november according to John Muller at Google and will not ever be updated again. It is still an internal metic used by Google but the latest data is no longer available to the public. So your pages will likely never have visible page rank again.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's your proudest accomplishment in regards to SEO?
After many years in the industry, you come to realize a few things. One of of the biggest pain points for us at web daytona was being able to give clients a quick keyword ranking cost estimation. After multiple trial and error and relying on API data from one of the most reliable SEO softwares in our industry, we were able to develop an SEO tool that allows us to quickly and accurately get the estimated cost for a given keyword (s) using multiple variables. Most agencies can relate to that story. It’s something my colleagues and I at Web Daytona have been through before. Finding the cost and amount of time needed to rank for a keyword is a time consuming process. That’s why it’s a common practice to sell SEO packages of 5-10 keywords for about $1000-2000 / month. The problem is not all keywords are equally valuable, and most clients know this. We constantly get questions from clients asking: “how much to rank for this specific keyword?” It’s difficult to answer that question with a pricing model that treats the cost of ranking every keyword equally. So is the answer to spend a lot more time doing tedious in-depth keyword research? If we did we could give our clients more precise estimates. But being that a decent proposal can take as long as 2-5 hours to make, and agency life isn’t exactly full of free time, that wouldn’t be ideal. That’s when we asked a question. What if we could automate the research needed to find the cost of ranking keywords? We looked around for a tool that did, but we couldn’t find it. Then we decided to make it ourselves. It wasn’t going to be easy. But after running an SEO agency for over a decade, we knew we had the expertise to create a tool that wouldn’t just be fast and reliable, it would also be precise. Fast forward to today and we’re proud to announce that The Keyword Cost Estimator is finally done. Now we’re releasing it to the public so other agencies and businesses can use it too. You can see it for yourself here. Keyword-Rank-Cost-Ectimator-Tool-by-Web-Daytona-Agency.png
Local Website Optimization | | WebDaytona0 -
Should I avoid duplicate url keywords?
I'm curious to know Can having a keyword repeat in the URL cause any penalties ? For example xyzroofing.com xyzroofing.com/commercial-roofing xyzroofing.com/roofing-repairs My competitors with the highest rankings seem to be doing it without any trouble but I'm wondering if there is a better way. Also One of the problems I've noticed is that my /commercial-roofing page outranks my homepage for both residential and commercial search inquiries. How can this be straightened out?
Local Website Optimization | | Lyontups0 -
Remove URLs from App
Hi all, our tech team inherited a bit of an SEO pickle. I manage a freemium React JS app built for 80k unique markets worldwide (and associated dedicated URL schema). Ex/ https://www.airdna.co/vacation-rental-data/app/us/california/santa-monica/overview Mistake - App, in its entirety, was indexed by Google in July 2018, which basically resulted in duplicate content penalties because the unique on-page content wasn't readable. Partial Solution - We no indexed all app pages until we were able to implement a "pre-render" / HTML readable solution with associated dynamic meta data for the Overview page in each market. We are now selectively reindexing only the free "Overview" pages that have unique data (with a nofollow on all other page links), but want to persist a noindex on all other pages because the data is not uniquely "readable" before subscribing. We have the technical server-side rules in place and working to ensure this selective indexing. Question - How can we force google to abandoned the >300k cached URLs from the summer's failed deploy? Ex/ https://screencast.com/t/xPLR78IbOEao, would lead you to a live URL such as this which has limited value to the user, https://www.airdna.co/vacation-rental-data/app/us/arizona/phoenix/revenue (Note Google's cached SERPs also have an old URL structure, which we have since 301ed, because we also updated the page structure in October). Those pages are currently and will remain noindexed for the foreseeable future. Our sitemap and robots.txt file is up-to-date, but the old search console only has a temporary removal on a one-by-one basis. Is there a way to do write a rule-based page removal? Or do we simply render these pages in HTML and remove the nofollow to those links from the Overview page so a bot can get to them, and then it would see that there's a noindex on them, and remove them from the SERPs? Thanks for your help and advice!
Local Website Optimization | | Airbnb_data_geek1 -
If I am starting a new business, similar to my existing business...
Howdy MOZ community, I hope you are enjoying the last days of summer as much as we are here in Toronto-Canada.I own an Air Duct Cleaning business, I have done the web design as well as SEO, My website is currently ranking for quite a few keywords (some of them on the top of the SERPS) special thanks to MOZ for their awesome tools and blog posts.I am starting a Mobile Car Detailing business, Despite the fact that my Duct Cleaning domain is 5 years old with a DA of 42 and PA of 40 (main page).Would it be better for me to just add pages to my existing website (despite the fact that both businesses are in a cleaning niche) or would it be better for me to start another website from scratch? Would it be a bonus for me in terms of my current DA to add pages to my existing website.like for example: www.mywebsite.ca/Mobile-Auto-Detailing or would I get penalized for it? I thank you all for answering my question. Alex
Local Website Optimization | | DustChasersToronto0 -
I have 5 sites each targeting a different service my company offers, should I consolidate to one site or merge to one?
I run a photo booth company and have a site for each service I offer. Are smaller sites that are SEO for each service stronger than just having pages for each service on one mother site?thanks,
Local Website Optimization | | hashtagltd0 -
How to approach SEO for a national website that has multiple chapter/location websites all under different URLs
We are currently working with a client who has one national site - let's call it CompanyName.net, and multiple, independent chapter sites listed under different URLs that are structured, for example, as CompanyNamechicago.org, and sometimes specific to neighborhoods, as in CompanyNamechicago.org/lakeview.org. The national umbrella site is .net, while all others are .orgs. These are not subdomains or subfolders, as far as we can tell. You can use a search function on the .net site to find a location near you and click to that specific local website. They are looking for help optimizing and increasing traffic to certain landing pages on the .net site...but similar landing pages also exist on a local level, which appear to be competing with the national site. (Example: there is a landing page on the national .net umbrella site for a "dog safety" campaign they are doing, but also that campaign has led to a landing page created independently on the local CompanyNameChicago.org website, which seems to get higher ranking due to a user looking for this info while located in Chicago.) We are wondering if our hands are tied here since they appear to be competing for traffic with all their localized sites, or if there are best practices to handle a situation like this. Thanks!
Local Website Optimization | | timfrick0 -
SEO for local business directory type site
I am thinking about creating a local business directory type website that lists all local Tattoo Shops. I am familiar with both local and global SEO and how to differentiate between them, however, I am not sure how I should approach this type of website. It isn't an actual business, but I want to target local searches that are looking for tattoo shops. In other words, when someone types in "tattoo shops" or "tattoo shops near me", or "tattoo parlors", I want the website to appear. Is this something that is manageable, or will the individual Tattoo Shop websites always show before mine since they are real local businesses with google+ pages?
Local Website Optimization | | brfieger0 -
Best practices for 301 redirect to a new location website.
We just opened a new location in a nearby city. We were already servicing this location from our main base. As such we had a special page for this location which raked fairly well. The new location will have its own website. Would it be better to 301 redirect the current location page to the new location website? Or should we simply link from the old page to the new location's website? Any best practices?
Local Website Optimization | | Vspeed0