What will happen if all our website content has the date created amended to the migration date?
-
HI,
We will be migrating all our website content soon to a new CMS and at the moment the
-
In my experience, the date created referred to the page itself as opposed to the entire site. I have migrated about 20 sites and this has not created an issue at all. As long as your domain doesn't change you should be ok. Part of the domain age is when it was first registered, how long it is registered for and whether or not it is an active site. New pages get added to sites all the time. The "date created" is another tag for Google to be able to use to know that there has been an update and there is new content to crawl.
-
Here's a Q&A that Matt Cutts published in October 2010. The specific question asked was "how does Google determine domain age". In it Matt talks about the fact that Google uses a variety of means to determine domain (and page) age such as when Google first saw a link to a domain or page or when domain or page was first crawled.
https://www.youtube.com/watch?v=-pnpg00FWJY
I think the point he's making is that Google doesn't rely on a single variable when making ranking decisions. A change to the DC.date.created will likely be noted, but the fact that your site was previously found and indexed will carry more weight.
I don't know that definitely. I did do a search on Bill Slawski's site (seobythesea.com) to see if he an opinion that was patent-based but didn't find anything.
-
I have the very same issue. Like to hear from Moz experts...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My WP website got attack by malware & now my website site:www.example.ca shows about 43000 indexed page in google.
Hi All My wordpress website got attack by malware last week. It affected my index page in google badly. my typical site:example.ca shows about 130 indexed pages on google. Now it shows about 43000 indexed pages. I had my server company tech support scan my site and clean the malware yesterday. But it still shows the same number of indexed page on google.
Technical SEO | | ChophelDoes anybody had ever experience such situation and how did you fixed it. Looking for help. Thanks FILE HIT LIST:
{YARA}Spam_PHP_WPVCD_ContentInjection : /home/example/public_html/wp-includes/wp-tmp.php
{YARA}Backdoor_PHP_WPVCD_Deployer : /home/example/public_html/wp-includes/wp-vcd.php
{YARA}Backdoor_PHP_WPVCD_Deployer : /home/example/public_html/wp-content/themes/oceanwp.zip
{YARA}webshell_webshell_cnseay02_1 : /home/example2/public_html/content.php
{YARA}eval_post : /home/example2/public_html/wp-includes/63292236.php
{YARA}webshell_webshell_cnseay02_1 : /home/example3/public_html/content.php
{YARA}eval_post : /home/example4/public_html/wp-admin/28855846.php
{HEX}php.generic.malware.442 : /home/example5/public_html/wp-22.php
{HEX}php.generic.cav7.421 : /home/example5/public_html/SEUN.php
{HEX}php.generic.malware.442 : /home/example5/public_html/Webhook.php0 -
Setting Up A Website For Redirects
I've got an old defunct domain with a lot of backlinks to individual pages. I'd like to use these backlinks for link juice by redirecting them to individual pages on the new domain (both sites belong to the same company). What is the best way to set this up? I presume I need some kind of hosting & site, even if it's just a default Wordpress install, which I can then use to set up the redirects? Would it be best done using .htaccess file for 301 redirects or some other way?
Technical SEO | | abisti20 -
Duplicate content
Hello mozzers, I have an unusual question. I've created a page that I am fully aware that it is near 100% duplicate content. It quotes the law, so it's not changeable. The page is very linkable in my niche. Is there a way I can build quality links to it that benefit my overall websites DA (i'm not bothered about the linkable page being ranked) without risking panda/dupe content issues? Thanks, Peter
Technical SEO | | peterm21 -
When will all of Google Maps be the same again?
As many of you are aware that the pigeon update was only applied to the new Google maps resulting in very different search results for Google local business. When you search for a business on old Google maps then you get totally different results vs the new Google maps. Some businesses totally disappeared completely from the search results. I have done my research and found out that it's because the new Algo was only applied to the new maps. Also new algo does not apply to other countries. Well the reason I posted this topic is because I have noticed that all the new Google Business listings I am verifying for my clients are all being put under the old Google maps and not the new ones. They come up fine when searching from old maps but not the new ones. I understand Google has not rolled out the pigeon on all data centers but why? Will Google eventually roll out the update to old maps? Since Google is adding businesses to old google maps then what's the point of even adding new listings?
Technical SEO | | bajaseo0 -
Website not ranking but the blog is!
I am hoping someone might be able to help me, I am doing some work on a website. A new version of the site was recently launched and since then rankings have plummeted and the new blog pages are ranking better! When the new version of the site went live, the domain changed to the non-www version, plus an incorrect robots.txt file and we have never really been able to fully recover (both of these things were beyond my control!). The robots.txt file was corrected and some of the external links links changed to the non-www but there is a 301 redirect in place so changing to the non-www shouldn't have been the reason to drop the site out completely. Before the launch of the new website, the site was ranking on the front page of Google for a lot of relevant keywords such as outdoor blinds, outdoor blinds Perth, cafe blinds, patio blinds, etc. The quality of the links is pretty bad and I am attempting to remove them before doing a disavow of all the really bad quality links but unless we were really unlucky I don't think it's the links right now that are causing the problem. I have ran the site through numerous crawl tests, checked the robots.txt, there are no messages in GWMT, the pages are indexed but I have a feeling there is something wrong with the site that is stopping this site from ranking well. If anyone could give me any insights I would be really grateful. I know the site could be better structured from a keyword/ structure perspective but the site was ranking fine!
Technical SEO | | Karen_Dauncey0 -
Is Google caching date same as crawling/indexing date?
If a site is cached on say 9 oct 2012 doesn't that also mean that Google crawled it on same date ? And indexed it on same date?
Technical SEO | | Personnel_Concept0 -
Duplicate content handling.
Hi all, I have a site that has a great deal of duplicate content because my clients list the same content on a few of my competitors sites. You can see an example of the page here: http://tinyurl.com/62wghs5 As you can see the search results are on the right. A majority of these results will also appear on my competitors sites. My homepage does not seem to want to pass link juice to these pages. Is it because of the high level of Dup Content or is it because of the large amount of links on the page? Would it be better to hide the content from the results in a nofollowed iframe to reduce duplicate contents visibilty while at the same time increasing unique content with articles, guides etc? or can the two exist together on a page and still allow link juice to be passed to the site. My PR is 3 but I can't seem to get any of my internal pages(except a couple of pages that appear in my navigation menu) to budge of the PR0 mark even if they are only one click from the homepage.
Technical SEO | | Mulith0 -
Internal website search
Hi, I'd like to index dynamically generated - internal website search pages - to Google. A mod rewrite to make the URL strings friendlier might be one way, but as these pages are created on the fly and effectively don't exist till the search keywords are inputted, is it even possible to index them? thanks
Technical SEO | | richcowley0