Print

Print


Just curious, if you buy Google Adwords, would that work as a trigger to reindex your site. It shouldn't, but then you never know.
-Vivek
 


From: MSU Network Administrators Group [mailto:[log in to unmask]] On Behalf Of Bill Park
Sent: Tuesday, July 03, 2007 10:31 AM
To: [log in to unmask]
Subject: Re: [MSUNAG] Googlebot & SEO

I'm typically giving Google a day to see new page changes.  Crawl rate is reporting at least 1 page a day, 5 average, and 18 max.  I know it will take them a minute to get things indexed and what not and that it's based on Page Rank. 

It indexed the old site fine.  It was when we switched to the Wordpress backend that screwed things up, same URL, different content/structure.

MSN appears to be OK so long as you are searching for "MSU Plant Biology" or very similar.

Yahoo does OK as well, plantbiology.msu.edu will come up but with the title "Michigan State University" which isn't preferable over MSU Plant Biology or something.

Google is of my main concern however.

Bill from Plant Biology.



On 7/3/07, Richard Wiggins <[log in to unmask]> wrote:
How much time are you giving Google between iterations?  For a site with a relatively low Pagerank, it can take a while for Google to take note of your new site or changes to it.
 
Did Google index your old site OK?
 
How are MSN, Yahoo, and Ask treating you?
 
/rich

 
On 7/3/07, Bill Park <[log in to unmask]> wrote:
Hello All,

Recently we implemented a new site for the Plant Biology department.  We used WordPress as our CMS.  I installed this Google Sitemap Generator plugin for WordPress so that the site would get properly indexed by Google.  After getting many complaints about the site not being found as a search result on Google I decided to have the plugin rebuild the sitemap.  Once I did that I realized the permissions on the 2 .xml files were not correct and were not writable.  So I 777'd those 2 sitemap files used by my plugin and it rebuilt the sitemap but I was still not having any luck with Plant Biology showing up from a search query.  

Next I got on Google Webmaster's Tools to verify the site.  I uploaded the HTML file as instructed by Google and verified the site flawlessly.  Next I also manually uploaded the sitemap to Google and let it verify that,  it verified and said it submitted roughly 278 URLS (which seems about right.)

The next day I checked on webmasters tools again to see what Googlebot was crawling and it was returning "robots.txt unreachable"  and then would postpone the crawling of our site.  Assuming robots.txt files were simply for exclusions only I hadn't created one.  After seeing that error message I decided to make one.  The robots.txt file  I created was permitting of all agents to crawl all aspects of our site.  After creating the robots.txt file and checking back with webmaster tools multiple time it looks as though the Googlebot has many no further progress.  

I was curious to see if anyone else had ever experienced anything like this or could offer me any insight into how I can get us on the map quickly with Google and other search engines despite having the robots file that permits everything full access to the site.

Thanks.

Bill Park from Plant Biology.