Print

Print


Hello All,

Recently we implemented a new site for the Plant Biology department.  We
used WordPress as our CMS.  I installed this Google Sitemap Generator plugin
for WordPress so that the site would get properly indexed by Google.  After
getting many complaints about the site not being found as a search result on
Google I decided to have the plugin rebuild the sitemap.  Once I did that I
realized the permissions on the 2 .xml files were not correct and were not
writable.  So I 777ıd those 2 sitemap files used by my plugin and it rebuilt
the sitemap but I was still not having any luck with Plant Biology showing
up from a search query.

Next I got on Google Webmasterıs Tools to verify the site.  I uploaded the
HTML file as instructed by Google and verified the site flawlessly.  Next I
also manually uploaded the sitemap to Google and let it verify that,  it
verified and said it submitted roughly 278 URLS (which seems about right.)

The next day I checked on webmasters tools again to see what Googlebot was
crawling and it was returning ³robots.txt unreachable²  and then would
postpone the crawling of our site.  Assuming robots.txt files were simply
for exclusions only I hadnıt created one.  After seeing that error message I
decided to make one.  The robots.txt file  I created was permitting of all
agents to crawl all aspects of our site.  After creating the robots.txt file
and checking back with webmaster tools multiple time it looks as though the
Googlebot has many no further progress.

I was curious to see if anyone else had ever experienced anything like this
or could offer me any insight into how I can get us on the map quickly with
Google and other search engines despite having the robots file that permits
everything full access to the site.

Thanks.

Bill Park from Plant Biology.