>How do you protect your site from search engines that come to your server.
The best way is to keep the pages on your local drive until they are finished.
Second best: In the ROOT directory of the site, you use a ROBOTS.TXT file to exclude them.
The "PRAGMA" tags refer to caching by a visitor's site, not search robots.
Here's my robots.txt file that keep them from indexing places I don't want them
# = comment
* = ALL ROBOTS should obey the directives
disallow = where they shouldn't go
# robots.txt for http://www.site.com/
disallow: /html/ # under construction
disallow: /temp/ # temporary features
disallow: /users/ # it's really wpoison to catch address harvesters
because they don't obey robots.txt