Two things actually.
1) Is there a particular line in the Admin Tools option that would prevent the verify ownership of a site w/ Google Search Console? When I call the file up via URL it gives me a 403 Error. With .htaccess disabled it works fine.
2) Is there any conflict between the User Agents listed in the "User agents to block, one per line" option and using a robots.txt file to allow commands for Googlebot? If I put something like:
User-agent: Googlebot Disallow: /nogooglebot/ User-agent: * Allow: /
Will it bypass what's in the current user agent list in the .htaccess file and allow bots to parse the site?