I have received the following email:
I am contacting you on behalf of the Bing Search engine (http://www.bing.com/) in regards to your site [Your Site]
Our customers have alerted us that your website was partially absent from our results, and we have discovered that you are blocking our crawler, named BingBot, via a disallow directive in your robots.txt file:
We would be pleased if you could edit your robots.txt file to allow our crawler to fetch and index your content properly, which will in turn increase traffic to your site via our search results.
If this block were implemented as a response to an issue caused by our crawler, we would be happy to hear your feedback and ensure the issue was fixed.
I also invite you to register your site on Bing Webmaster Tools where you can configure your own settings, including hourly crawl control.
Please let me know should you have any question.
Note: In case you are not the relevant person to receive this email, kindly forward it to relevant person in your company. Relevant person would be Website admin, Webmaster, CTO, SEO Manager, Tech support. Thank you.
A possible solution:
What Bing are saying is that we put a header on all of our sites. Bing robots see that header and don't try to index your site. This is a courtesy on both sides.
As the majority of your site is behind a login screen, the robot can't go there anyway. The debate about whether or not we enable it is mostly moot. From our perspective, we're saving both sides the load of having automated scripts frequently trying to access your site and failing.
As such, this message can be ignored.