Aravind Joshi, Bonnie Webber, Ralph M. Weischedel
In cooperative man-machine interaction, it is necessary but not sufficient for a system to respond truthfully and informatively to a user’s question. In particular, if the system has reason to believe that its planned response might mislead the user, then it must block that conclusion by modifying its response. This paper focusses on identifying and avoiding potentially misleading responses by acknowledging types of "informing behavior" usually expected of an expert. We attempt to give a formal account of several types of assertions that should be included in response to questions concerning the achievement of some goal (in addition to the simple answer), lest the questioner otherwise be misled.