What do we mean by Calibrated Trust?
Calibrated trust is what becomes necessary when older, simpler forms of trust stop working. In AI and automation, the problem is not whether people trust a system. It is whether they trust it appropriately: neither surrendering judgment to the machine nor treating every output as poison. Trust has to match actual capability.