We characterize the Bayesian transfer learning problem as one of conditioning on external stochastic knowledge, typically a partially or completely specified distribution. The knowledge is `external’ in that a joint probability model specifying the stochastic dependence on this knowledge is not available. In consequence, there is no unique distributional design via standard Bayesian conditioning for transferring this knowledge. In this presentation, we adopt normative Bayesian decision making for this distributional design problem, modelling the unknown distribution hierarchically. This leads to Boltzmann-type relaxations of the classical maximum-entropy and minimum-cross-entropy designs. Among the consequences are (i) randomized designs in place of deterministic distributional estimates, and (ii) a mean-field-type relaxation of Bayes’ rule for transferring external distributions. Applications in coupled Kalman filers, and in centralized deliberation for sensor networks, are briefly reviewed.
Discovery Building, Orchard View Room