Multiple kernel learning algorithms are proposed to combine kernels in order to obtain a better similarity measure or to integrate feature representations coming from different data sources. Most of the previous research on such methods is focused on classification formulations and there are few attempts for regression. We propose a fully conjugate Bayesian formulation and derive a deterministic variational approximation for single output regression. We then show that the proposed formulation can be extended to multiple output regression. We illustrate the effectiveness of our approach on a single output benchmark data set. Our framework outperforms previously reported results with better generalization performance on two image recognition data sets using both single and multiple output formulations.