Traditional air-cooling along with corresponding heat sinks are beginning to reach performance limits, requiring lower air-supply temperatures and higher air-supply flowrates, in order to meet the rising thermal management requirements of high power-density electronics. A switch from air-cooling to single-phase immersion cooling provides significant thermal performance improvement and reliability benefits. When hardware which is designed for air-cooling is implemented within a single-phase immersion cooling regime, optimization of the heat sinks provides additional thermal performance improvements. In this study, we investigate the performance of a machine learning (ML) approach to building a predictive model of the multi-objective and multi-design variable optimization of an air-cooled heat sink for single-phase immersion-cooled servers. Parametric simulations via high fidelity CFD numerical simulations are conducted by considering the following design variables composed of both geometric and material properties for both forced and natural convection: fin height, fin thickness, number of fins, and thermal conductivity of the heat sink. Generating a databank of 864 points through CFD numerical optimization simulations, the data set is used to train and evaluate the machine learning algorithms’ ability to predict heat sink thermal resistance and pressure drop across the heat sink. Three machine learning regression models are studied to evaluate and compare the performance of polynomial regression, random forest, and neural network to accurately predict heat sink thermal resistance and pressure drop as a function of various design inputs. This approach to utilizing numerical simulations for building a databank for machine learning predictive models can be extrapolated to thermal performance prediction and parameter optimization in other electronic thermal management applications and thus reducing the design lead time significantly.