Conformal prediction is a statistical technique that allows a point prediction from a regression algorithm to be transformed into a prediction region that contains the true target value with a chosen confidence. Normalization allows a conformal algorithm to consider the difficulty of a prediction by adjusting the size of the prediction regions. Easy cases get a smaller prediction region and vice versa. This paper focuses on normalization with a residual model. We investigate the suitability of regression models as residual models and how well they can generate prediction regions. Two evaluation criteria are used for the assessment. The empirical study includes nine datasets and three confidence levels and investigates the algorithms Random Forest, XGBoost, K-nearest neighbors, Linear Regression, and AdaBoost. The experiments show that residual models must not produce predictions lower than the lowest value in the training dataset. Furthermore, AdaBoost is best suited for the residual mode in our experiments.