Read Sections 6.3 - 6.5 1. Ensemble systems are generally employed to increase the robustness of the system.
Section 6.3 2. Weighted Ensemble: We build various recommendation systems in parallel and combine their outputs by weighing and adding them. The weights are learned using Linear Regression or some form of Gradient Descent that minimises the error combining the results.
Section 6.4 4. Switching Ensemble: The idea here is that we use model M1 initially that addresses cold-start problem and later switch to a better model as we gather more data. For ex: one can start with knowledge based recommendation system at the start and later switch to collaborative recommender.
Section 6.5 5. Cascade Ensemble: The basic idea is something like a successive improvement in output by improving upon the previous technique. Boosting (https://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote19.html) is an example of such a method where error in predictions of one model is used build a refined model that reduces that error.
Section 6.6 6. Feature Augmentation Ensemble: This is similar to stacking (https://machinelearningmastery.com/stacking-ensemble-machine-learning-with-python/) where the output of a model is used as a feature in a subsequent model. Note that unlike cascade models the error in prediction is not considered.