he Compact Muon Solenoid (CMS) detector at the CERN Large Hadron Collider (LHC) is undergoing an extensive Phase II upgrade program to prepare for the challenging conditions of the High-Luminosity LHC (HL-LHC). To sustain the harsh conditions foreseen in Phase II the CMS experiment has designed a novel endcap calorimeter that uses approx. 5.8M radiation-tolerant Silicon sensors. These sensors will sample the electromagnetic and hadronic particle showers using 47 layers with fine lateral granularity and providing 5D measurements of energy, position and timing. In regions of sufficiently low radiation, small scintillator tiles with individual SiPM readout are employed. Developing a reconstruction sequence that fully exploits the granularity to achieve optimal electromagnetic and hadron identification, as well as a good energy resolution, in the presence of pileup, is a challenging task. A modular iterative clustering framework (TICL) is being developed to cope with this task. The framework starts from input clusters of energy deposited in individual calorimeter layers delivered by an "imaging", density-based, algorithm In view of the expected pressure on the computing capacity in the HL-LHC era, the algorithms and their data structures are being designed with GPUs in mind. In addition, machine learning techniques are being investigated and integrated into the reconstruction framework. This talk will describe the approaches being considered and discuss the first results.