The LHCb detector is undergoing a comprehensive upgrade for data taking in the LHC's Run 3, which is scheduled to begin in 2022. The increased data rate in Run 3 poses significant data-processing and handling challenges for the LHCb experiment. The offline computing and dataflow model is consequently also being upgraded to cope with the factor 30 increase in data volume and associated demands of user-data samples of ever-increasing size. Coordinating these efforts is the charge of the newly created Data Processing and Analysis (DPA) project. The DPA project is responsible for ensuring the LHCb experiment can efficiently exploit the Run 3 data, dealing with the data from the online system with central skimming/slimming (a process known as "Sprucing") and subsequently producing analyst-level ntuples with a centrally managed production system (known as "Analysis Productions") utilising improved analysis tools and infrastructure for continuous integration and validation.It is a multi-disciplinary project involving collaboration between computing experts, trigger experts and physics analysis experts. This talk will present the evolution of the data processing model, followed by a review of the various activities of the DPA project. The associated computing, storage and network requirements are also discussed.