Is model X a viable candidate for physics "Beyond the Standard Model", in the face of the combined weight of all currently known experimental data? If so, what limits exist on its parameter space? These simple questions are the first that one would like to answer when examining a particular theoretical model, however answering them comprehensively is a huge task, requiring a wide array of computational tools to produce theoretical predictions for a wide array of experiments, as well as smart algorithms to sample these predictions across theory parameter spaces with many dimensions in a feasible amount of CPU time, while remaining statistically valid. To coordinate all these tasks in a flexible way that allows new data and new models to be analysed using new sampling and statistical techniques, while making as few changes to the code base as possible, requires an extremely flexible and modular tool. GAMBIT is such a tool, as well as the name of the collaboration developing it, which is comprised of nearly 30 theorists and experimentalists, from 15 institutions, 9 countries, and 8 experiments. In this talk I will give an overview of the GAMBIT project, from its goals, to the code and its present status.