ATLAS G4 pre-production tests just after Christmas we run several preproduction tests (in view of our Data Challenge due to take place as of April) in order to: - demonstrate that we can comfortably run with G4 in a DC - prove that a G4-based simulation is affordable (in terms of CPU/memory/disk - verify that the program is robust enough - provide data samples that can be used for exploiting the ATLAS SW chain :) tests consisted of a few millions of single particle events (on 6.4 units of rapidity) and ~35K full events (different physics channels) on 12units of rapidity. After initial setting up problems due to - a couple of bugs in hadronic physics that Hans-Peter & team promptly took care of - a problem with the ATLAS geometry the whole production went through smoothly and succesfully (100% of the jobs completed). As of today we can't quote a failure rate (not that we want to complain mind you :) CPU needed was within a factor 1.5 wrt G3, so was the event size (not depending on G4 indeed). it goes without saying that we are extremely satisfied with G4 and how it performs. For the immediate future, ATLAS priorities are: - stable performance and robustness, especially for what concerns physics - performance improvements, in particular for EM physics which (we believe) would deserve a thorough optimization ATLAS would like to thank the Geant4 collaboration for the support and the commitment that we could appreciate in the last few (for us hectic) months. Thanks folks!