WEBVTT

00:00:15.000 --> 00:00:18.000
Everyone, welcome back.

00:00:18.000 --> 00:00:21.000
You're able to get your coffee, or whatever it is that you needed.

00:00:21.000 --> 00:00:38.000
And so next we have as Jim said before, some presentations and then discussion on the recent results of the GD x, and what to charge vertical groups, where we even thought maybe a little bit of an excess, maybe not we can talk about it.

00:00:38.000 --> 00:00:47.000
It is an excess. But what we want to make of it. And so, the first speaker is an.

00:00:47.000 --> 00:00:48.000
Yeah.

00:00:48.000 --> 00:00:49.000
Yeah.

00:00:49.000 --> 00:00:54.000
Okay, let me just share.

00:00:54.000 --> 00:00:59.000
Okay, yeah, I can see me, your screen.

00:00:59.000 --> 00:01:01.000
Okay, can you see full screen now.

00:01:01.000 --> 00:01:07.000
I can see full screen hopefully that will move forward.

00:01:07.000 --> 00:01:12.000
Okay, so I'll just start if that's okay

00:01:12.000 --> 00:01:31.000
great so Hi everyone, I'll be presenting highlights from the pixel ddx analysis. This is with the full run to data set. So this is a search for heavy Long live charged particles with large ionization energy loss, using the ATLAS experiment and excitingly

00:01:31.000 --> 00:01:41.000
art paper was recently submitted to Jacob, and here is the archive link.

00:01:41.000 --> 00:01:55.000
OK, so the basic strategy of this analysis is that it relies on the fact that finalization energy loss for ddx of a charged particle traversing any material it depends on it, Lawrence beta gamma.

00:01:55.000 --> 00:02:08.000
So we can use this property to look for massive long with particles, and this is because they will travel more slowly than light center model particles, through our detector, if we fix the momentum.

00:02:08.000 --> 00:02:12.000
So if we have a mental requirement.

00:02:12.000 --> 00:02:30.000
So, according to the beta block relationship, which is depicted in this plot on the left here, so this is ddx on the y axis, and beta gamma on the x axis slower particles will lie to the left of the curve, and thus have larger measured dx.

00:02:30.000 --> 00:02:46.000
exploit this property to find evidence of for new physics, by looking for a track with large GD x, which also has high PT and as high quality.

00:02:46.000 --> 00:02:55.000
Okay, so because the beta block relationship that I just showed you a governance all charged particles, this analysis is really a relatively model independent.

00:02:55.000 --> 00:03:03.000
We have broad sensitivity to charge lower NASA particles with lifetimes of order nanosecond to stable.

00:03:03.000 --> 00:03:13.000
But we are specifically in this analysis interpreting on supersymmetry, and with the full run to data set, we're targeting long with glucose charging us can select on.

00:03:13.000 --> 00:03:23.000
Note that the glucose here, what we're actually looking for our hydrants, they had and I used to travel to the chapter of these composite particles.

00:03:23.000 --> 00:03:34.000
And then we have Gleaners slept Don's in charge. He knows which sort of span the mass range, a really broad mass range, and a variety of different cross sections.

00:03:34.000 --> 00:03:49.000
So when we're designing this analysis we have the philosophy again to be as independent as possible. And we have to remain flexible and sensitive to a large range of masses and lifetimes.

00:03:49.000 --> 00:03:51.000
Okay, so how do we actually do this.

00:03:51.000 --> 00:04:01.000
Well, in order to look for a highly ionizing high PT track. We rely largely on the Atlas pinner tracker. This is a cartoon of the inner tracker here.

00:04:01.000 --> 00:04:13.000
And we use the full enter detector to measure the momentum, the transverse momentum and the fort innermost layer innermost layer is here are called the pixel system.

00:04:13.000 --> 00:04:25.000
And we use these specifically to measure the ddx for layer one special thing about this is that the innermost layer of the pixel system is called the instead of will be layer or the IPL.

00:04:25.000 --> 00:04:30.000
And this has different front and electronics with respect to the rest of the pixel system.

00:04:30.000 --> 00:04:40.000
It has an overflow bit in the event of sufficient charge deposition which helps us further tag, highly ionizing particles,

00:04:40.000 --> 00:04:45.000
one extra thing also is that the weekend, ADDX.

00:04:45.000 --> 00:04:46.000
Hit per layer.

00:04:46.000 --> 00:05:01.000
And this, these hits are these clusters actually sample l&l back distribution. So in order to convert this to Larry GD x is going to attract a dx. What we do is we apply a truncated mean algorithm.

00:05:01.000 --> 00:05:20.000
So in order to do this, what we do is for each track, we order. The ddx clusters by charge, we throw out a subset of the largest hits and then we average the rest and what we're trying to track here is the MTV or the most possible value of the ddx measurement.

00:05:20.000 --> 00:05:25.000
Okay, so now we have our track tedious and our track beauty.

00:05:25.000 --> 00:05:33.000
And then one particularly nice aspect of this analysis is that with those things. We can reconstruct the mass of any track that we see.

00:05:33.000 --> 00:05:46.000
So, m equals P over beta gamma and ddx probes Beta Gamma so we get that, sort of, for free.

00:05:46.000 --> 00:06:03.000
Right. So, using the GD x measurement, however, requires a ton of custom work done by the team, specific to this analysis. So we have a unique set of calibrations and treatments of the ddx variable.

00:06:03.000 --> 00:06:16.000
So, for example, if you look at this plot on the upper left you see DD acts as a function of deliberate integrated luminosity. And you also see three slices of Ada so probing different regions of the detector.

00:06:16.000 --> 00:06:31.000
And you see a very significant decrease of DDXS time passes, and you also see that through different Ada slices that ddx measured is different. So we correct for these things on a run by run basis.

00:06:31.000 --> 00:06:35.000
And for these detector effects.

00:06:35.000 --> 00:06:45.000
We also see a discrepancy of DDS modeling in Atlas Monte Carlo, compared to data. So this is illustrated on the plot and the lower left.

00:06:45.000 --> 00:06:56.000
This is long dx here, split into tracks with and without an overflow. In the IBM, so that's represented by overflow zero, and overflow one.

00:06:56.000 --> 00:07:02.000
And we see, for example one discrepancy is very apparent in the tails.

00:07:02.000 --> 00:07:10.000
So for our signal Monte Carlo we actually use a data driven ddx template to replace the DX values here.

00:07:10.000 --> 00:07:21.000
And then finally, maybe the crux of our analysis relies on this ddx to beta gamma calibration. So instead of using anything analytic or the beta blocker formula we derive our own calibration.

00:07:21.000 --> 00:07:42.000
And the example plot from a calibration is on this is this plot here. We have dx as a function of beta gamma and these points all come from ddx as measured by standard particles in slices of momentum, using a special very special do so.

00:07:42.000 --> 00:07:57.000
Okay, so we've done our ddx corrections and collaborations. So now we can move on to our event selection. So this is our event selection and track selection in a snapshot, we trigger on high missing et, and we require that our events passing offline missing

00:07:57.000 --> 00:08:01.000
a ticket of 170 GV.

00:08:01.000 --> 00:08:09.000
Then after that we look for a high momentum track that central and architecture and has a GD x greater than 1.8.

00:08:09.000 --> 00:08:17.000
We also impose a serious, very important track quality requirements and vetoes to reduce Standard Model background.

00:08:17.000 --> 00:08:23.000
And these details can be found in the paper or in the backup.

00:08:23.000 --> 00:08:32.000
Finally, after our track passes our selection we put it into a group called so inclusive.

00:08:32.000 --> 00:08:40.000
And then we can categorize the track into six exclusive signal bins, according to hit pattern DD x value and new and information.

00:08:40.000 --> 00:08:52.000
And so we call these regions are exclusion regions, and they're designed and they're more powerful than the region so I'll talk about next for excluding specific models.

00:08:52.000 --> 00:09:06.000
So, we also have a set of more inclusive regions where instead we just have divided by dt x value so DDS between 1.8 and 2.4, or 2.4 and greater for inclusive low and inclusive high.

00:09:06.000 --> 00:09:27.000
These are last model independent model dependent, they're easier for reinterpretation, and we call these are discovery regions, the some of the discovery regions is exactly equivalent to the sum of the exclusion regions, and that's the SR inclusive.

00:09:27.000 --> 00:09:40.000
So, after applying our about selection, our final signal region plot that we make is the mass distribution of any candidate tracks. So what we need to do when we estimate our background is accurately predict the mass of our expected background in the

00:09:40.000 --> 00:09:56.000
signal region, our background can consist of any standard model particle which will leave an ice will isolate a track, and then combine that with statistical fluctuations can do you access falling that land distribution that I showed you earlier.

00:09:56.000 --> 00:10:13.000
And then these tale events can pass, and we can get a track in our signal region. So this is extremely difficult to model in Monte Carlo so we use a completely data driven technique, designed to predict the expected mass distribution in the signal region.

00:10:13.000 --> 00:10:31.000
And in order to do this, what we do is we define to control regions adjacent to the signal region and phase space. So the kinematic control region, which is that low ddx, and the DDS control region which is that low missing at your man, we randomly draw

00:10:31.000 --> 00:10:38.000
momentum and ddx from the distributions in these regions which we expect are representative of the distributions in this region.

00:10:38.000 --> 00:10:54.000
And we combine this P or momentum and ddx sample to define a toy track. And because we have PMDDX, we can calculate the mass of this toy track, and then repeat this order 10 million times.

00:10:54.000 --> 00:11:07.000
Finally after creating the expected mass distribution through this method we normalize the distribution to a low mass region and data and we apply our GD x cut, and now we have a background distribution.

00:11:07.000 --> 00:11:14.000
background mass distribution in our signal region.

00:11:14.000 --> 00:11:27.000
Okay, so to give us confidence in the background estimation we've checked the background estimation in two sets of validation regions. So one is to find that little PT relative to the signal region and want us to find out how ADA.

00:11:27.000 --> 00:11:33.000
So, we check the predicted and observe mass distributions.

00:11:33.000 --> 00:11:43.000
You can find this in the backup these plots are just comparing the observed yield and the expected yield in these in these regions.

00:11:43.000 --> 00:12:01.000
So each region is subdivided using track and IV all information, and also ddx information in a manner analogous to the signal region, and you see here, the green represents the systematic uncertainty that we see a good agreement between observed and predictive

00:12:01.000 --> 00:12:12.000
yields within the background uncertainty. So this gives us confidence in our background estimation method.

00:12:12.000 --> 00:12:16.000
Okay. So, finally, here are the results.

00:12:16.000 --> 00:12:32.000
These are the predicted and observed mass distributions and the two discovery regions, inclusive low on the left and inclusive high on the right. So the predicted distribution is in this dark blue with uncertainty in the shaded purple and the observed

00:12:32.000 --> 00:12:43.000
data is represented by the black points. And then there's a few signal samples overlaid, which gives you sort of sense of our mass resolution, as a function on mass.

00:12:43.000 --> 00:12:51.000
And as you can see in a sorry inclusive low, the data and the predicted distributions agree extremely well.

00:12:51.000 --> 00:13:04.000
In the inclusive high region.

00:13:04.000 --> 00:13:24.000
Okay, so how do we quantify this excess before unblinding, what we actually do is we define a set of mass windows in which we count cut and count the events, so many signal bends basically these mass windows are optimized to each target particle mass

00:13:24.000 --> 00:13:28.000
and lifetime.

00:13:28.000 --> 00:13:33.000
The mass window definitions for a long lifetime is is shown in this upper right plot.

00:13:33.000 --> 00:13:39.000
You can see the extent of the mass windows which is represented by this purple line.

00:13:39.000 --> 00:13:44.000
And after unblinding we calculate the p value, and each mass window.

00:13:44.000 --> 00:14:04.000
So P values for us are inclusive high for these different mass windows subdivided between short and long lifetimes is shown here. And you see that this, this one with the region with the smallest p value that spans from 1.81 to 2.8 TV and us are inclusive

00:14:04.000 --> 00:14:13.000
Hi, that's the highest deviation, and we see seven events and expects 0.7 events here plus or minus 0.4.

00:14:13.000 --> 00:14:24.000
The significance of this is 3.6, and when you account for customer effect it's the point three sigma.

00:14:24.000 --> 00:14:42.000
Okay, so let's examine these excess events a little bit more. So here's a plot of the data and, and the expected background and RSR inclusive region. So, including low and high ddx is on the y axis momentum is on the x axis the blue represents our expected

00:14:42.000 --> 00:14:56.000
background distribution and the red points are data here in the box part of the seven access tracks. And just to give you a sense of the topology of these events, six out of seven have a jet back to back with the signal check.

00:14:56.000 --> 00:15:03.000
Five out of seven, or match to me once in three out of seven half to millions in the bank.

00:15:03.000 --> 00:15:10.000
So these tracks were also systematically examined for evidence of detector effects, such as an almost pixel clusters.

00:15:10.000 --> 00:15:17.000
Poor isolation, things like pile of other systematic effects and nine were found.

00:15:17.000 --> 00:15:20.000
We did do an extra cross check.

00:15:20.000 --> 00:15:34.000
So we looked at the available calorie monitor and we wanted some timing information so time of flight measurements here, and evidence of these tracks coming from slower particles as suggested by their large GD X was not confirmed.

00:15:34.000 --> 00:15:44.000
So these time of flight measurements I were very consistent with beta have one speed of light.

00:15:44.000 --> 00:15:59.000
Okay, so with observed data, even with access we can still set limits on the Sydney models that I previously mentioned. So here are the limits for Gleaners chinos and styles and sort of clockwise fashion.

00:15:59.000 --> 00:16:11.000
We calculate the limits here using toy experiments, and we use the exclusion regions and so discovery regions so those finer Ben regions that I mentioned, and we do a multi benefit overall of them.

00:16:11.000 --> 00:16:34.000
So we end up, excluding casinos, for example stable Gleaners around up to 2.1 TV and meta stable, to a maximum around 2.3 TV charging those are excluded, up to around 1.05 TV for the middle lifetime, the 30 nanosecond lifetime and styles are very difficult

00:16:34.000 --> 00:16:50.000
to search for but we do manage to exclude this parameter space between around 200 and 360 gb at top 10 seconds.

00:16:50.000 --> 00:17:01.000
Okay, that's it. So, in summary, a new search was conducted the pixel dx analysis to look for having long with charged particles and proton proton collisions.

00:17:01.000 --> 00:17:06.000
Many analysis improvements were made from the previous search from 36 in perspective.

00:17:06.000 --> 00:17:24.000
These are just a few, but we made improvements to the DDS calibration and modeling the data driven template we employed a higher ddx special. The IBO overflow categorization information is new, so that's really helps us discriminate signal versus background.

00:17:24.000 --> 00:17:29.000
We do a multi benefit this time, and we also improved our track quality cuts.

00:17:29.000 --> 00:17:38.000
We added a new validation region and a more complete systematic uncertainty estimation to give us more confidence in our background estimation.

00:17:38.000 --> 00:17:43.000
We set competitive woman's on going to charge you know and style models.

00:17:43.000 --> 00:17:58.000
And we did see an excess and cross checks were conducted, but the existence of soul particle suggested by the access was not confirmed using time of flight measurements from the Milan and calendar systems.

00:17:58.000 --> 00:18:00.000
Yeah, that's it.

00:18:00.000 --> 00:18:02.000
Are there any questions.

00:18:02.000 --> 00:18:19.000
And I'm Thanks a lot, and effect, and we will have two discussion sections kind of coming up soon. Um, one on the experimental aspects of the search and multi touch vertical search which is the next presentation and then another one after the presentation

00:18:19.000 --> 00:18:23.000
and then Daniele on the phenomenal logical aspects of this.

00:18:23.000 --> 00:18:31.000
So, if you have a question that's, you want to ask before We then talked about multi touch particles and the overlap those two searches.

00:18:31.000 --> 00:18:47.000
Then can you keep your hand up now and if you think will be more appropriate for the discussion sections that we have later. Can you put it down for the moment and then we can maybe just to start to see which is the right time.

00:18:47.000 --> 00:19:08.000
Okay. So Michael, thank you page, obviously I'm not sure. It seems to you assume that the particles have charged one. And as you know that there are also like nuclei produced in PP conditions even helium three and him for maybe have had few.

00:19:08.000 --> 00:19:10.000
Consider the.

00:19:10.000 --> 00:19:20.000
Well, one one question is this analysis could actually show like nuclear being produced a short that this is not an issue here.

00:19:20.000 --> 00:19:41.000
And it's actually Michael This is in fact the printed I was to make this before is that we are right now, and May after this have a presentation on multi charged particles which look at heavy charged particles with her does from two to seven, which is

00:19:41.000 --> 00:19:42.000
Now if people want to ask questions that are just specifically about the DX that they think, want to involve also the presentation that will be in a few minutes.

00:19:42.000 --> 00:19:52.000
we stuck things in this way so that afterwards we'll have a discussion section on kind of the overlap of these two analyses.

00:19:52.000 --> 00:19:53.000
Okay, thank you.

00:19:53.000 --> 00:19:58.000
Thanks, Matt, do you think your question is better for now.

00:19:58.000 --> 00:20:15.000
Oh Yes I think so, this has to do with the issue of the cross checks that you made on the tracks that you mentioned that you checked for certain quality things I don't forget put that slide back up quickly but those particular tests, do they usually fail

00:20:15.000 --> 00:20:24.000
when you're looking at control region tracks that have a, an exceptionally high value of ddx way out on the tail.

00:20:24.000 --> 00:20:33.000
Um, so when you say cross checks Do you specifically, are you, I don't mean the beta test I mean all the others.

00:20:33.000 --> 00:20:48.000
Go back to that slide. That's yeah the pixel clusters the poor last lead and all those things do they just do those tests tend to fail. When you have a control we can track with a large amount of TDX.

00:20:48.000 --> 00:20:50.000
Yeah.

00:20:50.000 --> 00:20:58.000
That's not sure if we have specifically.

00:20:58.000 --> 00:21:21.000
Check for that so so I do know that, for example, when we invert a lot of our trap quality cuts so things like split insured hits and isolation, we actually get very, very few countries control region events per cut which pass or additionally passed into

00:21:21.000 --> 00:21:29.000
to our control region so I think that our quality cuts here are doing a really really good job.

00:21:29.000 --> 00:21:34.000
Okay. Thanks.

00:21:34.000 --> 00:21:36.000
Okay. Thanks, Matt.

00:21:36.000 --> 00:21:51.000
And then, I think we can have one or two more questions now and maybe save the rest for after the next up, and considering the time. So, I'm young, do you think your question is this know, be honest.

00:21:51.000 --> 00:22:10.000
Be honest. Okay, I think so. So my question is just on slide, 12, you show the these limits and for the Gleaner in particular, there are much stronger or significantly amount stronger for the metal stable case then for the detective stable case and I've

00:22:10.000 --> 00:22:28.000
seen this in various of analysis so see this seems to be a common thing that you've come into why I mean because I found from just the DEDX I wouldn't think that the detector stable would be less sensitive Could you comment on that.

00:22:28.000 --> 00:22:45.000
Yeah, it has to do with our other cuts so mainly are for trigger and are missing new tickets. So for example, if you have a metal stable signal you'll be able to see us to K products in the missing at the online missing at, which is what we trigger on.

00:22:45.000 --> 00:22:55.000
Oh, okay. Yeah, cables to civil tracks, you don't see those tracks so they live very little missing or energy in the Cal revenue so it's difficult to arrive.

00:22:55.000 --> 00:23:01.000
So this really comes from a trigger. Yeah. Okay, thank you. Thank you.

00:23:01.000 --> 00:23:04.000
Okay, thank you.

00:23:04.000 --> 00:23:06.000
So, Christopher.

00:23:06.000 --> 00:23:21.000
Hi. My question is about the in the use of the average quantity to make a DDDX metric, you actually lose some information.

00:23:21.000 --> 00:23:40.000
Because the, the probability to have a large ddx deposit is not not uniform radio Lee, in general, in general, it might be for an ideal track. But, but, you know, you have a much higher probability of having a large GD X on maybe the first layer or something.

00:23:40.000 --> 00:23:47.000
So do you look at the you look at the for those seven tracks, you have the.

00:23:47.000 --> 00:23:57.000
Have you unfolded that average and can show the, the, the charged depositions per layer.

00:23:57.000 --> 00:24:06.000
And we have looked at the chart position for layer, I, and it doesn't look to be problematic if that's what you're asking.

00:24:06.000 --> 00:24:11.000
And actually, the truncated mean algorithm is a really robust algorithm.

00:24:11.000 --> 00:24:24.000
It's, it's very powerful and I know there have been studies using, for example a fit instead without instead of throwing out information, but it's it's pretty comparable to methods such as that.

00:24:24.000 --> 00:24:38.000
But it does make the assumption that the each layer is equally likely to produce a high that you know that either all layers are following the same line though, it's sort of implicit in that but okay if you looked at it would be helpful to publish that

00:24:38.000 --> 00:24:53.000
that information, or if the Atlas has looked at it, because so, so much layers. I mean, it depends on the thickness of course but it would be a similar amount of measurement, because the beta gamma doesn't change as he passed to the layers.

00:24:53.000 --> 00:25:09.000
Yeah, that's what I meant by an ideal ideal track but when you have when you have tracks that that are not really you know for example they pick up a hit from pile up in the first layer and falsely attach that but not so, so much that it for that pulls

00:25:09.000 --> 00:25:25.000
the chi square and such that the track is rejected by your track quality cuts, I mean, this is something we've seen in other experiments before where we're and current where you if you, you can pick up, it's it's just to assume it's a perfect track, and

00:25:25.000 --> 00:25:29.000
then everything is

00:25:29.000 --> 00:25:31.000
equally.

00:25:31.000 --> 00:25:46.000
All the, if you could compensate for this with the track quality cut so it's similar to the question from Matt, but I didn't think that you really answered that either and maybe you haven't answered my question, either. But, but the first layer is a morning

00:25:46.000 --> 00:25:48.000
person.

00:25:48.000 --> 00:25:58.000
So the ability of saving another track together is, is not the higher than the for the other layers.

00:25:58.000 --> 00:26:14.000
But the first the first layer is this IPL layer with. Yeah. So, yeah, be interesting to hear a little bit more how that information is used but maybe maybe my comment would apply to the second of the first layer of somebody in the second layer, but it

00:26:14.000 --> 00:26:22.000
because you don't have, is it a it, what kind of readout of you have out of the IBM it's a limited the limited range. Right.

00:26:22.000 --> 00:26:24.000
Yeah.

00:26:24.000 --> 00:26:37.000
So you just rely mostly on this overflow bit for that, that one so that's not going to give you too much information but if for it but for example, did you said you looked at this distribution that wasn't anomalous like they all have high hits on the,

00:26:37.000 --> 00:26:47.000
on the IPL or. Is that how you I mean, I guess you don't have the information handy but but they all follow expected distributions.

00:26:47.000 --> 00:26:55.000
Yeah, I would say they don't deviate from expected distributions and then they're not all tracks with IBM hits, if that's where you're at, as well.

00:26:55.000 --> 00:27:08.000
Regards to track quality I mean you check things like pile up in isolation so they don't look like, what you're suggesting with pile up It's okay. Thank you.

00:27:08.000 --> 00:27:18.000
Okay, thanks Christopher, I think the million bonus maybe we can save these questions for the discussion section that we have coming up pretty soon.

00:27:18.000 --> 00:27:26.000
And maybe for the moment, we can move on to Gary's presentation on the very recent results on heavy multi touch particles.

00:27:26.000 --> 00:27:32.000
Yes, hi, so I'm going to share the slides now.

00:27:32.000 --> 00:27:35.000
I think you can see them right.

00:27:35.000 --> 00:27:48.000
Thank you. Okay, great. So, this is a search for heavy Long live most church particles in the forum to data with that was detector.

00:27:48.000 --> 00:28:00.000
We are searching for human like particles, it was quite a few charges from two to seven based on the organization losses in three your three actually, it'll sound detectors.

00:28:00.000 --> 00:28:15.000
This is generally a blue sky search but there are some models that in fact breeding new particles with surcharges, this is almost community of geometry model, they walk in Technicolor model and the left try symmetric model and Amy observation of such

00:28:15.000 --> 00:28:21.000
particles would be an evidence of physics beyond a certain model.

00:28:21.000 --> 00:28:42.000
And for the signal samples, most evil Monte Carlo samples we use particle players with mess of from 500 GV to do 20 hundred gv with a step of 300 gb and charges, 23456 and seven produce We are the trail young and for confusion mechanisms.

00:28:42.000 --> 00:28:55.000
We do not have a paper yet, at least not the public one, but we have a conf note, and the blocks and tables from the, from the continent, which is publicly.

00:28:55.000 --> 00:29:10.000
First, the production loads we use the this to production modes. The, the fusion mode was never used in previous MCP searches in Atlas.

00:29:10.000 --> 00:29:23.000
And for the Dalian mode only the mediator production was used in previous MCP searches so the, the Z mediator is new here to.

00:29:23.000 --> 00:29:36.000
Now for the selection, we use a derivation selection which keeps all events from the mainstream with at least one offline combined in dealing with bt greater than 50 gd.

00:29:36.000 --> 00:29:54.000
Then we use three trigger requirements or 323 or trigger requirements, is the single your trigger me CT trigger the Soho late neon trigger this is new one, and this is used again in this kind of search for the first time.

00:29:54.000 --> 00:30:13.000
The same knowing your trigger is innate natural trigger here because the mcps arm to like the larger team nice originates from the initial stage radiation just recoiling off of the bear, and the new Lake neon trailer, also known as the out of time to

00:30:13.000 --> 00:30:25.000
trigger it fires in events with abt greater than 50 GVGF and the current bunch crossing, and a softer immune. In the next bunch of grace.

00:30:25.000 --> 00:30:36.000
And it is on Freescale, and it was brought into service in the 2017 data taking as one of the algorithms are the UL on top of trigger.

00:30:36.000 --> 00:30:48.000
Now for the brief selection we are selecting events with at least one combined view and with at least medium quality of the transports momentum measured only by the neon system greater than 50 gV.

00:30:48.000 --> 00:30:59.000
The overall transverse momentum measured by the combination of inner detector immune system is greater than 10 G.

00:30:59.000 --> 00:31:15.000
And we also require reliable ddx estimation of MDT TRT and the pixel, the pixel is all for charge too because it was a charge situation. And I'm going to talk about this a bit later than me owner for the meal reconstruction.

00:31:15.000 --> 00:31:35.000
It has to provide the transverse momentum of a particle after it was lost its energy in the Caribbean or so, we rely on the standard reconstruction algorithm and we do not do anything fancy like you go from here, or an algorithm reconstructing slow.

00:31:35.000 --> 00:31:50.000
And we also require the corresponding idea track segments should be isolated from other it tracks. This, in order to limit the background contribution from two or more tracks firing the same charities throws or entities.

00:31:50.000 --> 00:31:58.000
They stable features the entire pre selection criteria is the entire set of the pre selection criteria.

00:31:58.000 --> 00:32:12.000
Now, for the discriminating one it is. We have pixel dx, as an already mentioned it's based on the measurements of an output signal with from this from the discriminate or of every pixel.

00:32:12.000 --> 00:32:22.000
Then we have the DRTDDX, which is based on measurements of a signal with exceeding the lower threshold, and divided by the track segment length entirety.

00:32:22.000 --> 00:32:34.000
Then there is a high production, sorry there's a fraction of my special Keats and TRT with a single amplitude over at least 16 each on a drug segment and TRT again.

00:32:34.000 --> 00:32:52.000
And finally, the reason he healed system, dX MDT is called the meds dx, this is based on the measurements of a time interval when the single amplitude from the amplifier shaper discriminated in MDT exceeds a certain threshold within the first nanoseconds

00:32:52.000 --> 00:33:11.000
of that signal, and the early dx variables from all these three sub detectors, they have always their own arbitrary values arbitrary units, and so we define the significance as the difference between the observed ddx, and other particle, as the one expected

00:33:11.000 --> 00:33:18.000
from immune from zdK in.

00:33:18.000 --> 00:33:30.000
Excuse me, in data units of measurement, here's the formula

00:33:30.000 --> 00:33:36.000
for the test selection, we divided four charged to is on the left.

00:33:36.000 --> 00:33:46.000
This is big, so the significance of the left you see it in humans from Zd case in data and Monte Carlo, and on the right, this is the signal.

00:33:46.000 --> 00:34:07.000
And this is for church student, and for the high charges pixel the same saturates, and we cannot discriminate between, we can hardly discriminate between background and signal so we do not have tight selection criteria criterion for this part of the search.

00:34:07.000 --> 00:34:16.000
Now for the, for the final selection criteria. We use the dirty the significance.

00:34:16.000 --> 00:34:30.000
Top left the charity high sexual kids fraction top right after the MDTDX significance. On the bottom so everywhere you see him he owns and some signal.

00:34:30.000 --> 00:34:42.000
Now for the final selection we use the BBC the planes, as shown here on the left this is George to search and on the riders is curious from three to seven.

00:34:42.000 --> 00:35:00.000
So for the first category we use the MDTDDX significance versus DRTDDX significance engraved This is data, and the region is the single region, and here, blue and red is a single month karma.

00:35:00.000 --> 00:35:08.000
for the other charges. This is NDTV are significant source of sky circle tip sheets fraction.

00:35:08.000 --> 00:35:32.000
Again, great is data. These the single region and yellow and blues, sing along to go for different charts and symbol would be time see over eight gives expectation of 1.5 plus minus 4.5 expected and then Cindy here, and 4.03 events in this region.

00:35:32.000 --> 00:35:36.000
Now for the signal efficiency.

00:35:36.000 --> 00:35:55.000
So, this blots feature the single efficiency versus mass or each charge and versus charge for each mess. And it is defined is a fraction of Monte Carlo events with at least one MCP in the D region that is after pressing all the selections among all generated

00:35:55.000 --> 00:36:06.000
events, and there are several reasons for the low efficiency here you see the largest ones about 47% for the low masses.

00:36:06.000 --> 00:36:20.000
The efficiency drums, basically because of the suitability and especially with your charge requirements for high massive drops because of the limited or construction efficiency of news.

00:36:20.000 --> 00:36:39.000
This is especially observed here, this blood below, because we have a decrease in the efficiency, because of larger organization loss slows particles down and they may not make it into the tiny window anymore, or may lose all the kinetic energy before

00:36:39.000 --> 00:36:41.000
the immune system.

00:36:41.000 --> 00:36:59.000
Also, obviously there is a stricter effective bt overcharged requirement, and a large Delta electron yield distorts timing parameters of MDT hits from mcps, leading to a smaller number of constraints combined.

00:36:59.000 --> 00:37:20.000
For the uncertainty on the background estimation we use so called mask region method, we introduce the mask regions between A and C and B and D or agents, this is shown in this picture of the shaded areas of the mask region, and some, and also between

00:37:20.000 --> 00:37:33.000
the A plus B and C plus D but this is not shown, and the background estimation, were calculated without accounting for the Antichrist inside, inside of these regions.

00:37:33.000 --> 00:37:50.000
And the systematic uncertainty is the relative difference between the nominal background estimation, and then you one. And I'll repeat it for more than, more than 20 different definitions of regions and take the maximum relative difference as a final

00:37:50.000 --> 00:37:52.000
systematic uncertainty.

00:37:52.000 --> 00:37:58.000
This is 33% for church two and 12% for great shows.

00:37:58.000 --> 00:38:01.000
Now, what we have observed.

00:38:01.000 --> 00:38:11.000
You know the single region for church do search we expected 1.5 plus minus point five plus minus point five events and observed for.

00:38:11.000 --> 00:38:17.000
This is a small excess of 1.5 sigma, and the value is 6%.

00:38:17.000 --> 00:38:28.000
And for church, just great of them to were expected or point oh three plus minus some something small and did not observe anything.

00:38:28.000 --> 00:38:42.000
And so these blood shows the unblinded ABC D playing for charge to you can see here, and also there's an inversion here, where these four events, sit.

00:38:42.000 --> 00:38:51.000
So they sit very very close to the boundaries between the signal region and non signal once.

00:38:51.000 --> 00:38:56.000
Now for the limit stating, we make use of the CLS method.

00:38:56.000 --> 00:39:11.000
It takes we eliminated the uncertainty of the statistical and systematic uncertainty expected but ground estimation uncertainties in signal efficiencies and signal linkages, as useless parameters.

00:39:11.000 --> 00:39:25.000
This block shows the, the cross section limits versus mass and also the theoretical cross sections. And this is the mass limits versus charge it both expected than the observed.

00:39:25.000 --> 00:39:43.000
Now, now, a very interesting question would be, whether we whether these four events are the same as observed in the talk, in the, in the paper that, and just talked about or were these actually completely different events.

00:39:43.000 --> 00:39:46.000
The answer is they are completely different.

00:39:46.000 --> 00:39:58.000
And on this slide, I show the seven candidates that and just stoked about, and the exact reasons why we did not observed them in our signal region.

00:39:58.000 --> 00:40:03.000
So I'm going to go from the, from the beginning.

00:40:03.000 --> 00:40:25.000
So there are three groups here, three, three groups of these seven candidates can be divided into three groups. The first two candidates are indeed combined views with Heidi zodiacs values as shown here they both pass or pixel the significance cut, but

00:40:25.000 --> 00:40:43.000
they feature very low. Well, not yet. Yeah, low, low MDTDAXMTRTVX failures, as shown here and here. I lot where these two candidates sit on our church to ABC plane.

00:40:43.000 --> 00:41:00.000
So they both are in a region which is completely the ground dominated the next two candidates number three and four. They are not comedians, but they do feature high tier TD Jakes significance.

00:41:00.000 --> 00:41:04.000
That is, they would pass out the RTD x requirement.

00:41:04.000 --> 00:41:12.000
If they weren't yields of course, and also they have really high quality, high threshold hits fraction.

00:41:12.000 --> 00:41:16.000
And also fairly high peaks audience.

00:41:16.000 --> 00:41:27.000
Various outside is it 13, and the first one does not does not pass this kind of the second one does.

00:41:27.000 --> 00:41:49.000
And they're asked the remaining three candidates, they are combined deals, but neither of pixel the significance nor to direct significance for MDTX significant significance or high enough to buy us, our selection criteria as can be shown and this this

00:41:49.000 --> 00:41:51.000
attaining cells.

00:41:51.000 --> 00:42:09.000
So, this, the conclusion, we perform the search for along with my church particles in for enter data in Atlas. The changes with respect to the previous search on the 2015 2016 data are related to the improvements in the production model and to the use

00:42:09.000 --> 00:42:12.000
of chillin additional trigger.

00:42:12.000 --> 00:42:27.000
We use for ionization estimators from three subsystems two separate signaling background, the signal efficiencies, up to 47% is lowest math and charge three.

00:42:27.000 --> 00:42:42.000
We expected 1.5 plus minus point five plus or minus point five events for the church to category and observe for, and did not observe anything for the church created into category.

00:42:42.000 --> 00:42:53.000
All for observed events here are very close to the boundaries between the signal announcing our agents, and they subsidization is within two standard deviation from the expectation.

00:42:53.000 --> 00:42:58.000
So we do not call it the discovering the continent.

00:42:58.000 --> 00:43:18.000
And the observed mass limits range from 500 gb up to 1015 or 1600 GV depending on the chart, and the largest increase in the mess limits with respect to the previous search is 450 GV this for church seven, and the manual check of the seven candidates

00:43:18.000 --> 00:43:25.000
that the CCP so dx analysis observed.

00:43:25.000 --> 00:43:32.000
It explains the exact reasons for their absence in our signal region.

00:43:32.000 --> 00:43:37.000
Thank you.

00:43:37.000 --> 00:44:01.000
Thanks so much for that presentation. So, I guess. Yes, um, does anyone have any clarification questions or questions very specific to URI. Before we have discussion over the experimental aspects of these two ddx and MCP searches combined.

00:44:01.000 --> 00:44:18.000
Yo Leonardo do something Yes, yes I have one question about the mean you showed it out of the 70 veins, the two events, which are not new one so they have high DRTDDX and it is the introduction of.

00:44:18.000 --> 00:44:23.000
Say Hi, sir.

00:44:23.000 --> 00:44:26.000
Yes, yes, Yes.

00:44:26.000 --> 00:44:33.000
So he you know how likely This is for the normal crack.

00:44:33.000 --> 00:44:46.000
This is very unlikely, especially the second, the ganas number four. This is 4.4 integrity closer to his fractions is very unlikely.

00:44:46.000 --> 00:44:48.000
Okay.

00:44:48.000 --> 00:45:04.000
So exactly what you what you did not mention is that the band of the essentially that the the threshold on which we observe that they need the success is louder than 2.4.

00:45:04.000 --> 00:45:19.000
The threshold that we reach your cap is larger than 3.1. So, that is certainly the seven events that some of them they just cannot be there. Yes. Okay, so yeah like like number three, for example, yes.

00:45:19.000 --> 00:45:24.000
Number three, and the last the last three events. Yes.

00:45:24.000 --> 00:45:44.000
Yes. Okay. Okay. No, that's all they could just go the likelihood of this very little bit surprising happening with the two tracks which are not me one.

00:45:44.000 --> 00:46:01.000
Thank you, Leonardo. Do you have a specific question for very switched on slides, yeah I did on slide six you give the, the significance formula and you have sigma and in the denominator are you assuming a Gaussian, sigma lap or how so yeah yeah we are

00:46:01.000 --> 00:46:16.000
assuming the girls can see that a good assumption. Yeah, it's a pretty good assumption that because we actually think the course of the distributions, to get that and it.

00:46:16.000 --> 00:46:25.000
The course, follow the Gaussian distribution perfectly

00:46:25.000 --> 00:46:37.000
core does but there are details to it. Right, right, right, but we don't really care about the details here.

00:46:37.000 --> 00:46:57.000
Because what all we need is some, some normalization of the significance and this is, this is all we need.

00:46:57.000 --> 00:47:01.000
All right, thank you so much. Think about that.

00:47:01.000 --> 00:47:14.000
Hey, Thanks Todd and Leonardo you have another question about. I wanted to ask you about the dynamic range of the TLTDDX and then BTDDX can you really measure.

00:47:14.000 --> 00:47:22.000
49 677.

00:47:22.000 --> 00:47:24.000
You should have this.

00:47:24.000 --> 00:47:36.000
There are no, there were no indications that we cannot do that, even when these detectors were tested at the test beams, with, with different particles.

00:47:36.000 --> 00:47:54.000
But even if we cannot measure that the density and then DT will not saturate like pixel does. So, what do we need to circulate and not not the things that he does.

00:47:54.000 --> 00:48:03.000
Because it is only 24 ones, and which you can measure your, your.

00:48:03.000 --> 00:48:14.000
Yeah, I mean, it will be all shifted to the right, but not to the left. And since we got at for example here we got it, too.

00:48:14.000 --> 00:48:28.000
We will not be very affected by that everything will be to the right of the cafeteria to. Anyway, unlike, unlike the pixel ddx where we can get to the left.

00:48:28.000 --> 00:48:49.000
And that's, that's exactly why we can use the RTD dx. For both church categories but we can use pixel the song for the first one. You mean that if you have larger than unionization of 10 minutes, or 20 minutes that you will not reach 20 meter but you

00:48:49.000 --> 00:48:57.000
will reach well above two. Yes, exactly. Yes. Okay.

00:48:57.000 --> 00:49:03.000
Thanks.

00:49:03.000 --> 00:49:12.000
Okay, so maybe we can move to the more general discussion about these two analyses.

00:49:12.000 --> 00:49:21.000
And I think we should try and structure it a little bit. So, I think, breaking it up into a couple of distinct topics will be helpful.

00:49:21.000 --> 00:49:39.000
So first coming back to the pixel ddx analysis and any questions about their background estimation, or things like that, then coming back and following up any questions on the multi charge analysis, and then coming to an overlap or comparison comparison

00:49:39.000 --> 00:49:42.000
between the two.

00:49:42.000 --> 00:49:51.000
And then maybe finally any sort of future looking questions that we might have. So, so thank you again to end and you're a for the really excellent presentations.

00:49:51.000 --> 00:50:01.000
If there are any hands or questions about the pixel ddx analysis specifically with respect to the background estimation, that'd be really great to get those questions.

00:50:01.000 --> 00:50:06.000
So, Daniella.

00:50:06.000 --> 00:50:15.000
Hey, can you hear me. Yes, sir. Yeah, because this forum is not directly for the background in estimation is just a technical question because height.

00:50:15.000 --> 00:50:30.000
A, you mentioned basically when you did all your chats that some of your events like three of them had to me on set, associated with the tracker. What do you mean exactly mean like to me on the same side as the track or new ones come to balancing the

00:50:30.000 --> 00:50:35.000
track or.

00:50:35.000 --> 00:50:41.000
I'm so I just mean that there are two, two Milan's in event, one of which was the signal track.

00:50:41.000 --> 00:50:43.000
Okay, so, yeah.

00:50:43.000 --> 00:50:49.000
Okay. Okay, thanks.

00:50:49.000 --> 00:51:14.000
Me while we wait for some other questions. I had one, which was in the paper you have this, this plot of uncertainties, as a function of the mass, and there's some uncertainty that accounts for correlation between the ddx and the PT of a track as a function

00:51:14.000 --> 00:51:18.000
of mass and it jumps up at about one TV.

00:51:18.000 --> 00:51:35.000
And my understanding is this was measured in a validation region with low met and I'm curious if if that uncertainty were to become larger extrapolating to larger amount would how much of that excess would it account for and how can you, what sort of

00:51:35.000 --> 00:51:40.000
studies were done to understand this a little bit better.

00:51:40.000 --> 00:51:53.000
Yeah, thanks Carrie for that question. It's an interesting question so right so this this uncertainty that Carrie is talking about is our largest systematic uncertainty.

00:51:53.000 --> 00:52:06.000
It's designed to assess any non negligible correlations when we're driving our dx and our momentum toy tracks and right we do this in our low met control region.

00:52:06.000 --> 00:52:17.000
so the first thing I'll say is that we're really limited by Statistics by this for the systematic and there's some evidence that really the.

00:52:17.000 --> 00:52:30.000
It's the large size at, especially in our inclusive high regions is really driven by Statistics but we're very conservative so we kind of just go with it and apply it the most conservative uncertainty possible.

00:52:30.000 --> 00:52:44.000
One other thing I'll say about this uncertainty is that really the discrepancy any discrepancy that we see with these non negligible correlations are really do to track candidates that are not new on candidates.

00:52:44.000 --> 00:52:52.000
So we don't expect that this will really affect our new on selection but again we do it inclusively because we want to be as conservative as possible.

00:52:52.000 --> 00:53:07.000
And if I recall correctly, we actually did do a quick check but we were very limited by statistics on whether or not. If we restrict our met region. If it changed so like a subset of the region, instead of the entire region, we couldn't, of course to

00:53:07.000 --> 00:53:15.000
this in a single region where we could do this in the control region and there wasn't evidence that there was a strong pendants.

00:53:15.000 --> 00:53:17.000
There wasn't wasn't. Yeah.

00:53:17.000 --> 00:53:19.000
Interesting.

00:53:19.000 --> 00:53:26.000
but again we're statistically limited for this uncertainty. So that's the caveat.

00:53:26.000 --> 00:53:27.000
Okay.

00:53:27.000 --> 00:53:31.000
Chris Do you want go ahead.

00:53:31.000 --> 00:53:48.000
Yeah, sure. My question is about the slide, 14 of the previous speakers talk but I suppose either speaker might want to answer the question. So I'll give someone a second to throw that slide up.

00:53:48.000 --> 00:53:51.000
Okay, yeah.

00:53:51.000 --> 00:54:00.000
So, I'm just want to make sure I understand these column headings and then some of it seems to be leading me to a possibly strange conclusion so I want to check.

00:54:00.000 --> 00:54:16.000
So, if I understand correctly, this when you talk about significance the the significance gets the significance you defined a few slides back where you divide by this resolution that that Todd was asking about right.

00:54:16.000 --> 00:54:29.000
Exactly, yes. Okay and then when it says pixel that means just measured in the pixel system and tarp just measured in the TRT system. Exactly. Okay, so then do you these the values are so different.

00:54:29.000 --> 00:54:34.000
this this to me suggests a problem.

00:54:34.000 --> 00:54:45.000
Do you have.

00:54:45.000 --> 00:55:04.000
But a on the same track which is past this the quality cuts that we were talking about in the last talk passes the, the, you know, has a zero point 16 TRT significance or or or the 17 sigma in the first one actually has a low ddx deposition in the TRT.

00:55:04.000 --> 00:55:12.000
I mean, what would the likelihood of attract to have that kind of distribution I would think it would be extremely unlikely.

00:55:12.000 --> 00:55:20.000
Yes, this is, this is very unlikely so let me find the big zodiacs plot.

00:55:20.000 --> 00:55:26.000
So here's the big so dx plot, we had 15 there.

00:55:26.000 --> 00:55:40.000
So, this is, this is like here. Uh huh. and for the TRT we had 4.1 right.

00:55:40.000 --> 00:55:44.000
Some of them are even negative yeah yeah all going to. Yeah.

00:55:44.000 --> 00:55:51.000
So point one is, is, like, like the peak of the distribution here. Uh huh.

00:55:51.000 --> 00:55:57.000
And negative and just means to their mirror. Yeah, that's that's that's basically it.

00:55:57.000 --> 00:56:13.000
A negative doesn't mean it's lower than a negative mean yeah negative means, it is a bit lower than bit more than the most bro bro but if you take this probability for each point if you go back to the slide 14, and you do this kind of mental calculation

00:56:13.000 --> 00:56:27.000
of the probability that you just showed for each one, you can do that and then multiply it by all seven because they all have this property of having a high the decks and the pixels that are low probability and the TRT, and I think you would get extremely

00:56:27.000 --> 00:56:29.000
unlikely.

00:56:29.000 --> 00:56:32.000
Number, and I would be interesting to know what that number is.

00:56:32.000 --> 00:56:46.000
Ok, ok, but these for example these two they're not lower it just, it means that you cannot really compare compare exact, well you know, you could do just what you just did.

00:56:46.000 --> 00:57:01.000
I mean, they would be closer so the value would be, this was more probable configuration but yes, but you could get the combined probability for all seven, seven hits and to me this looks like, you know, whatever this is is more likely to happen in in

00:57:01.000 --> 00:57:13.000
the pixel detector. And I don't understand that because they asked about whether there was a pattern in the radial distribution, the speaker said it was there wasn't, but there looks like there is from from this.

00:57:13.000 --> 00:57:16.000
No, you say so but.

00:57:16.000 --> 00:57:17.000
Well, you have no basically Hi, Dr. The dx.

00:57:17.000 --> 00:57:35.000
Well, you have no basically Hi, Dr. The the XD This this I understand, but I don't understand the reasoning why you go from this to the independence. What the RTS further out right yeah but there's a resolution which is thousand times less than the peaks

00:57:35.000 --> 00:57:50.000
of granularity, which is thousand times less than the pizza and space resolution resolution. Yeah, I know. Speaking of resolutions.

00:57:50.000 --> 00:57:54.000
Yeah, but that's not this is an energy measurement, so I don't know.

00:57:54.000 --> 00:58:16.000
Yes, but do you claim that there is a problem of a mix up orthodox access suggested that I always mix up of tracks and that's not what I say. That's one possibility but yeah i agree with a pixel detector that's quite unlike with out of this math and what

00:58:16.000 --> 00:58:25.000
so what they found the most striking, is the fact that there are two events, which are in the fairy tale of the distribution.

00:58:25.000 --> 00:58:32.000
The number three and number four, which are in the fairy tale of both distribution where there is a measurement.

00:58:32.000 --> 00:58:49.000
So, there is that yes, I agree that it is very unlikely you you may have a fluctuation in the, in the pixel and not have a fluctuation in TFT so maybe it also goes to the thing that Todd said maybe this sigma, I mean, you're talking, this, this, if we

00:58:49.000 --> 00:59:03.000
to read this correctly you have flooded them, those are not fluctuations 17 sigma is not a fluctuation that happens. What, what is 17 sigma. Well, Yeah.

00:59:03.000 --> 00:59:06.000
DC miles the core sigma. Yeah.

00:59:06.000 --> 00:59:25.000
Yeah. So, we're headed Lando to. Okay, I see so that's if you assumed it was Gaussian it's, yeah yeah so again that's maybe not the point that that doesn't reflect the actual probability it's not really, like, that's my conclusion Chris.

00:59:25.000 --> 00:59:26.000
Yes.

00:59:26.000 --> 00:59:39.000
Okay. I also I sort of along this line I would be very curious to understand exactly what did the x values, the TRT and MDT measurements point to.

00:59:39.000 --> 00:59:51.000
And also seeing sort of like what the underlying ADC or time over thresholds like what what those underlying distributions look like for signal and background.

00:59:51.000 --> 01:00:01.000
Because I think Todd was asking a question about, can, can someone was asking question about can maturity and the MDT actually measure CDs that goes on this fine.

01:00:01.000 --> 01:00:11.000
And I would be very, it would be convincing to sort of see those those like low level distributions. Okay, um, the next question or the next hand that I see is from Thomas.

01:00:11.000 --> 01:00:18.000
I was wondering how does this deal with emerge clusters I guess it for both of these dogs.

01:00:18.000 --> 01:00:25.000
So then you have text them to very close to each other and instead of having two clusters one which cluster.

01:00:25.000 --> 01:00:33.000
So, if you mean the pixel kids, which are shared between at least two tracks right.

01:00:33.000 --> 01:00:37.000
If. Is that is that a question.

01:00:37.000 --> 01:00:55.000
Yeah, I guess. Yes, yes. So, we, we rejected, such strikes where that was the right exactly zero such hits on an IT track which corresponds to the current do.

01:00:55.000 --> 01:01:08.000
So that's kind of my question. How do you know that that cluster is actually coming from two tracks instead of one.

01:01:08.000 --> 01:01:20.000
I guess this is decided by the neural network that does the pixel pixel trick or reconstruct. I mean the the you know the vector drinking reconstruction,

01:01:20.000 --> 01:01:25.000
somebody with pixel expertise will know better. Yes.

01:01:25.000 --> 01:01:29.000
And he's the same nevertheless with us in the previous Stoke.

01:01:29.000 --> 01:01:32.000
Just the.

01:01:32.000 --> 01:01:42.000
Yes, yes.

01:01:42.000 --> 01:01:44.000
Okay.

01:01:44.000 --> 01:01:46.000
Are you comfortable with that answer.

01:01:46.000 --> 01:01:52.000
Well, I would like to hear more about it if you have some pointers to do is no less delicious. Nice.

01:01:52.000 --> 01:02:06.000
Do the answers in the machine learning is that it somehow happens. So, yeah, I believe there is a paper on the neural network is for pixel clustering, and others.

01:02:06.000 --> 01:02:12.000
I don't know how old it is but I believe it's out there.

01:02:12.000 --> 01:02:16.000
Okay then. Moving to Eric.

01:02:16.000 --> 01:02:20.000
Thanks for that so it's a question for the first speaker.

01:02:20.000 --> 01:02:26.000
So mean about the seven track that you're isolated you you said there were several things about the.

01:02:26.000 --> 01:02:41.000
How much of them were matching the immune system number of events where you had to be honest, but you didn't say anything about the domestic distribution Did you check what the girls like the delta phi between the candidates and the machine at or these

01:02:41.000 --> 01:02:42.000
kind of things.

01:02:42.000 --> 01:03:02.000
And the second question is about the humans with humans. Did you check in there were close to each other. If the, the, the two meals were comfortable with resonance like a gypsy or something.

01:03:02.000 --> 01:03:09.000
Um, yeah so give me one second,

01:03:09.000 --> 01:03:17.000
looking to see if I put them at distribution and slides and I didn't.

01:03:17.000 --> 01:03:24.000
So I guess my question for you is what would do, what's the check for them and distribution.

01:03:24.000 --> 01:03:39.000
I mean, in some events you said that if it is made this table you expect the particles to count for the madness so I just would like to know if it's aligned with the track or language digits are in there is no correlation at all.

01:03:39.000 --> 01:03:46.000
men. The candidates and all the other, the chip back back. That's, that's the question.

01:03:46.000 --> 01:04:00.000
Right. So for some events the mentors along the Star Trek, and for some months, the mat is opposite. So this isn't really a consistent thing along across all the signal region.

01:04:00.000 --> 01:04:04.000
Access candidates.

01:04:04.000 --> 01:04:18.000
Okay. And then, sorry, what's your second question. And the second question was, the angle between the young players that you have three events over seven heads to me on that mean did you check the, the angles between the neurons, while they are close

01:04:18.000 --> 01:04:23.000
to each other on on not necessarily.

01:04:23.000 --> 01:04:32.000
I don't remember this information, maybe someone answer overs.

01:04:32.000 --> 01:04:39.000
They're relatively Why then will the new ones but they don't remember exactly the.

01:04:39.000 --> 01:04:53.000
And these are high momentum you on so I would expect and not to be Cheapside, but they cannot confirm.

01:04:53.000 --> 01:04:55.000
Okay.

01:04:55.000 --> 01:05:01.000
That's

01:05:01.000 --> 01:05:20.000
okay I'm trying to put together the slide 37 from and chocolate shows the PT track distributions and and and URIs list of events, and try to understand what's going on with the PTS of these tracks if I understand correctly, there if we look at the PT

01:05:20.000 --> 01:05:23.000
distribution on slide 37 demands talk.

01:05:23.000 --> 01:05:32.000
The seven events that we're talking about are separated from the rest of the distribution, is that correct.

01:05:32.000 --> 01:05:43.000
And then the question which would follow that whether the answer is yes or no, I suppose, is what do we make of the in the five events with his immune system measurement of the, of the track PT.

01:05:43.000 --> 01:05:59.000
The difference between the track PT and the and the and the immune system PT tracker PT and immune system pts is large and as a theorist I don't have a sense for whether it's spectacularly large or just a little unusually large or what.

01:05:59.000 --> 01:06:05.000
I'm so sorry. Did someone say something, go ahead go ahead then.

01:06:05.000 --> 01:06:15.000
So I know that for for the Cam Newton candidates we did track them on mental and it's consistent with the track beauty.

01:06:15.000 --> 01:06:27.000
and then let me share this slide

01:06:27.000 --> 01:06:42.000
here, and I don't recall if all of the access events exactly match up with this, maybe someone else does. If that's what you're asking that we're looking at your if, if you can put up his slide again let me just look just know that there's a gap there

01:06:42.000 --> 01:06:51.000
around 500 ggV. Yeah, we can put up his slide of the list of events.

01:06:51.000 --> 01:06:59.000
Yeah, you're talking about these different rate right so they're all about 500 gV first box have number four.

01:06:59.000 --> 01:07:05.000
And for all the ones where there's a new one measurement immune system measurement and the inner detector measurement.

01:07:05.000 --> 01:07:13.000
There's a substantial discrepancy although again I in those measurements they're they're quite different.

01:07:13.000 --> 01:07:25.000
I just wonder, you know, do you see this in typical nuance or is this way on on a tail. No, we do not see this, I think in the typical here for the typical mediums.

01:07:25.000 --> 01:07:42.000
This is a yes this request city detailed study. That may mean that these particles loose some non negligible non human like energy, num, num, you're like amount of energy in the universe.

01:07:42.000 --> 01:07:57.000
So that should be observable in the US, is that interesting that this effect wouldn't be consistent with the PT resolution for very high PT inner detector tracks like i would i would imagine because you only have a few so we can hit screw, he could be

01:07:57.000 --> 01:07:59.000
skewed upwards.

01:07:59.000 --> 01:08:13.000
And then even for TV new on your PT resolution is on the order of 10% in the, in the barrel. So, for the inner detector must be much larger uncertainty.

01:08:13.000 --> 01:08:18.000
Right, which then in turn folds into the mass uncertainty and so forth.

01:08:18.000 --> 01:08:21.000
But one extract.

01:08:21.000 --> 01:08:35.000
You shoes, what do you ever seen. Yeah, I was gonna go back to the plot before the show and the PG distribution, the tracks wanted to share before. I think this is an sport that shows.

01:08:35.000 --> 01:08:41.000
Oh,

01:08:41.000 --> 01:08:42.000
right.

01:08:42.000 --> 01:08:48.000
So, for example, The, the highest PT track.

01:08:48.000 --> 01:09:00.000
The uncertainty on the PT is will not be Gaussian it'll be very, very large horizontal era band really on that on that particular event right.

01:09:00.000 --> 01:09:07.000
Yeah, so I think I actually have some simulation.

01:09:07.000 --> 01:09:20.000
Here. Right, so it's not super gotcha so here's the relative one over PT resolution so where this is quantified as the full was half next one divided by the factor to convert it sort of to sigma.

01:09:20.000 --> 01:09:42.000
And here are maybe ignore that decay but here look at the different slices of PT, to get a sense of what the resolution will look like and it's, it's almost 1600 percent at this hype momentum.

01:09:42.000 --> 01:09:44.000
Yeah.

01:09:44.000 --> 01:09:55.000
I think this is yeah this is consistent with what I would expect. Okay, so I see two more hands, and I think after that we should move to the next presentation so Chris you have one more question.

01:09:55.000 --> 01:10:01.000
This is a quick clarification for the, for the multiple charged particle analysis.

01:10:01.000 --> 01:10:09.000
Of course it if they were really multiple charged the PPT measurement that you would make normally make is wrong by that factor. right.

01:10:09.000 --> 01:10:15.000
It's about all the measurements you were showing these tables and whatnot, or just assuming the charges one.

01:10:15.000 --> 01:10:20.000
Exactly, yes because this is in data we always assume the chart.

01:10:20.000 --> 01:10:25.000
That's what I. OK, Ok.

01:10:25.000 --> 01:10:28.000
Ok. And then one last one.

01:10:28.000 --> 01:10:53.000
I would like to ask and about how well they could make the calibration of the DX, how, what, what is the final result of the plot on the page number five, which is ddx versus luminosity after all collections.

01:10:53.000 --> 01:11:00.000
It's, it's flat so we we we measure it for each run.

01:11:00.000 --> 01:11:06.000
So each Atlas run, and we correct it to 1.4.

01:11:06.000 --> 01:11:08.000
Flat from IPS.

01:11:08.000 --> 01:11:19.000
Okay. And, yeah, the. The other question is, how in time, the candidates events are distributed.

01:11:19.000 --> 01:11:31.000
We have the numbers which is shown in the other presentation but they have no idea what is the time difference between different parts of the data taking.

01:11:31.000 --> 01:11:44.000
So, they are from 2017, or 18, or 16.

01:11:44.000 --> 01:11:57.000
He access events what they were destroyed, how they're distributed across run to. Is that what you're asking. Yes, Yes, if they are in.

01:11:57.000 --> 01:12:13.000
Because the other presentation we have the wrong number between three zero for up to 364 thousand. And I wonder if it is coming from a certain period of time, or it is.

01:12:13.000 --> 01:12:22.000
You have taught events in 2017 518 or something like that.

01:12:22.000 --> 01:12:32.000
Yeah, they're well distributed across the run to data set, but I don't have the numbers which are unique because they are not clustered in any place.

01:12:32.000 --> 01:12:36.000
Thanks.

01:12:36.000 --> 01:12:40.000
Okay, thank you everyone for the lively discussion.

01:12:40.000 --> 01:12:51.000
I think we should move to the next talk from Daniel a on high ddx events from boosted multi charge.

01:12:51.000 --> 01:12:53.000
Manga particles.

01:12:53.000 --> 01:12:54.000
No, thank you very much. Can you see my screen.

01:12:54.000 --> 01:13:03.000
Oh, thank you very much. Can you see my screen. Yes we can hear and see your pointer. Perfect, thanks a lot. Good afternoon, everybody.

01:13:03.000 --> 01:13:04.000
Thanks for the invitation.

01:13:04.000 --> 01:13:18.000
Am I gonna talk about this reason work of mine with religion Jewish and much of a Carlo here, certainty age, and we're going to build up on the experimental results that we just seen and in particular on the Excel ddx success assuming that it's real,

01:13:18.000 --> 01:13:30.000
of course, as we just heard you know after season seven events in a region which is basically signal dominated in which the background is supposed to be very small.

01:13:30.000 --> 01:13:55.000
And the reason we the hype etc so we have the other staff and the reason why the ground here is so small is the simple to understand in also the amount of particles that we have, have large dx if they are relatively slow, but at the same time having to

01:13:55.000 --> 01:14:13.000
In terms of bsm we heard that artist analyzes them in terms of particles with beat of the order of 0.5 0.6, and these excess of about six events survive all checks about from one, which is that all these access seems to have a large data from that act

01:14:13.000 --> 01:14:15.000
time of slight measurements.

01:14:15.000 --> 01:14:28.000
And this is precisely what caught our attention is what we are. Our starting point, our main motivation to see if indeed it's possible to reconcile these access, with the information on the time of flight that that's essentially the details of the other

01:14:28.000 --> 01:14:29.000
one is that it was one.

01:14:29.000 --> 01:14:44.000
that it was one. And we believe that it's possible and the reason is besides with the better block curve, the better block curve, which gives the most probable value of the DX, the pens on a bunch of quantity, some of them are material dependent so depending

01:14:44.000 --> 01:14:47.000
on the detector.

01:14:47.000 --> 01:14:59.000
Just two of them depends on the particle is promising your detector. One is of course the speeds to the last city beat or beta gamma, and the other one is the charge of the practical as we heard in the second the second talk.

01:14:59.000 --> 01:15:15.000
And in particular, the average energy deposition had ever giant ization depends on the square of the charge of the particle. Then, if you plot if you just do the simple exercise to plot that ddx as function of beta, for the cubicles to one charge you

01:15:15.000 --> 01:15:18.000
put one in charge equal to two hypotheses.

01:15:18.000 --> 01:15:36.000
You see that it is true that you can interpret the access in terms of beat of the other 0.5 is great band is besides the original word artist sees and access, but it's also true that you can interpret it says Sq equal to two, as long as beta is besides

01:15:36.000 --> 01:15:52.000
of the other one. And this, on the other hand, is perfectly, what the time of flight measurements seems to suggest, so if you want to look into two feet, both in both data both sets of data which is the larger organization and the time of flight measurement

01:15:52.000 --> 01:16:09.000
you're immediately led to consider hypotheses, which designs have originates from two equals two, two, but boosted, and so not slow particle so the previous analysis what we heard an abuse analysis covers the case in which discharge equal to do was of

01:16:09.000 --> 01:16:21.000
the order of 0.5 0.6 or whatever. And we saw that is not significant access there, what you're trying to propose here is a different position which we, we have to equal to two, but boosted.

01:16:21.000 --> 01:16:27.000
And this is basically what is suggested by the accommodation with these two sets of measurements.

01:16:27.000 --> 01:16:40.000
Now, the problem here is that the audience analysis is performed in terms of three equal to one. And we want to interpret it as to equal to the I'm talking about the pixel analysis, let me recap, very short.

01:16:40.000 --> 01:16:57.000
So my theory is point of view, artist protocols to measure these build disease so cancer, we saw that he has the following artist measures the momentum and the ddx, and then as we heard that you can invert the better block curve to find beta gamma of

01:16:57.000 --> 01:17:15.000
this tracker. And once you have that you can from beta gamma and measure of momentum be, you can obtain an estimate for the mass which I'm going to call it here like an effective mass MDX, which is equal to one, an approximation to the physical mass of

01:17:15.000 --> 01:17:25.000
your particle and saying that it's an approximation because as we heard those requests and so these if you look at the histograms, This is far from being monochromatic.

01:17:25.000 --> 01:17:32.000
So, even if you assume that the signal here is monochromatic what you see in your detector what you see in the histogram with the regular stock sorry.

01:17:32.000 --> 01:17:48.000
It's a large, a very large distribution, especially if you can see a masters of your little tab, which are the ones of interest for explained access, and this is your basically to two effects that first and most important effect is that the grading momentum

01:17:48.000 --> 01:17:56.000
resolution for a minute on your last resolution of the tracker is basically of the other 100% almost.

01:17:56.000 --> 01:18:06.000
Yet the second effect is the natural spread of the ddx distribution around the the most probable body which is given by the London distribution that was discussed before.

01:18:06.000 --> 01:18:11.000
Once you take into account these two effects were able to reproduce basically the.

01:18:11.000 --> 01:18:19.000
The histograms given by the collaborations. And so we were confident that we could simulate physics physics that we wanted.

01:18:19.000 --> 01:18:29.000
So, I'm going to call this just, MD dx. So an effective mass and not the physical mass because this is going to be important for what I'm going to say, immediately after.

01:18:29.000 --> 01:18:43.000
And in particular, since we want to reinterpret two equals one, the three equal to one search in terms of equal to two fees except we have to do a trick, we cannot reinterpret directly the data, just because for instance we don't have the information

01:18:43.000 --> 01:19:00.000
for the detailed information of the time of slide. What we can do is just use a tree, and in particular, what we can do is to assume to equal to two to the understanding policies except and run it through that q equals one, Atlas protocol that was described

01:19:00.000 --> 01:19:06.000
before, and see how this Q equals two one protocol would see these two equals to physics.

01:19:06.000 --> 01:19:20.000
In particular, why we can simulate this critical to physics because we can approximate that the average organization of cubicle to do physics is just four times the ones that has been calibrated than discussed before, but now seems clear now there is

01:19:20.000 --> 01:19:35.000
a mismatch between the charge of the degree of charge of your part of the physical particles and the charge assumed in the construction project Well, clearly, these effective master is the output of this profitable will not be an indicator of the physical

01:19:35.000 --> 01:19:50.000
mass of the particles would be somewhere else. Notice also that something similar is true for the momentum that measure constructive measurement is assumed to equal to to is not the true momentum your particle but there's a mismatch of attack those two

01:19:50.000 --> 01:20:06.000
that few of these factors that conspire in such a way basically that that effective mass is the output of the artist protocol is not the physical is quite different from the physical mass of your particle. But this doesn't matter because this is a perfect

01:20:06.000 --> 01:20:16.000
perfectly legitimate protocol that we can follow to just to build histograms to build signal models and use the signal models to fit that says, according to the statistics that we are proposing.

01:20:16.000 --> 01:20:29.000
Now remind you that the physics that we're proposing is to equal to two boasted and to have it. What we assume is that there is a parent's presence in which is produced for instance by a real young or Bloomfield genre.

01:20:29.000 --> 01:20:45.000
And this parent resonance which is heavy decays into to Q equal to two. Though the particles and these other particles will naturally boost that if the speed is heavy enough is much heavier than the daughters in particular.

01:20:45.000 --> 01:20:55.000
So, we can run these up, and we can build signal models we can build histograms and we can feed the data which are these black points here.

01:20:55.000 --> 01:21:03.000
By using the signal models in particular, this is just a benchmark model in which the parent resonance as five TV and daughters or 800 gb.

01:21:03.000 --> 01:21:18.000
You see that in red. This is the background that's taken from the analysis, whereas green is the background plus our signal Mandala after you close us the cubicle to one reconstruction algorithm.

01:21:18.000 --> 01:21:34.000
And you see that the access can be perfect that perfectly basically this is that MDX histograms, but as a check and not just as a check. We also looked at the ETS programs and the DDXE surrounds the PT students were discussed before there was some discussions

01:21:34.000 --> 01:21:42.000
about this gap but, well, this gap is perfectly compatible you know with the background, and the signal Mandala.

01:21:42.000 --> 01:21:58.000
And again, you see that here the basically all the signals of large cities are pretty larger than 700 jet or so. And the reason why you can isolate this from that background is because I remind you that there is a large dx can't assume in this kind of

01:21:58.000 --> 01:21:59.000
histograms.

01:21:59.000 --> 01:22:11.000
For the dx. The, we can, one cannot see the one cannot distinguish the access from the background but at least our signal model does not knowing what the observation basically.

01:22:11.000 --> 01:22:26.000
So, all in all, you see that you can fit all available histograms, with this simple physical hypothesis. So, then that means that you can perform a parameter feet and particularly to just stand up, Santa things here we do our profile likely feet with

01:22:26.000 --> 01:22:28.000
Bostonian likelihoods.

01:22:28.000 --> 01:22:42.000
It's likely with our it slightly different format us what we do is we feed all these three histograms, just because we want to use all available information especially the information. Otherwise, the results will be slightly misleading.

01:22:42.000 --> 01:22:49.000
But to do so we have to build toys with experiments estimate confidence intervals and so on and so forth.

01:22:49.000 --> 01:23:07.000
And most importantly, in this plot that I'm going to show what is taking into account is just the boosted production so its proton broaden that creates produces a resonance p, which decays into those particles of charge, equal to us during that just this

01:23:07.000 --> 01:23:08.000
production.

01:23:08.000 --> 01:23:22.000
Assuming that just this production. What you see that banner the space of the model, the final space of the center, which is basically daughters, up to 1.5 TV, let's say, and verint resonances from 3.5 TV or Hey era.

01:23:22.000 --> 01:23:25.000
You can feed the access and all these regions.

01:23:25.000 --> 01:23:44.000
Most importantly, by the very construction, the beta that you have here is largest close to one, because these guys have boosted if you have a five days on the gains in two to one tab, articles, these will have a beat of the order 0.9, or lab so these

01:23:44.000 --> 01:23:51.000
by constructions we fit. Also, the time of flight information.

01:23:51.000 --> 01:24:07.000
Predictions additional production of the scenario. Well, there are pleased to irreducible or almost reusable additional production. The first one is the fact that okay these guys have charged equal to do, so they can be produced that actually electromagnetically.

01:24:07.000 --> 01:24:13.000
And this is precisely the search that we saw the second talker results by Yahoo.

01:24:13.000 --> 01:24:28.000
And this is precisely the production mechanism this considering that analysis. And as we heard that these would give you an even larger the dx. So the irreducible production, the usable prediction here is that, in association to these events you should

01:24:28.000 --> 01:24:48.000
have also events with a much larger edX. Maybe closer to the, to the radical range of the Torah, and in particular the limits that we heard about from last week, got a good portion of this parameter space, essentially they got all these before one of

01:24:48.000 --> 01:24:54.000
these slides, 1.5 c Max is that clearly is not significant.

01:24:54.000 --> 01:25:06.000
At this at this moment but we should keep an eye on that, of course, and by casting this good chunk of the panel this space we see that the penetration test to be pretty heavy let's say five TV or heavier.

01:25:06.000 --> 01:25:21.000
And the second almost reusable prediction of the scenario is that since that betting presence is produced by problem problem now, you can just flip the diagram and see that this parent restaurants should be able to decay into digests, and so they should

01:25:21.000 --> 01:25:39.000
give it should be seen, also a digest seen a lot with corresponding Master, and this is true, up to the magnitude of the signal is basically determined by the DVXX is up to that relative branching ratio into these loaded particles and the Jets.

01:25:39.000 --> 01:25:52.000
And again, here you can consider the game to like walk so they grew once and you see these are the relevant bounce, and even here you have to seek MAX SIZE of around size TV

01:25:52.000 --> 01:26:05.000
microscopic models so independently. So we want to a physical model of Harding apparent resonance decaying into to charge two particles, and that from the model building point of view, the most important.

01:26:05.000 --> 01:26:20.000
The most interesting aspect of this stage is the parent resonance. In particular, there are a few options for the present phase and then the simplest options are, if the parent resonance is neutral under the sun that model.

01:26:20.000 --> 01:26:35.000
So I could be either a scale of capital to blue ones, or exit prime capital ports. So if it's a spin a couple to blue ones, essentially what you're assuming is that this parent, be a couple to blue ones with some energy scale alumni with some introduction

01:26:35.000 --> 01:26:49.000
scale and that encompasses course to this loaded particles, but if you just look at numbers, if you want to reproduce the best fit for sexual, that's the signal of the for the DVXXSUC media to that this has to be a strongly coupled presenter.

01:26:49.000 --> 01:27:01.000
So, the predictions here is that there should be some strongly couples sector, not far away in master from the five dB, or whatever, mass scale of interest here.

01:27:01.000 --> 01:27:18.000
Your third motive is that that prime which can be doubled to clock sir. And if you take into account that this has to be produced by by quarter by like four so this has to be coupled with like to like quotes for sure, where that's gotten platforms will

01:27:18.000 --> 01:27:35.000
give you a strong bounce. So, in this case desert plan has to be moderately let the phobic not mature it stone or the one numbers here but the ratio of charges of the of the of the lactose and the daughter particles has to be a number which is in favor

01:27:35.000 --> 01:27:38.000
of the daughter particles of the audible feel older one.

01:27:38.000 --> 01:27:47.000
Then the other the last options is that despairing horizons is charged either under su to weaker or color.

01:27:47.000 --> 01:28:02.000
This is a bit more difficult to accommodate, although it could be useful to model build the decay into daughters which are not the same. So, if this is not the same that clearly cannot be game for pair of the same path because and this could be a good

01:28:02.000 --> 01:28:13.000
option that if future data would show that you always have just one track and not two tracks. That gives you a large dx conclude that.

01:28:13.000 --> 01:28:28.000
Well, the ddx success could be of course, a non physical background but this, I mean from a serious point of view seems unlikely. Because, you know, I cannot imagine what some other particle could be at the same time slow and give you and have a beauty

01:28:28.000 --> 01:28:39.000
of that but it could be a statistical celebrations and it is again from a from a serious point of view it's very unlikely because this is how these things are four sigma or more.

01:28:39.000 --> 01:28:54.000
It could be an experimental issue and this of course no idea, but it could be new physics. And this is an easy because it's just simple. We just need to wait, we need to wait for CMS, to see if they see the same things.

01:28:54.000 --> 01:29:03.000
wait for long because with the data of run three should be enough to bring these access to more than 16 months, if we are the best fit cross sections.

01:29:03.000 --> 01:29:06.000
And these is pretty much for the wonderful day.

01:29:06.000 --> 01:29:11.000
Thanks.

01:29:11.000 --> 01:29:25.000
so much for let's talk. So, right, it's really interesting to hear the female perspective and also I guess they're already mentioned incorporate a little bit about the recent MCP results from here talks that's awesome I see.

01:29:25.000 --> 01:29:31.000
And I see we already have a couple questions so Leonardo Do you want to go ahead. Yes.

01:29:31.000 --> 01:29:44.000
I mean, oh this theory is not broken by the table which has been shown by the teacher multi channel.

01:29:44.000 --> 01:29:55.000
And that exists on the seven events, I mean, those seven events are not unionization in anywhere else. Apart from.

01:29:55.000 --> 01:30:08.000
Well, of course the internet at least in the personal the search for for the highest charge of analysis of the artists that they take into account, they can see there are different production mechanisms, so they're not talking about booster party we're

01:30:08.000 --> 01:30:19.000
talking about particles produced by a recent photo infusions or DeLeon which with this law, and would have a much larger ionization energy than the one that we expect.

01:30:19.000 --> 01:30:22.000
So, this is our first.

01:30:22.000 --> 01:30:38.000
Yeah sure, but I mean they have taken the seven events and for the seven events they've looked to the unionization integrity a unique position in MBT, and the only there are only two events, which by the way I normally when he went to air, this is in

01:30:38.000 --> 01:30:54.000
sense, confirmed that the other five. I mean, these. So, so if this particle is is a challenge to, then it is not any more challenged to at least four or five out of seven at.

01:30:54.000 --> 01:31:07.000
So it can be gay Of course I mean these guys do not need to be stable, right, they can, again, they can indicate so I mean after the games are charged one objects are to judge one objects so these yeah but they charge an object should continue the same

01:31:07.000 --> 01:31:23.000
path, because the strategies that are constructed throughout up to the new one. But that was a charge one particle would be just background, not the DX right most because I mean the charge to object is charged right and so meters he was the one.

01:31:23.000 --> 01:31:34.000
If he became feel like enough the decay is essentially between the pizza, I mean the decay length of the order of the pixel. The tide.

01:31:34.000 --> 01:31:41.000
And at that point it would be just the ground it would be indistinguishable from the ground. And this is one option now.

01:31:41.000 --> 01:31:57.000
Yeah, but he does. I mean, since we reconstruct the track throughout the data must continue the same path from being challenged to to become in charge one, which is very unlike indicates is not likely if the decay lag but if the correct.

01:31:57.000 --> 01:32:09.000
Yeah, if he became he definitely did the same, same the same direction as the parent. It's booster for us the charge to practical is booster so you know the the anger ensure really short.

01:32:09.000 --> 01:32:13.000
I mean, it may happen.

01:32:13.000 --> 01:32:24.000
We locked in, in principle, I mean for us by construction, our partners have been a good one. So, the decay angle will be smaller. I mean, I'm just telling you this now because I learned about these results.

01:32:24.000 --> 01:32:32.000
Last week, as everybody else right so when we wrote the paper we did not think about this possibility otherwise we would have written it down.

01:32:32.000 --> 01:32:41.000
But, you know, then my understanding of that table is that this could point towards an experimental issue and of course I do not have anything to say about that.

01:32:41.000 --> 01:32:43.000
Right.

01:32:43.000 --> 01:32:46.000
Thank you.

01:32:46.000 --> 01:32:48.000
I thank you.

01:32:48.000 --> 01:33:02.000
Go ahead. Daniela I have the same complaint as Leonardo and and i don't think your answer works because the boost you're talking about for your benchmark model is about three, and what determines the angles is gamma not beta.

01:33:02.000 --> 01:33:15.000
So with a gamma three this particle is going to come out at a different angle, it's going to come out with a different to PT it's going to have different curvature this thing is not going to be reconstructed very likely as a single track, all the way

01:33:15.000 --> 01:33:30.000
through. And if instead you tried to say that okay it's, it, it decays to a particle of charge one and spits off something soft and somehow continues with the same momentum it's still it's going to curve differently.

01:33:30.000 --> 01:33:37.000
To go from charge to to charge one and it's okay and has this been continuing to straight line or should say in on the same curve.

01:33:37.000 --> 01:33:39.000
This is just not very plausible.

01:33:39.000 --> 01:33:51.000
You know, you could try to do something by saying these things are hydroponic and maybe they're charged exchanging but then the problem is that, again, I don't see how that's going to be consistent for these seven events with what happens in first of

01:33:51.000 --> 01:34:02.000
all in the TRT and second of all the immune system for these seven events but it just does not seem to line up with a model like this and on top of that, why don't we have events with two of these guys, you're producing two of them.

01:34:02.000 --> 01:34:07.000
So let me answer to both questions so the first question is that we're talking about, to the other.

01:34:07.000 --> 01:34:21.000
And so what the experience of measuring is the radius of curvature basically of the order of fab, the uncertainty the mental microstructure is 100%, or so, we had, we saw that and it's actually been measured these things.

01:34:21.000 --> 01:34:34.000
So, if you look at the momentum reconstructing immune spectrometer the momentum in the fact they are different, even for the certain events, these was shown in the big tables that was the second dog.

01:34:34.000 --> 01:34:50.000
So, well I do not have anything blogged about what's not right. These are not I mean they're two days in the measurement of the momentum. At least by I does not look enough to distinguish the two hypotheses, but then I cannot be considered, because again,

01:34:50.000 --> 01:35:06.000
I learned that last week. And to answer your second question, not a what happens to the other fact, we should take into account that all these processes have an efficient learning all these analysis and efficiency of about order of 10%.

01:35:06.000 --> 01:35:19.000
Now, clearly, the efficiency for the two tracks are to reconcile the two tracks are not totally independent, but part of this efficiency is independent, in particular the one about the quality of the tracks or the sensor.

01:35:19.000 --> 01:35:32.000
That means that if you take into account that you already have this oppression, which is 10% or 10%, the probability of how of reconstructing two drugs is suppressed, as compared to the probability of reconstructing just one block.

01:35:32.000 --> 01:35:43.000
Now since the 70s five or six or whatever. We didn't not worry about this issue of course if the access events we become when you're thirsty and you always see one track.

01:35:43.000 --> 01:35:52.000
You either have to invoke that this particle, the case on slide so so and so you haven't additional suppression, even by the fact that's to live for longer.

01:35:52.000 --> 01:36:08.000
Or, you have to consider a model that I was mentioning in my talk, in which the parent horizons is not the same letter, as some, some other charge, so that the case into two different particles, one which has chargeable to talk, and is the first one that

01:36:08.000 --> 01:36:21.000
we will be seeing an experiment, and the other one would be for recent charges but one charge equal to zero, doesn't matter. So you would see systematically only one of them but these we cannot tell with five lines to be honest because we have to take

01:36:21.000 --> 01:36:31.000
into account the fact that there is an additional suppression there is an efficient order of 10% in these researchers. No, no, that is clear.

01:36:31.000 --> 01:36:38.000
And I think the experiment is would have seen science of this but that's up to them to say,

01:36:38.000 --> 01:36:39.000
No, I'm.

01:36:39.000 --> 01:36:50.000
Are you ready, do you want to chime in on that or Leonardo sorry I see your hand. Yeah. No, I just want to comment that the most interesting thing I have seen in this.

01:36:50.000 --> 01:37:05.000
This series of presentation is that there is a matching between two of the seven events in which my opinion, this gives a very, which is a very low probability of happening, it is clear the only two events.

01:37:05.000 --> 01:37:11.000
That is really a secret something that will be followed more and more.

01:37:11.000 --> 01:37:31.000
So the two utilization of two, I bought this as not to be thrown away. In my opinion, only because these two events, which are certainly a very very low probability to weapon.

01:37:31.000 --> 01:37:36.000
That's all comment.

01:37:36.000 --> 01:37:40.000
Okay, thank you, Stephanie I see your hand, go ahead.

01:37:40.000 --> 01:37:57.000
Yes, I agree with what labor just said, and there was wondering if the model has been presented would be easily extendable to charges which have not to but there are a fraction, for example, above one between one and two.

01:37:57.000 --> 01:38:14.000
Yes, for us yes it's not, it's not a big deal then if you ask me that vertical motivation or having charged three half the video motivation is much deeper but of course you know if you look at blue sky like you discover whatever you find in.

01:38:14.000 --> 01:38:28.000
So, there is nothing you know that, but to be honest, if I look, if I show you. let me show you again, my slide so I mean if I look at these, you want to call it the propaganda plot or whatever.

01:38:28.000 --> 01:38:33.000
If data is one and then essentially is the one that you see from the diesel dx.

01:38:33.000 --> 01:38:43.000
This part is telling you the charge equal to two is the way to go. Basically, you can create you can do other lines for charging but 1.5 but then you would still have the, the data problem.

01:38:43.000 --> 01:38:58.000
The problem is not going to be equal to one. So that's why we focused on cubicles to basically because we want to reconcile that ionization measurements of the ddx that you see in that says, with information on the time of fly that tells you that has

01:38:58.000 --> 01:38:59.000
to be close to one up to 0.1 or so basically. So that's the real motivation for people to do that.

01:38:59.000 --> 01:39:07.000
close to one up to 0.1 or so basically. So that's the real motivation for people to do that.

01:39:07.000 --> 01:39:22.000
Yeah, back actually concerning this plot I have an additional question which is less relevant about the spread out the pointer that you're showing here is not a Landau is.

01:39:22.000 --> 01:39:45.000
We are yeah we just repeat the calibration data that were provided by artists can see me, somehow, but maybe we just feed us like this double function, we feed you know the the low energy data, which is the same procedure as authors that was that's basically.

01:39:45.000 --> 01:39:54.000
Okay, thank you. Your new case disagree band if we will be charged to these great bands will be centered around Sure.

01:39:54.000 --> 01:40:00.000
Now this is not beta gamma is speaking about dx, the band.

01:40:00.000 --> 01:40:17.000
The band did this size of the, of the vertical band, the band, the size of the band. Since, Yun ization the utilisation in big cities normalized 14142 should be normalized to four.

01:40:17.000 --> 01:40:28.000
And this is the blue line. Now this is the blue line is the DX the beta blocker. The band is what the experiment sees okay the web the access events are I mean I'm.

01:40:28.000 --> 01:40:31.000
Okay, thank you.

01:40:31.000 --> 01:40:38.000
Hey, thank you so much stuff I know and I think we have time for one more questions I see on hand is raised.

01:40:38.000 --> 01:40:59.000
Go ahead. Yeah, thank you. So I wanted to come back to the, to the lifetime issue so you you just entertain the idea that the thing could actually decay, and then proceed via, I mean, proceed as a charge one particle and I mean someone said that that

01:40:59.000 --> 01:41:16.000
this would probably be or not, not the positive because I mean there are some some, I mean the the track would probably not match but I mean, is it really true that you are is that sensitive to it I mean if if the charge one particle would not be super

01:41:16.000 --> 01:41:31.000
light but say just off the mass, I guess, with a half the mass we would even have the same bending but even if it's not exactly half the mass just I mean considerably lighter I mean, would you be able to tell whether.

01:41:31.000 --> 01:41:45.000
I mean, the two tracks match or would that be because I mean this PT uncertainty is quite large. Right. That's what I think. Not because this is what you reconstruct I mean, these curves is that you start from a particular for instead of 2.2 times where

01:41:45.000 --> 01:41:58.000
is the yellow line. So the physics here is just a single particles 2.2 times. What you reconstruct, then it's this broad curve it goes from 500 jobs to 45, and more.

01:41:58.000 --> 01:42:11.000
I mean, I don't think there is enough curiosity to you know to to investigate this is possibility but that maybe the experiment is no better than me.

01:42:11.000 --> 01:42:24.000
Yeah would be would be interesting to hear from exponential list about this issue.

01:42:24.000 --> 01:42:42.000
Well, if I may, I mean, you know, if, if the events that come out backwards and it was the events that didn't show up and immune system which we're also the ones that have low TRT, then that would be possible that's exactly where first.

01:42:42.000 --> 01:42:56.000
the events with low TRT show up and immune system events with high TRT don't.

01:42:56.000 --> 01:43:07.000
can show up anywhere I mean charge you but one is enough to be seen in the immune system, don't understand the point.

01:43:07.000 --> 01:43:15.000
The, the events that don't show up in the neon system are the ones that seem consistent with not having had the decay.

01:43:15.000 --> 01:43:23.000
They're the ones that traveled far enough that you still got high TRT, the events that have low TRT there, therefore we have to be the ones that had the decay.

01:43:23.000 --> 01:43:31.000
If some of those might show up in the neon system i don't disagree.

01:43:31.000 --> 01:43:40.000
I mean, it depends on the Decatur all that it is can equal to one, equal to zero, right, this is a highly model dependent statement, right.

01:43:40.000 --> 01:43:53.000
Again, you know, with this small number of events and don't know how much we can I mean we just bought a fit the physics right.

01:43:53.000 --> 01:44:02.000
I think I've said, Okay, um, the other the idea of something very quick. Yes, very last one very quick.

01:44:02.000 --> 01:44:30.000
So, can you eventually look if there is any mechanism is such that, after a particle of, say, charge equal to two. There is a possibility of decay product is a charged particle to one such that this is say, more or less convenient with charged particle.

01:44:30.000 --> 01:44:39.000
But this is in beta is like an article question and the expectation is that is sufficiently Colleen or just because beta is larger than 0.9 for us.

01:44:39.000 --> 01:44:49.000
That is not going to be exactly Belinda is going to be a cone can be calculated. But again, given the momentum resolution of 100% at this high moment.

01:44:49.000 --> 01:45:01.000
I'm, you know, my intuition tells me this should not be a problem but I do not have to answer because you know these results came after our war no because would be in this position of every single you.

01:45:01.000 --> 01:45:15.000
Yes, but to be honest I mean for my face. These are just a matter of space right the restore the feet is large right we we have access to that it's like let's not try to do too much.

01:45:15.000 --> 01:45:25.000
Don't do too much but it was just a suggestion to.

01:45:25.000 --> 01:45:40.000
Maybe we can, if you have more questions, please put them in the marvelous if you want to continue this discussion. And it's been very lively. So thank you to everyone who contributed, and especially to an area and then yelling for speakers, the speakers

01:45:40.000 --> 01:45:44.000
were making their positions and of course everyone who made the presentations earlier in the day.

01:45:44.000 --> 01:46:05.000
And then Noah, if you could maybe post your slides so people will be great. Thank you. And thanks everyone for today and hopefully see you tomorrow.

