Summer Student Opportunities 2025

Summer Student Opportunities 2025

‌‌Students outside sitting beneath treesThe Particle Physics Experiment (PPE) group, in conjunction with the School of Physics & Astronomy, operates an annual summer student scheme for undergraduates to work on a research topic in experimental particle physics. This year we are pleased to once again be able to offer several such studentships.

Eligibility: Please note that only students who are currently enrolled to study for a further year after the summer are eligible for funding (for example, students due to finish their third year, or students due to finish their fourth year but are already accepted to a 5-year MSci degree). Priority will be given to students who are passionate about experimental particle physics and feel they would like to continue with a PhD in this topic.
Students from outside the University can apply but please note no additional funding for accommodation will be provided.

Requirement: Students who are accepted onto the Summer School programme must write a report at the end of their 6-week project, which provides an opportunity to further their communication skills.

Application: Applications should be sent by email to Kenneth Wraight with title “PPE summer project application”.

Your application must contain the documents below. Please use the naming convention: last name, first name, followed by the document type (e.g. Last_First_ApplicationForm.xlsx, Last_First_PersonalStatement.doc)

Application Form: Please fill out a copy of the Summer Student Projects application form : include your grades here.

Personal Statement: a brief statement of research interest (why you want to do particle physics research) and any relevant prior experience; in addition, information on any presentations, posters or reports that you have written for any research or academic projects, which will help us to judge your scientific communication skills.

APPLICATIONS ARE NOW OPEN – closing date: 15/03/2025

Process: Applications will be ranked by merit. Using the matrix of ranked students/ranked preferred projects per students, students will be suggested to supervisors, who will look at the student applications and may then set up an informal meeting to see if the student has the skills and interest needed for that particular project. For this reason students are encouraged to contact the supervisors directly during this application time in order to learn more about the projects, so that they are well informed when choosing an ordered list of preferred projects.

Funding: The projects (formed by a pair of supervisor and student) will enter into competition for funding with other projects from other physics groups, the final decision resting with the Head of School. On average, 4-5 projects for the particle physics experimental (PPE) group are funded every year. Successful applicants will receive £300/week for the duration of the project.

Previous projects: Please see the list on the left for details of previous years’ summer student projects.

PROJECT PROPOSALS (2025)

Exploring New Physics in LHC data using simplified analysis frameworks

Each year, the ATLAS Collaboration at the Large Hadron Collider (LHC) publishes dozens of searches for New Physics, like supersymmetry. These searches rigorously test theoretical predictions of New Physics against experimental LHC data. However, the involved analysis techniques are complex and time-intensive, often taking years to complete and still leaving many fascinating New-Physics scenarios unexplored.

In this project, we will test previously unexplored New-Physics predictions much faster against LHC data: we will adapt ATLAS searches by creating simplified versions of particle-physics data analyses. We will compute event observables and apply machine-learning classifiers, and compare the output to Monte-Carlo simulations.  After the project, particle physicists worldwide will be able to use our adapted searches to perform fast and simple studies of their own.

Project type: Data analysis/ simulation
Prerequisites: Python/C++ and Linux preferable
Preferred dates: TBC
Primary supervisor: Dr Martin Habedank
Secondary supervisor: Prof Andy Buckley

 

Multi-node federated machine learning for on Grid deployment for ATLAS

Machine learning methods, namely deep learning techniques have played an increasingly important role in physics analyses. With current leading edge models being trained at the terabyte scale. Since both the models being used are evolving in complexity and training data-sets are increasing in size the community is fast approaching the point where it is not feasible to train large models in one single site bar a large financial investment. The Grid has been a longstanding high throughput computing (HTC) infrastructure designed to facilitate the needs of particle physicists with several GPUs already being available via this infrastructure. This project would aim to serve as a proof concept to explore federated learning models (either with TensorFlow Federated or PyShyft) where the entire data-set would not have to fit into one site, potentially facilitating training on petabyte and exabyte scale data-sets. The initial work would be done on one node (with GB to TB scale data-sets) in containerised environments with artificial latency used to model the delay between sites. Training would initially simulate the Glasgow and Manchester Tier-2 sites as they have relatively low latency and both have similar tier GPUs readily available via the Grid. There are a number of cutting edge neural network topologies to use as a starting point with examples being either the novel transformer used in the recent ttH(bb) measurement https://arxiv.org/pdf/2407.10904 or the current leading flavour tagging model GN3 developed by ATLAS.

Project type: Data analysis/ simulation
Prerequisites: experience with Linux and Python, experience with machine learning desirable but not required.
Preferred dates: TBC
Primary supervisor: Dr Albert Borbely
Secondary Supervisor: Prof David Britton

 

Characterisation of a CMOS sensor

Specially designed CMOS sensors are being developed as tracking detectors for future particle physics tracking systems. 

Glasgow is a member of the MALTA consortium and has characterised the performance, noise and radiation hardness of the MALTA CMOS sensors. This project will focus on the characterisation of the new MALTA2 sensor. Initial work will be electrical characterisation to test the readout across the pixel matrix and the noise level that can be achieved. There will then be a study of the response to sources and X-rays. Students should have lab experience and be able to work with python and linux.

Project type: Instrumentation and data analysis
Prerequisites: lab experience and be able to work with python and linux
Preferred dates: TBC
Primary supervisor: Prof Craig Buttar
Secondary supervisor: TBC

 

Studies of ATLAS pixel modules

Glasgow is currently producing pixel modules for the new silicon tracker for the ATLAS Upgrade. The modules are being thoroughly tested and characterised. The project will cover measuring the response of the pixel matrix, noise and how the module operates at low temperatures. This project will look at measuring the properties of pixel modules and analysing the data from all the modules tested to study their performance. Students should have lab experience and be able to work with python and linux.

Project type: Instrumentation and data analysis
Prerequisites: lab experience and be able to work with python and linux
Preferred dates: TBC
Primary supervisor: Craig Buttar
Secondary supervisor: TBC

 

4D detectors for advanced science applications

Silicon detectors are the eyes of modern science experiments. There deployment is ubiquitous; for example, they are used to reconstruct the interactions in a particle or nuclear physics experiment, reconstruct diffraction patterns produced at synchrotrons to understand all manner of matter from quantum materials to viruses, and perform medical imaging and dosimetry in medical therapy.

The most advanced silicon pixel detectors have a spatial resolution order micrometre, but a temporal resolution of only nanoseconds and they have a minimum detectable energy of 500keV.

The next generation of silicon detectors based on the Low Gain Avalanche silicon Detector (LGAD), aims to increase the temporal resolution to 10 picoseconds and lower the minimum detectable energy to 100keV. Such advances in performance will revolutionise a wide range of fundamental and applied science.

This project will characterise the latest iteration of LGADs developed within the UK in the laboratories of the Glasgow Experimental Particle Physics (PPE) group. The project will gather data using advanced instrumentation and analyse it using python scripts. Semiconductor theory is required to understand the results.

Project type: Experimental/analysis
Prerequisites: Semiconductor physics, data analysis skills, basic coding, pleasant and hardworking character.
Preferred dates: July 21st onwards
Primary Supervisor: Dr Richard Bates
Secondary Supervisor: Dr Andy Blue

 

CP violation studies in multi-body decays with Run 3 LHCb data

The Large Hadron Collider (LHC) in Geneva, Switzerland, is the largest and most powerful particle collider ever built. The LHCb experiment is designed primarily to study a phenomenon known as CP violation (differences between matter and antimatter), which may provide clues as to how our universe evolved into the matter dominated state we observe today, from an initial state of equal matter and antimatter.

In preparation for Run 3 of the LHC, LHCb recently underwent a significant upgrade which consisted of almost entirely rebuilding the detector to operate in much harsher radiation conditions. After an initial commissioning period, the LHCb experiment has begun to steadily collect high-quality data during 2024, almost already reaching the same size of dataset as previously collected between 2010-2018 in Runs 1 and 2. This project will consist of analysing data collected in 2024 to develop a CP violation measurement using four-body decays of D^0 mesons (a bound state of a charm quark and an anti-up quark) to Kaons and pions.

Project type: Data analysis
Prerequisites: Experience in python and/or C++. Experience in Linux and/or ROOT would be a big advantage.
Preferred dates: TBC
Primary supervisor: Dr Niall McHugh
Secondary supervisor: Prof Paul Soler

 

How bright is the LHC? Measuring the luminosity at LHCb with beam-gas imaging

The Large Hadron Collider (LHC) in Geneva, Switzerland, is the largest and most powerful particle collider ever built. The LHCb experiment is designed primarily to study differences between matter and antimatter, which may shed light on the early universe.

The quantity "luminosity" is a measure of how intense the collisions at a particle collider are. Typically, this is measured at a proton-proton collider using the "van der Meer" method, where the rate of collisions is measured as the two beams are scanned across one another. At LHCb, we have the unique ability to inject gas around the beams, and measure the luminosity by directly reconstructing the beam profiles, which is known as "beam-gas imaging". In 2014, the LHCb experiment published (at the time) the most precise determination of the luminosity at a hadron collider, with a precision of ~1% combining both methods.

In preparation for Run 3 of the LHC, LHCb recently underwent a significant upgrade which consisted of almost entirely rebuilding the detector to operate in much harsher radiation conditions. After an initial commissioning period, the LHCb experiment has begun to steadily collect high-quality data during 2024. This project will consist of applying the beam-gas imaging method to directly reconstruct the profile of the LHC beams at the LHCb interaction point, and make a preliminary measurement of the luminosity using 2024 data with a precision of a few percent, which will potentially form a useful input to some of the first LHCb publications using Run 3 data.

Project type: Data analysis
Prerequisites: Experience in python and/or C++. Experience in Linux and/or ROOT would be a big advantage.
Preferred dates: TBC
Primary supervisor: Dr Niall McHugh
Secondary supervisor: Prof Paul Soler

 

Machine learning tools for Quantum Information measurements at Hadron Colliders

Quantum Entanglement was recently observed for the first time in pairs of top quarks at the Large Hadron Collider (LHC) in the ATLAS experiment, the first time that entanglement has been observed in fundamental ‘unbound’ quarks. The limiting factor in the measurement is that the top quarks decay almost instantly and must be reconstructed from other more stable physics objects in the detector, such as charged leptons and hadronic jets, and the resolution of this reconstruction is currently quite poor. This project will seek to improve the existing reconstruction algorithms developed by the Glasgow team that led the discovery with the end goal using advanced machine learning techniques to not only study quantum entanglement but to test Bell Inequalities in top events at the LHC. There is also scope to work on transferring these tools to a GPU-based architecture.

Project type: Data analysis/simulation
Prerequisites: Python or C++ essential, some Linux experience preferable
Preferred dates: July onwards
Primary supervisor: Dr James Howarth
Secondary supervisor: Prof Mark Owen

 

Searching for new pentaquark decays at LHCb with Run 3 data

The LHCb experiment is one of the detectors of the LHC, built to investigate b-hadron decays and probe for new physics in beyond standard model processes through mechanisms such as charge-parity violation and lepton flavour universality violation.

The LHCb detector has recently observed a number of tetraquark and pentaquark candidates, corresponding to four or five quark states respectively. The nature of these states is not yet clear. For example in the case of the pentaquarks the state could be a baryon (3 quark state) and meson (2 quark state) molecular state, or it could exist as all five quarks bound together in a compact state. Neither of these theories, or others, have been proven to be correct so far. The observation of new exotic states, or learning more about known states, will help to discern between these theories and to ultimately understand their nature.

In this project you will be required to analyse LHCb data samples from the current LHC run to search for a pentaquark decay that has not previously been observed. The main challenges of the analysis will be to correctly model the backgrounds that will be present in the sample, such that the signal can be seen more clearly. Furthermore, the software used to process the data is new and is in active development, so time will be spent understanding the algorithms used. Fits to the data will also be necessary in order to evaluate the yield of the signal, if one is seen.

Project type: 10% Experiment, 90% Modelling/Data Analysis
Prerequisites: This project makes use of the ROOT data analysis framework commonly used at CERN. This framework uses C++ and python, so previous knowledge of these or of ROOT itself would be advantageous.
Preferred dates: TBD
Primary Supervisor: Dr Gary Robertson
Secondary Supervisor: Dr Mark Whitehead

 

Searching for pentaquark production in semi-leptonic states at LHCb

The LHCb experiment is one of the detectors of the LHC, built to investigate b-hadron decays and probe for new physics in beyond standard model processes through mechanisms such as charge-parity violation and lepton flavour universality violation.

The LHCb detector has recently observed a number of tetraquark and pentaquark candidates, corresponding to four or five quark states respectively. The nature of these states is not yet clear. For example in the case of the pentaquarks the state could be a baryon (3 quark state) and meson (2 quark state) molecular state, or it could exist as all five quarks bound together in a compact state. Neither of these theories, or others, have been proven to be correct so far. The observation of new exotic states, or learning more about known states, will help to discern between these theories and to ultimately understand their nature.

In this project you will be required to analyse LHCb data samples from the current LHC run to search for pentaquark contributions to the semi-leptonic decay of a b-hadron. Semi-leptonic decays are challenging to reconstruct in the LHCb experiment since they involve the production of a neutrino which is not detected. Within this decay there is predicted to also be the production of a pentaquark which then decays further. The main challenges of the analysis will be to correctly model the backgrounds that will be present in the sample, such that the signal can be seen more clearly. Furthermore, the software used to process the data is new and is in active development, so time will be spent understanding the algorithms used. Fits to the data will also be necessary in order to evaluate the yield of the signal, if one is seen.

Project type: 10% Experiment, 90% Modelling/Data Analysis
Prerequisites: This project makes use of the ROOT data analysis framework commonly used at CERN. This framework uses C++ and python, so previous knowledge of these or of ROOT itself would be advantageous.
Preferred dates: TBD
Primary Supervisor: Dr Gary Robertson
Secondary Supervisor: Dr Mark Whitehead

 

Improving Higgs event reconstruction in ATLAS using neural-network based neutrino prediction.

Since the discovery of the Higgs boson in 2012 by the ATLAS and CMS experiments its properties have been measured to very high precision, helping to further our understanding of the Standard Model and hint at possible avenues for new physics. Recently there has been particular interest in probing the polarisation properties of Higgs boson decays in order to make measurements of quantum entanglement at very high energies. Such measurements are possible using events in which a Higgs boson decays into a W+ and W- boson pair which each subsequently decay leptonically into a charged lepton and associated lepton neutrino.

LHC collisions are dominated by quark backgrounds and thus the charged lepton leaves a very recognisable signature within the detector which can be used to easily isolate candidate Higgs events. However the associated neutrinos leave no direct signal within the detector, making it difficult to accurately reconstruct the kinematics of the full decay chain. By leveraging a neural network based on fast conditional normalising flows, it is possible to restrict the phase space of available neutrino kinematics allowing for more accurate neutrino reconstruction. In this project you will train and optimise such a network to reconstruct H→WW events using simulated data matching the ATLAS detector response. If improved and validated such methods could allow state of the art quantum entanglement measurements to be possible at collider experiments.

Project type: Data analysis
Prerequisites: Experience with Linux and Python preferable but not essential. Some familiarity with machine-learning concepts would be an advantage.

Preferred dates: July/August
Primary supervisor: Dr Jonathan Jamieson

Secondary supervisor: Dr Jay Howarth

 

Reactor constraints for the T2K experiment

The T2K experiment is based in Japan and studies the oscillation of neutrinos as they travel 295km beneath the Japanese Alps.  The oscillation channel is characterised by a frequency (related to the neutrino masses) and amplitude (related to quantum mechanical mixing of neutrino states) of the oscillation.  T2K appears to see a significant difference in the oscillations of neutrino and antineutrinos, pointing to the largest known difference between the physics of matter and antimatter.  However, the result relies on accurate measurement of neutrino oscillations in other systems, including that of neutrinos from reactors.

To date all measurements at T2K have relied on reactor neutrino data to a constraint on one of the four parameters that govern the mixing (amplitude), but the reactor experiments are also capable of providing information about the neutrino masses (frequency), which should improve T2K’s measurements of the matter-antimatter asymmetry.  Glasgow students have developed the analysis tools to incorporate this ‘2-dimensional’ information into T2K’s analysis, but there is one piece still missing: there is no analysis that extracts a fully correlated ‘2-d’measurement from one of the reactor experiments.

The aim of the project is to see if we can extract this 2-d constraint by reanalysing the data and using the fit itself to deduce the missing details of the published analysis.  It can be done in Python on the schools Jupyter server, or in your preferred programming language.

Project type: Simulation
Prerequisites: Familiarity with a numerical programming language with graphing capability (such as Python or C) is essential.
Preferred dates: TBC
Primary supervisor: Dr Phillip Litchfield

Secondary supervisor: TBC


Analysing quarkonia production in jets with the LHCb experiment that is based at CERN

The production of quarkonia, mesons that have either c-cbar or b-bbar quark content, have been a long source of contention in particle physics. From the underestimation of their production cross section from theory calculations in comparison to experimental data, to the long-standing polarisation puzzle. Hence, any measurements that can give a better idea of how these states are produced are invaluable.

The Large Hadron Collider beauty (LHCb) experiment based at the LHC at CERN, has been a world-leader in many measurements, including rare decays, spectroscopy and in CP-violation. The first measurements of quarkonia production in jets (a large spray of energetic particles) were performed by LHCb, which looked at J/ψ’s clustered into jets, and measured the momentum fraction that the J/ψ’s takes up in the jet. This measurement had unexpected results, with experimental data showing significantly lower momentum fractions than theory predictions. This measurement considerably aided theory calculations, allowing simulation event generators, such as Pythia, to better model their production.

In this project you will be required to analyse LHCb data samples from LHC Run II to look at χc production in jets, where χc’s are excited states of J/ψ’s. It is unknown whether these will exhibit the same behaviour as J/ψ’s, or completely new behaviour. This will be the first performance of this measurement by any experiment. This project will involve coding, either with C++ or python, mainly using a fitting tool called Roofit. If there is interest, the project can also include simulation experience using Pythia, a monte carlo package which simulates particle interactions.

Project type: Data Analysis
Prerequisites: Familiarity with a programming language such as Python or C++
Preferred dates: TBC
Primary supervisor: Dr Naomi Cooke
Secondary supervisor: Dr Mark Whitehead

 

Exploring and testing new simulation models in particle physics

The production of quarkonia, mesons that have either c-cbar or b-bbar quark content, have been a long source of contention in particle physics. From the underestimation of their production cross section from theory calculations in comparison to experimental data, to the long-standing polarisation puzzle. A new breakthrough in simulation models could solve the discrepancy observed between theoretical models and experimental data. This model looks at quarkonia production in the parton shower, which is just the radiation of quarks and gluons.

In this project, you will test this new model which is implemented in the Monte Carlo simulation software package Pythia 8. Pythia 8 simulates high energy particle interactions. Firstly, you will make theoretical predictions using Pythia 8, to see the contribution of this model to standard model processes such as decays of the famous Higgs boson. Then in inclusive branching fraction measurements of quarkonia. Finally, you will make comparisons to current experimental data from the Large Hadron Collider. The final stage of the project use either Pythia 8 or Rivet, depending on preference. In this project, you will also learn how to handle big datasets, by running parallel jobs on computing clusters.

Project type: Simulation
Prerequisites: Familiarity with a programming language such as Python or C++
Preferred dates: TBC
Primary supervisor: Dr Naomi Cooke
Secondary supervisor: Dr Mark Whitehead

 

Challenging new-physics models with ATLAS double-Higgs data

One of the major unsolved problems in experimental particle physics is the search for evidence that the Higgs field (and its excitation, the Higgs boson) "self-couples", i.e. that one Higgs boson can split into two. Detecting this in the Standard Model is one of the major current and future efforts of the ATLAS collaboration at the Large Hadron Collider. Two Higgs bosons might also be produced by exotic "beyond Standard Model" (BSM) mechanisms, and these can be a little easier to detect: the Glasgow ATLAS group has been leading a data-analysis to test "two-Higgs-doublet" BSM models. But this is not the only family of BSM models that could be sensitive to our measurements: we want to preserve a version of the analysis that can be used for general BSM-model testing. This project will work on translating the data analysis (including machine-learning elements) to the simplified Rivet system (https://rivet.hepforge.org/), understanding and validating its behaviour, and testing the effects on a new BSM model with the (https://hepcedar.gitlab.io/contur-webpage/) platform, ensuring lasting impacts from this new measurement.

Project type: Data analysis/simulation
Prerequisites: Python/C++ and Linux preferable
Preferred dates: TBC
Primary supervisor: Prof Andy Buckley
Secondary supervisor: Dr Giuseppe Callea