Summer Student Opportunities 2016
Summer Student Opportunities 2016
The Particle Physics Experiment (PPE) group, in conjunction with the School of Physics & Astronomy, operates an annual summer student scheme whereby undergraduates can work on a research topic in experimental particle physics.
Students are typically funded for 6 weeks during a pre-arranged period over the summer (June-September). Please note that preference is given to 3rd and 4th year students. This opportunity is typically for University of Glasgow students, although students from elsewhere have also been supported.
The list of projects available for Summer 2016 will be added gradually to this webpage until the end of January 2016. The deadline for receipt of applications is Friday 4th March 2016.
Note that the additional costs of travel and accommodation cannot be met by the Group/School.
Application Procedure
Eligibility: Please note that only students who are currently enrolled to study for a further year after the summer are eligible for funding (for example, students due to finish their third year, or students due to finish their fourth year but are already accepted to a 5-year MSci degree). Priority will be given to students who are very passionate about experimental particle physics and feel they would like to continue with a PhD in this topic.
Requirement: Students who are accepted onto the Summer School programme must write a report at the end of their 6-week project, which provides an opportunity to further their communication skills.
Application: Applications should be sent by email to Michael Alexander.
Your application must contain the following documents as .pdf files, with a naming convention of last name, first name, followed by the document type (offering my name as an example):
- AlexanderMichaelCV.pdf (CV);
- AlexanderMichaelInterest.pdf (a brief statement of research interest);
- AlexanderMichaelResearch.pdf (a brief description of any research you have done previously, as well as computing projects);
- AlexanderMichaelGrades.pdf (a record of your grades obtained so far at university);
- AlexanderMichaelProjects.pdf (a ranked list of your preferred projects as offered);
- in addition, information on any presentations, posters or reports that you have written for any research or academic projects, which will help us to judge your scientific communication skills. It always reflects very nicely on the student if the documents are written in Latex, and not in Word, as within the science community project reports are written in Latex.
Process: Applications will be ranked by merit. Using the matrix of ranked students/ranked preferred projects per students, students will be suggested to supervisors, who will look at the student applications and may then set up an informal meeting to see if the student has the skills and interest needed for that particular project. For this reason students are encouraged to contact the supervisors direct during this application time in order to learn more about the projects, so that they are well informed when choosing an ordered list of preferred projects.
Funding: The projects (formed by a pair of supervisor and student) will enter into competition for funding with other projects from other physics groups, the final decision resting with the Head of School. On average, 4-5 projects for the particle physics experimental (PPE) group are funded every year. This year we aim to fund 6 projects in PPE.
Previous Schools: Please see the list on the left for details of previous years' summer student projects.
Page updated 29/01/2016
Improving ATLAS
The ATLAS experiment, one of two general-purpose detectors at CERN, will be upgraded over the next decade in order to handle the increased collision rate of the high luminosity Large Hadron Collider (LHC). Various subsystems will be improved to take full advantage of the new collision environment, including the inner pixel detector. This is the crucial subsystem for locating the interaction point of the initial collision and identifying the daughter particle trajectories and subsequent decays.
Studies are already underway to investigate the optimum specifications of the new pixel sensors: geometry, material, biasing, electronic and mechanical performance. Testbeam data has been collected for a range of devices under various operating conditions. The data collected is then analysed and compared before conclusions are made concerning the improved system.
You will be involved in the analysis of data collected from novel pixel detectors in various testbeam environments. This will characterise the detector response and help improve the next generation of ATLAS pixel sensors.
Project type: data analysis through computing, hardware
Prerequisites: reasonable familiarity with Linux and C++; some experience of ROOT analysis package is preferred but is not essential.
Preferred dates: June or August
Main Supervisor: Kenneth Wraight
Second Supervisor: Richard Bates
Development of Job Analytics Framework for UKI-SCOTGRID-GLASGOW
Data flow of the cgroup based monitoring platform developed at UKI-SCOTGRID-GLASGOW
UKI-SCOTGRID-GLASGOW is one of the largest UK Tier-2 sites of the Worldwide LHC Compute Grid (WLCG) with over 2.4PB of storage and 4000 cpu cores. Over the past 24 months a system for monitoring and analysing the resource requirements of jobs running at the Scotgrid site has been developed using cgroups as the instrumentation platform.
This framework harvests memory, runtime, CPU usage and other metrics for each job run at the site and can be used to analyse the resources requested and used. The proposed project is to use the initial framework to create a production ready system by examining the requirements of the backend DB system and to expand on its analytical capabilities
Prerequisites: A good understanding of computing principles is required. Some knowledge of Python would be very helpful, as would some knowledge of database design. An understanding of the ROOT analysis framework would be useful, but not essential.
Preferred Dates: Starting around mid-June 2016 (negotiable).
Main Supervisor: Gareth Roy
Second Supervisor: David Britton
Novel Simulations for HEP
Sensor Simulation
Modern Particle Physics research requires the development and application of high resolution detectors. For high energy collision experiments, such as those found at the LHC, these detectors are used for tracking particle trajectories; however, applications can be found from medical imaging to dosimetry aboard the ISS. Novel sensors are tested electronically in bench tests, but to assess the response to high energy radiation detectors are taken to testbeam facilities. Often detectors under test are placed in the centre of a group of well-understood devices. A beam of particles then traverses the set of detectors and the measured particle trajectories can be compared between the trusted and tested sensors (see telescope schematic).
In order to fully understand measured testbeam data, simulations of the testbeam set-up are used to predict the detector response. These simulations involve sensor material, pixel geometry, device orientation and incident radiation (see sensor simulation). Based on the outcome of simulation measurements can be planned and testbeam strategies made ahead of data-taking.
You will be involved in the simulation and pseudo-analysis of novel pixel detectors in various testbeam environments. This will help improve the reconstruction of past and future testbeam data and hence aid the characterisation of detectors used for the upgrade of LHC experiments and beyond.
Telescope Schematic
Project Type: Simulation/data analysis through computing.
Prerequisites: Reasonable familiarity with Linux and C++; some experience of ROOT analysis package is preferred but is not essential.
Preferred Dates: June or August
Main Supervisor: Alex Morton
Second Supervisor: Kenneth Wraight
Search for the Higgs boson produced in association with the top-quark pairs with the ATLAS experiment
Production of a top-quark pair in association with a Higgs-boson, where the Higgs-boson decays into a b-quark pair
One of the important physics goals of the Large Hadron Collider in the current data-taking run is to study the characteristics of the Higgs boson, for example its coupling to other fundamental particles. Being the heaviest quark, the top quark has the strongest coupling to the recently discovered Higgs boson (Yukawa-coupling) and so the measurement of this coupling is therefore one of the most interesting tests of the Standard Model Higgs Boson. This coupling can be probed by studying the production of Higgs bosons in association with top-quark pairs; the ttH channel. At the measured Higgs mass of about 125 GeV, the Higgs decays predominantly into a b-quark pair. This channel is very challenging, since it suffers from backgrounds (background: events with different production processes, that can look similar to the signal). The main background is the production of a top-quark pair with an additional b-quark pair, since it has exactly the same final state particles (irreducible background). The discrimination of ttH- and ttbb-events is therefore of the utmost importance to the sensitivity of the ttH channel. For this separation neural networks are a useful tool that allow us to take into account several variables, in order to build a final discriminant between the two processes.
Production of background events that mimic the Higgs signal events
This summer project focuses on investigating methods to improve the signal/background separation. You will learn about neural network training and how to statistically analyse the results.
Project type: Data analysis through computing (simulation and real data).
Prerequisites: Good familiarity with Linux, C++ and Python and/or shell scripting.
Preferred dates: May-July
Main Supervisor: Sarah Boutle
Second Supervisor: Tony Doyle
Visualising LHC event simulation and analysis
Illustration of event display and analysis -- to be made more dynamic!
Particle collisions at the LHC are complex phenomena, and their simulation is accordingly many-faceted. The simulations are based on a core of quantum field theory calculations, enhanced by a mix of approximated quantum chromodynamics and models of hadronisation. The resulting simulated events are a myriad of different particle species and kinematics, which somehow encode the structure of the fundamental interaction.
The job of particle experimentalists is to perform the decoding, using methods like sequential clustering of nearby particles into "jets", but it is hard to explain this process to non-specialists when the representation is mathematics and computer code. This project will develop new interactive 3D visualisations of event simulations and the methods used to analyse them, and investigate the feasibility of gesture-based interactivity like Kinect or Leap Motion sensors.
Project Type: Scientific computing and public outreach
Prerequisites: Strong programming skills, ideally some familiarity with 3D (game) programming
Dates Available: June-August
Main Supervisor: Andy Buckley
Second Supervisor: Chris Pollard
Measuring b-jets and vector bosons in LHC data
A W+b event at 13 TeV in the ATLAS detector
One of the main aims in Run 2 of the Large Hadron Collider, which began in 2015, is the search for evidence that the Higgs boson couples to bottom quarks. Despite bottom pairs being the predicted dominant decay mode for a Higgs mass of 125 GeV, this decay has not yet been unambiguously identified because of very high "fake rates" from Standard Model processes. To find the H -> bb decay, we hence need to understand the backgrounds very well.
In this project, you will use and write C++ and Python computer codes to analyse simulations and the LHC data, to select and study the behaviours of events which contain "jets" flagged as containing b-quark decay products, plus an electroweak vector bosum whose presence helps to identify Higgs production. The aim is to investigate the behaviour of physics signatures which could be used to identify H -> bb in the full Run 2 dataset, and to identify areas where better simulation is required for that discovery.
Project Type: Computational data analysis
Prerequisites: Good familiarity with Linux, ideally some C++ and Python
Dates Available: June-July
Main Supervisor: Chris Pollard
Second Supervisor: Andy Buckley
High-speed data transfer for experiments at the HL-LHC
ATLAS upgrade pixel module data tape
The experiments based at the LHC have been hugely successful in making discoveries and performing precision measurements with the data collected at the higest energy particle physics collider ever built. Highlights include the discovery of the Higgs particle (at ATLAS and CMS) and precision measurements constraining physics beyond the Standard Model (at LHCb).
To further our knowledge, the experiments will be upgraded to be able to take data at even higher luminosity to significantly increase the available statistics. The LHCb experiment will be upgraded in 2018 to use the full luminosity of the present LHC. in 2023 ATLAS will be upgraded at the same time that the LHC machine is replaced by the High Luninosity LHC, with an order of magnitude higher luminosity.
These upgrades result in order of magnitude increases in interaction rate in the experiments. To take advantage of this the experiments' data recording and transmission rates must be similarly increased. The data rates expected from the LHCb and ATLAS silicon pixel detector systems are predicted to be 5 Gb/s, which is unprecedented in particle physics experiments. This data must be transmitted electrically for a number of meters before being converted into an optical signal. Such data rates make broadband speeds seem rather lacklustre with the faster package (Virgin fibre optic) boasting a pitiful 200 Mb/s.
The development of the highly performant data transmission cables is being led by the Glasgow group for both the LHCb and ATLAS experiments. This project includes the characterisation of the prototypes using state of the art equipment, the analysis of the data and comparison to simulation. Tape characteristics will be extracted and used as input in circuit simulations. The project therefore includes both laboratory based measurements and data analysis.
The project will be based in the Glasgow experimental particle physics laboratory for advanced detector development.
High speed network analyser
Project Type: Measurements and data analysis
Prerequisites: Interest in electronics is beneficial but not required
Preferred Dates: Mid-June to end July
Main Supervisor: Richard Bates
Second Supervisor: Leyre Flores
Measurement of charmed hadron lifetimes at the LHCb experiment
A proton-proton collision recorded by LHCb from June 2015
The LhCb experiment on the Large Hadron Collider (LHC) at CERN is designed specifically to make high precision measurements of decays of hadrons containing charm and beauty quarks. These are compared to the theoretical predictions of the Standard Model (SM) in order to look for discrepancies which may indicate new physics effects. Measurement of the lifetimes of charmed hadrons are challenging, both theoretically and experimentally, but with sufficient precision may reveal new physics. Should a discrepancy between theory and experiment be found this could indicate interference from non-SM particles, which can enhance or suppress the decay of charmed hadrons. LHCb has recorded the largest datasets of decays of charmed hadrons in the world. Combined with its high precision tracking system this makes it an ideal place to perform such tests of the SM.
Using both real and simulated data from LHCb you will work towards a measurement of the lifetime of one or more species of charmed hadron. This will require extensive use of the ROOT data analysis software package in order to parametrise signal and background, account for detector resolution and efficiency effects, and extract the lifetime of the signal decays.
Prerequisites: Some programming experience, particularly with Linux shell scripting, C++ and Python. Prior experience with ROOT would be beneficial, but not essential.
Preferred Dates: June-July
Main Supervisor: Michael Alexander
Second Supervisor: Lars Eklund
Neutrino energy corrections to bottom-quark-initiated jets for searches of Higgs bosons decaying to bottom-quark pairs
mbb becomes narrower, thus better, as we add the muon-in-jet and PtReco (denoted here Resolution) correction in the Run I ATLAS VH with H->bb search
The Higgs boson discovered at LHC in 2012 has been observed coupling directly to W and Z bosons and tau leptons, and indirectly to top quarks. In order to probe if it is indeed the particle predicted by the Standard Model (SM), or by a theory beyond the SM, direct couplings of Higgs boson to quarks must also be measured and compared with the SM pediction. From all quarks, the Higgs boson decays most often to a pair of bottom (b) quarks (58% of times). When the Higgs boson signal is produced alone in gluon-gluon fusion, the signal (S) is overwhelmed by the regular multi-jet background (B) produced in the SM. By requiring the Higgs boson to be produced in association to a vector (W or Z) boson, which is required to decay leptonically, data events can be selected using charged-lepton triggers. This enhances greatly the S/B ratio. The S/B ratio is increased further by improving the reconstruction of the di-b-jet invariant mass (mbb), the best single S/B discriminant. b-jets are special and need particular treatment. ATLAS already uses two b-jet-specific energy corrections to account for the semileptonic decays that involve muons in 12% of cases (muon-in-jet corrections) and energy deposited outside the radius of the cone jet reconstruction algorithm, especially at low transverse momentum (PtReco correction). The PtReco correction also takes into account on average the contribution of the neutrinos in jet, which happens in 24% of cases.
In this project we try to improve the neutrino estimation personalised for each, instead of evaluating an average effect. Two methods are envisaged; an MVA-based regression, or conservation of momentum and energy using the primary vertex, secondary vertex and associated tracks. Once identified, the neutrino contribution should also be subtracted from the missing transverse energy in the event generated from the W or Z bosons produced together with the Higgs boson. Overall, these methods are expected to improve the Higgs boson search sensitivity, and be applicable to other searches that involve b-jets, such as for ttH with H->bb.
Feynman diagram of a b quark decaying semileptonically with a muon and a neutrino among the decay projects
Project Type: Data analysis through computing (simulation)
Prerequisites: Good familiarity with Linux, C++, Python, ROOT and shell scripting
Preferred Dates: May-June 2016
Main Supervisor: Adrian Buzatu
Second Supervisor: Aidan Robson
Optimising limit setting on new physics using parallel processing
Marginalised exclusion contours on new physics in top quark observables (SM=(0,0))
The Standard Model of particle physics explains an extraordinary range of data with wonderful precision, and the 2012 discovery of the Higgs boson served as a further confirmation of its basic correctness. But the existence of the Higgs also suggests that there must be physics beyond the SM, to stabilise the Higgs mass at its relatively low value of 125 GeV.
In Glasgow particle experiment and theory groups we have built a tool for simulating the influence of generic extensions to the Standard Model on experimental observables in the top quark sector. Constraints on new physics are explored in a space of 12 coefficients for new field theory operators, which encode effective top quark interactions via new particles, and projected into 2-dimensional limit contours as shown in the accompanying image.
At present, the 12-dimensional space is factorised into at most 7 operators at a time, but this is an approximation which cannot be sustained indefinitely: we need to be able to explore all 12 dimensions simultaneously. Integrating over such a large space to produce marginalised 2D limit contours is a technical as well as physics challenge. This project will attempt to achieve this goal by parallelising the 12-dimensional statistical limit-setting, in particular via the Python multiprocessing module and PyOpenCL/PyCUDA packages for GPU processors.
Project Type: Scientific computing
Prerequisites: Strong programming skills, particularly in Python
Dates Available: July-August
Main Supervisor: Andy Buckley
Second Supervisor: Michael Russell