Summer Student Opportunities 2017
Summer Student Opportunities 2017
The Particle Physics Experiment (PPE) group, in conjunction with the School of Physics & Astronomy, operates an annual summer student scheme whereby undergraduates can work on a research topic in experimental particle physics.
Students are typically funded for 6 weeks during a pre-arranged period over the summer (June-September). Please note that preference is given to 3rd and 4th year students. This opportunity is typically for University of Glasgow students, although students from elsewhere have also been supported.
The list of projects available for Summer 2017 will be added gradually to this webpage until the beginning of February 2017. The deadline for receipt of applications is Friday 3rd March 2017.
Note that the additional costs of travel and accommodation cannot be met by the Group/School.
Application Procedure
Eligibility: Please note that only students who are currently enrolled to study for a further year after the summer are eligible for funding (for example, students due to finish their third year, or students due to finish their fourth year but are already accepted to a 5-year MSci degree). Priority will be given to students who are very passionate about experimental particle physics and feel they would like to continue with a PhD in this topic.
Requirement: Students who are accepted onto the Summer School programme must write a report at the end of their 6-week project, which provides an opportunity to further their communication skills.
Application: Applications should be sent by email to Michael Alexander with title "PPE summer project application".
Your application must contain the following documents as .pdf files, with a naming convention of last name, first name, followed by the document type (offering my name as an example):
- AlexanderMichaelCV.pdf (CV);
- AlexanderMichaelInterest.pdf (a brief statement of research interest);
- AlexanderMichaelResearch.pdf (a brief description of any research you have done previously, as well as computing projects);
- AlexanderMichaelGrades.pdf (a record of your grades obtained so far at university);
- AlexanderMichaelProjects.pdf (a ranked list of your preferred projects as offered);
- in addition, information on any presentations, posters or reports that you have written for any research or academic projects, which will help us to judge your scientific communication skills. It always reflects very nicely on the student if the documents are written in Latex, and not in Word, as within the science community project reports are written in Latex.
Process: Applications will be ranked by merit. Using the matrix of ranked students/ranked preferred projects per students, students will be suggested to supervisors, who will look at the student applications and may then set up an informal meeting to see if the student has the skills and interest needed for that particular project. For this reason students are encouraged to contact the supervisors direct during this application time in order to learn more about the projects, so that they are well informed when choosing an ordered list of preferred projects.
Funding: The projects (formed by a pair of supervisor and student) will enter into competition for funding with other projects from other physics groups, the final decision resting with the Head of School. On average, 4-5 projects for the particle physics experimental (PPE) group are funded every year. This year we aim to fund 6 projects in PPE.
Previous Schools: Please see the list on the left for details of previous years' summer student projects.
Page updated 08/02/2017
Improving ATLAS
The ATLAS experiment, one of two general-purpose detectors at CERN, will be upgraded over the next decade in order to handle the increased collision rate of the high luminosity Large Hadron Collider (LHC). Various subsystems will be improved to take full advantage of the new collision environment, including the inner pixel detector. This is the crucial subsystem for locating the interaction point of the initial collision and identifying the daughter particle trajectories and subsequent decays.
Studies are already underway to investigate the optimum specifications of the new pixel sensors: geometry, material, biasing, electronic and mechanical performance. Testbeam data has been collected for a range of devices under various operating conditions. The data collected is then analysed and compared before conclusions are made concerning the improved system.
You will be involved in the analysis of data collected from novel pixel detectors in various testbeam environments. This will characterise the detector response and help improve the next generation of ATLAS pixel sensors.
Project type: data analysis through computing, hardware
Prerequisites: reasonable familiarity with Linux and C++; some experience of ROOT analysis package is preferred but is not essential.
Preferred dates: June or August
Main Supervisor: Kenneth Wraight
Second Supervisor: Richard Bates
Optimising combinatorics in multi-body final states in LHCb
The LHCb experiment records particles produced in proton-proton collisions from the Large Hadron Collider at CERN (Geneva) to perform precision measurements of CP violation and rare decays in the beauty and charm sector. A recurrent issue is the construction of particles decaying into the many-body final state. In the dense hadronic environment of the LHC, the number of combinations of possible final state tracks that form a candidate for subsequent physics analysis increases with a high power of the track density. This poses a significant burden on available resources in terms of disk space and CPU, since all these combinations must be formed and saved in order to perform an analysis.
There are situations in which some of the final state particles share many detector hits with already existing candidates, so that the resulting candidate would essentially be a duplicate of an already existing one. In some cases, this allows the creation of a new candidate to be skipped, which could lead to some substantial savings in computing resources. The aim of this project is to integrate such a strategy into the LHCb multi-body combiner software package, using a fast probabilistic data structure called a Bloom filter to perform the test for shared detector hits. A Bloom filter allows fast approximate membership tests and approximate set intersections. You can monitor the savings incurred by your changes to our C++ particle combiner code using one of the many multibody modes studied at LHCb.
Project Type: Data analysis, software.
Prerequisites: Some familiarity with particle/detector physics, Linux, C++ and/or Python would be beneficial, but interest in the topic is the main factor.
Preferred Dates: Starting around mid-June 2016 (negotiable).
Main Supervisor: Manuel Schiller
Second Supervisor: Michael Alexander
Novel Simulations for HEP
Sensor Simulation
Modern Particle Physics research requires the development and application of high resolution detectors. For high energy collision experiments, such as those found at the LHC, these detectors are used for tracking particle trajectories; however, applications can be found from medical imaging to dosimetry aboard the ISS. Novel sensors are tested electronically in bench tests, but to assess the response to high energy radiation detectors are taken to testbeam facilities. Often detectors under test are placed in the centre of a group of well-understood devices. A beam of particles then traverses the set of detectors and the measured particle trajectories can be compared between the trusted and tested sensors (see telescope schematic).
In order to fully understand measured testbeam data, simulations of the testbeam set-up are used to predict the detector response. These simulations involve sensor material, pixel geometry, device orientation and incident radiation (see sensor simulation). Based on the outcome of simulation measurements can be planned and testbeam strategies made ahead of data-taking.
You will be involved in the simulation and pseudo-analysis of novel pixel detectors in various testbeam environments. This will help improve the reconstruction of past and future testbeam data and hence aid the characterisation of detectors used for the upgrade of LHC experiments and beyond.
Telescope Schematic
Project Type: Simulation/data analysis through computing.
Prerequisites: Reasonable familiarity with Linux and C++; some experience of ROOT analysis package is preferred but is not essential.
Preferred Dates: June or August
Main Supervisor: Kenneth Wraight
Second Supervisor: Dima Maneuski
Search for the Higgs boson produced in association with the top-quark pairs with the ATLAS experiment
Production of a top-quark pair in association with a Higgs-boson, where the Higgs-boson decays into a b-quark pair
One of the important physics goals of the Large Hadron Collider in the current data-taking run is to study the characteristics of the Higgs boson, for example its coupling to other fundamental particles. Being the heaviest quark, the top quark has the strongest coupling to the recently discovered Higgs boson (Yukawa-coupling) and so the measurement of this coupling is therefore one of the most interesting tests of the Standard Model Higgs Boson. This coupling can be probed by studying the production of Higgs bosons in association with top-quark pairs; the ttH channel. At the measured Higgs mass of about 125 GeV, the Higgs decays predominantly into a b-quark pair. This channel is very challenging, since it suffers from backgrounds (background: events with different production processes, that can look similar to the signal). The main background is the production of a top-quark pair with an additional b-quark pair, since it has exactly the same final state particles (irreducible background). The discrimination of ttH- and ttbb-events is therefore of the utmost importance to the sensitivity of the ttH channel. This summer project focuses on investigating methods to improve the signal/background separation.
Production of background events that mimic the Higgs signal events
Project type: Data analysis through computing (simulation and real data).
Prerequisites: Good familiarity with Linux, C++ and Python and/or shell scripting.
Preferred dates: May-July
Main Supervisor: Sarah Boutle
Second Supervisor: Tony Doyle
Electrical testing of Silicon Strip Detectors for the ATLAS experiment at the Large Hadron Collider
In 2021, installation will begin on the upgrade of the ATLAS experiment at the high luminosity (HL) LHC at CERN. The new silicon based particle tracker will require over 20,000 strip detectors, and these will all be assembled and tested over a range of 20 institutes over a 3 year period. At present the UK will build 50% of the strip barrel tracker, with the remainder built by the US. For the next 2 years, an R&D stage continues to study the design and performance of the detectors that will be used in the upgrade of ATLAS. This involves characterization of both the semiconductor silicon sensors and the electrical performance of the readout CMOS chips. We would be looking for a student who is keen to study hardware used for particle physics experiments. Work will include the use of FPGA systems for DAQ readout, semiconductor characterization of silicon detectors (for charge collection & noise levels) and analysis of detector performance from preliminary particle test beams.
Project Type: Instrument and data analysis.
Prerequisites: Knowledge of computer programming and semiconductor physics is preferred but not essential.
Dates Available: June-August (available for full time)
Main Supervisor: Andrew Blue
Second Supervisor: Craig Buttar
Improving top quark modelling for the LHC
The top quark is a special object in particle colliders: at a mass of 172 GeV it is by far the heaviest quark, and consequently it is the only one to decay before it can form bound hadrons. Unlike the b-quark, measurements of the top require reconstruction of complex experimental signatures, reflecting the top decay chain through b-quarks and W bosons.
These properties make the top both an important test-bed for both Standard Model and Beyond Standard Model physics concepts, and a challenging system for calculations and simulations. In this project we will look at optimising free parameters in the current state-of-the-art of top quark simulation to better fit LHC data, and to constrain their allowed ranges of variation within experimental precision.
Project Type: Data analysis and optimisation.
Prerequisites: Good Unix familiarity, Python highly desirable
Dates Available: June-August
Main Supervisor: Andy Buckley
Second Supervisor: Mark Owen
Data visualisation for particle physics
This project should interest students interested in data visualisation and software library & tool design. It will primarily involve extending and refining plotting machinery in the Rivet data analysis toolkit, with the aim of making a fast, portable, user-friendly and attractive system for presenting physics data. The scope and ambition are largely up to the experience of the successful applicant. As a possible side-project or alternative, we may also look at extending a collider-event visualisation tool to use a new 3D toolkit: applicants with experience in 3D programming, possibly using game engines, should please let us know!
Project Type: Data visualisation for particle physics
Prerequisites: Strong Unix familiarity and Python programming, familiarity with graphics libraries and LaTeX highly desirable
Preferred Dates: June-August
Main Supervisor: Andy Buckley
Second Supervisor: TBC
Jet physics at the LHC
At hadron colliders, the fundamental particles that interact via the strong nuclear force ("partons") are not visible in isolation, but instead "fragment" to create collimated flows of energy aligned with the fundamental parton direction. These flows are called "jets".
QCD calculations and models make many predictions about jet properties, and in recent years many of them have been tested. However, there are still aspects which need experimental validation, especially measures of their structure related to the type of the original parton, and "soft-QCD" effects where perturbative calculations break down.
This project will prototype methods for probing these aspects of jet physics, starting on Monte Carlo simulated events, and hopefully building to first studies on real ATLAS experiment data.
Project Type: Data analysis and algorithm development.
Prerequisites: Good Unix familiarity, Python and C++ highly desirable
Preferred Dates: June-August
Main Supervisor: Andy Buckley
Second Supervisor: TBC
Simulating Equalisation for High-Speed Data Links
The experiments based at the LHC have been hugely successful in making discoveries and performing precision measurements with the data collected at the highest energy particle physics collider ever built. Highlights include the discovery of the Higgs particle (at ATLAS and CMS) and precision measurements constraining physics beyond the Standard Model (at LHCb).
To further our knowledge, the experiments will be upgraded to be able to take data at even higher luminosity to significantly increase the available statistics. The LHCb experiment will be upgraded in 2018 to use the full luminosity of the present LHC. In 2023 ATLAS will be upgraded at the same time that the LHC machine is replaced by the High Luminosity LHC, with an order of magnitude higher luminosity.
Both these upgrades result in order of magnitude increases in interaction rate in the experiments. To take advantage of this the experiments' data recording and transmission rates must be similarly increased. The data rates expected from the LHCb and ATLAS silicon pixel detector systems are predicted to be 5 Gb/s, which is unprecedented in particle physics experiments. This data must be transmitted electrically for a number of meters before being converted into an optical signal. Such data rates make broadband speeds seem rather lacklustre with the faster package (Virgin fibre optic) boasting a pitiful 200 Mb/s.
The development of the highly performant data transmission cables is being led by the Glasgow group for both the LHCb and ATLAS experiments as well as testing the full link performance. At the moment we have a set-up which allows us to test full links: chip (velopix), cables, flexes, opto-box and vacuum feed-through. According to the characterization results, in order to be able to recover data from the detector due to the losses introduced by different items of the link we must include in the link a new ASIC, an equalizer. The purpose of this equalizer is to counteract the losses of the link. The idea is that we would like a flat frequency response of the electrical transmission link; however the high frequencies are attenuated by the link so we try to boost them by this chip.
The project will consist in simulating different conceptual designs of equalizers and then testing how commercial equalizers perform in the full link which requires a general knowledge of a transmission chain. It is a hands-on project that combines computer simulations with hardware so it will be a great way of understanding signal integrity fundaments.
Project Type: Measurements, simulations and data analysis
Prerequisites: Interest in electronics is beneficial but not required
Preferred Dates: Mid-June to end-July
Main Supervisor: Richard Bates
Second Supervisor: Leyre Flores
Measurement of charmed hadron lifetimes at the LHCb experiment
A proton-proton collision recorded by LHCb from June 2015
The LhCb experiment on the Large Hadron Collider (LHC) at CERN is designed specifically to make high precision measurements of decays of hadrons containing charm and beauty quarks. These are compared to the theoretical predictions of the Standard Model (SM) in order to look for discrepancies which may indicate new physics effects. Measurement of the lifetimes of charmed hadrons are challenging, both theoretically and experimentally, but with sufficient precision may reveal new physics. Should a discrepancy between theory and experiment be found this could indicate interference from non-SM particles, which can enhance or suppress the decay of charmed hadrons. LHCb has recorded the largest datasets of decays of charmed hadrons in the world. Combined with its high precision tracking system this makes it an ideal place to perform such tests of the SM.
Using both real and simulated data from LHCb you will work towards a measurement of the lifetime of one or more species of charmed hadron. This will require extensive use of the ROOT data analysis software package in order to parametrise signal and background, account for detector resolution and efficiency effects, and extract the lifetime of the signal decays.
Prerequisites: Some programming experience, particularly with Linux shell scripting, C++ and Python. Prior experience with ROOT would be beneficial, but not essential.
Preferred Dates: June-July
Main Supervisor: Michael Alexander
Second Supervisor: Lars Eklund
Optimising limit setting on new physics using parallel processing
Marginalised exclusion contours on new physics in top quark observables (SM=(0,0))
The Standard Model of particle physics explains an extraordinary range of data with wonderful precision, and the 2012 discovery of the Higgs boson served as a further confirmation of its basic correctness. But the existence of the Higgs also suggests that there must be physics beyond the SM, to stabilise the Higgs mass at its relatively low value of 125 GeV.
In Glasgow particle experiment and theory groups we have built a tool for simulating the influence of generic extensions to the Standard Model on experimental observables in the top quark sector. Constraints on new physics are explored in a space of 12 coefficients for new field theory operators, which encode effective top quark interactions via new particles, and projected into 2-dimensional limit contours as shown in the accompanying image.
At present, the 12-dimensional space is factorised into at most 7 operators at a time, but this is an approximation which cannot be sustained indefinitely: we need to be able to explore all 12 dimensions simultaneously. Integrating over such a large space to produce marginalised 2D limit contours is a technical as well as physics challenge. This project will attempt to achieve this goal by parallelising the 12-dimensional statistical limit-setting, in particular via the Python multiprocessing module and PyOpenCL/PyCUDA packages for GPU processors.
Project Type: Scientific computing
Prerequisites: Strong programming skills, particularly in Python
Dates Available: July-August
Main Supervisor: Andy Buckley
Second Supervisor: Michael Russell