Summer Student Projects 2015
The following list will give some idea of the range and diversity of previous Summer Studentship Scheme projects, along with the named individuals who participated.
Measuring the LHC underlying event at 13 TeV
The Large Hadron Collider re-starts for data-taking in April/May 2015.
The targets of this two-year run are to measure the Higgs boson in more detail, and to search for signs of new physics beyond the Standard Model, however the first data analyses will be those that measure the backgrounds at 13 TeV, particularly from quantum chromodynamics at this unprecedented energy.
First among these measurements are studies of the so-called "minimum bias" (MB) and "underlying event" (UE) at 13 TeV. These are, respectively, analyses of all events which leave some signature in the ATLAS detector (i.e. they are not "biased" by requiring certain kinds of activity, provided there is some) and analysis of such inclusive activity in events where there is also some highly energetic interaction. UE studies provide a connection between soft, whole-proton collisions, and the quark/gluon super-high-energy collisions for which the LHC was built. Understanding the UE is also crucial in order to calibrate accurately the rest of the high-energy objects to be used in 13 TeV data-taking.
This project will involve being part of the early data-taking task force to measure the MB and UE on the first ATLAS Run 2 data. It will be busy and hard work, with a steep learning curve... but also very exciting, challenging and a unique opportunity to contribute to this new phase of the LHC project. Please contact Dr Buckley if you are interested.
Project type: data analysis through computing (simulation and real data).
Prerequisites: good familiarity with Linux, C++ and Python.
Preferred dates: around 1st June to 15th July
Supervisor: Andy Buckley
2nd supervisor: Chris Pollard
Fast modelling of LHC simulation and data analysis
Testing models of physics at the Large Hadron Collider requires huge amounts of computing power - just as much as is needed to process the vast amount of data recorded by the experimental detectors! Billions of collision events need to be simulated, to model both new physics theories and their Standard Model backgrounds. It can take weeks or months to make a new simulated dataset. For some purposes, this is far too slow, for example, models of crucial backgrounds often have free parameters which must be fitted to data - it's impossible to simulate millions of events for every one of the thousands of iterations that a fitting algorithm requires. For this, we have developed a method called 'Professor', which creates very fast parameterisations of how physics observables respond to parameter changes, so they can be used more flexibly.
This approach has very wide applications, from optimising physics models, to exploring the likelihoods of beyond-Standard Model physics scenarios, to making interactive LHC demonstrations for open days and school outreach (this could even be Web-based).
The Professor code has currently been written with only the first of these tasks in mind. This project will focus on generalising the code to be useful in generic situations such as those described above - the focus is up to you! Please contact Dr Buckley for more information.
Project type: numerical methods, simulation, software development, science outreach.
Prerequisites: good familiarity with Linux and Python. Experience with Cython, C++, matplotlib, and/or Flask would be ideal but is not essential.
Preferred dates: around 1st July to 15th Aug (negotiable).
Supervisor: Chris Pollard
2nd supervisor: Andy Buckley
Remote Monitoring of X-ray Imaging
X-ray imaging is used in particle physics detector development research, with applications ranging from material analysis, to charge collection efficiency measurements of hadron-irradiated devices.
This project involves the use of remote monitoring hardware and software to control and catalogue various elements of a new X-ray testing setup within the PPE group. This would include monitoring background radiation, temperature and humidity, and X-ray energy and intensities.
Such data should be viewable by the user either locally or remotely, and the data should be archived for later analysis. The work will be based around RaspberryPi and Arduino microcontrollers, programmed to send data to Graphite, an online software package designed for remote monitoring, and currently used by the GRID.
Project type: software development, hardware, x-ray imaging.
Prerequisites: knowledge of Arduino programming; experience of readout systems and monitoring software would be ideal but not essential.
Preferred dates: June - August
Supervisor: Andrew Blue
2nd Supervisor: Craig Buttar
Improving ATLAS
The ATLAS experiment, one of two general-purpose detectors at CERN, will be upgraded over the next decade in order to handle the increased collision rate of the high luminosity Large Hadron Collider (LHC). Various subsystems will be improved to take full advantage of the new collision environment, including the inner pixel detector. This is the crucial subsystem for locating the interaction point of the initial collision and identifying the daughter particle trajectories and subsequent decays.
Studies are already underway to investigate the optimum specifications of the new pixel sensors: geometry, material, biasing, electronic and mechanical performance. Testbeam data has been collected for a range of devices under various operating conditions. The data collected is then analysed and compared before conclusions are made concerning the improved system.
You will be involved in the analysis of data collected from novel pixel detectors in various testbeam environments. This will characterise the detector response and help improve the next generation of ATLAS pixel sensors.
Project type: data analysis through computing, hardware
Prerequisites: reasonable familiarity with Linux and C++; some experience of ROOT analysis package is preferred but is not essential.
Preferred dates: June or August
Supervisor: Kenneth Wraight
2nd supervisor: Richard Bates
Ultra-Fast Readout System for Medipix X-ray Imaging Detector
Medipix is a state-of-the-art multi-pixel detector of radiation. It has many applications in the science arena, from monitoring of radiation background at the Large Hadron Collider (LHC), to Computed Tomography (CT) of small animals for novel drug development. The system comprises a silicon detector attached to a readout chip, designed at CERN and fabricated at IBM facilities.
The aim of this project is to develop a prototype readout system for the chip that is capable of acquiring data at 2k fps, 100 times faster than existing systems. Such speeds will enable applications of the systems to deliver new science and discoveries.
On project completion you will have the skills to develop complex PCBs, and to characterise building blocks of any ICs. You will learn ARM processor architecture and acquire knowledge for USB3 interface developments.
Type of work: Lab-based, basic data analysis
Prerequisites: Good knowledge of C++, basic understanding of electronics, basic knowledge of radiation detection mechanisms.
Preferred dates: May - July
Supervisor: Dima Maneuski
Backup supervisor: Val O'Shea
Development of technology towards next generation radiopharmaceuticals for nuclear medicine and PET
Radiochemical purity testing is a crucial step in the quality control of radiopharmaceuticals produced for Positron Emission Tomography (PET) imaging. It is absolutely essential to ascertain the purity of radiopharmaceuticals products before they can be administered to patients. A well-established analytical technique used in identifying radioactive components in a radiopharmaceutical formulation – Thin Layer Chromatography (TLC) – is based on the detection of the annihilation gamma rays from the active radionuclide in the radiopharmaceutical. High energy gamma detection however, inherently suffers from low detection efficiency, high background noise, and very poor linearity.
We have developed and patented a novel positron detection technique. This approach has been extensively tested over the past two years and has proved to be far superior to the existing techniques which are offered on the open market.
As a summer student, you will join this effort to design the next generation TLC scanner based on existing developments. During the project you will be required to develop a user-friendly mechanical system and enclosure to hold the detector and TLC plate. You will learn about the industry-standard package for mechanical designs: 'Solidworks'. You will study how to operate a Medipix detector, and how to acquire and analyse data using C++ and JAVA. You will become familiar with many aspects of nuclear medicine, PET, and radiopharmaceuticals from a particle physics point of view.
Type of work: Lab-based, basic data analysis
Prerequisites: Basic understanding of radiation detection mechanisms, some experience in lab work.
Preferred dates: May - July
Supervisor: Dima Maneuski
Backup supervisor: Val O'Shea
Characterisation of pixel and strip HV-CMOS sensors for ATLAS upgrade
HV-CMOS sensors combine the elements of Si radiation detectors with CMOS electronic capabilities. This allows sensors to be developed with built-in processing that can provide improved performance, such as reduced pixel size, leading to increased spatial resolution and in situ z-measurements in strip detectors.
CMOS sensors can be manufactured in different ways using high-resistance substrates or special HV processes in order to provide the 'sensing' region. As with all detector designs there are a number of trade-offs in terms of signal-to-noise, power, and signal complexity that must be made. For operation at the Large Hadron Collider (LHC), these devices must also be resistant to radiation (radiation-hard).
This project will characterise a number of CMOS sensors with different structures (high-resistance vs high-voltage) in a range of modes of operation using strip and pixel readout, looking at signal-to-noise, power dissipation and radiation hardness.
You will gain a diverse range of skills in sensor characterisation such as building experimental setups, re-evaluating data using ROOT and C++, and data acquisition using Labview and C++.
Type of work: Lab-based, basic data analysis
Prerequisites: Some knowledge of C++ and ROOT and labview. Basic understanding of radiation detection. Basic understanding of electronics.
Supervisor: Dima Maneuski
Backup supervisor: Richard Bates
Higgs-boson production in association with top-quark pairs
Being the heaviest quark, the top quark has the strongest coupling to the recently discovered Higgs boson (Yukawa-coupling). The measurement of this coupling is therefore one of the most interesting tests of the Standard Model Higgs Boson. The Higgs boson can be produced in association to a top-quark pair, called ttH-channel. At the measured Higgs mass of about 125 GeV, the Higgs decays predominantly into a b-quark pair. This channel is very challenging, since it suffers from backgrounds (background: events with different production processes, that can look similar to the signal). The main background is the production of a top-quark pair with an additional b-quark pair, since it has exactly the same final state particles (irreducible background). The discrimination of ttH- and ttbb-events is therefore of the utmost importance. For this separation, neural networks are a useful tool that allow us to take into account several variables, in order to build a final discriminant between the two processes. This summer project focuses on the use of this Neural Network and the optimisation of variables, that are utilised for the final discriminant in order to improve the signal/background separation. The project would involve the run of this Neural Net and the statistical analysis of these results.
Prerequisites: The applicant should be familiar with Linux, C++, Python and/or bash-scripting.
Preferred dates: Mid-May to Mid-July
Main supervisor: Sarah Boutle
Backup supervisor: Andrea Knue
Characterisation of high-speed readout for the VELO upgrade
The LHCb Experiment will go through a major upgrade in 2018-2019 replacing most of its detectors and completely changing the trigger and data acquisition system. This upgrade will allow the experiment to operate at higher luminosity and to select the signal candidates more efficiently. One of the technical challenges for the upgrade is to cope with the increased data rates produced in the detector and to transport this data off-detector for event selection and storage.
This project is to work on Research and Development (R&D) towards the upgrade of the LHCb’s vertex detectors, the VELO. Glasgow will design and deliver the high speed read-out links and front-end control electronics. The project will entail characterisation of prototypes of the data links, analysis of this data, and comparison with simulations of the expected performance. It may also include performing the simulations of the transmission link and characterisations of prototypes of the front-end control electronics - hence the project will include both measurements in the lab and analysis of the data taken.
Project type: Measurements and data analysis.
Preferred dates: mid-June to end of July
Supervisor: Lars Eklund
Second Supervisor: Kenneth Wraight
Measuring rare meson decays with NA62 - Feasibility Studies
NA62 is a CERN experiment that aims to measure the very rare kaon decay $K^+ \to π^+ + ν + \bar{ν}$ and extract a 10\% measurement of the CKM parameter $|v_{td}|$. NA62 will take data until 2017, and plans to collect about 100 $K^+ \to π^+ + ν + \bar{ν}$ events if the Standard Model prediction is correct.
In addition to the headline $R_K$ measurement, there is a proposal to measure $R_{K\pi} = \Gamma(K \to e \nu (\gamma))/\Gamma(\pi \to e \nu (\gamma))$, either with beam pions or with pions from in-beam kaon decay. A feasibility study is needed in order to understand which option is more advantageous.
The advantage of using beam pions here is that the acceptance effects cancel out, but one needs special runs to determine the beam pion characteristics and then assume these are stable. When using pions from kaon decay the measurement does not depend on the beam composition; however, the disadvantage is that acceptances will be very different.
The aim of this summer project is to analyse Monte Carlo simulations, fine-tuned using experimental data from the 2014 pilot physics run, and compare detector acceptances for the two cases described above.
Project type: numerical methods, simulation, software development, data analysis
Prerequisites: familiarity with Linux and C++. Previous knowledge of Root would be ideal.
Preferred dates: around 1st June to 15th July (negotiable)
Supervisor: Dan Protopopescu
2nd supervisor: Bruno Angelucci
General
For general quesions on the summer student programme, please contact Michael Alexander. For specific questions on research topics that interest you, contact any of the RAs/academics working in the area. (please see members page).
Summer Student Administration