Summer Student Projects 2014
The following list will give some idea of the range and diversity of previous Summer Studentship Scheme projects, along with the named individuals who participated.
Distributed Resiliency for Grid Data Storage
Data storage on the Worldwide Data Grid used for Large Hadron Collider (LHC) physics work consists of many "Storage Elements", each a relatively large storage array (on the 100s of Terabytes to Petabytes scale). These are organised logically by "logical file catalogues", such as the LFC service, or the experiment specific services such as ATLAS' DQ2/Rucio. The purpose of the logical file catalogues is to provide a "generic" way to locate files, regardless of where on the Grid they actually end up.
Generally, for the larger experiments using the Grid, file resilience is provided by simply making multiple copies of the file - the logical file catalogue remembers where all the copies corresponding to the original entry are, so if one is lost, we can easily find the other copies.
Smaller experiments, having less resources available to them, often do not have enough space available to them on the grid to make multiple copies of files. However, the risk of file loss is non-zero, especially if they have data distributed across many Storage Elements.
One way to improve data resilience, without making full copies of a file, is via Erasure Coding schemes (such as Reed-Solomon coding).
This project is to develop a set of tools for Grid data management, wrapping the existing framework, in order to add distributed Erasure Coded resilience to files on the Grid. We will be using the Least-Authority File System implementation of Reed-Solomon coding as a basis for the mechanism.
Prerequisites: This project does not require a deep mathematical understanding of Reed-Solomon or other erasure codes. It will require some experience in C++ and Python, and the ability to learn some relatively simple data management tools. Depending on how advanced the project gets, it may also require the acquisition of Grid Credentials. (This will take about a day or two at some point when necessary.)
Project type: computing
Preferred dates: July - August 2014
Supervisor: Sam Skipsey
Backup Supervisor: David Crooks
Searching for the Higgs boson decaying to bottom quarks with ATLAS
This is a computational data-analysis project contributing to the Higgs boson search in data collected by the ATLAS detector, when the Higgs boson decays to a bottom-quark pair. Prior experience with Linux, C++ and Python is required.
The 2013 Nobel prize in Physics was awarded to two theorists (Francois Englert and Peter Higgs) who 50 years ago predicted the existence of a new elementary ingredient for the recipe of the Universe, namely the Higgs field and its corresponding elementary particle, the Higgs boson. The theory was confirmed by the discovery of the Higgs boson by the ATLAS and CMS experiments at CERN in 2012. The theory offers a mechanism by which elementary particles acquire mass by stating that the vacuum is not truly empty, but instead is filled with the Higgs field. Therefore, even when moving through a vacuum, elementary particles feel the viscosity of the Higgs field, or in other words have collisions with the Higgs particles that form the Higgs field, which slows down the particles acquiring mass. Therefore, the Higgs boson, once produced in the laboratory, should decay to all particles that have mass, and the probability of decay should be proportional to the daughter-particle mass.
There are two types of elementary particle: particles of matter (or fermions) and particles that carry forces (or bosons). Though the largest probability of decay is to bottom-quark pairs (about 60% of the time at its mass of about 125 GeV), the Higgs boson has been observed so far only in decays to bosons (photons, W and Z bosons). It is important to observe experimentally that the Higgs boson also decays to fermions, such as the bottom quarks.
Our project contributes to the search of the Higgs boson when it decays to bottom quarks. More precisely, Glasgow is leading the ATLAS Collaboration effort to improve the measurement of the energy of the bottom quarks by performing calibrations using machine learning techniques of artificial neural networks, which in turn translates into a better measurement of the Higgs-boson-candidate mass and a better separation of the Higgs boson signal from the other non-Higgs processes, collectively known as background.
This project is computational and requires prior familiarity with Linux, and either C++ or Python. Preference will be given to students who are motivated to continue with a PhD in experimental particle physics and have experience with the CERN-based data analysis software ROOT.
Project type: advanced data analysis through computing (simulation and real data)
Preferred dates: around 1st June to 15th July 2014.
Supervisor: Adrian Buzatu
Backup supervisor: Aidan Robson
Performance evaluation of silicon detectors for medical and material characterisation applications
Silicon detectors developed for the Large Hadron Collider (LHC) exhibit many excellent properties that make them very attractive to use in areas outside particle physics. Excellent position resolution of strip detectors and good energy resolution of scintillation detectors make them the natural choice for the next generation Compton Cameras for Single Photon Emission Computed Tomography (SPECT) and for electron counting for novel techniques in Electron Microscopy.
In this project you will take part in the characterisation of building blocks of new generation spin-off technologies from particle physics. The work package will include participation in characterisation of silicon photon detectors, development of novel signal processing and digitisation techniques.
On project completion you will have gained a better understating of radiation detection techniques and integrated circuits, and have enhanced your Solid works design skills and PCB development. You are expected to have some understating of semiconductor detectors and integrated circuits. Preference will be given to candidates with knowledge of C++ or PCB design.
Type of work: Lab-based
Preferred dates: May - July 2014
Supervisor: Dzmitry Maneuski
Backup supervisor: Paul Soler
Evaluation of single photon counting detectors for high definition imaging and particle identification
Medipix is a state-of-the-art multi pixel detector of radiation. It has found applications in many areas of science from monitoring of radiation background at the Large Hadron Collider (LHC) to Computed Tomography (CT) of small animals for novel drug development. The system comprises a silicon detector attached to a readout chip, designed at CERN and fabricated at IBM facilities. The assembly is connected to a PC via a dedicated readout USB interface and controlled by the specialised software.
The Particle Physics Experiment (PPE) group has recently acquired a modern high energy/high current X-ray generation machine. You will take part in the development of a setup for a wide range of X-ray radiography applications. Your duties will include assembling parts of the setup, and evaluation of individual aspects of the system.
In addition, you will also work closely with setting up the Medipix system for data acquisition and analysis of various types of interactions. This work will form part of the teaching curriculum for the undergraduate labs, based on the Medipix technology.
On project completion you will have gained a better understating of radiation detection techniques and the use of complex equipment, and have enhanced your programming and data analysis skills. You are expected to have some understating of semiconductor detectors. Preference will be given to candidates with knowledge of C++ /Java and who also have excellent writing skills.
Type of work: Lab-based, basic data analysis
Preferred dates: May - July 2014
Supervisor: Dzmitry Maneuski
Backup supervisor: Val O'Shea
Jet substructure for LHC Run 2
Jets are the collimated bunches of hadrons measured in our detectors, created by high energy particle collisions. As we go to higher energies at the Large Hadron Collider (LHC), Higgs bosons, or as yet undiscovered heavy particles, are produced with very high energy and the decay products from these "boosted" particles tend to be contained in jets that are spread over a larger area. The internal structure of these jets is exploited to identify the original particles that have decayed into the jets detected.
In this project, we will study the use of substructure techniques, and try to optimise them for LHC Run 2, using the knowledge gained from Run 1 data. Preference will be given to candidates who have familiarity with C++/Python.
Project type: advanced data analysis through computing (simulation and real data)
Preferred dates: May - July 2014 (negotiable).
Supervisor: Deepak Kar
Backup supervisor: Craig Buttar
Searching for Higgs self-coupling and new physics at the High Luminosity LHC with the ATLAS experiment
The high-luminosity LHC will provide a factor of 10 more data, 3000fb-1, than the current running of the LHC. This will allow searches of rare processes such as the Higgs self-coupling, and will extend the mass range of new physics that can searched for well into the TeV region.
With the discovery of the Higgs, the ATLAS physics programme will focus on studying its properties, particularly its couplings to particles. This will determine whether the observed boson is the Higgs boson predicted by the Standard Model, or whether it is a more exotic alternative. In addition to studying the properties of the boson, it is important to understand whether the Higgs potential describes electroweak symmetry breaking. This is probed for by searching for the Higgs self-coupling in the reaction pp-->HH-->X. This is a hugely challenging channel because of its extremely low rate, and because it can be masked by high rate background events including single Higgs production (as particle physicists say: "Today's discovery is tomorrow's background"). However, preliminary studies of pp-->HH-->bbar+gamma-gamma and pp-->HH-->bbar+tau-taubar have shown promising results.
This project will focus on the pp-->HH-->bbar+tau-taubar, looking at the effectiveness of mass variables such as MT2, and at multivariate analysis methods such as boosted decision trees, in order to identify the signal in the presence of large backgrounds.
The high luminosity data set will extend the mass range that can be probed in the search for new physics. One of the key areas for looking for a range of new physics processes is in searches for resonances in the ttbar system. In the high mass regime the analysis must deal with highly boosted tops where the decay products (t->Wb; W->jj or l+nu) merge into a single reconstructed object. New reconstruction methods are being investigated to identify boosted tops and to reconstruct the elements of the top decay.
This project will focus on looking at how the boosted tops can be reconstructed, and apply this to the high luminosity data set to estimate the mass range that will be probed. You will use ATLAS data sets and skeleton analysis code to develop your own analysis. This will require programming in C++ and using the ROOT analysis package and training will given as required.
Preferred Dates: end of June - mid/end August 2014 (flexibility is possible)
Supervisor: Danilo Ferreira de Lima
Backup Supervisors: James Ferrando; Craig Buttar
Higgs-boson production in association with top-quark pairs
Particles in the Standard Model gain mass by interacting with the Higgs-field. Being the heaviest quark, the top quark has therefore the highest probability of interacting with the recently found Higgs-boson (Yukawa-coupling). The measurement of this coupling is therefore one of the most interesting tests of the Standard Model.
The Higgs-boson can be produced in association to a top-quark pair, called ttH-channel. At the measured Higgs-mass of about 125 GeV, the Higgs decays predominantly into a b-quark pair. This channel is very challenging, since it suffers from backgrounds (background: events with different production processes, that can look similar to the signal). The main background is the production of a top-quark pair with an additional b-quark pair, since it has exactly the same final state particles (irreducible background). Therefore the discrimination of ttH- and ttbb-events is of the utmost importance.
For this separation, neural networks are a useful tool that allow us to take into account several variables to build a final discriminant between the two processes. The topic of the project would be the use of this Neural Network and the optimisation of variables that are used for the final discriminant to improve the signal/background separation. The project would involve the run of this Neural Net and the statistical analysis of these results.
Project type: data analysis using computing, using simulated data.
Prerequisites: the applicant should be familiar with Linux, C++, Python and/or bash-scripting.
Preferred dates: Mid-may to Mid-July.
Supervisor: Andrea Knue
Backup supervisor: Tony Doyle
Development of a Pixel system test setup
The ATLAS tracker will be replaced to cope with the forthcoming experimental requirements of the High luminosity LHC. The inner region of the tracker consists of Si-based pixel detectors which are read using a dedicated readout ASIC: FE-I4. Glasgow is developing a quad-sensor-based assembly that comprises a single large area Si-based pixel sensor, bump-bonded to four FE-I4 chips.
The performance of the individual assemblies consisting of a single sensor and readout chip, must be characterised to determine whether they can achieve the required signal-to-noise and efficiency performance. These single assemblies are then placed into multi-module systems and must show the same performance.
This lab-based project will look at using the RCE data acquisition system to characterise single and multi-assembly systems looking at noise performance and signal-to-noise, using radioactive sources and efficiency measurements. In addition to working with the RCE system, other system elements including LV and HV power supplies will need to be integrated.
Prerequisites: The skills required for this project include programming in linux and C++ and using the ROOT analysis package, a basic knowledge of C++ radiation detector systems and an elementary knowledge of electronic circuits.
Training will be given as required.
Preferred dates: end June - end August 2014
Supervisor: Craig Buttar
Backup Supervisor: Richard Bates
Characterisation of HV-CMOS sensors
HV-CMOS sensors combine the elements of Si radiation detectors with CMOS electronic capabilities. This allows sensors to be developed with built-in processing that can provide improved performance, such as reduced pixel size, leading to increased spatial resolution and in-situ z-measurements in strip detectors.
CMOS sensors can be manufactured in different ways using high-resistance substrates or special HV processes to provide the sensing region. As with all detector designs there are a number of trade-offs in terms of signal-to-noise, power, and signal complexity that must be made. For operation at the LHC, these devices must also be resistant to radiation (radiation hard).
This project will characterise a number of CMOS sensors with different structures (high-resistance vs high-voltage) in a range of modes of operation using strip and pixel readout, looking at signal-to-noise, power dissipation and radiation hardness.
Prerequisites: The skills required for this project include programming in Linux and C++ and using the ROOT analysis package, a basic knowledge of radiation detector systems and an elementary knowledge of electronic circuits. Training will be given as required.
Project type: data analysis
Preferred dates: end June - August (flexible, for discussion)
Supervisor: Richard Bates
Backup supervisor: Craig Buttar