Research Projects Archive

2019-2020

 

“High-performance computing infrastructure and mesh-based data representation for phase-space calculations in nuclear femtography”

Principal Investigator: Nikos Chrisochoides (Computer Science Dept., Old Dominion University) Collaborators: Gagik Gavalian (Jefferson Lab), Christian Weiss (Jefferson Lab), Yaohang Li (Computer Science Dept., Old Dominion University), Xu Liu (Computer Science Dept, William and Mary)

An interdisciplinary project to leverage advanced computational methods from medical imaging for the processing of phase space distributions in nuclear femtography experiments, with the aim to enable next-generation Monte-Carlo simulations, data analysis, and physics model comparison. Objectives are: 

  1. Design/implement a High Performance Computing infrastructure for multithreading of phase space calculations in Monte-Carlo simulations; 
  2. Design/implement a parallel mesh-based representation of phase space data using tessellationmethods; 
  3. Explore the potential of physics model comparison enabled by the new technologies. The developments will be carried out in a collaborative effort of computer scientists and experimental and theoretical nuclear physicists.

“QCD theory and machine learning for global analysis”

Principal Investigator: Nobuo Sato (Dept. of Physics, Old Dominion University) Collaborators: Ian Cloet (Argonne National Lab), Michelle Kuchera (Dept. of Physics, Davidson College), Yaohang Li (Computer Science Dept., Old Dominion University), Wally Melnitchouk (Jefferson Lab), Andreas Metz (Dept. of Physics, Temple University).

The goal of this project is to build the next generation of global QCD analysis tools using machine-learning techniques to study the quantum probability distributions characterizing the internal structure of the nucleon. In concert, QCD-inspired models will be developed and used to calculate Wigner distributions and their projections onto generalized parton distributions and transverse-momentum-dependent distributions. QCD theory will be used to train and optimize machine-learning algorithms by putting physical constraints on the mapping between observables and the quantum probability distributions. The resulting product will be a critical tool for the nuclear physics community, opening up new possibilities for collaboration with computer science in the exploration and visualization of the inner structure of hadrons and nuclei.

“Visualizing Femtoscale Dynamics”

Principal Investigator: David Heddle (Dept. of Physics, Computer Science and Engineering,Christopher Newport University)
Collaborators: Volker Burkert, Latifa Elouadrhiri (Jefferson Lab), Francois Girod and Peter Schweitzer (Dept. of Physics, University of Connecticut), Nicholas Polys (Advanced Research Computing, Virginia Tech), Marc Vanderhaeghen (Institute for Nuclear Physics, JGU Mainz,Germany)

This project aims at the characterization and visualization of the internal mechanical and dynamical properties of the proton through the analysis for data for Deeply Virtual Compton Scattering that provides a quantitative picture of the nucleon encapsulated in Generalized Parton Distributions. Towards this end, a team of experts has been assembled across a spectrum of disciplines, including both theoretical and experimental nuclear physics, visualization, imaging and computing to develop the ground work for the visualization of the Femtoscale Dynamics.

“Data Visualisation”

Principal Investigator: Carl Carlson (Dept. of Physics, William and Mary)
Collaborator: Marc Vanderhaeghen (Institute for Nuclear Physics, JGU Mainz, Germany)

The flow of data from Jefferson Lab and other labs across the world is already significant and set to grow by leaps and bounds. The data on cross sections, asymmetries, and other quantities are being measured as a function of greater number of kinematic variables, making the
recognition of patterns or regularities in the data increasingly challenging. The aim of the project is to exploit ideas developed in the medical imaging field to produces two- and higher- dimensional slices of the data that can reveal the salient features of nucleon structure.

“Parton Distribution Functions from Lattice QCD”

Principal Investigator: Kostas Orginos (Dept. of Physics, William and Mary)
Collaborators: Andreas Stathopoulos, Eloy Romero (Computer Science Dept., William and Mary), Joseph Karpie (Dept. of Physics, William Mary)

A major achievement in acquiring a first-principles understanding of proton structure through lattice QCD computations has been the development of new methods for the direct calculation of parton distribution functions, but several computational and theoretical challenges remain. This project will bring together computer scientists and physicists to address two of the

principle problems: the solution of inverse problem to obtain the measures of proton structure from the quantities computed in lattice QCD, and the development of new computational methods to more efficiently exploit the emerging generation of exascale supercomputers and remove some of the computational bottlenecks.

“Wigner Imaging”

Principal Investigator: Simonetta Liuti (Dept. of Physics, University of Virginia)
Collaborators: Peter Alonzi (School of Data Science, University of Virginia), Matthias Burkardt (Dept. of Physics, New Mexico State University), Dustin Keller, Olivier Pfister (Dept. of Physics, University of Virginia), Petra Reinke (School of Engineering and Applied Science, University of Virginia)

The science of Nuclear Femtography probed by deeply virtual exclusive reactions has revo- lutionized our approach to exploring the internal structure of the nucleon. A new generation of current experiments, and those planned at the future EIC, could in principle allow us to incorporate all the information from data and phenomenology into a tomographic image
connecting the deepest part of the quantum world with what we see as everyday matter around us. However, to harness and organize information from experiment and increase the reach of this emergent field will require going beyond the standard computational toolbox. This pilot project is an effort in this direction. It consists of two parts: computational and
visualization on one side, and theoretical, on the other. The first element will be to In order to examine and evaluate the use of new state-of-the-art computational methods and techniques, including visualization to address the many layers of analysis which are necessary to extract the signal from its complex background after the large experimental data sets are acquired. The second element will be to develop a flexible model of the Wigner distribution which underlies the theoretical description of the data.

“Next-generation Visual Analysis Workspace for Multidimensional Nuclear Femtography Data”

Principal Investigator: Nicholas Polys (Advanced Research Computing, Virginia Tech)
Collaborators: Srijith Rajamohan (Advanced Research Computing, Virginia Tech), Markus Diefenthaler and Dmitry Romanov (Jefferson Lab)

The analysis of experimental data in five or more kinematics dimensions that enable us to constrain Generalized Parton Distributions and Transverse-Momentum-Dependent Distributions is a multidimensional data-science challenge. This project will apply recent advances in scientific visualization to gain more insight into the multidimensional datasets that lie at the forefront of Nuclear Femtography. In this initial phase, the project will explore Semantic Interactions as a visualization technique to analyze an ensemble of scientific data sets, motivated by ongoing research and development into the direct connection between Semantic Interactions and machine learning.

http://vis.arc.vt.edu/projects/JeffersonLab/Femtography/
http://vis.arc.vt.edu/projects/JeffersonLab/SemanticInteraction/