From the types of samples to the techniques used to study them, user experiences at beamlines around the world can vary, but one commonality connects them: beamtime is precious. At different facilities, users encounter different beamline controls, and varying availability of compute infrastructure to process their data. Beyond needing to familiarize themselves with different equipment and software setups, they also need to ensure that they’re collecting meaningful, consistent data no matter where they are. For the past several months, the ALS Computing group has been traveling around the world for beamtime. Their firsthand experience is informing the development of a suite of tools aimed at lowering the barriers of access to advanced data processing for all users.
Today’s beamtime experience
As a beamline scientist at the ALS, Dula Parkinson has helped numerous users with microtomography, a technique that can yield ten gigabytes of data in two seconds. “In many cases, users won’t have done this kind of experiment or analysis before, and they won’t have the computing infrastructure or software needed to analyze the huge amounts of complex data being produced,” he said.
Computational tools and machine-learning models can help the users, from adjusting their experimental setup in real time to processing the data after the experiment has concluded. Eliminating these bottlenecks can make the limited beamtime more efficient and help users glean scientific insights more quickly.
As a former beamline scientist himself, Computing Program Lead Alex Hexemer has first-hand knowledge of the user experience. He was instrumental in the creation of a dedicated computing group at the ALS in 2018, which continues to grow in both staff numbers and diversity of expertise. A current focus for the group is to advance the user experience with intuitive interfaces.
Computing approach to beamtime
Recently, Hexemer and two of his group members, Wiebke Koepp and Dylan McReynolds, traveled to Diamond Light Source, where they worked with Beamline Scientist Sharif Ahmed to test some of their tools during a beamline experiment. “It is always useful to see other facilities from the user’s perspective,” McReynolds said. “We want our software to be usable at many facilities, so getting to test in other environments was very valuable.”
The computational infrastructure is an essential complement to the beamline instrumentation. To standardize their experiments across different microtomography beamlines, the team performed measurements on a reference material—sand with standardized size distributions. Each scan captures a “slice” from the sample; the slices then need to be reconstructed into three-dimensional images that contain 50 to 200 gigabytes of data.
Read more on ALS website
Image: The ALS Computing group performed experiments and tested their machine learning models at Beamline 8.3.2. Clockwise from back left: Tanny Chavez, Dylan McReynolds, Raja Vyshnavi Sriramoju, Seij De Leon, Dula Parkinson, Wiebke Koepp.

