One of the important ways Berkeley Lab has revolutionized science is its development of first-of-its-kind scientific tools. These tools have often accelerated, and sometimes even changed the trajectory of science. The Lab of course got its start with E.O. Lawrence’s invention of the cyclotron, which accelerated scientific discovery at the sub-atomic level. Today, scientists around the world take advantage of the Lab’s cutting-edge user facilities. In addition, scientists at the Lab continue to develop new tools every day, sometimes in service of their own research, and then realize that the tools themselves may be valuable to other researchers.
A Drop-on-Tape Method for Efficient and Accurate Probing of Liquid Samples
Ten years ago, Jan Kern, a scientist in the Biosciences Area, was working on a research project that involved running continuous liquid streams of samples under an X-ray. He and his team were working on proteins of which they had a limited supply, and running the streams of these proteins continuously was proving not only wasteful, but also hard to control, no matter how hard the team tried to optimize the process. It was frustrating, to say the least.
One day, one of the project’s DOE program managers, Robert Stack (now retired) mentioned that a Bay Area liquid-handling robotics company called Labcyte (now part of Beckman Coulter) had some interesting ideas on how to manipulate droplets. Jan and his colleagues, Junko Yano and Vittal Yachandra called up Richard Sterns, the scientist at Labcyte that Robert had mentioned.
That conversation with Richard and his colleagues gave the team at the Lab some ideas. What if the liquid samples were presented in droplets, rather than a continuous stream, and then the droplets were steered towards the X-ray beam? Jan collaborated with scientists at Stanford’s Linac Coherent Light Source (LCLS) facility and the Diamond Light Source facility in the UK to develop and test different ways of doing this. They found it was difficult to position the droplets accurately, so decided to put the droplets on a tape drive. After many iterations, the team settled on a continuous kapton tape that ran like a conveyor belt. The droplets were deposited onto the tape, probed with the X-ray, then the conveyor belt was washed off and recirculated with another set of droplet samples. The system worked like a dream.
“With other processes that used solid supports, you had to throw away the support after they had been used for a half hour. This conveyor belt could be run for 12 hours or more without intervention. It was so much better,” said Jan. He and his partners published their findings in 2017.
Jan noted that the process was developed for a specific experiment using photoactive proteins, but the team realized it could also be used for other enzymes that can be triggered by a reactant, such as gases or chemical substrates. They conducted 30 experiments for about a dozen different enzyme systems, working with scientists at the Lab and elsewhere, including four X-ray laser facilities at Stanford, Japan, Korea, and Switzerland. The drop-on tape process for liquid handling was getting more traction in the scientific community and similar systems are currently under development also at synchrotron facilities in the UK, Germany, Japan, and Australia.
Jan believes the technology could have further applications. Beyond X-ray lasers and synchrotron beam lines, the system could be adapted for other detection modes such as spectroscopy (the system could measure optical signals from the same sample). The system could also be adapted for use with inorganic samples; in materials science, for example, small crystals are not optimal for traditional crystallography. In fact, Aaron Brewster, a scientist in the Molecular Biophysics and Integrated Bioimaging (MBIB) Division who works with researchers at the Molecular Foundry, is examining the use of this system for the very small molecule crystals studied by his team.
“It will be very interesting to see the applications of this system broaden beyond structural biology,” said Jan.
The continuous operation of the system also opens the door for automation down the road. Jan’s team is now building a system for the Advanced Light Source (ALS) that includes an automation aspect. “The amount of data that the system can collect also means that data analysis becomes very important,” said Jan. “Partners including Paul Adams, Nick Sauter, and Aaron Brewster have done critical work on data analysis, allowing scientists to get fast feedback on the data that the ALS system will be generating.”
View drop-on tape method video (31:35 – 33:20)
**
A Tool to Bypass Chip Failures
In the Physical Sciences Area, particle physicist Dan Dwyer faced a challenge of a different sort. Since 2016, he has been working on the Deep Underground Neutrino Experiment (DUNE) project, which will use detectors in giant underground cryogenic tanks for neutrino science and proton decay studies. These studies seek answers to several fundamental questions about the nature of matter and the evolution of the universe.
Dan had an idea to move to pixelated 3D detectors to solve challenging problems with traditional, 2D wire-based neutrino detectors. To validate this idea, Dan reached out to Carl Grace, a scientist and engineer in the Engineering Division, who leads the Lab’s custom integrated circuit design and development efforts for DUNE. Carl determined that Dan’s idea was feasible with available technology, and together, in 2018, they designed the first generation detector, a novel 3D liquid argon particle detector using the LArPix (Liquid Argon Pixel) system. For simplicity, the team used a traditional daisy-chain structure for data input and output for the chips used in the detector. The issue with the use of a daisy chain was that if one chip had a problem, it created a dead-end and data communication stopped. Replacing the damaged chip in an underground instrument stored in a cryogenic tank would be prohibitively expensive.
Dan, Carl, and their teams brainstormed other solutions. Adding thousands of independent backup wires was a no-go; the heat introduced by the wires would damage the detector. Another option was a communication “bus” which the chips would share, each taking over the bus when it was their turn to communicate. But what if one chip took control and didn’t let go of the bus? Fixing such a problem, given the difficulty of accessing the chip, could take months, or even a year. They needed a more reliable solution.
Dan had an idea: what if they just added a software layer on top of the existing circuitry to reroute around any failed components? If one chip is a bad actor, the software could automatically reconfigure the data path to avoid the failed chip without compromising the overall network. The idea had real potential. It would solve the problem of reliability without having to access the failed chip themselves. As a bonus, the team could keep all the circuits that had already been tested in the first system.
Dan and Carl’s teams designed and built a prototype Hydra I/O system in early 2019 as part of the second generation LArPix chip, and then they tested and debugged the system. By the fall, the system was working just as the team wanted.
“We called it Hydra I/O because if you cut off one head it can grow two more,” said Dan.
Development moved rapidly after that. Since the start of the pandemic, the team has ramped up the production of the mid-scale prototype detector, with more than 300,000 channels of electronics in four modules. The modules were finished earlier this spring and assembled at the University of Bern in Switzerland. The system worked beautifully, and the detectors were then moved to Fermilab where this fall, they will be installed in a cryostat for the next stage, the demonstration stage.
“At this stage the detector will actually collect neutrino data,” said Dan. “It’s very exciting, and Hydra I/O was a key component that led from the project being a bench project in 2019 to a real physics experiment in 2023.” If the demonstration project works as expected, the plan is to build the full size DUNE detector later this decade.
As for Hydra I/O’s future, the team has talked about adapting it for other experiments. The innovation won an R&D 100 award in 2022. Dan and Carl are now working with Dan McKinsey’s group at UC Berkeley to test Hydra I/O with a high-pressure helium particle detector. Beyond that, there are also other potential applications – in particular in extreme environments where system reliability is critical, such as for space, underwater, defense, telecommunication, and scientific applications.
**
Building A Cold Microscope
Cryogenic electron microscopy, or cryoEM for short, is another critical research tool that the Lab is developing. In the last five years, with the growing interest in quantum computing and in superconducting materials, the need to examine materials that only exist or operate at low temperatures, has exploded. But today’s electron microscopes, which are optimized to function at room temperature, don’t work well for examining many temperature-sensitive materials. Image resolution of these materials is poor, partly due to significant drift that occurs when “cold” materials are in room temperature environments. In addition, many materials change their structure at room temperature.
Andy Minor, facility director of the National Center for Electron Microscopy (NCEM) at the Molecular Foundry, said, “Having your sample drift around while you’re trying to examine them at atomic resolution is a problem. Fundamentally, having a 300 degree difference in temperature between the sample and the instrument is just too limiting.”
Other approaches to reduce the problem of drift due to thermal expansion, such as putting a low temperature stick into a room temperature microscope, have been explored, but they don’t work well.
CryoEM is an approach that has been explored since the 1960s. Lab scientist Robert Glaeser has been recognized for his groundbreaking work in cryoEM research. The biological science community relies on averaging (combining multiple images) to improve resolution at cryogenic temperatures using room temperature microscopes, but this approach is not available for materials that are unique and can’t be averaged. Therefore, new approaches to cryoEM could significantly open up the range of materials that can be explored. Another advantage of the cryoEM approach is that the electron source can be improved; colder sources make higher quality beams, but they are a challenge for room temperature microscopes.
Andy and his team have been working on building a new type of cryoEM instrument since 2019. That year, Andy, fellow Lab scientist Peter Denes, and Cornell University scientist David Muller organized a workshop to explore the many things that could be achieved with such a microscope. Following the workshop, the team published a paper and successfully pitched the DOE for funds to conduct initial development.
In the last three years, a team led by Peter and Andy, leveraging the expertise of the Berkeley Center for Magnet Technology, have built significant components of a cryoEM: a lens that is stable and a superconducting tip through which to shoot electron beams at the sample. But a cryoEM has hundreds of lenses. And it also requires detectors and software. Putting all these components together, first into a prototype low-voltage system, and then testing the instrument’s performance, will be an expensive proposition.
The good news is that there is strong demand for better electron microscopes. “Everyone who has a modern electron microscope would like a better one — semiconductor companies, pharmaceutical companies, materials researchers, and biologists,” said Andy.
“The impact of an ultrastable and ultracold cryoEM could be very large,” he continued. “Understanding the structure of materials at atomic scales is critical, and cryoEM would open so many doors, whether for developing new battery materials or quantum computers or biological materials. It will be interesting to see what people do with it,” he said.