As AI continues to evolve and iterate, it has transformed from an auxiliary tool into a core driver of paradigm shifts in scientific research. Its applications now permeate the entire scientific workflow—from hypothesis generation to experimental validation—significantly enhancing the efficiency and expanding the boundaries of scientific discovery.
“AI for Science” has become a major trend, with deep integration already achieved across disciplines such as life sciences, materials science, earth and environmental sciences, engineering, and astronomy. For example, three scientists from the U.S. and U.K. used AlphaFold 2, an AI application, to successfully predict the structures of approximately 200 million proteins—nearly exhausting all known proteins—and were awarded the 2024 Nobel Prize in Chemistry for this achievement.
1. How is AI applied in high-energy physics?
Among scientific fields, high-energy physics (also known as particle physics) may be one of the earliest to integrate artificial intelligence.
High-energy physics, a dominant branch of physics in the latter half of the 20th century, studies subatomic particles and their behavior under high-energy conditions. The concept of “elementary particles”—the most fundamental building blocks of matter—has evolved over time. Atoms were once considered elementary; later, electrons and atomic nuclei were discovered. Subsequently, protons and neutrons were found within nuclei, and these in turn were revealed to consist of even more fundamental quarks. According to the Standard Model of particle physics, there are 61 elementary particles: 36 quarks, 12 leptons, 12 gauge bosons, and one Higgs boson. All known particles are composed of these.
Particle physicists aim to uncover the internal structure and properties of these particles and discover new, even more fundamental ones.
As early as the 1980s, particle physicists began exploring AI. By around 2010, machine learning had become a standard analytical tool. Today, deep learning and other advanced AI techniques are deeply embedded in data analysis and signal filtering. With AI’s growing capabilities, scientists now harbor even greater expectations.
At the Guanghe Organization’s 2025 AI Innovation Conference, Chen Gang, researcher at the Institute of High Energy Physics (IHEP), Chinese Academy of Sciences (CAS), and Director of the National High Energy Physics Scientific Data Center, shared his insights on how AI empowers high-energy physics research in an interview with Xinshiyeh and other media.
Chen stated that over the past five years, the National High Energy Physics Scientific Data Center and IHEP have undertaken extensive work in AI. “We use AI technologies—including large models and agents—for scientific data analysis. This has significantly improved both the quality and precision of our analyses. Moreover, by training these models, we are gradually enabling them to conduct scientific research autonomously. It’s akin to mentoring a student: we guide large models or agents to analyze physical data and reproduce known phenomena. Currently, they possess capabilities roughly equivalent to a graduate student.”
He emphasized, “If we can advance further, these systems may eventually achieve autonomous scientific discovery—a breakthrough of profound significance. Today’s AI primarily enhances research efficiency and intelligence, but true autonomous discovery remains unrealized. Achieving this would be transformative, though the path ahead is still long.”
Notably, the public often associates high-energy physics solely with its experimental instrument—the large particle collider. In China, debates have arisen over whether to build such a collider. As AI grows more powerful, is constructing a large collider still necessary?
When asked this question, an AI responded that AI does not diminish—but greatly enhances—the scientific value and necessity of colliders. AI enables us to design and interpret far more complex and powerful experiments. Comparing the massive investment in building a collider directly against the cost of developing AI tools is a misconception. They are inseparable “two sides of the same coin” in humanity’s quest to understand the deepest structure of matter: the collider is the “exploration vessel” pushing the frontier of knowledge, while AI is the “intelligent compass” steering it into the unknown.
Indeed, the international community agrees on building future large colliders. Europe is actively planning the Future Circular Collider (FCC) as the successor to the Large Hadron Collider (LHC).
1. How do domestic computing solutions empower scientific research?
The rapid development of AI relies fundamentally on underlying resources—computing power, electricity, and more.
Before 2020, global research labs commonly adopted “Intel CPU + NVIDIA GPU” as the standard hardware stack for high-performance computing and AI research. However, due to geopolitical factors, China must achieve self-reliance and control in computing power. Consequently, many domestic labs and institutions are intensifying support for homegrown computing solutions.
Regarding the real-world performance of domestic platforms, Chen Gang noted, “Today’s domestic processors, computing accelerators, and storage devices perform very well. When integrated with advanced domestic software systems, their actual performance is no worse than foreign alternatives. I often tell my international colleagues that we’ve already migrated our computing and storage stacks to domestic hardware platforms—they’re very interested and hope to adopt Chinese solutions themselves.”
Chen added that high-energy physics previously relied heavily on CPU clusters for scientific computing and data analysis. Now, they are transitioning to domestic accelerators like Hygon’s DCUs for data processing. This shift requires porting existing algorithms and software to new hardware architectures. He stressed, “This transition presents a great opportunity: we can fully leverage domestic chips while simultaneously driving their further development—a mutually reinforcing process.”
Sugon (Dawning Information Industry Co., Ltd.) has maintained a long-standing collaboration with IHEP. Public records show that in 2017, Sugon won a bid to supply servers and components to IHEP. In April 2025, IHEP selected Sugon’s AI solution as its computing backbone and, in partnership with DeepAI’s Shensuan intelligent engine, independently developed Xiwu, the first L2-level large model in high-energy physics focused on knowledge mining and discovery. In August 2025, Sugon and IHEP jointly launched the “Scientific Large Model Joint Solution Based on Domestic GPU Accelerators,” aiming to deepen AI integration into scientific research.
Reflecting on collaborations with Sugon and Hygon, Chen highlighted that IHEP has worked closely with these vendors over an extended period on software validation and optimization within their ecosystems. “This ensures both full utilization of hardware performance and acceptable computational accuracy—critical for the credibility of scientific results. Sugon has provided strong support in personnel and resources, yielding significant outcomes that set a valuable precedent for future collaborations between scientific data centers and hardware vendors.”
Moreover, given IHEP’s extensive international collaborations in scientific data, these partnerships help domestic enterprises “go global.” In talent development, joint efforts between researchers and engineers cultivate young scientists into domain experts while enabling companies to refine their products through real-world feedback.
At the Guanghe Organization conference, 22 leading universities, research institutes, and enterprises—including Guangzhou National Laboratory, Tianjin University, Hunan Applied Mathematics Center, IHEP, National Astronomical Observatories, Institute of Atmospheric Physics (CAS), BGP Inc. (China National Petroleum Corporation), Sugon, and Hefei Big Data Company—jointly launched the “Scientific Intelligence Collaborative Research Initiative.”
This initiative aims to systematically advance the deep integration of AI with critical fields such as bioinformatics, earth sciences, energy materials, healthcare, and industrial intelligence. Key focus areas include:
Development of scientific large models
Construction of hyper-intelligent converged computing platforms
Optimization of model training and inference
Open sharing of scientific data
This marks a pivotal shift in China’s “AI for Science” trajectory—from isolated technical breakthroughs toward systematic, ecosystem-driven, and scalable transformation. According to the Guanghe Organization, the initiative plans to deploy more than 20 industry-level AI4S software-hardware co-optimized solutions within the next three years.
|