Machine learning engineering is a branch of engineering that implements ongoing data science developments to address complicated or intricate problems with large amounts of disparate data to create cost-effective and far-reaching benefits.
Exascale computers are the world’s fastest supercomputers. They hold a performance of at least one exaflop, or one quintillion calculations per second. Combined with simulation, they are positioned to help tackle some of the world’s greatest challenges, which pertain to topics such as national security, climate, medicine, energy, and water.
At the intersection between quantum mechanics and computer and information science lies at quantum information science (QIS). QIS seeks to understand how information is processed and transmitted using quantum mechanical principles. It is the merger of quantum mechanics and information and computation theory. QIS comprises four major areas: quantum computing, quantum communication, quantum sensing, and quantum foundational science.
Cyber resilience, which is also sometimes referred to as cyber resiliency, is the ability to weather adverse events in a computing environment. The National Institute of Standards and Technology (NIST) defines cyber resilience as “the ability to anticipate, withstand, recover from, and adapt to adverse conditions, stresses, attacks, or compromises on systems that use or are enabled by cyber resources.” Cyber resilience applies to both physical and virtual assets.
Americans rely on critical infrastructures to protect the nation, maintain a strong economy, and enhance quality of life. These infrastructures—which include the electrical power grid, transportation systems, information networks, banking and finance systems, manufacturing and distribution, and more—are evolving and modernizing. They have become increasingly complex, connected, and vulnerable to adverse conditions, such as cyber and physical attacks.
Advanced computing testbeds, the proving grounds for new machines, are central to the development of next-generation computers. They allow researchers to explore a complex and non-linear design space and facilitate the evaluation of new computing technologies in terms of performance and efficiency on critical scientific workloads. These “laboratories of machines,” in which multiple components are available for experimentation, are critical to the next greatest advancements in computation.
Graph analytics is the evaluation of information that has been organized as objects and their connections. The purpose of graph analytics is to understand how the objects relate or could relate.