Batteries were invented in 1800, but their complex chemical processes are still being explored and improved. While there are several types of batteries, at its essence a battery is a device that converts chemical energy into electric energy.
Machine learning engineering is a branch of engineering that implements ongoing data science developments to address complicated or intricate problems with large amounts of disparate data to create cost-effective and far-reaching benefits.
Exascale computers are the world’s fastest supercomputers. They hold a performance of at least one exaflop, or one quintillion calculations per second. Combined with simulation, they are positioned to help tackle some of the world’s greatest challenges, which pertain to topics such as national security, climate, medicine, energy, and water.
Fish passage refers to the ability to effectively and efficiently allow fish to move through a water system, including successfully navigating human-made structures like dams and culverts, which can impede their journey.
At the intersection between quantum mechanics and computer and information science lies at quantum information science (QIS). QIS seeks to understand how information is processed and transmitted using quantum mechanical principles. It is the merger of quantum mechanics and information and computation theory. QIS comprises four major areas: quantum computing, quantum communication, quantum sensing, and quantum foundational science.
Climate science, or climatology, is the study of Earth’s climate. Climate scientists want to better understand our planet’s atmosphere and how it affects various ecosystems. Many equate climate with the weather. And indeed, the word climate is usually defined as the average weather conditions in a particular area over a long-term period, such as years or decades. But climate science touches far more than weather trends.
Cyber resilience, which is also sometimes referred to as cyber resiliency, is the ability to weather adverse events in a computing environment. The National Institute of Standards and Technology (NIST) defines cyber resilience as “the ability to anticipate, withstand, recover from, and adapt to adverse conditions, stresses, attacks, or compromises on systems that use or are enabled by cyber resources.” Cyber resilience applies to both physical and virtual assets.
Americans rely on critical infrastructures to protect the nation, maintain a strong economy, and enhance quality of life. These infrastructures—which include the electrical power grid, transportation systems, information networks, banking and finance systems, manufacturing and distribution, and more—are evolving and modernizing. They have become increasingly complex, connected, and vulnerable to adverse conditions, such as cyber and physical attacks.
Export controls are U.S. laws and regulations that govern the shipment, transmission, or transfer of sensitive equipment, information, and software to foreign countries, persons, or entities. Export controls exist to protect the national security and foreign policy interests of the United States. These laws and regulations work to achieve adequate oversight on the transfer and use of the products and materials required for developing proliferation-sensitive parts of the nuclear fuel cycle.
Bioinformatics uses computers to make sense of the vast amount of data researchers can now glean from living things. These things can be as seemingly simple as a single cell or as complex as the human immune response. Bioinformatics is a tool that helps researchers decipher the human genome, look at the global picture of a biological system, develop new biotechnologies, or perfect new legal and forensic techniques, and it will be used to create the personalized medicine of the future.
Advanced computing testbeds, the proving grounds for new machines, are central to the development of next-generation computers. They allow researchers to explore a complex and non-linear design space and facilitate the evaluation of new computing technologies in terms of performance and efficiency on critical scientific workloads. These “laboratories of machines,” in which multiple components are available for experimentation, are critical to the next greatest advancements in computation.