POROUS MATERIALS FOR ENHANCED IONIZATION EFFICIENCY (iEdison No. 0685901-19-0013)
We have employed established porous engineered materials, such as metal-organic frameworks (MOFs) to enhance the ionization efficiency and simplify sample loading--greatly expanding our capabilities in low level characterization of environmentally and/or strategically important elements.
PERFORMANCE AND USABILITY ENHANCEMENTS FOR CONTINUOUS SUBGRAPH MATCHING QUERIES ON GRAPH-STRUCTURED DATA
StreamWorks is a network analysis framework that enables an analyst to monitor and analyze streaming computer network traffic data to identify emerging computer network intrusions and threats. Different types of graphical query patterns may be defined for specific types of cyberattacks including various network scans, reflector attacks, flood attack, viruses, worms, etc. StreamWorks will support subgraph matching on computer network attributes such as hostnames, IP addresses, protocols, ports, packet sizes, machine types, and message types. The speed of subgraph pattern matching will be accelerated by collecting and utilizing node and edge frequency information to optimize search paths through a massive data graph. Computer network intrusion analysis will involve live computer network data streamed in at high data rates and the analysis of data graphs consisting of millions to billions of edges. For known patterns, specific graphical query patterns are collected in a library and continuously and efficiently matched against the dynamic graph as it is updated. Each graph query is captured as a subgraph join tree which decomposes the query graph into smaller search subpatterns. These smaller subpatterns signify precursor events that emerge early before the full query pattern is complete. As precursor events are detected in data streams, they are matched to the nodes of different subgraph join trees. Matching that occurs higher in a join tree indicates a higher probability that a specific type of attack is occurring. A similarity or confidence score may be computed for the partial matching through training on collected computer network traffic data to measure the frequencies of occurrence of partial subpatterns as precursors to the full graph query pattern. For unknown patterns or zero-day exploits, the same analysis framework may be applied to track the emergence of small subpatterns as they appear in the data stream. The system may be seeded with hints to look for small graph patterns that involve rare events (based on collected statistics), events involving critical resources such as an authentication server, domain name server, database, etc., or particular host machines of specific suspicions or interests to analysts. When seeded subpatterns are found in the data stream, they are tracked and monitored within subgraph join trees. Here, subpatterns are joined based on specific criteria such as when the subpatterns grow beyond a certain threshold size, additional critical resources are introduced into a subpattern, or important types of interactions or communications are detected. Thus, full attack patterns may dynamically emerge from the small seeded patterns or hints. The initial seeded patterns may have confidence scores generated from collected statistics or assigned by analysts, which are then propagated up through the subgraph join tree. Additionally, StreamWorks will provide mechanisms for analysts to vet tracked subpatterns so as to improve analysis and performance by eliminating benign patterns from being monitored and assessed. The advanced dynamic graph algorithms have been packaged into a streaming network analysis framework known as StreamWorks. With StreamWorks, a scientist or analyst may detect and identify precursor events and patterns as they emerge in complex networks. This analysis framework is intended to be used in a dynamic environment where network data is streamed in and is appended to a large-scale dynamic graph. An interactive graph query construction tool has been developed that will allow an analyst to build a query graph. Various cyberattack templates have been developed for querying the dynamic graph, where an analyst may tailor the attributes of a cyberattack query by adjusting parameters of the cyberattack template. The dynamic results, which are the subpatterns of the template that are matched in the dynamic graph, are returned to the analyst in a visualization showing the emerging and evolving patterns along with a visualization of the subgraph join tree containing statistics on the level of matching per partial subgraph pattern.
ELECTRON BEAM MASKS FOR COMPRESSIVE SENSORS
Transmission microscopy imaging systems include a mask and/or other modulator situated to encode image beams, e.g., by deflecting the image beam with respect to the mask and/or sensor. The beam is modulated/masked either before or after transmission through a sample to induce a spatially and/or temporally encoded signal by modifying any of the beam/image components including the phase/coherence, intensity, or position of the beam at the sensor. For example, a mask can be placed/translated through the beam so that several masked beams are received by a sensor during a single sensor integration time. Images associated with multiple mask displacements are then used to reconstruct a video sequence using a compressive sensing method. Another example of masked modulation involves a mechanism for phase-retrieval, whereby the beam is modulated by a set of different masks in the image plane and each masked image is recorded in the diffraction plane.
SENSING ANALYTICAL INSTRUMENT PARAMETERS, SPECIMEN CHARACTERISTICS, OR BOTH FROM SPARSE DATASETS
Disclosed are methods for sensing conditions of an electron microscope system and/or a specimen analyzed thereby. Also disclosed are sensor systems and electron microscope systems able to sense system conditions, and/or conditions of the specimen being analyzed by such systems. In one embodiment, a sparse dataset can be acquired from a random sub-sampling of the specimen by an electron beam probe of the electron microscope system. Instrument parameters, specimen characteristics, or both can be estimated from the sparse dataset.
OPTIMIZED SUB-SAMPLING IN AN ELECTRON MICROSCOPE
Compressive Sensing (CS) allows a signal to be sparsely measured first and accurately recovered later in software [1]. In scanning transmission electron microscopy (STEM), it is possible to compress an image spatially by reducing the number of measured pixels, which decreases electron dose and increases sensing speed [2,3,4]. The two requirements for CS to work are: (1) sparsity of basis coefficients and (2) incoherence of the sensing system and the representation system. However, when pixels are missing from the image, it is difficult to have an incoherent sensing matrix. Nevertheless, dictionary learning techniques such as Beta-Process Factor Analysis (BPFA) [5] are able to simultaneously discover a basis and the sparse coefficients in the case of missing pixels. On top of CS, we would like to apply active learning [6,7] to further reduce the proportion of pixels being measured, while maintaining image reconstruction quality. Suppose we initially sample 10% of random pixels. We wish to select the next 1% of pixels that are most useful in recovering the image. Now, we have 11% of pixels, and we want to decide the next 1% of 'most informative" pixels. Active learning methods are online and sequential in nature. Our goal is to adaptively discover the best sensing mask during acquisition using feedback about the structures in the image. In the end, we hope to recover a high quality reconstruction with a dose reduction relative to the non-adaptive (random) sensing scheme. In doing this, we try three metrics applied to the partial reconstructions for selecting the new set of pixels: (1) variance, (2) Kullback-Leibler (K-L) divergence using a Radial Basis Function (RBF) kernel, and (3) entropy. Figs. 1 and 2 display the comparison of Peak Signal-to-Noise (PSNR) using these three different active learning methods at different percentages of sampled pixels. At 20% level, all the three active learning methods underperform the original CS without active learning. However, they all beat the original CS as more of the 'most informative" pixels are sampled. One can also argue that CS equipped with active learning requires less sampled pixels to achieve the same value of PSNR than CS with pixels randomly sampled, since all the three PSNR curves with active learning grow at a faster pace than that without active learning when the pixel fraction is larger than 20%. For this particular STEM image, by observing the reconstructed images and the sensing masks, we find that while the method based on RBF kernel acquires samples more uniformly, the one on entropy samples more areas of significant change, thus less uniformly. The K-L divergence method performs the best in terms of reconstruction error (PSNR) for this example [8].
Multidimensional Structured Data Visualization Method and Apparatus, Text Visualization Method and Apparatus, Method and Apparatus for Visualizing and Graphically Navigating the World Wide Web, Method and Apparatus for Visualizing Hierarchies (
The invention consists of three parts. The first is a method for generating graphical representations of the contents of large internet directory structures. The visual metaphor employed for this purpose is that of a shaded-relief terrain image. The second component is a design for a human-computer interface capable of supporting a variety of graphical interactions with such representations. Including graphical browsing, querying, and "bookmarking" of sites or features of interest. The third is an information delivery architecture that enables internet a browser users to use such maps to browse, query, bookmark, and otherwise navigate the display to locate web sites of interest. Key innovations described here include. A method for generating spatial representations of large quantities (10,000,000+) of web sites that organizes, depicts, and enables graphical navigation of the sites. A method for generating a graphical underlayment for such spatial representations, in the form of a shaded-relief terrain image, that conveys to the viewer information about the topical variability of the site distribution in an intuitive and visually compelling way. A user interface that seamlessly integrates directory, query results, and "bookmark" browsing and that enable their visual cross-referencing. A general architecture for delivering the previously described visualization components to internet browser users that has extremely low bandwidth requirements. This invention could serve as the basis for an internet portal, providing users with an exciting and effective new means for accessing and interacting with information on the World Wide Web.
System and Process for Production of Isotopes and Isotope Compositions
The availability of longer-lived positron emitters have made possible PET-based imaging of tumors by radiolabeling monoclonal antibodies (mAbs), mAb fragments and aptamers, a process referred to as immunoPET. ImmunoPET combines the high sensitivity and spatial resolution of PET imaging with the antigen specificity of mAbs. 89Zr is gaining tremendous interest as an immunoPET isotope, due to its ease of production using monoisotopic (natural) yttrium targets and moderate-to-low energy medical cyclotrons. In addition to opportunities for new and emerging medical modalities, the long half-life of 89Zr (t1/2 = 78.4 hr) enables the potential for off-site isotope production and distribution. PNNL developed new column-based purification methods that are capable of ultra-high purity 89Zr for use in immunoPET diagnostic imaging. The new methods are capable of improved 89Zr product purities as compared to 89Zr that is currently available in Europe and the U.S. Additionally, PNNL integrated laboratory automation into the purification process. This makes possible highly reproducible and remote purification of 89Zr. Scale-up of 89Zr purification processes will result in high dose rates to personnel, so it is important that the production method is capable of being performed in remote, shielded locations.
SYSTEM AND METHOD OF STORING AND ANALYZING INFORMATION
SGEM is a scalable semantic graph databse. The software platform comprises three major components: 1) a SPARQL to C compiler, 2) a mulithreaded graph library (SGLib), and 3) a custom mulithreaded runtime layer (GMT). The compiler converts SPARQL queries to data parallel C code with calls to SGLib and GMT methods. SGLib is a llibrary of graph methods and data structures. It includes methods to ingest RDF triple files, store data in a compressed neighbor graph, and multithreaded graph algorithms customized for query processing and GMT. GMT is a mulithreaded runtime system for comoodity servers, clusters, and cloud systems customized for query processing. GMT manages a global address space, tread scheduling, and data aggregation.
Slow Magic Angle Spinning Nuclear Magnetic Resonance Device and Process for Metabolomics Profiling of Tissues and Biofluids
The line broadening in biological samples can be suppressed using 1H-PASS method applied at a sample spinning rate of about 100 Hz or less. A novel slow MAS probe that is capable of high resolution and high sensitivity metabolic profiling on biological samples with volume as small as a few hundred nanoliters to sample volume as large as a few milliliter is developed. The nanoliter capability may make it possible to follow the metabolic changes through a continued investigation on a single small laboratory animal over a long period of time using minimally invasive blood and tissue biopsy samples. While the milliliter capability would allow minimal destructive studies of intact biological object with size as large as a few cm