Presentation + Paper
7 June 2024 Grounding ontologies with pre-trained large language models for activity based intelligence
Anee Azim, Leon Clark, Caleb Lau, Miles Cobb, Kendall Jenner
Author Affiliations +
Abstract
The development of Activity Based Intelligence (ABI) requires an understanding of individual actors’ intents, their interactions with other entities in the environment, and how these interactions facilitate accomplishment of their goals. Statistical modelling alone is insufficient for such analyses, mandating higher-level representations such as ontology to capture important relationships. However, constructing ontologies for ABI, ensuring they remain grounded to real-world entities, and maintaining their applicability to downstream tasks requires substantial hand-tooling by domain experts. In this paper, we propose the use of a Large Language Model (LLM) to bootstrap a grounding for such an ontology. Subsequently, we demonstrate that the experience encoded within the weights of a pre-trained LLM can be used in a zero-shot manner to provide a model of normalcy, enabling ABI analysis at the semantics level, agnostic to the precise coordinate data. This is accomplished through a sequence of two transformations, made upon a kinematic track, toward natural language narratives suitable for LLM input. The first transformation generates an abstraction of the low-level kinematic track, embedding it within a knowledge graph using a domain-specific ABI ontology. Secondly, we employ a template-driven narrative generation process to form natural language descriptions of behavior. Computation of the LLM perplexity score upon these narratives achieves grounding of the ontology. This use does not rely on any prompt engineering. In characterizing the perplexity score for any given track, we observe significant variability given chosen parameters such as sentence verbosity, attribute count, clause ordering, and so on. Consequently, we propose an approach that considers multiple generated narratives for an individual track and the distribution of perplexity scores for downstream applications. We demonstrate the successful application of this methodology against a semantic track association task. Our subsequent analysis establishes how such an approach can be used to augment existing kinematics-based association algorithms.
Conference Presentation
(2024) Published by SPIE. Downloading of the abstract is permitted for personal use only.
Anee Azim, Leon Clark, Caleb Lau, Miles Cobb, and Kendall Jenner "Grounding ontologies with pre-trained large language models for activity based intelligence", Proc. SPIE 13057, Signal Processing, Sensor/Information Fusion, and Target Recognition XXXIII, 130570L (7 June 2024); https://doi.org/10.1117/12.3013332
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Semantics

Data modeling

Intelligence systems

Machine learning

Tracking and scene analysis

Back to Top