Sensor, Data, and Information Fusion
|
Title
|
Abstract
|
Data Distributed Data Fusion and
Resource Management for CEMA
|
Presentation
to Old Crows Cyber and Electromagnetic Activities (CEMA) 2022 in Belcamp,
MD. Architectures and algorithms to
fuse cyber and EM data to provide higher-level data in support of
commanders situational awareness.
|
Distributed Data Fusion and Resource
Management for Cyberspace and Electromagnetic Activities
|
Cyberspace
and Electromagnetic Activities (CEMA) consist of cyberspace operations, electronic
warfare, and electromagnetic spectrum management operations. Distributed
Data Fusion and Resource Management for CEMA (DDFRM-CEMA) is an integrated
estimation and sensor/source management process that has matured over a
series of programs addressing the various functions that have ultimately
been integrated into a complete analysis process. The CEMA Data Fusion (DF)
Level 0-3 functions make inferences from CEMA sensor and source data to
objects and events, develops linkages between them, and asserts predictions
about them. The Resource Manager (RM) Level 4 DF function exploits an
information- theoretic approach that optimizes data/information collection
to satisfy layered Commanders Critical Information Requirements (CCIR) and
disambiguate DF hypotheses. This process, called Information Based
Sensor/Source Management (IBSM), measures information by the expected
decrease in uncertainty in the value. It uses a goal lattice and
sensor/source Applicable Function Table (AFT) to maximize the expected information
value rate (EIVR) through sensor cues and source requests. This data-pull
scheme is essential for CEMA DF where data-push is infeasible, e.g.,
pushing Packet Captures (PCAPs) would create multiply more. DDFRM-CEMA
operations are made semantically consistent by a formal and extensible
ontology that can go from CEMA modalities to organizational behaviors,
intentions, and plans, and whose formal structure reinforces mathematically
correct relationships. The ontology represents relationships (temporal,
whole part, causality, etc.) with which to fuse attack patterns from sensed
observations and extracted features. DDFRM-CEMA is considered a unique
analytical toolkit/integrated estimation and action-taking process that
offers distinctive features and benefits to complex problems in the CEMA
problem spaces. Keywords: cyberspace, CEMA, fusion, ontology, resource
management, optimization, directed graphs, artificial intelligence, IBSM
|
Cyber Ontology (CybOnt) Data Fusion
|
CybOnt performs
ontology-based fusion for cyber threat behavior estimation to contribute to
an operator's cyber Situational Awareness (SA) and Situational
Understanding (SU). It is unique in
that, (1) it is architected following the Joint Directors of Laboratories
(JDL) fusion levels, (2) it uses formal ontology for the T-Box (types) and
A-Box (actuals), and (3) it computes mathematically-principled -- and thus
robust -- likelihood ratios of attack behavior hypotheses. Inference links are visualized in a graph
database tool that allows customized viewing tailored to operator
requirements. The likelihood ratios
can be thresholded to give operators control over
display clutter. It runs in a
tactical cloud environment and uses big data technologies.
Keywords:
cyber, fusion, ontology, artificial intelligence. DISTRIBUTION STATEMENT C. Distribution
authorized to U.S. Government Agencies and their contractors; critical
technology; 2018-09-27. Available
upon request to authorized personel.
|
A Data
Fusion Approach to Indications and Warnings of Terrorist Attacks
|
Indications
and Warning (I&W) of terrorist attacks, particularly IED attacks,
require detection of networks of agents and patterns of behavior. Social
Network Analysis tries to detect a network; activity analysis tries to
detect anomalous activities. This work builds on both to detect elements of
an activity model of terrorist attack activity the agents, resources,
networks, and behaviors. The activity model is expressed as RDF triples
statements where the tuple positions are elements or subsets of a formal
ontology for activity models. The advantage of a model is that elements are
interdependent and evidence for or against one will influence others so
that there is a multiplier effect. The advantage of the formality is that
detection could occur hierarchically, that is, at different levels of
abstraction. The model matching is expressed as a likelihood ratio between
input text and the model triples. The likelihood ratio is designed to be
analogous to track correlation likelihood ratios common in JDL fusion level
1. This required development of a semantic distance metric for positive and
null hypotheses as well as for complex objects. The metric uses the Web
1Terabype database of one to five gram frequencies for priors. This size
requires the use of big data technologies so a Hadoop cluster is used in
conjunction with OpenNLP natural language and
Mahout clustering software. Distributed data fusion Map Reduce jobs
distribute parts of the data fusion problem to the Hadoop nodes. For the
purposes of this initial testing, open source models and text inputs of
similar complexity to terrorist events were used as surrogates for the
intended counter-terrorist application.
Keywords: Data Fusion, Hadoop, Mahout,
Semantic Distance, Probability Mass, Activity Model Matching
|
A
Mathematical Cyber Ontology (CybOnt) for Cyber Data Fusion and Cyber Data
Exchange
|
Silver
Bullet Solutions, Inc., with teammates CUBRC, Inc. and Edutainiacs,
Inc., is developing a mathematical ontology for cyber events, entities,
behaviors, associations, and intentions cyber Situation Awareness (SA)
and associated cyber Command and Control (C2) Network Operations (NetOps), Defensive Cyber Operations (DCO), and
Offensive Cyber Operations (OCO). This Cyber Ontology (CybOnt) will improve
interoperable data exchange between cyber operations nodes and enable data
fusion for detection of cyber attacks as they are being planned and before
they become incidents.
|
An
Information Fusion Framework for Data Integration
|
Despite
high demand for and years of dozens of product offerings, enterprise data
integration remains a manually intensive effort, with custom development
for each data interface. It involves linguistics, ontological models,
uncertain reasoning, inference, and other non-exact and not fully
understood sciences. This paper presents an approach for making progress in
data integration technology by paralleling progress made in the data fusion
community where the fundamental problems are now being appreciated. A framework
for information fusion as a means to achieve data integration is presented.
|
Real-time
DBMS for Data Fusion
|
As
data and information fusion technology and application evolves, the need is
increasing to cope with large amounts and diverse types of data. Although
there would be many benefits to employment of Data Base Management Systems
(DBMS) in automated fusion processes the data access throughput
requirements of automated fusion processes have vastly exceeded the
performance of off-the-shelf DBMSs. The availability of large random
access memories is allowing the development of real-time data base
management systems. While these are currently being used in financial
market and telecommunications applications, their ability to bring DBMS
benefits to data fusion applications has yet to be explored. This paper
presents results of experimentation with these emergent real-time DBMSs
for automated fusion applications. We used algorithms, data
characteristics, and scenarios from deployed and R&D systems. The
applications-dependent data structures were converted to
Entity-Relationship models and generated into real-time and conventional
DBMSs. Keywords: Data Fusion, DBMS, real-time, correlation, knowledgebase,
embedded
|
Early
Experiments with Ontology-Based Fusion
|
There
is a growing sense in the fusion community that an underlying ontology
would improve fusion. We explored the idea and reported the results of
experimentation to ISIF in 2002 and 2003. In 2004 we have continued with
theory development and experimentation. The experimentation described in
this paper involved the development of a formal ontology from a data model
so that automated processes can reason dynamically, by virtue of onthe formal properties of the ontology relationship
types. The experimentation has been for a next generation fusion
architecture that is an open architecture in the well-documented sense that
adds an ontology layer for further decoupling and coordination of software
components. The experimentation has involved rehosting
of existing fusion algorithms to operate within the ontology and a
publish/subscribe architecture. While the experiments to date have shown
how the components can be made to interoperate at the data level, we
believe the architecture will ultimately promote or enforce probabilistic
interoperability between components providing a fusion open architecture at
the probabilistic level. Keywords: Data Fusion, ontology, semantic modeling
|
Airport
Movement Area Knowledge-Assisted Association and Tracking
|
This
white paper describes an approach for improving airport movement area
aircraft and vehicle tracking using knowledge-based techniques. This design
employs a knowledge-based fusion approach that would take into account
airport geography, vehicle movement patterns, static prior data, expert
rules, and sensor characteristics heuristics.
|
Bayes
Networks for Diverse-State and Large-Scale Fusion
|
Generalized
inference provides an elegant formulation for fusing sources that have many
diverse states that are nonetheless inter-related, be it in often in weak
and complex ways. Indeed, levels 1 through 3 fusion can be characterized as
inferring states from evidence; estimation can be viewed as a specific
inference discipline. Unfortunately, the elegant inference formulation
rapidly becomes intractably complex for any real-world problems due to the
permutations of interrelationships between the interacting state variables.
Bayesian networks provide a way of coping with the complexity. Bayesian
networks are techniques for making probabilistic inference tractable and have
been in broad literature and research for quite some time. This paper
describes the application of the Bayes network technique to a real-world
large-scale fusion problem. It provides experience with the many
adaptations and extensions that are required and illustrates some issues
that need further research. Keywords: Entity-Relationship Modeling,
Semantic Network, Inference Network, Data Fusion
|
Multi-Hypothesis
Database for Large-Scale Data Fusion
|
Progress
in deploying large-scale fusion systems has been slow in recent years
despite substantial algorithmic developments. One reason is that there has not
been a way to address a large-scale enterprise in a tractable manner that
allows modular and collaborative evolution of fusion algorithms.
Information and data modeling techniques have become quite mature over the
past 20 years so that it is now possible to model the information domain of
a large-scale enterprise tractably.
By extending information modeling constructs to semantic and
inference nets, it is possible to use these information models as a basis
for large-scale fusion. This paper
shows how to instrument an information model into a fusion inference
structure. Algorithms encapsulation
and computing techniques are discussed. This approach could lead to
foundations for large-scale fusion in defense, intelligence, law
enforcement, and air traffic control systems. Keywords: Entity-Relationship Modeling, Semantic
Network, Inference Network, Data Fusion
|
Data Management for the
Warfighter Information Processing Cycle
|
The
Data Access Function (DAF) provides net-centric services and means to
access information within and relevant to the Warfighter Information
Processing Cycle (WIPC). Within the
Combat System, the DAF provides to sensor, track, reference, context, and
sensor tasking and queuing information.
The DAF consists of many data access services needed to meet the
broad range of QoS, IA, and topology requirements
and information types accessed across the WIPC. The DAF helps WIPC services operate
autonomously with respect to each other; by separating the functionality of
the service from the data, the services interact via the commonly
understood and accessed data without any knowledge-of or explicit
interaction-with each other.
|
Ontology-Based Inference with Inferlets
|
We
propose to develop a massive ontology and use it as a framework for
class-level inference for improved situation awareness. Our proposal is to
conduct research of concepts and preliminary experiments about which we
have written and presented conference papers at academic and DoD sensor and
information fusion conferences. If successful, the ontology-based approach
will leverage COTS database technologies and DARPA, ONR, AFRL, and other
fusion, inference, and cognition technologies. It will enable massive
fusion inference networks for weak evidence accumulation and long indirect
inferencing to improve situation awareness. Our approach is simple and
elegant, yet rigorous and comprehensive.
|
Information Exchange
Requirements (IER) Driven Fusion
|
The
technical concept is has four principal elements:
1. Information Exchange Requirements Processor (IER-P) that decomposes IERs
to inter-related object and events and then links them to types of sensor
and source support evidence. This is
necessary since IERs are usually not directly observable but are, rather,
satisfied by fusion of multiple sensors and sources. The IER ontology dictates the workflow.
2. BrainLike Process (BLP) that tailors FMV and
imagery feature extraction to provide the required evidence
3. Sources Query Process that prepares Hadoop map jobs to retrieve object
and event of interest data from DCGS-N sources
4. Fusion Process that performs, 1) Hadoop reduction using the returned
DCGS-N key-value pairs, and 2) updates likelihoods in realtime as sensor
features arrive.
|
Pedigree in the Warfigher Information Processing Cycle (WIPC)
|
All
information in the WIPC has a source -- even a lineage of sources. Within the WIPC, information lineage is
referred to as Pedigree and information about
the source is called, Source Metadata (P&SM). Pedigree is a chain of observations or
object beliefs and along with a description of how
such observations or object beliefs were arrived at while Source Metadata
is a characterization of the source, whether it be a sensor, individual
operators, or a system of machines and operators. P&SM lineage describes how a piece of
information came about; P&SM descendancy
describes how a piece of information was used.
|
Next Generation Fusion
Architecture
|
This
project describes the creation of a Next Generation Fusion Architecture, an
open
information architecture, for Command and Control (C2), and Weapons Control
systems that
require advanced sensor and data fusion. This Next Generation Fusion
Architecture provides a
foundation for advanced fusion algorithms including non-kinematic level 1
fusion, level 2 and 3
complex assessments, more broadly scoped Situation Awareness and Battle
Management
information analysis, and level 4 process adaptation. The architecture
supports increased
automation and higher quality data fusion through enforced integration and
integrity of data
thus allowing advanced mechanisms, such as ontology-based inference, as
well as the ability to
execute multiple kinds of fusion algorithms that interoperate autonomously,
yet synergistically
|
EnterPrise Architecture, System Of
Systems Engineering, and model based Systems engineering
|
Title
|
Abstract
|
Analyzing
and Presenting Multi-Nation Process Interoperability Data for End-Users
|
Silver
Bullet was tasked by the DoD CIO to brief the International Enterprise
Architecture Conference in London in 2008.
International Defense Enterprise Architecture Specification (IDEAS) was
developed to deliver a unified specification for the exchange of military
architectures between coalition partners.
The nations/organizations are Australia, Canada, UK, USA and, as
observers Sweden & NATO. The
BORO methodology (http://www.boroprogram.org/)
is being used to re-engineer and deconflict
legacy systems. It, 1) provides a
precise, mathematical approach to comparing information, 2) is very easy to
understand, and stakeholders readily commit to use the methodology, and 3)
is guaranteed to produce a correct representation, and is fully transparent
at every stage stakeholders are involved so buy-in is kept all the way
through. Its layers are, 1)
foundation based on Set Theory, 2) common patterns based on the foundation,
3) domain patterns that specialize the common patterns. This fits well the many aspects of
interoperability: 1) communication,
2) system, and 3) procedural/doctrinal.
An experiment was conducted between the nations to address current
issues in warfare Casualty Management.
The use Cases are, 1) Scud missile attack in Desert Storm, and 2)
Operation Desert Storm Overall. The
conclusions are that exchanging architecture data during coalition
operations planning process can automate interoperability comparisons to
reduce resource requirements, speed the process, potentially detect issues
that may have been missed, and de-bias national interpretations of other
doctrines. But it depends on a
precise data exchange standard and IDEAS grounding in a formal ontology
provides such precision.
|
Implementation
of the International Defence Enterprise Architecture Specification (IDEAS)
Foundation in DoD Architecture Framework 2.0
|
Silver
Bullet was tasked by the DoD CIO to brief the International Enterprise
Architecture Conference in London in 2010.
Why DoD used IDEAS benefits
1. Re-use of common patterns saved a lot of work
2. Reconciliation and analysis tool
3. Information pedigree model
4. Design reification and requirements traceability
5. Services description
6. Semantic precision
7. Mathematical precision
How we implemented IDEAS
Implementation challenges
|
DoDAF
2.0 Meta Model (DM2) Briefing for the JAWG
|
Silver
Bullet was tasked by the DoD CIO to brief the Joint Staff Architecture
Working Group on:
DoDAF Meta Model (DM2) pieces
Formal ontologic foundation: International Defence
Enterprise Architecture Specification (IDEAS) overview
Why we used IDEAS benefits
Simplification
Quality
Expressiveness
The Physical Exchange Specification (PES)
Active Configuration Management
GFI Resources
|
DM2 Ontologic Foundation and Pedigree Model
|
Silver
Bullet was tasked by the DoD CIO to brief the NSA Commercial Solutions
Center (NCSC) on:
THE DM2 FOUNDATION
DODAF PHYSICAL EXCHANGE
SPECIFICATION
EXCHANGE OF DM2 PES XML DOCUMENTS
PES XSD XML DOCUMENT EXAMPLES
UPDM SEARCH AND RESCUE
ISP SAMPLES
|
DoD
Architectures and Systems Engineering Integration
|
Silver
Bullet was tasked by the DoD CIO to brief the NDIA 15th Annual Systems
Engineering Conference on:
1. DoDAF evolution plan
2. Fit-for-purpose (FFP) and legacy views
3. DoDAF reification, requirements, and SE V model
4. DoDAF meta-model for:
DOTMLPF
temporality, behavior, scenarios, M&S, executable
architectures
5. DoDAF artifacts X SE documents and artifacts
|
Leveraging
DoDAF 2.0 in the DoD Enterprise
|
Silver
Bullet was tasked by the DoD CIO to brief the International Enterprise
Architecture Conference in London on:
DoD CIOs Role with DoDAF V2.0
DoDAF V2.0s Role in DoDs Six Core Processes
Types of Architectures in DoD
Reference Architectures and DoDAF V2.0
Two examples
Enterprise-wide Access to Network and Collaboration
Services (EANCS) Reference Architecture
Active Directory Optimization Reference Architecture
(ADORA)
Vision of role of the DoDAF Meta Model (DM2) in
empowering architecture roles in core processes.
|
Overview
and Role of Enterprise Architecture in DoD Governance
|
Silver
Bullet was tasked by the DoD CIO to separately brief DISA and the joint
VA-DHA workshop on:
Requirements for EA (the six core processes)
Data Centric paradigm (why EA data is essential to success)
Metamodel
Method
Presentation
Fit-for-purpose
CM
|
Briefing
to the Software Engineering Institute (SEI Army Strategic Software
Improvement Program (ASSIP) Action Group (AAG)
|
Silver
Bullet was tasked by the DoD CIO to brief the Software Engineering
Institute (SEI Army Strategic Software Improvement Program (ASSIP) Action
Group (AAG) on:
DM2 Purposes
DM2 Modeling Conventions
Foundation Ontology
Partial walkthrough of a sample of DM2 LDM data groups
Thoughts as to how this could aid software intensive PEOs
|
Lessons
Learned from Implementing Enterprise Integrated Architecture Data
Repository
|
Briefing
to Command Information Superiority Architectures (CISA) Worldwide 30
October 2002 for the Department of the Navy. Discussed issues implementing the
Department of Navy Integrated Architecture Database (DIAD) with screenshots
of DIADs many tools.
|
DoD
Information Enterprise Architecture (DIEA) and the DoD Business Capability
Acquisition Cycle (BCAC)
|
Briefing
to Department of Navy (DON) Information Technology Conference 2018 on:
DIEA v3.0 Status and Plans Overview
DoDI 8270, DoD Architectures (DRAFT)
Integration of DIEA into BEA supporting BCAC
|
DoD
Information Enterprise Architecture (IEA) Version 3.0
|
Briefing
to DON IT Conference 2017 on revamp of the DoD Information Enterprise
Architecture.
|
DoDAF
In-Depth
|
DoD
CIO Architecture and Interoperability Directorate standard DoDAF brief
developed and presented by Silver Bullet.
Provides:
DoDAF Basic Concepts
Walkthrough of DoDAF Meta-Model (DM2)
Walkthrough of DoDAF Model (View) Types
DoDAF and Systems Engineering:
Refinements Levels and Traceability
|
Enterprise
Taxonomies
|
Briefing
for AFD Working Group 11 July 2002 sponsored by the DON CIO.
|
Lines
of Sight and Provable Traceability
|
Presentation
by Mr. David McDaniel and Gregory Schaefer, Silver Bullet Solutions, Inc.
at the 2014 Integrated EA Conference.
Discussed:
Why traceability is important
Issues with traceability
Ontology and predicate calculus of traceability
Application to architectural patterns
|
Interpretation
of UPDM 1.0 Search and Rescue (SaR) Example
Diagrams in DoDAF 2.0 / DM2 Concepts
|
Extracts
for DoD EA Conference May 2010
|
Repository,
Process, and Tools Support for Set Based Design (SBD)
|
How
NEAR, ExARM, and ExAMS
can complement and support SBD
|
The
Role of Information Elements in Net-Centric Data Management
|
Presentation
to the Sixteenth Systems and Software
Technology Conference, April 2004.
Provided:
1. Definition of Information Elements and Roles in
Architecture
System Engineering
Information Requirements Description
Systems Analysis
Capabilities Definition
Net Centric Data Strategy Goals and Elements and IE Roles in the Elements
COI Determination and Interaction
Understanding and Discovery
Ontologies
Taxonomies
Harmonization and Mediation
Metadata Attributes
|
|