Information Characterization & Exploitation

Information Characterization & Exploitation

Intelligent Systems: From Theory to Practice


Micro-Copter UAV

SEP18 - Unmanned Aerial Vehicles (UAVs) are now a proven commodity of the modern day battlefield, providing real-time imagery data feeds, targeting, prosecution, and round-the-clock monitoring of high-value targets. However, the command, control, and tasking decisions of UAV assets are still executed at locations remote from the frontlines. The buzz around the next generation of UAV systems is to equip frontline soldiers with scaled-down UAV™s; thus, giving them an invaluable resource to aid with timely and well-informed execution of the observe, orient, decide, and act (OODA) loop.

A micro-copter is a VTOL (Vertical takeoff and landing) platform that provides many advantages over conventional aircraft platforms. These advantages include ease of launch and recovery, hover capability, and a more compact form factor. The ICE Lab has developed hardware and software solutions for several micro-copter applications, including direct georeferencing from smart phones, image analysis, sensor system integration, and custom gimbal design.

Posted By
 Dr. Adrian Peter 

Large-scale Clustering for Big Data Analytics: A MapReduce Implementation of Hierarchical Affinity Propagation

JUL22 This project allows users to effectively perform a hierarchical clustering algorithm over extremely large datasets. The research team developed a distributed software system which reads in data from multiple input sources using a common interface, clusters the data according to a user-defined similarity metric, and represents the extracted clusters to the user in an interactive, web-based visualization. In order to deal with large “Big Data” datasets, the team derived and implemented a distributed version of the Hierarchical Affinity Propagation (HAP) clustering algorithm using the MapReduce framework. This parallelization allows the algorithm to run in best-case linear time on any cardinality dataset. It also allows execution of the software within a scalable cloud-computing framework such as Amazon’s Elastic Compute Cloud (EC2).

Posted By Dr. Adrian Peter 

Density Estimation for Streaming Data Analytics

JUN12 - Stream computing is rapidly gaining momentum in markets ranging from healthcare to commercial businesses to defense—an accelerated adoption driven by the promise of delivering actionable decision support in a timely manner. By processing high-volume data on the wire, we can greatly mitigate the need to rely on traditional paradigms of data warehousing and batch mining of information sources. To deliver on the promise of on-the-wire actionable intelligence, the backend data ingest and routing infrastructure must be supported by advanced analytic algorithms that can extract the value-added information from the stream and enable application-specific analysis and discovery. Amplified by the demand function for stream computing, there exists an immediate need to accelerate the development of analytics for data on the move. We propose to directly address this deficiency by investigating and delivering solutions that significantly advance the state of the art in statistical methodologies at the heart of advanced analytics. In particular, this project seeks to develop novel density estimation techniques that will enable robust data characterization, in an incremental and computationally efficient manner suitable for the streaming paradigm.

Posted By
 Michal Frystacky 

Sliding Wavelets for Indexing and Retrieval

APR06 - Shape representation and retrieval of stored shape models are becoming increasingly more prominent in fields such as medical imaging, molecular biology and remote sensing. We present a novel framework that directly addresses the necessity for a rich and compressible shape representation, while simultaneously providing an accurate method to index stored shapes. The core idea is to represent point-set shapes as the square-root of probability densities expanded in a wavelet basis. We then use this representation to develop a natural similarity metric that respects the geometry of these distributions, i.e. under the wavelet expansion distributions are points on a unit hypersphere and the distance between distributions is given by the separating arc length. The process uses a linear assignment solver for non-rigid alignment between densities prior to matching; this has the connotation of ``sliding'' wavelet coefficients akin to the sliding block puzzle L'Âne Rouge. (We acknowledge support from the National Science Foundation, NSF IIS-0307712.)

Posted By Michal Frystacky



Wavelet Density Estimation

APR06 - Wavelet based density estimators have gained in popularity due to their ability to approximate a large class of functions; adapting well to difficult situations such as when densities exhibit abrupt changes. The decision to work with wavelet density estimators brings along with it theoretical considerations (e.g. non-negativity, integrability) and empirical issues (e.g. computation of basis coefficients) that must be addressed in order to obtain a bona fide density. We present a new method to accurately estimate a non-negative, density which directly addresses many of the problems in practical wavelet density estimation. We cast the estimation procedure in a maximum likelihood framework that estimates the square root of the density √p; allowing us to obtain the natural non-negative density representation (√p)². Analysis of this method brings to light a remarkable theoretical connection with the Fisher information of the density and consequently leads to an efficient constrained optimization procedure to estimate the wavelet coefficients. (We acknowledge support from the National Science Foundation, NSF IIS-0307712.)

Posted By Michal Frystacky


Shape Analysis with Parametric Mixtures

MAR23 - Shape matching plays a prominent role in the analysis of medical and biological structures. We present a unifying framework for shape matching that uses mixture-models to couple both the shape representation and deformation. The theoretical foundation is drawn from information geometry where information matrices are used to establish intrinsic distances between parametric densities. When a parameterized probability density function is used to represent a landmark-based shape, the modes of deformation are automatically established through the information matrix of the density. We first show that given two shapes parameterized by Gaussian mixture models, the well known Fisher information matrix of the mixture model is a natural, intrinsic tool for computing shape geodesics. We have also developed a new Riemannian metric based on generalized φ-entropy measures. In sharp contrast to the Fisher-Rao metric, our new metric is available in closed-form. Geodesic computations using the new metric are considerably more efficient.

Posted By Michal Frystacky