- Participated in carrying out sensitivity and portability studies of the Short Term Integrated Forecasting System (STIFS) for the Department of Energy.
- Carried out programming enhancement of the Dynamic General Equilibrium Model (DGEM) for the Federal Emergency Management Agency.
- Developed an interactive program for verifying the integrity of pressure transfer standards for the Pressure Division at the National Institute of Standards and Technology.
- Participated in the implementation of Linear Programming and Integer Programming packages.
- Developed, analyzed, and implemented optimal expected-time algorithms for computing Voronoi diagrams in two and three dimensions.
- Developed, analyzed, and implemented optimal expected-time algorithms for computing Delaunay triangulations constrained by line segments.
- Developed, analyzed, and implemented algorithm for computing Delaunay triangulations for comet-shaped polygons.
- Developed, analyzed, and implemented algorithm for computing the growth surface for the slopes at the boundary of a polygon.
- Identified 3-dimensional line segment insertion problems that can be approached algorithmically as 2-dimensional problems.
- Implemented algorithm for computing two-dimensional Delaunay triangulations and their associated Voronoi diagrams using incremental topological flipping. For a Delaunay triangulation and Voronoi diagram obtained in this manner, implemented an additional program for computing the areas of Voronoi cells in the Voronoi diagram, lengths of faces of these cells, and inter-neighbor distances for each face.
- Implemented algorithm for computing Regular/Delaunay triangulations in three dimensions based on incremental topological flipping and lexicographical manipulations. For a Regular/Delaunay tetrahedralization obtained in this manner, implemented additional programs for computing the Power/Voronoi diagram associated with the tetrahedralization and for computing the volumes of Power/Voronoi cells in the Power/Voronoi diagram, areas of facets of these cells, and inter-neighbor distances for each facet.
- Developed and implemented algorithms for inserting line segments and triangles into a 3-dimensional triangulation based on topological flipping and Steiner points.
- Developed and implemented method for computing dredging volume estimates based on Delaunay triangulations for the U.S. Army Engineer Topographic Laboratories.
- Participated in evaluation effort of the Intergraph InRoads software for hydrographic volume determination.
- Assisted with the implementation of a contour-to-grid interpolation algorithm based on Delaunay triangulations for the U.S. Army Engineer Topographic Laboratories.
- Implemented program based on constrained Delaunay triangulations for the probabilistic
computation of Poiseuille flow velocity fields through tubes having general cross section
for the Polymers Division at NIST.
- Assisted with the development of triangulation-based surface modelling software for the U.S. Army Engineer Topographic Laboratories.
- Developed and implemented method for computing volume estimates of 3-dimensional bodies based on Delaunay tetrahedralizations and the insertion of line segments and triangles into tetrahedralizations for BFRL at NIST.
- Developed and implemented method for approximately locating and
computing the area beneath peaks of mass spectral data for the
Polymers Division at NIST.
- Developed and implemented linear method for statistically locating
and eliminating peaks of mass spectral data caused by noise for the
Polymers Division at NIST.
- Assisted with the development of software based on tetrahedralizations
for simulating the tomography that takes place when performed using scanning
transmission electron microscopy (STEM) for the Optical Technology Division
- Developed and implemented a scheme for representing as integers
input decimal numbers that have been stored in a computer as double
precision floating point numbers and for carrying out additions,
subtractions, multiplications with them in an exact manner.
- Implemented program for computing an estimate of the surface of a
3-dimensional object as a power crust from a point cloud obtained from
the surface of the object if a point is known that lies in the interior
of the object at a reasonable distance from the surface of the object.
- Developed and implemented algorithm for computing dynamic Voronoi
diagrams in 2-d space, that is, Voronoi diagrams of moving points in the
- For the pupose of visualizing defect structures in block copolymer
thin films, using Matlab Handle Graphics implemented program for producing
data files and script file for plotting in the form of a movie the
development of Voronoi diagrams of moving centroids of individual
microdomains for the Polymers Division at NIST.
- For the Computational Biology and the Medical Change Analysis Validation
projects implemented image segmentation algorithms Otsu, Maximum Entropy,
K-means, Watershed, Pseudo-watershed, and Canny.
- Developed and implemented method based on the pseudo-watershed concept
for linking Canny edge points, that is, for filling gaps in edges identified
by the Canny algorithm.
- Implemented programs for comparing a segmentation of an image of cells
to the ground truth segmentation of the image, based on misclassification errors,
false positive and false negative errors, roundedness and roughness of cells.
- In collaboration with colleagues of Computational Biology project carried
out comparison of segmentation algorithms (Otsu, K-means, Canny, etc.) through
the analysis of various segmentations of images of cells obtained with the algorithms.
- For any positive integer d, implemented K-means algorithm for computing K
d-dimensional means and partitioning a set of points in d-dimensional space into
K disjoint sets so that a point in a set is closer to the mean corresponding to
the set than it is to the other means.
- For any positive integer d, implemented Expectation-Maximization (EM) algorithm,
a data clustering algorithm, for computing an optimal d-dimensional Gaussian
Mixture Model (GMM) of K Gaussians corresponding to a given finite set of
d-dimensional data points (training set) under the assumption that the Gaussians
each have a covariance matrix which is some scalar multiple of the identity matrix.
Once the optimal GMM is computed, Gaussians and convexity coefficients are used to
define K classes of points and each data point is assigned to the class to which
it is most likely to belong.