This program co-regsisters two images using a constant sample/line translation. This implies of course that the
internal geometry of the both images be nearly the same so that the translation can be computed. That is, this
program will not work if the translation varies significantly throughout the image. However, there are still useful
capabilities in this program if this condition is not meet. If the condition of near constant translation is met then
the translation can be computed to sub-pixel accuracy. The basic principle behind this program is to compute
an average translation by computing local translations spaced evenly throughout the image. The number and
spacing of local translations is user defined. This allows for many output options including 1) directly creating the
translated image, 2) creating a control network which can be used in other programs (e.g., qnet, warp), especially
if the translation is not constant across the image, or 3) creating a flat-field file usable in spreadsheets or plotting
packages to visualize magnitude and direction of varying translations.
NOTE: This program can utilize many different techniques for computing the translation. It is recommend that
you read "Automatic Registration in Isis 3.0". It is essential for under understanding how to create a "registration
definition" file and how to size your search and pattern chips. We will continue with the discussion of functionality
of this program assuming the reader has a fundamental knowledge of Automatic Registration. For the brave we
give an example of a defintion file (REGDEF):
You will provide two input cubes, one which will be translated (FROM) and the other is held fixed (MATCH).
The images must have the same number of samples and lines and can have only one band (use cube
attributes if necessary). A sparse grid will be defined across the held image using the parameters ROWS
and COLUMNS. If the user does not provide values for the sparse grid size it will be automatically computed
as follows: COLUMNS = (image samples - 1) / search chip samples + 1. Similarly for ROWS. Conceptually,
the sparse grid defined by ROWS and COLUMNS will be laid on top of both images with even spacing between
the rows (or columns) and but no row will touch the top or bottom of the image. That is, the grid is internal to the
image.
At each grid intersection we will compute a translation. This is done by centering the search chip at the grid
intersection for the image to be translated (FROM) and centering the pattern chip at the grid intersection for the
held image (MATCH). The pattern chip is walked through the search chip to find the best registration (if any).
Again the details of how this is done is described in the document "Automatic Registation in Isis 3.0". The local
translation is recorded at all grid intersections that had a successful registration. It will be written to the control
network and/or flat-file if requested. The average of the local translations is then used to compute an overall
sub-pixel translation which can be applied to the output image (TO).
Some tips to improve odds of a successful registration. In general don't go too small with the pattern chip size,
20x20 is probably a good starting point. Also, the larger the translation, the larger the search chip size will need
to be. So if your translation is only a couple of pixels keep the search chip slightly bigger than the pattern
(e.g., 25x25 vs 20x20). However if the translation is large you will need to expand the seach area. For example,
if the translation is roughly 45 pixels and your pattern is 20x20 the search area should be 20+2*45 or 110x110.
This application replaces the following
applications existing in previous versions of Isis, which have been deprecated
from the current version of Isis:
Added warp option and fixed bug in control net creation
Elizabeth Miller
2006-03-23
Fixed appTest
Jacob Danton
2006-01-06
Fixed appTest to comply with changes made to the ControlMeasure class.
Jacob Danton
2006-04-05
Added error reporting when the registration was a failure.
Kris Becker
2006-06-15
Set the MATCH file as the reference image so it can be used in
subsequent processing. Implemented use of unique serial numbers
for each image. Issues still remain with handling band-to-band
registrations within files. One alternative is to extract bands to
separate files as a fallback approach is to use filenames as the
serial number. This solution/alterntive is unique to coreg, however.
Brendan George
2006-10-02
Modified call for current time to point to Time class, instead of Application class
Brendan George
2006-12-08
Modified to reflect changes to the SerialNumber class
Steven Lambright
2008-06-23
Updated to properly check AutoReg::Register()'s return status
Noah Hilt
2008-08-13
Added two new optional arguments to AutoReg: WindowSize and
DistanceTolerance. These two arguments affect how AutoReg gathers and
compares data for surface modeling and accuracy. Added more statistics
to the Translation group, including min/max and standard deviation of
line/sample changes. Added the AutoReg statistics to be displayed as
well.
The template to use with the AutoReg class. Default will be maximum correlation function
with a tolerance of 0.7, a search cube of 50x50 pixels, and a pattern cube of 20x20 pixels.
There will also be other templates available in the default directory.
The tranformation type to use on the output file. The options are TRANSLATE or WARP.
If WARP is selected, the CNETFILE and DEGREE parameters are required.
Defaults to TRANSLATE.
Type
string
Default
TRANSLATE
Option List:
Option
Brief
Description
TRANSLATE
Output Translated Image
Runs the translate application on the input file to get the output file.
Exclusions
DEGREE
WARP
Output Warped Image
Runs the warp application on the input file to get the output file. If this
option is selected, the CNETFILE and DEGREE parameters must also
be entered.
This file will contain the ControlNet created in the coreg application. The data
will be in Pvl format. This option is required if the WARP option is selected for
the output file.
This file will contain the data collected from the coreg application. The data
will be comma separated and contain the sample, line position in the first input
cube, the sample, line position found in the search cube, and the sample difference
and line difference between to two.
The number of columns of points to use in the coreg process. If not entered,
it will default to COLUMNS = (image samples - 1) / search chip samples + 1.