Schedule

From WikiEducator
Jump to: navigation, search
Image Morphing: A Comparative Study


Prashant K. Oswal and Prashanth Y. Govindaraju
Department of Electrical and Computer Engineering, Clemson University, Clemson


Abstract


A comparison of various techniques for morphing one digital image in to another is made. We will compare various morphing techniques such as Feature based image morphing, Mesh and Thin Plate Spline based image morphing based on different attributes such as Computational Time, Visual Quality of Morphs obtained and Complexity involved in Selection of features. We will demonstrate the pros and cons of various techniques so as to allow the user to make an informed decision to suit his particular needs.

Keywords: Morphing, Feature Based Morphing, Thin Plate Spline Morphing, Mesh Morphing.


  1. Introduction

Morphing is defined as the animated transformation of one image into another. Morphing involves the image processing techniques of warping and cross dissolving. Morphing sequences produced by only using cross-dissolving (e.g. linear interpolation to fade from one image to another) of the source and destination image are visually poor. The results are poor, because in general the features of the source and destination will not be aligned. When we simply cross dissolve,the double-exposure effect will be apparent in misaligned regions. In order to overcome this problem, warping is used to align the two images before cross dissolving. Warping determines the way in which the pixels in one image should be mapped to the pixels in the other image. For warping to work, the mapping of few important pixels needs to be specified. The motion for the other pixels is obtained by extrapolating the information specified for the control pixels. Since cross dissolving is very simple, warping becomes the major problem of morphing techniques. Morphing is simply a cross-dissolve applied to warped imagery. The different warping techniques differ in the way the mapping for the control pixels is to be specified and the interpolating technique used for the other pixels. These set of control pixels usually specify prominent features in the images.

In this paper we intend to review different morphing techniques and compare them based on various attributes. We have defined attributes such as, Computational Time, Visual Quality of Morphs obtained and Complexity involved in Selection of features, for making an effective comparison.

The rest of the paper is divided as follows: Section 2 explains the principal behind morphing algorithms, Section 3 deals with various warping techniques with section 3.1 dealing with Mesh Warping, and Section 3.2 dealing with Thin Plate Spline based Warping and Section 3.3 dealing with Feature Based Image Warping. Section 4 presents our results and Section 5 a brief conclusion.


2. Morphing Principle


The basic principle behind image morphing is explained in this section. "Morphing" refers to the combination of generalized image warping with a cross-dissolve between image elements. In order to morph between two images we define corresponding control pixels in source image I0 and destination image I1. We then define each intermediate frame I of the metamorphosis by creating a new set of control pixels by interpolating the control pixels from their positions in I0 to the positions in I1.Both images I0 and I1 are then warped toward the position of the control pixels in I. These two warped images are cross-dissolved throughout the metamorphosis. Therefore the different morphing techniques differ mainly in the way in which they perform warping. In the following section we will describe various image warping techniques.


3. Warping Techniques


3.1 Mesh Warping: Mesh warping is a two-pass algorithm that accepts a source image and two 2-D arrays of coordinates S and D. The S coordinates are the control pixels in the source image. The D coordinates specify the location to which the S coordinates map. The final image is the source image warped by means of meshes S and D. The 2-D arrays in which the control points are stored impose a rectangular topology to the mesh. The only constraint is that the meshes defined by both arrays be topologically equivalent i.e. no folding or discontinuities. Therefore the entries in D are coordinates that may wander as far from S as necessary, as long as they do not cause self-intersection.

The first pass is responsible for re-sampling each row independently. An intermediate array of points I, whose x coordinates are same as those in D and whose y coordinates are the same as those in S, is created. Vertical splines are generated to fit each column of data in S and I. The data for each span (region) in a row is interpolated to create intermediate image I.

The second pass is responsible for re-sampling each column independently. Horizontal splines are generated to fit each row of data in arrays I and D. The data for each span (region) in a column is interpolated from intermediate image I to create destination image D.

The collection of vertical splines fitted through S and I in the first pass, along with the horizontal splines fitted through I and D in the second pass, are shown in Figure 1.


[[Image:]]
Figure 1


3.2 Feature Based Image Warping: This method gives the animator a high level of control over the process. The animator interactively selects corresponding feature lines in the 2 images to be morphed. The algorithm uses lines to relate features in the source image to features in the destination image. It is based upon fields of influence surrounding the feature lines selected. It uses reverse mapping (i.e. it goes through the destination image pixel by pixel, and samples the correct pixel from the source image) for warping the image.

A pair of lines (one defined relative to the source image, the other defined relative to the destination image) defines a mapping from one image to the other.

[[Image:]]


Figure 2

The following parameters are calculated

[[Image:]]

Where X is the pixel co-ordinate in the destination image and X’ is the corresponding pixel co-ordinate in the source image, PQ is a line segment in the destination image and P’Q’ is the corresponding line segment in the source image, u is the position along the line, and v is the distance from the line. The value u goes from 0 to 1 as the pixel moves from P to Q, and is less than 0 or greater than 1 outside that range. The value for v is the perpendicular distance in pixels from the line.


The algorithm is as follows:

For each pixel X in the destination image

Find the corresponding u, v

Find X’ in the source image for that u, v

DestinationImage(X) = SourceImage(X')


The algorithm transforms each pixel coordinate by a rotation, translation, and/or a scale, thereby transforming the whole image.

The above algorithm is for the case of a single feature line. In a normal morphing scenario, however there are multiple features in the images to be morphed and consequently multiple feature line pairs are specified.

The displacement of a point in the source image is then, actually a weighted sum of the mappings due to each line pair, with the weights attributed to distance and line length.  The weight assigned to each line should be strongest when the pixel is exactly on the line, and weaker the further the pixel is from it. The equation we use is

[[Image:]]


The displacement of a point in the source image is then, actually a weighted sum of the mappings due to each line pair, with the weights attributed to distance and line length.  The weight assigned to each line should be strongest when the pixel is exactly on the line, and weaker the further the pixel is from it. The equation we use is

[[Image:]]

Where length is the length of a line, dist is the distance from the pixel to the line, and a, b, and p are constants that can be used to change the relative effect of the lines. For multiple lines, the algorithm is as follows:

For each pixel X in the destination

DSUM = (0, 0)

Weightsum = 0

For each line Pi Qi

Calculate u, v based on Pi Qi

Calculate X'i based on u, v and Pi'Qi'

Calculate displacement Di = Xi' - Xi

dist = shortest distance from X to Pi Qi

[[Image:]] [[Image:]]

DSUM += Di * weight

Weightsum += weight

X' = X + DSUM / weightsum

DestinationImage(X) = SourceImage(X')


3.3 Thin Plate Spline (TPS) Based Image Warping:

Thin-plate Spline is a conventional tool for surface interpolation over scattered data. It is an interpolation method that finds a "minimally bended" smooth surface that passes through all given points. The name "Thin Plate" comes from the fact that a TPS more or less simulates how a thin metal plate would behave if it was forced through the same control points.

Let us denote the target function values vi at locations ( xi , yi ) in the plane, with i=1,2,…….p , where p is the number of feature points. In particular, we will set vi equal to the coordinates ( xi , yi ) in turn to obtain one continuous transformation for each coordinate. An assumption is made that the locations ( xi , yi ) are all different and are not collinear.

[[Image:]]


Figure 3

Figure 3 is a simple example of coordinate transformation using TPS. Let’s consider two sets of points for which we assume the correspondences to be known (a). The TPS warping allows an alignment of the points and the bending of the grid shows the deformation needed to bring the two sets on top of each other (b). Note that in the case of TPS applied to coordinate transformation we actually use two splines, one for the displacement in the x direction and one for the displacement in the y direction. The two resulting transformations are combined into a single mapping.


Thin plate splines (TPS) interpolate specified points while minimizing an approximate curvature (integrated squared second derivative),


If=∫∫R2 (fxx2+2fxy2+fyy2)dxdy


and has the form

[[Image:]]

Where U(r) = r2logr and f(x, y) is the desired displacement at a point (x, y) to warp from source image to destination image. In order for f(x, y) to have square integrable second derivatives, we require that:


[[Image:]]


The weights wi, a1,ax, ay, i = 1 . . . n, needed to interpolate the data can be found by solving the block matrix system

[[[Image:]]] [[[Image:]]] = [[[Image:]]]

Where Kij=U( || ( xi , Yi ) - ( xj , Yj )|| ), the i th row of P is (1, xi , Yi), O is a 3 X 3 matrix of zeros, o is a 3 X 1 column of zeros, w and v are column vectors formed by wi and vi , respectively, and a is the column vector with elements a1,ax,ay.


[[Image:]]
[[Image:]] [[Image:]] [[Image:]][[Image:]] [[Image:]] [[Image:]]
(a)
[[Image:]] [[Image:]] [[Image:]] [[Image:]] [[Image:]] [[Image:]]
(b)
[[Image:]] [[Image:]] [[Image:]] [[Image:]][[Image:]] [[Image:]]
(c)
[[Image:]] [[Image:]] [[Image:]] [[Image:]] [[Image:]] [[Image:]]
(d)
Figure 4: Results of Morphing using (a) Simple Cross Dissolve (b) Mesh Warping, (c) Thin Plate Spline Warping and, (d) Feature based Warping





Activity Nodal person/Agency Target Date
Preparation of draft proposal Pankaj and Uma 20th July 2007
Final proposal Pankaj and Uma & Mohan, Som 31st July 2007
Identifying academics (3 from each school) from School of Studies offering professional programmes Uma and Pankaj 10th August 2007
Planning Meeting at Bangalore Mohan to Coordinate (Som, Pankaj and Uma will participate) 18th and 19th August 2007
Formulation of teams and commencement of online discussions Pankaj, Uma & Som 23rd August 2007
Identification of target Subjects and or Topics by teams. Definition of key learning competencies that need to be developed among learners Som will help through emails 31 August
Development of Project Platform on IGNOU's Server Uma 30 September 2007
Developing of key competencies and learning outcomes Som will help through emails All of October 2007
Orientation workshop on SBL in Delhi Lead Resource by Som; Assisted by Uma and Pankaj November first week, 2007
Development of Scenarios by teams (Using IGNOU's Platform) All participants All of November 2007
First draft in to Som via WikiEducator All participants 15th December 2007
Editing feedback and further refinements (On the WikiEducator) Som to all participants December 2007 and January 2008
Second revision workshop Som February 2008
Language and subject content editing Som (Assistant) End of February 2008
Final Review of Write-up and editing Som End of March 2008
Publication and its distribution COL/IGNOU April 2008