CPS843/CP8307 Introduction to Computer Vision Assignment 4

$30.00

Category: You will Instantly receive a download link for .zip solution file upon Payment || To Order Original Work Click Custom Order?

Description

5/5 - (3 votes)

1 Optical flow estimation (45 points)
In this part, you will implement the Lucas-Kanade optical flow algorithm that computes the pixelwise motion between
two images in a sequence. Compute the optical flow fields for the three image sets labeled synth, sphere and
corridor. Before running your code on the images, you should first convert your images to grayscale and map the
intensity values to the range [0, 1].
1. [20 points] Recall from lecture, to compute the optical flow at a pixel, compute the spatial derivatives (in the
first frame), Ix and Iy, compute the temporal derivative, It
, and then over a window centred around each pixel
solve the following:
P P IxIx
P P
P P
IxIy
IxIy
P P IyIy
! u
v
!
= −
P P
P P
IxIt
IyIt
!
. (1)
Write a MATLAB function, call it myFlow, that takes as input two images, img1 and img2, the window length
used to compute flow around a point, and a threshold, τ. The function should return three images, u and v, that
contain the horizontal and vertical components of the estimated optical flow, respectively, and a binary map that
indicates whether the flow is valid.
To compute spatial derivatives, use the five-point derivative of Gaussian convolution filter (1/12)*[-1 8 0 -8
1] (make sure the filter is flipped correctly); the image origin is located in the top-left corner of the image, with
the positive direction of the x and y axes running to the right and down, respectively. To compute the temporal
derivative, apply Gaussian filtering with a small σ value (e.g., 3 × 3 filter with σ = 1) to both images and
then subtract the first image from the second image. Since Lucas-Kanade only works for small displacements
(roughly a pixel or less), you may have to resize the input images (use MATLAB’s imresize) to get a reasonable
flow field. Hint: The (partial) derivatives can be computed once by applying the filters across the entire image.
Further, to efficiently compute the component-wise summations in (1), you can apply a smoothing filter (e.g.,
box filter, Gaussian, etc.) on the image containing the product of the gradients.
Recall, the optical flow estimate is only valid in regions where the 2×2 matrix on the left side of (1) is invertible.
Matrices of this type are invertible when their smallest eigenvalue is not zero, or in practice, greater than some
threshold, τ, e.g., τ = 0.01. At image points where the flow is not computable, set the flow value to zero.
2. [5 points] Visualize the flow fields using the function flowToColor. Play around with the window size and
explain what effect this parameter has on the result.
1
3. [10 points] Another way to visualize the accuracy of the computed flow field is to warp img2 with the computed
optical flow field and compare the result with img1. Write a function, call it myWarp, that takes img2 and the
estimated flow, u and v, as input and outputs the (back)warped image. If the images are identical except for a
translation and the estimated flow is correct then the warped img2 will be identical to img1 (ignoring discretization artifacts). Hint: Use MATLAB’s functions interp2 (try bicubic and bilinear interpolation) and meshgrid.
Be aware that interp2 may return NaNs. In particular, this may occur around the image boundaries, since
data is missing to perform the interpolation. Make sure your code handles this situation in a reasonable way.
Visualize the difference between the warped img2 and img1 by: (i) take the difference between the two images
and display their absolute value output (use an appropriate scale factor for imshow) and (ii) using imshow display
img1 and the warped img2 consecutively in a loop for a few iterations, the output should appear approximately
stationary. When running imshow in a loop, you will need to invoke the function draw now to force MATLAB
to render the new image to the screen.
4. [10 points] In this question you will implement the Kanada-Lucas-Tomasi (KLT) tracker. The steps to implement are as follows:
• Detect a set of keypoints in the initial frame using the Harris Corner detector. Here, you can use the code
outlined in the lecture as a starting point.
• Select 20 random keypoints in the initial frame and track them from one frame to the next; for each
keypoint, use a window size of 15 × 15. The tracking step consists of computing the optical flow vector
for each keypoint and then shifting the window, i.e., x
t+1
i
= x
t
i
+ u and y
t+1
i
= y
t
i
+ v, where i denotes the
keypoint index and t the frame. This step is to be repeated for each frame using the estimated window
position from the previous tracking step.
• Discard any keypoints if their predicted location moves out of the frame or is near the image borders.
Since the displacement values (u, v) are generally not integer value, you will need to use interpolation for
subpixel values; use interp2.
• Display the image sequence and overlay the 2D path of the keypoints using line segments.
• Display a separate image of the first frame and overlay a plot of the keypoints which have moved out of
the frame at some point in the sequence.
For this question, you will use the images from the Hotel Sequence.
2