A typical digital imaging system consists of four functional modules: (i) a set of optical lenses, (ii) the analog front-end (AFE) module including a color filter array (CFA), a complementary metal-oxide-semiconductor image sensor (CIS), and an analog-to-digital converter (ADC), (iii) the digital back-end (DBE) module including various image signal processing subsystems, and (iv) the display devices as shown in Figure 2. The proposed digital zooming system belongs to the DBE module in the ISP chain.
The proposed digital zooming subsystem consists of: (i) estimation of the edge orientation followed by edge refinement, (ii) selective interpolation using either cubic-spline or directionally weighted one-dimensional (1D) linear interpolation along the estimated edge orientation, and (iii) restoration filtering as shown in Figure 3.
Edge orientation estimation and refinement
In order to determine the edge orientation, the input image is convolved with four 5 × 5 FIR steerable filters given in Table 1 as
(14)
where f
L
(m, n) represents a 5 × 5 local block of the input image centered at (x, y) and Gθ(x, y) the 5 × 5 FIR steerable filters rotated by angle θ ∈ {0°, 45°, 90°, 135°}.
The initial edge orientation is determined by minimizing the mean of dθ(x, y) as (Kang et al. 2013a, [b])
(15)
and
(16)
If the mean value Dθ(x, y) is less than a pre-specified threshold, the corresponding pixel is considered to be in the non-slanted edge region. In this work, the threshold value of 0.075 was used for the empirically optimum sensitivity of steerable filters. Given an initial edge orientation θ
I
(x, y), the refined edge orientation is selected among eighteen directions. The proposed edge refinement algorithm is summarized as.
The refined edge orientation θ∗ is finally quantized with the interval of 10°. Figure 4 shows the results of edge orientation estimation using four directionally steerable filters followed by edge refinement. As shown in Figure 4c, the proposed method provides more accurate and continuous edge orientation, which make the result of the proposed directionally adaptive interpolation looks more natural.
Directionally adaptive image interpolation
In order to interpolate the slanted edge region without jagging artifacts, the proposed method computes the line on the point P3 with refined edge orientation θ∗ as shown in Figure 5.
The intensity value of P1 is first determined by the proposed adaptive cubic-spline interpolation using the activity-map (Efstratiadis et al. 1990) at v2 and four pixels on the same horizontal line as
(17)
where S represents the distance from interpolation point P1, and f(·) represents a cubic-spline weight function defined as
(18)
where a represents a cubic-spline weight function parameter.
In this work, the initial parameter value of a = - 1 was used. For the spatially adaptive interpolation without blurring artifacts, the cubic-spline weight function parameter is changed according to the strength of the edge using the activity-map (Efstratiadis et al. 1990) as
(19)
where the tuning parameter σ is chosen so that α
MAP
(x, y) distributes as uniformly as possible in [0,1], and Var(x, y) is the local variance of a pixel located at (x, y). In this work, the tuning parameter of σ = 250 was used.
Thus, the adaptive cubic-spline weight function parameter is determined as
(20)
where α
MAP
(x, y) represents the activity value at v2.
The intensity value of P2 is also determined in the same manner in the vertical direction. Given P1 and P2, the intensity value of P3 is determined by the weighted linear interpolation along the edge line as (Kang et al. 2013a, [b])
(21)
where w 1 represents the distance between P1 and P3, and w 2 the distance between P2 and P3.
For reducing the computational load of the interpolation process, a simple cubic-spline interpolation is used with a = - 0.5, if a pixel is not on the salted edge region. By using the directionally optimized interpolation, the proposed method can significantly reduce jagging artifacts in the slanted edge region.
Directionally adaptive image restoration
The proposed TCLS restoration filters are generated using directionally adaptive smoothness constraints Cθ(u, v) according to the estimated edge orientation. To remove blurring artifacts caused by the interpolation process, the proposed restoration method performs 2D convolution using five 5 × 5 directionally adaptive TCLS filters according to the edge orientation θ as
(22)
where ∗ represents the 2D convolution operator, ĝ(x, y) is the interpolated image, is the impulse response of the directionally TCLS filter of orientation θ, and is the restored HR image.