Skip to main content

Z Transform and Its Properties

From DTFT to Z Transform

From DTFT, we have,

x[t]=12ππ+πX(ω)ejωtdωX(ω)=+x[t]ejωtx[t] = \frac{1}{2\pi} \int_{-\pi}^{+\pi} X(\omega) e^{j \omega t} \mathrm{d} \omega \\ X(\omega) = \sum_{-\infty}^{+\infty} x[t] e^{-j \omega t}

Again, we multiply the kernel with a real exponential, just like Laplace Transform (z transform is a discrete version of Laplace Transform). We note, z=ejω+σz = e^{j \omega + \sigma}, thus,

X(z)=t=+x[t]ztX(z) = \sum_{t = -\infty}^{+\infty} x[t] z^{-t}

If we would like X(z)X(z) to converge on the positive side, we need,

limt+x[t+1]z(t+1)x[t]zt<1\lim_{t \to +\infty} |\frac{x[t + 1] z^{-(t + 1)}}{x[t] z^{-t}}| < 1

Which is equivalent to,

z>limt+x[t]x[t+1]|z| > \lim_{t \to +\infty} \frac{x[t]}{x[t + 1]}

And vice versa on the negative side.

z<limtx[t]x[t1]|z| < \lim_{t \to -\infty} \frac{x[t]}{x[t - 1]}
tip

For certain points that,

z=limt+x[t]x[t+1]|z| = \lim_{t \to +\infty} \frac{x[t]}{x[t + 1]}

Wether the series will converge depends on the phase of zz. However, on this circle, it is guaranteed to have at least one point where the series will not converge.

And vice versa for the negative side.

And technically,

x[t]=j12πjπ+σ+jπ+σX(z)ztd(jω+σ)=12πjejπ+σe+jπ+σX(z)zt1dzx[t] = -j \frac{1}{2\pi} \int_{-j \pi + \sigma}^{+j\pi + \sigma} X(z) z^t \mathrm{d} (j \omega + \sigma) \\ = \frac{1}{2\pi j} \int_{e^{-j \pi + \sigma}}^{e^{+j\pi + \sigma}} X(z) z^{t - 1} \mathrm{d}z

But, wait, ejπ+σ=e+jπ+σe^{-j \pi + \sigma} = e^{+j\pi + \sigma}, what does the integral do?

Let's consider ejx+σe^{jx + \sigma} where xx walks from π-\pi to π\pi. This means that, the function walks through a circle with radius eσe^\sigma.

Exponential Function in Complex Plain

In the picture, that is starting from a point, walking alongside the blue line (the old line that was parallel to the real line before the transformation). A round is 2π2\pi of xx.

That is to say,

x[t]=12πjC(eσ)X(z)zt1dzx[t] = \frac{1}{2\pi j} \oint_{C(e^\sigma)} X(z) z^{t - 1} \mathrm{d}z

And C(eσ)C(e^\sigma) is a circle with radius eσe^\sigma. The value of σ\sigma is dependent on the region of convergence.

For convenience, if we write CC, it means a circle of radius one.

info

There is another way to directly get the inverse z transform, because,

X(z)=t=+x[t]ztX(z) = \sum_{t = -\infty}^{+\infty} x[t] z^{-t}

Based on the Cauchy Integral, that,

Pf(z)(zz0)ndz=2πjf(n1)(z0)n!Wheref(z) is continuous within P and P is any curve that contains z0\oint_{P} \frac{f(z)}{(z-z_0)^n} \mathrm{d}z = 2\pi j \frac{f^{(n - 1)}(z_0)}{n!} \\ \text{Where} \quad f(z) \text{ is continuous within } P \text{ and } P \text{ is any curve that contains } z_0

So if we want x[t]x[t], then consider,

X(z)zt1=k=+x[k]zk+t1X(z) z^{t - 1} = \sum_{k = -\infty}^{+\infty} x[k] z^{-k + t - 1}

Thus,

C(r)X(z)zt1dz=2πjx[t]\oint_{C(r)} X(z) z^{t-1} \mathrm{d}z = 2 \pi j x[t]

X(z)X(z) exists so CC must have a valid range of rr.

Similarly, we sometimes also use single side z transform. We don't usually care about the negative side.

So in conclusion,

The z transform is,

x[t]=12πjC(eσ)X(z)zt1dzX(z)=t=+x[t]zteσ0<z<eβ0x[t] = \frac{1}{2\pi j} \oint_{C(e^\sigma)} X(z) z^{t - 1} \mathrm{d}z \\ X(z) = \sum_{t = -\infty}^{+\infty} x[t] z^{-t} \quad e^{\sigma_0} < |z| < e^{\beta_0}

The single side z transform is,

x[t]=12πjC(eσ)X(z)zt1dzX(z)=t=0+x[t]ztz>eσ0x[t] = \frac{1}{2\pi j} \oint_{C(e^\sigma)} X(z) z^{t - 1} \mathrm{d}z \\ X(z) = \sum_{t = 0}^{+\infty} x[t] z^{-t} \quad |z| > e^{\sigma_0}

We note the pair as,

x(t)±ZX(z)z>eσ0x(t) \xrightarrow{\pm \mathcal{Z}} X(z) \quad |z| > e^{\sigma_0}

And the single side pair as,

x(t)ZX(z)eσ0<z<eβ0x(t) \xrightarrow{\mathcal{Z}} X(z) \quad e^{\sigma_0} < |z| < e^{\beta_0}

Properties

Linearity

ax[t]+by[t]ZaX(z)+bY(z)z>max(eσ0,eσ1)a x[t] + b y[t] \xrightarrow{\mathcal{Z}} a X(z) + b Y(z) \quad |z| > \max(e^{\sigma_0}, e^{\sigma_1}) ax[t]+by[t]±ZaX(z)+bY(z)max(eσ0,eσ1)<z<min(eβ0,eβ1)a x[t] + b y[t] \xrightarrow{\pm \mathcal{Z}} a X(z) + b Y(z) \quad \max(e^{\sigma_0}, e^{\sigma_1}) < |z| < \min(e^{\beta_0}, e^{\beta_1})

Shifting Time is Shifting Phase

We need to distinguish three cases when we consider time shifting.

  • Two sides z transform
  • Single side z transform on casual signals
  • Single side z transform on non-casual signals

Previously in the Laplace Transform, we only talked about single side transform on casual signals, because time shifting isn't very important in practice of Laplace Transform. However, since difference equation utilizes shifting, it's crucial to consider all off the three cases.

Again, in the following sections, if we don't say otherwise, the region of convergence won't change.

Shifting Time in Two Sides Z Transform

Suppose,

x[t]±ZX(z)x[t] \xrightarrow{\pm \mathcal{Z}} X(z)

We have,

t=+x[t+t0]zt=zt0t+t0=+x[t]z(t+t0)=zt0X(z)\sum_{t = -\infty}^{+\infty} x[t + t_0] z^{-t} \\ = z^{t_0} \sum_{t + t_0 = -\infty}^{+\infty} x[t] z^{- (t + t_0)} \\ = z^{t_0} X(z)

That's to say,

x[t+t0]±Zzt0X(z)x[t + t_0] \xrightarrow{\pm \mathcal{Z}} z^{t_0} X(z)

Shifting Time in Single Side Z Transform for Casual Signals

Casual signals are signals that has value zero on the negative side. Thus,

x[t]u[t]ZX(z)x[t]u[t] \xrightarrow{\mathcal{Z}} X(z) t=0+x[t+t0]u[t+t0]zt=zt0t+t0=t0+x[t+t0]u[t+t0]z(t+t0)=zt0+t+t0=0+x[t+t0]u[t+t0]z(t+t0)=zt0X(z)\sum_{t = 0}^{+\infty} x[t + t_0] u[t + t_0] z^{-t} \\ = z^{t_0} \sum_{t + t_0 = t_0}^{+\infty} x[t + t_0] u[t + t_0] z^{-(t + t_0)} \\ = z^{t_0} \sum+{t + t_0 = 0}^{+\infty} x[t + t_0] u[t + t_0] z^{-(t + t_0)} \\ = z^{t_0} X(z)

That's to say,

x[t+t0]u[t+t0]Zzt0X(z)x[t + t_0] u[t + t_0] \xrightarrow{\mathcal{Z}} z^{t_0} X(z)

Shifting Time in Single Side Z Transform for Non-Casual Signals

Again,

x[t]ZX(z)x[t] \xrightarrow{\mathcal{Z}} X(z)

We need to distinct the shift towards the postive side and the shift towards the negative side. Thus, we assume t00t_0 \geq 0.

For moving towards the negative side, the result is,

t=0+x[t+t0]zt=zt0t+t0=t0+x[t+t0]z(t+t0)=zt0(t+t0=0+x[t+t0]z(t+t)t+t0=0t01x[t+t0]z(t+t0))=zt0X(z)zt0t+t0=0t01x[t+t0]z(t+t0)\sum_{t = 0}^{+\infty} x[t + t_0] z^{-t} \\ = z^{t_0} \sum_{t + t_0 = t_0}^{+\infty} x[t + t_0] z^{-(t + t_0)} \\ = z^{t_0} (\sum_{t + t_0 = 0}^{+\infty} x[t + t_0] z^{-(t + t)} - \sum_{t + t_0 = 0}^{t_0 - 1} x[t + t_0] z^{-(t + t_0)}) \\ = z^{t_0} X(z) - z^{t_0} \sum_{t + t_0 = 0}^{t_0 - 1} x[t + t_0] z^{-(t + t_0)}

Kind of ugly, to be honest.

For moving towards the positive side, the result is,

t=0+x[tt0]zt=zt0tt0=t0+x[tt0]z(tt0)=zt0(tt0=0+x[tt0]z(tt0)+tt0=t01x[tt0]z(tt0))=zt0X(z)+zt0tt0=t01x[tt0]z(tt0)\sum_{t = 0}^{+\infty} x[t - t_0] z^{-t} \\ = z^{-t_0} \sum_{t - t_0 = -t_0}^{+\infty} x[t - t_0] z^{-(t - t_0)} \\ = z^{-t_0} (\sum_{t - t_0 = 0}^{+\infty} x[t - t_0] z^{-(t - t_0)} + \sum_{t - t_0 = -t_0}^{-1} x[t - t_0] z^{-(t - t_0)}) \\ = z^{-t_0} X(z) + z^{-t_0} \sum_{t - t_0 = -t_0}^{-1} x[t - t_0] z^{-(t - t_0)}

Well, just as ugly.

In conclusion, for t00t_0 \geq 0,

x[tt0]Zzt0X(z)+zt0tt0=t01x[tt0]z(tt0)x[t - t_0] \xrightarrow{\mathcal{Z}} z^{-t_0} X(z) + z^{-t_0} \sum_{t - t_0 = -t_0}^{-1} x[t - t_0] z^{-(t - t_0)} x[t+t0]Zzt0X(z)zt0t+t0=0t01x[t+t0]z(t+t0)x[t + t_0] \xrightarrow{\mathcal{Z}} z^{t_0} X(z) - z^{t_0} \sum_{t + t_0 = 0}^{t_0 - 1} x[t + t_0] z^{-(t + t_0)}

It's better that we simply memorize the case for pushing back 11 and 22, because they'll be used later,

x[t1]Zz1X(z)+x[1]x[t2]Zz2X(z)+x[1]z1+x[2]x[t - 1] \xrightarrow{\mathcal{Z}} z^{-1} X(z) + x[-1] \\ x[t - 2] \xrightarrow{\mathcal{Z}} z^{-2} X(z) + x[-1] z^{-1} + x[-2]

For forwarding operators,

x[t+1]Zz1X(z)x[0]x[t+2]Zz2X(z)x[0]z1x[1]z2x[t + 1] \xrightarrow{\mathcal{Z}} z^{1} X(z) - x[0] \\ x[t + 2] \xrightarrow{\mathcal{Z}} z^{2} X(z) - x[0] z^{1} - x[1] z^{2}

Z-Scale

x[t]ZX(z)z<eσ0x[t] \xrightarrow{\mathcal{Z}} X(z) \quad |z| < e^{\sigma_0} t=+x[t]atzt=t=0+x[t](az)t=X(az)\sum_{t = -\infty}^{+\infty} x[t] a^{-t} z^{-t} \\ = \sum_{t = 0}^{+\infty} x[t] (az)^{-t} \\ = X(az) atx[t]ZX(az)az<eσ0a^{-t}x[t] \xrightarrow{\mathcal{Z}} X(az) \quad |az| < e^{\sigma_0}

Similarly,

atx[t]±ZX(az)eσ0<az<eβ0a^{-t}x[t] \xrightarrow{\pm \mathcal{Z}} X(az) \quad e^{\sigma_0} < |az| < e^{\beta_0}

Z-Derivative

x[t]ZX(z)x[t] \xrightarrow{\mathcal{Z}} X(z)

That is,

X(z)=t=0+x[t]ztX(z) = \sum_{t = 0}^{+\infty} x[t] z^{-t}

Take derivative,

dX(z)dz=t=0+tx[t]zt1\frac{\mathrm{d} X(z)}{\mathrm{d} z} = -\sum_{t = 0}^{+\infty} t x[t] z^{-t - 1} zdX(z)dz=t=0+tx[t]zt-z \frac{\mathrm{d} X(z)}{\mathrm{d} z} = \sum_{t = 0}^{+\infty} t x[t] z^{-t}

And thus,

tx[t]ZzdX(z)dzt x[t] \xrightarrow{\mathcal{Z}} -z \frac{\mathrm{d} X(z)}{\mathrm{d} z}

It's identical to the two sides,

tx[t]±ZzdX(z)dzt x[t] \xrightarrow{\pm \mathcal{Z}} -z \frac{\mathrm{d} X(z)}{\mathrm{d} z}

Initial and Final Value Theorems

Suppose the sequence is right side- that is, there is an MM such that,

x[t]=0t<Mx[t] = 0 \quad t < M

And,

X(z)=t=M+x[t]ztX(z) = \sum_{t = M}^{+\infty} x[t] z^{-t}

Thus,

limz+X(z)zM=limz+t=M+x[t]z(tM)=x[M]\lim_{z \to +\infty} X(z) z^{M} \\ = \lim_{z \to +\infty} \sum_{t = M}^{+\infty} x[t] z^{-(t - M)} \\ = x[M]

That is,

x[M]=limz+X(z)zMx[M] = \lim_{z \to +\infty} X(z) z^{M}

For causal signals,

x[0]=limz+X(z)x[0] = \lim_{z \to +\infty} X(z)

As for the final value theorem, consider,

X(z)=t=M+x[t]ztX(z) = \sum_{t = M}^{+\infty} x[t] z^{-t}

And,

limz1(z1)X(z)=limz1t=M+x[t](ztz(t+1))=limz1t=M+(x[t+1]x[t])ztx[M]zM=x[+]\lim_{z \to 1} (z - 1) X(z) \\ = \lim_{z \to 1} \sum_{t = M}^{+\infty} x[t] (z^{-t} - z^{-(t + 1)}) \\ = \lim_{z \to 1} \sum_{t = M}^{+\infty} (x[t+1] - x[t]) z^{-t} - x[M] z^{-M} \\ = x[+\infty]

That is to say,

x[+]=limz1(z1)X(z)x[+\infty] = \lim_{z \to 1} (z - 1) X(z)

This requires convergence condition of,

eσ<ze^{\sigma} < |z|

Where,

eσ<1e^{\sigma} < 1

This is because we are approaching zz to one, and thus one must be in the circle.

In conclusion,

The initial value theorem is,

x[M]=limz+X(z)zMx[M] = \lim_{z \to +\infty} X(z) z^{M}

The final value theorem is,

x[+]=limz1(z1)X(z)x[+\infty] = \lim_{z \to 1} (z - 1) X(z)

Convolution

Consider,

x[t]ZX(z)eσ0<zy[t]ZY(z)eσ1<zx[t] \xrightarrow{\mathcal{Z}} X(z) \quad e^{\sigma_0} < |z| \\ y[t] \xrightarrow{\mathcal{Z}} Y(z) \quad e^{\sigma_1} < |z|

Then, we define a single-side convolution,

x[t]y[t]=m=0+x[tm]y[m]x[t] \ast y[t] = \sum_{m = 0}^{+\infty} x[t - m] y[m]

Perform z transform on the both sides,

Z(x[t]y[t])=t=0+m=+x[tm]y[m]zt=t=0+m=0+x[tm]z(tm)y[m]zm=X(z)Y(z)\mathcal{Z}(x[t] \ast y[t]) = \sum_{t = 0}^{+\infty} \sum_{m = -\infty}^{+\infty} x[t - m] y[m] z^{-t} \\ = \sum_{t = 0}^{+\infty} \sum_{m = 0}^{+\infty} x[t - m] z^{-(t - m)} y[m] z^{-m} \\ = X(z) Y(z)

And thus,

x[t]y[t]ZX(z)Y(z)maxeσ0,eσ1<zx[t] \ast y[t] \xrightarrow{\mathcal{Z}} X(z) Y(z) \quad \max{e^{\sigma_0}, e^{\sigma_1}} < |z|

For two side transform,

x[t]y[t]±ZX(z)Y(z)maxeσ0,eσ1<z<mineβ0,eβ1x[t] \ast y[t] \xrightarrow{\pm \mathcal{Z}} X(z) Y(z) \quad \max{e^{\sigma_0}, e^{\sigma_1}} < |z| < \min{e^{\beta_0}, e^{\beta_1}}

In both cases, the region of convergence is the intersect of the two old regions of convergence.