Skip to main content

Pulse Signal and Its Representation

We are introducing pulse signal, which is a common function seen in the future.

Definition

A pulse signal at time 00 is written as,

δ(t)\delta(t)

It is defined as a limit of any distribution, that has a mean value of zero, and the chance of being non-zero is one.

That is to say,

δ(t)=limϵ0distributionϵ(t)\delta(t) = \lim_{\epsilon \to 0} \text{distribution}_\epsilon(t)

Where ϵ\epsilon is the chance of tt being non-zero.

For example,

δ(t):=limϵ0rectϵ(t)\delta(t) := \lim_{\epsilon \to 0} \text{rect}_\epsilon(t)

Where,

rectϵ(t):={12ϵt2ϵ0  otherwise  s.t.  ϵ>0\text{rect}_\epsilon(t) := \begin{cases} \frac{1}{2\epsilon} \quad ||t|| \leq 2\epsilon \\ 0 \; \text{otherwise} \end{cases} \quad \; s.t. \; \epsilon >0
tip

rectϵ\text{rect}_\epsilon is also called the gate function. It is a uniform distribution from ϵ-\epsilon to ϵ\epsilon. Where ϵ>0\epsilon > 0.

Or the gaussian distribution,

δ(t):=limϵ012πϵ2et22ϵ2\delta(t) := \lim_{\epsilon \to 0} \frac{1}{\sqrt{2\pi\epsilon^2}}e^{-\frac{t^2}{2\epsilon^2}}
tip

Whenever you need to prove some properties of a pulse signal, you must keep in mind that, δ\delta is the limit of a probability distribution.

note

There are more than one ways to define a δ\delta signal. We use the probability distribution as the definition. It can also be defined via test function.

Obviously, δ(0)=+\delta(0) = +\infty, and δ(t)=0\delta(t) = 0 for t0t \neq 0.

We often draw δ(t)\delta(t) signal as an arrow pointing upwards, with a height of 1.

tip

In a physical sense, the dimension is,

[δ(t)]=1[t][ \delta(t) ] = \frac{1}{[t]}

Properties

Even

This is obvious because δ(t)=δ(t)=0\delta(t) = \delta(-t) = 0 for t0t \neq 0.

Unit Area

+δ(t)dt=1\int_{-\infty}^{+\infty} \delta(t) \mathrm{d}t = 1

Because it is the limit of a distribution, the area under the curve is 1.

Selective

+δ(tt0)f(t)dt=f(t0)\int_{-\infty}^{+\infty} \delta(t-t_0) f(t) \mathrm{d}t = f(t_0)

Consider,

+δ(t)f(t)dt=Etδ(t)(f(t))=f(0)\int_{-\infty}^{+\infty} \delta(t) f(t) \mathrm{d}t = \mathbb{E}_{t \sim \delta(t)}(f(t)) = f(0)

So the selectivity can be proved via substitution.

Derivative

+f(t)(ddt)nδ(tt0)dt=(1)n((ddt)nf(t))t=t0\int_{-\infty}^{+\infty} f(t) (\frac{\mathrm{d}}{\mathrm{d}t})^n \delta(t - t_0) \mathrm{d}t = (-1)^{n}((\frac{\mathrm{d}}{\mathrm{d}t})^n f(t))|_{t=t_0}

This is easy from integration by parts. Consider,

+f(t)ddtδ(tt0)dt=+f(t)dδ(tt0)=(f(t)δ(tt0))++ddtf(t)δ(tt0)dt=(ddtf(t))t=t0\int_{-\infty}^{+\infty} f(t) \frac{\mathrm{d}}{\mathrm{d}t} \delta(t - t_0) \mathrm{d}t \\ = \int_{-\infty}^{+\infty} f(t) \mathrm{d} \delta(t - t_0) \\ = (f(t) \delta(t - t_0))|^{+\infty}_{-\infty} - \int_{-\infty}^{+\infty} \frac{\mathrm{d}}{\mathrm{d}t} f(t) \delta(t - t_0) \mathrm{d}t \\ = - (\frac{\mathrm{d}}{\mathrm{d}t} f(t))|_{t=t_0}

This can be done recursively for the final result.

Scale

δ(αt)=1αδ(t)\delta(\alpha t) = \frac{1}{||\alpha||} \delta(t)

This is done by definition. Assume α>0\alpha > 0,

We first calculate the CDF for the distribution αt\alpha t,

F(t)=P(αtt)=P(ttα)=tαdistributionϵ(t)dtF(t')=P(\alpha t \leq t') = P(t \leq \frac{t'}{\alpha}) \\ = \int_{-\infty}^{\frac{t'}{\alpha}} \text{distribution}_{\epsilon}(t) \mathrm{d}t

Because the derivative of CDF is a PDF, so,

p(t)=1αdistributionϵ(t)p(t') = \frac{1}{\alpha}\text{distribution}_{\epsilon}(t)

We enforce ϵ0\epsilon \to 0, so,

δ(αt)=1αδ(t)\delta(\alpha t) = \frac{1}{\alpha}\delta(t)

If α<0\alpha < 0, we can utilize the evenness of the delta function.

Decomposition

If a polynomial P(x)P(x) can be decomposed into product of linear terms pi=xiζip_i = x_i - \zeta_i.

That is,

P(x)=i=0i=npiP(x) = \prod_{i=0}^{i=n} p_i

Then,

δ(P(x))=i=0i=nδ(xiζi)P(ζi)\delta(P(x)) = \sum_{i=0}^{i=n} \frac{\delta(x_i - \zeta_i)}{||P'(\zeta_i)||}

This is hard to prove directly, we use an indirect proof.

Consider,

+δ(P(x))f(x)dx\int_{-\infty}^{+\infty} \delta(P(x)) f(x) dx

Because in most of the space, δ(P(x))\delta(P(x)) is zero. We can make that into,

i=0i=nζiϵiζi+ϵiδ(P(x))f(x)dx\sum_{i=0}^{i=n}\int_{\zeta_i-\epsilon_i}^{\zeta_i+\epsilon_i} \delta(P(x)) f(x) dx

We can use substitution if we assume ϵi0\epsilon_i \to 0. That is to say, we can perceive P(x)P(x) as locally linear, so,

dx=1P(ζi)dP(x)dx = \frac{1}{||P'(\zeta_i)||}dP(x)

So,

+δ(P(x))f(x)dx=i=0i=nζiϵiζi+ϵiδ(P(x))f(x)dx=i=0i=nζiϵiζi+ϵi1P(ζi)δ(P(x))f(x)dP(x)=i=0i=nf(ζi)P(ζi)\int_{-\infty}^{+\infty} \delta(P(x)) f(x) dx \\ = \sum_{i=0}^{i=n}\int_{\zeta_i-\epsilon_i}^{\zeta_i+\epsilon_i} \delta(P(x)) f(x) dx\\ = \sum_{i=0}^{i=n}\int_{\zeta_i-\epsilon_i}^{\zeta_i+\epsilon_i} \frac{1}{||P'(\zeta_i)||} \delta(P(x)) f(x) dP(x) \\ = \sum_{i=0}^{i=n} \frac{f(\zeta_i)}{||P'(\zeta_i)||}

Which is equivalent to,

δ(P(x))=i=0i=nδ(xiζi)P(ζi)\delta(P(x)) = \sum_{i=0}^{i=n} \frac{\delta(x_i - \zeta_i)}{||P'(\zeta_i)||}

Integration

If we integrate δ(t)\delta(t), we get a step function,

info

By integrate, we will always mean a definite integral from -\infty to a variable. So that integrating a function yields another function.

u(t):=tδ(τ)dτu(t) := \int_{-\infty}^{t} \delta(\tau) d\tau

Or, alternatively,

u(t)={0if t<01if t0u(t) = \begin{cases} 0 & \text{if } t < 0 \\ 1 & \text{if } t \geq 0 \end{cases}

The step function is a common function seen in the future.

info

It is obvious that,

rectϵ(t)=u(ϵ+t)u(ϵt)\text{rect}_\epsilon(t) = u(\epsilon + t) - u(\epsilon - t)
info

More precisely,

u(t)={0if t<01if t>0undefinedif t=0u(t) = \begin{cases} 0 & \text{if } t < 0 \\ 1 & \text{if } t > 0 \\ \text{undefined} & \text{if } t =0 \end{cases}

The value at 00 depends on whether we are integrating on δ(t0)\delta(t-0) or δ(t+0)\delta(t+0).