Generalising the Factorial

Recall that for any n \in \mathbb N_0 and s \in \mathbb R^+, F_n(s) := \mathcal L\{t^n\} = n!/s^{n+1}. Setting s = 1, we obtain n! = F_n(1), and consequently, F_n(s) = F_n(1) \cdot s^{-(n+1)}. In particular, F_0(1) = 1.

Problem 1. For \alpha \in \mathbb N_0, prove that F(s):= \mathcal L\{t^\alpha\} is well-defined for \mathrm{Re}(s) > 0.

(Click for Solution)

Solution. Write s = \sigma + i\tau with \sigma > 0. Then

\begin{aligned} \mathcal L\{t^{\alpha}\} &=  \int_0^{\infty} t^{\alpha} e^{-st}\, \mathrm dt \\ &=  \int_0^{\infty} t^{\alpha} e^{-(\sigma + i \tau)t}\, \mathrm dt \\ &= \int_0^{\infty} t^{\alpha} e^{-\sigma t} e^{-i (\tau t)}\, \mathrm dt. \end{aligned}

Taking absolute values and applying the triangle inequality,

\begin{aligned}\left| \int_0^{\infty} t^{\alpha} e^{-st}\, \mathrm dt \right| &\leq \int_0^{\infty} t^{\alpha} e^{-\sigma t} |e^{-i (\tau t)}| \, \mathrm dt \\ &=\int_0^{\infty} t^{\alpha} e^{-\sigma t} \, \mathrm dt = F(\sigma) < \infty.\end{aligned}

For \alpha \in \mathbb C and s \in \mathbb C, define F_\alpha(s) := \mathcal L\{t^\alpha\} (s) whenever the right-hand side is well-defined. For example, by Problem 1, F_\alpha(s) is well-defined for \alpha \in \mathbb N_0 and s \in \mathbb C such that \mathrm{Re}(s) > 0.

Problem 2. Prove that for \mathrm{Re}(\alpha) > -1 and \mathrm{Re}(s) > 0, F_\alpha(s) is well-defined. Furthermore, prove that

\displaystyle F_{\alpha+1}(s) = \frac{\alpha+1}{s} \cdot F_\alpha(s).

Deduce that F_\alpha(s) = F_\alpha(1) \cdot s^{-(n+1)}.

(Click for Solution)

Solution. Write \alpha = \beta + i \gamma with \beta > -1. For t \in (0, \infty),

t^{\alpha+1} = t^{(\beta+1) + i \gamma} = t^{\beta + 1} \cdot e^{i (\gamma \ln(t))}

implies that |t^{\alpha+1}| \leq |t^{\beta+1}|, so that |\mathcal L\{t^{\alpha + 1}\}| \leq \mathcal L\{t^{\beta + 1}\}. Since \beta+1 > 0, the latter converges for \mathrm{Re}(s) > 0, so does \mathcal L\{t^\alpha\} = F_\alpha(s). Integrating by parts,

\begin{aligned}F_{\alpha+1}(s) &= \int_0^\infty t^{\alpha+1}e^{-st}\, \mathrm dt \\ &= \left[t^{\alpha+1} \cdot \frac{e^{-st}}{-s} \right]_0^\infty - \frac{\alpha+1}{-s} \int_0^\infty t^\alpha e^{-st}\, \mathrm dt \\ &= (0 - 0) + \frac{\alpha + 1}{s} F_\alpha(s) \\ &= \frac{\alpha + 1}{s} \cdot F_\alpha(s). \end{aligned}

On the other hand, by properties of Laplace transforms,

\displaystyle F_{\alpha+1}(s) = \mathcal L\{t \cdot t^{\alpha}\} = -\frac{\mathrm d}{\mathrm ds} F_\alpha(s).

Hence,

\displaystyle \frac{\mathrm d}{\mathrm ds} F_\alpha(s) = - \frac{\alpha + 1}{s} \cdot F_\alpha(s).

We can verify that F_\alpha(s) = F_\alpha(1) \cdot s^{-(n+1)} satisfies the initial value problem

\displaystyle \frac{\mathrm dy}{\mathrm ds} = -\frac{\alpha+1}{s} \cdot y,\quad y(1) = F_\alpha(1).

By the existence and uniqueness theorem, this solution is unique, as required.

Definition 1. For \mathrm{Re}(\alpha) > 0, define \Gamma_+(\alpha) := F_{\alpha - 1}(1). We remark that \Gamma_+(\alpha) \in \mathbb R^+ if \alpha \in \mathbb R^+ and \Gamma_+(\alpha) \in \mathbb C if \alpha \notin \mathbb R^+.

Problem 3. Prove that \Gamma_+(1) = 1 and that for any \alpha \in \mathbb R^+,

\Gamma_+(\alpha+1) = \alpha \cdot \Gamma_+(\alpha).

In particular, \Gamma_+(n) = (n-1)! for n \in \mathbb N^+.

(Click for Solution)

Solution. By definition,

\displaystyle \Gamma_+(1) = F_0(1) = (\mathcal L\{1\})(1) = \frac 11 = 1.

Setting s = 1 in Problem 2,

\Gamma_+(\alpha + 1) = F_{\alpha}(1) = \alpha \cdot F_{\alpha - 1}(1) = \alpha \cdot \Gamma_+(\alpha).

The final identity follows inductively from \Gamma_+(1) = (1-1)! = 0! = 1 and

\Gamma_+(n) = (n-1) \cdot \Gamma_+(n-1) = (n-1) \cdot (n-2)! = (n-1)!.

Problem 4. Construct an extension of \Gamma_+ to the gamma function \Gamma : \mathbb C \backslash \mathbb Z_{\leq 0} \to \mathbb C such that \Gamma|_{\mathbb R^+} = \Gamma_+ and for any \alpha \in \mathbb C \backslash \mathbb Z_{\leq 0}, \Gamma(\alpha + 1) = \alpha \cdot \Gamma(\alpha).

(Click for Solution)

Solution. We have defined \Gamma(\alpha) := \Gamma_+(\alpha) whenever \mathrm{Re}(\alpha) > 0. Suppose \mathrm{Re}(\alpha) < 0. We leave it as an exercise in induction to verify that if -n < \mathrm{Re}(\alpha) \leq -(n-1), the extension yields

\displaystyle \Gamma(\alpha) = \frac{\Gamma(\alpha+n)}{\alpha \cdot (\alpha+1) \cdot \cdots \cdot (\alpha+n-1)},

which is well-defined so long as \alpha \notin \mathbb Z. The crucial multiplicativity property is preserved, since

\begin{aligned} \Gamma(\alpha+1) &= \frac{\Gamma((\alpha+1)+n-1)}{ (\alpha+1) \cdot \cdots \cdot ((\alpha+1)+n-2)} \\ &= \frac{\Gamma(\alpha+n)}{(\alpha+1) \cdot \cdots \cdot (\alpha+n-1)} \\ &= \alpha \cdot \frac{\Gamma(\alpha+n)}{\alpha \cdot (\alpha+1) \cdot \cdots \cdot (\alpha+n-1)} \\ &= \alpha \cdot \Gamma(\alpha). \end{aligned}

Problem 5. Evaluate \Gamma(3/2).

(Click for Solution)

Solution. Since \Gamma(3/2) = 1/2 \cdot \Gamma(1/2), it suffices to evaluate the latter. Here,

\begin{aligned} \Gamma(1/2) &= \Gamma_+(1/2) \\ &= F_{-1/2}(1) \\ &= (\mathcal L\{t^{-1/2}\})(1) = \int_0^\infty t^{-1/2} e^{-t}\, \mathrm dt. \end{aligned}

For the latter, make the substitution u = t^{1/2} \Rightarrow \mathrm dt = 2u\, \mathrm du so that

\begin{aligned} \int_0^\infty t^{-1/2} e^{-t}\, \mathrm dt &= \int_0^\infty u^{-1} e^{-u^2} \cdot 2u\, \mathrm du \\ &= 2\int_0^{\infty}e^{-u^2}\, \mathrm du \\ &= \int_{-\infty}^{\infty} e^{-u^2}\, \mathrm du = \sqrt{\pi}. \end{aligned}

Therefore, \Gamma(3/2) = \frac 12 \sqrt{\pi}.

—Joel Kindiak, 7 Jun 25, 1243H

,

Published by


Response

  1. A Proposed Definition of Grace – KindiakWrites Avatar
    A Proposed Definition of Grace – KindiakWrites

    […] Christian who loves to talk about Jesus more so than even Cauchy’s integral formula or the Gamma function, I have been remarkably silent about grace. I am bold in defending the logical validity of […]

    Like

Leave a comment