Complex Numbers
\(\mathbb{C} = \{x + y \cdot i | x,y \in \mathbb{R}\}\) Set of complex numbers
\(i^2 = -1\)
\(e^{i\theta} = \cos(\theta) + i \cdot \sin(\theta)\)
Euler’s identity: \(e^{i\pi} = -1\)
Elementary Functions
Exponential function
\(\exp : \mathbb{R} \rightarrow \mathbb{R}^+\)
We write: \(\exp(x) = e^x\)
-
\[e^0 = 1\]
-
\[\forall x,y \in \mathbb{R}: e^{x+y} = e^x \cdot e^y\]
- \(\forall x \in \mathbb{R}: e^x \neq 0\) and \(e^{-x} = \frac{1}{e^x}\)
-
\[\forall x \in \mathbb{R}: e^x > 0\]
- $\exp$ is strictly monotone increasing
- \(\exp: \mathbb{R} \rightarrow (0,\infty)\) is bijective
-
\[(e^x)' = e^x\]
Logarithmic function
\(\ln : (0,\infty) \rightarrow \mathbb{R}\)
The natural logarithm is the inverse function of the exponential:
-
\[e^{\ln(x)} = x \text{ for all } x > 0\]
-
\[\ln(e^x) = x \text{ for all } x \in \mathbb{R}\]
-
\[\ln(x \cdot y) = \ln(x) + \ln(y)\]
-
\[\ln(x/y) = \ln(x) - \ln(y)\]
-
\[\ln(x^a) = a \cdot \ln(x)\]
-
\[\ln(1) = 0, \quad \ln(e) = 1\]
-
\[(\ln(x))' = \frac{1}{x}\]
Trigonometric functions
\(\sin, \cos : \mathbb{R} \rightarrow [-1,1]\)
- \(\sin^2(x) + \cos^2(x) = 1\) (Pythagorean identity)
-
\[\sin(x + y) = \sin(x)\cos(y) + \cos(x)\sin(y)\]
-
\[\cos(x + y) = \cos(x)\cos(y) - \sin(x)\sin(y)\]
-
\[(\sin(x))' = \cos(x)\]
-
\[(\cos(x))' = -\sin(x)\]
- Domain: $\mathbb{R}$, Period: $2\pi$
\[\tan(x) = \frac{\sin(x)}{\cos(x)}, \quad x \neq \frac{\pi}{2} + k\pi, k \in \mathbb{Z}\]
Sequences and Limits
Definition of a Sequence
A sequence is a function \(a: \mathbb{N} \rightarrow \mathbb{R}\)
Notation: \((a_n)_{n \in \mathbb{N}} \text{ or } (a_n)_{n=1}^{\infty}\)
Convergence of Sequences
A sequence $(a_n)$ converges to $L \in \mathbb{R}$ if:
\(\forall \epsilon > 0 \, \exists N \in \mathbb{N} : \forall n > N : |a_n - L| < \epsilon\)
We write: \(\lim_{n \to \infty} a_n = L\)
Properties:
- If $\lim a_n = L$ and $\lim b_n = M$, then $\lim (a_n + b_n) = L + M$
- $\lim (a_n \cdot b_n) = L \cdot M$
- If $M \neq 0$: $\lim \frac{a_n}{b_n} = \frac{L}{M}$
- Squeeze theorem: If $a_n \leq c_n \leq b_n$ and $\lim a_n = \lim b_n = L$, then $\lim c_n = L$
Monotone Sequences
Monotone Convergence Theorem:
- Every monotone increasing sequence that is bounded above converges
- Every monotone decreasing sequence that is bounded below converges
Special Important Limits
\(\lim_{n \to \infty} \left(1 + \frac{1}{n}\right)^n = e\)
\(\lim_{n \to \infty} \frac{a^n}{n!} = 0 \text{ for any } a > 0\)
\(\lim_{n \to \infty} \sqrt[n]{n} = 1\)
Series
Definition
An infinite series is:
\(\sum_{n=1}^{\infty} a_n = a_1 + a_2 + a_3 + \ldots\)
The $N$-th partial sum is: \(S_N = \sum_{n=1}^{N} a_n\)
A series converges if $\lim_{N \to \infty} S_N$ exists and is finite.
Geometric Series
\(\sum_{n=0}^{\infty} r^n = \frac{1}{1-r} \quad \text{for } |r| < 1\)
Convergence Tests
- Divergence Test: If $\lim a_n \neq 0$, then $\sum a_n$ diverges
- Comparison Test: If $0 \leq a_n \leq b_n$ and $\sum b_n$ converges, then $\sum a_n$ converges
-
| Ratio Test: Let $L = \lim_{n \to \infty} \left |
\frac{a_{n+1}}{a_n}\right |
$ |
- If $L < 1$: series converges absolutely
- If $L > 1$: series diverges
- If $L = 1$: test is inconclusive
-
| Root Test: Let $L = \lim_{n \to \infty} \sqrt[n]{ |
a_n |
}$ |
- If $L < 1$: series converges absolutely
- If $L > 1$: series diverges
Harmonic Series
\(\sum_{n=1}^{\infty} \frac{1}{n} = 1 + \frac{1}{2} + \frac{1}{3} + \ldots \quad \text{DIVERGES}\)
Power Series
\(\sum_{n=0}^{\infty} c_n(x - a)^n = c_0 + c_1(x-a) + c_2(x-a)^2 + \ldots\)
has a radius of convergence $R$, and converges for all $x$ with $|x - a| < R$.
Limits and Continuity
Limit of a Function
\(\lim_{x \to a} f(x) = L\)
if: \(\forall \epsilon > 0 \, \exists \delta > 0 : 0 < |x - a| < \delta \Rightarrow |f(x) - L| < \epsilon\)
Continuity
A function $f$ is continuous at $a$ if:
\(\lim_{x \to a} f(x) = f(a)\)
Properties of continuous functions:
- Sums, products, and quotients of continuous functions are continuous
- Composition of continuous functions is continuous
- Intermediate Value Theorem: If $f$ is continuous on $[a,b]$ and $y$ is between $f(a)$ and $f(b)$, then $\exists c \in (a,b)$ such that $f(c) = y$
- Extreme Value Theorem: If $f$ is continuous on $[a,b]$, then $f$ attains its maximum and minimum values on $[a,b]$
Differentiation
Definition of the Derivative
\(f'(a) = \lim_{h \to 0} \frac{f(a+h) - f(a)}{h}\)
If this limit exists, we say $f$ is differentiable at $a$.
Differentiation Rules
- Sum Rule: $(f + g)’ = f’ + g’$
- Product Rule: $(f \cdot g)’ = f’ \cdot g + f \cdot g’$
- Quotient Rule: $\left(\frac{f}{g}\right)’ = \frac{f’ \cdot g - f \cdot g’}{g^2}$
- Chain Rule: $(f \circ g)’(x) = f’(g(x)) \cdot g’(x)$
- Power Rule: $(x^n)’ = n \cdot x^{n-1}$
Mean Value Theorem
If $f$ is continuous on $[a,b]$ and differentiable on $(a,b)$, then $\exists c \in (a,b)$ such that:
\(f'(c) = \frac{f(b) - f(a)}{b - a}\)
Monotonicity and Local Extrema
- If $f’(x) > 0$ on an interval, then $f$ is strictly increasing on that interval
- If $f’(x) < 0$ on an interval, then $f$ is strictly decreasing on that interval
- If $f’(a) = 0$ and $f’‘(a) < 0$, then $a$ is a local maximum
- If $f’(a) = 0$ and $f’‘(a) > 0$, then $a$ is a local minimum
Integration
Riemann Integral
The Riemann integral of $f$ on $[a,b]$ is:
\(\int_a^b f(x) \, dx = \lim_{n \to \infty} \sum_{i=1}^{n} f(\xi_i) \Delta x_i\)
where the interval is partitioned and $\Delta x_i \to 0$.
If this limit exists, $f$ is Riemann integrable on $[a,b]$.
Fundamental Theorem of Calculus
Part 1: If $f$ is continuous on $[a,b]$, then:
\(F(x) = \int_a^x f(t) \, dt\)
is differentiable and $F’(x) = f(x)$.
Part 2: If $F$ is an antiderivative of $f$ on $[a,b]$, then:
\(\int_a^b f(x) \, dx = F(b) - F(a)\)
Integration Rules
- Linearity: $\int (af + bg) = a\int f + b\int g$
- Integration by parts: $\int u \, dv = uv - \int v \, du$
- Substitution: $\int f(g(x)) \cdot g’(x) \, dx = \int f(u) \, du$ where $u = g(x)$
Standard Antiderivatives
\(\int x^n \, dx = \frac{x^{n+1}}{n+1} + C \quad (n \neq -1)\)
\(\int \frac{1}{x} \, dx = \ln|x| + C\)
\(\int e^x \, dx = e^x + C\)
\(\int \sin(x) \, dx = -\cos(x) + C\)
\(\int \cos(x) \, dx = \sin(x) + C\)
\(\int \frac{1}{1+x^2} \, dx = \arctan(x) + C\)