Strong Church-Turing Thesis: The intuitive notion of “efficiently computable” is captured by functions efficiently computable by a TM.
- Master Method
For $T(n) \le a\dot T(\frac{n}{b}) + O(n^d)$,- $a = b^d$, $O(n^dlog n)$
- $a < b^d$, $O(n^d)$
- $a > b^d$, $O(n^{log_ba})$
Polynomial time and the class P
$DTIME(T(n)) = $ the language decided by an $O(T(n))$ time algorithm.
$P=DTIME(n^k)$ union of all
$EXP=DTIME(2^{n^k})$ union of all
P is a subset of EXP.
We can prove that Polynomial time algorithms can solve every decidable problem
$HWTB$ is the language
Time Hierarchy Theorem
Let $T(n)$ be a time-constructible algorithm. $\epsilon > 0$. Then there is a problem which cannot be decided in time $T(n)$, but can be decided in time $T(n)^{1+\epsilon}$. $DTIME(T(n))$ is a PROPER subset of $DTIME(T(n)^{1+\epsilon})$.
Space Complexity
- Use two-tape TM that has a work tape and an input tape that is read-only.
$DSPACE(T(n)) = $ the language decided by an $O(T(n))$ space algorithm.
PSPACE, L is log space. L is a subset of PSPACE.
Space Hierarchy Theorem
If a TM uses $S(n)\ge log_2(n)$, then it decides the language using $\le 2^{O(S(n))}$ time (because of the number of possible configurations) - $L\subset P$
- $PSPACE\subset EXP$
Circuit Complexity
A circuit family is an infinite answer to infinite number of questions - The size of a circuit is the total number of gates, including inputs as gates too
- Circuit complexity is the size of the minimal circuit family that decides the language
Circuit complexity is ALWAYS bounded by $2^n$.
$f_A(x_1, x_2, …, x_n) = (x_1\wedge f_A(1, x_2, …, x_n))\wedge (\not x_1\wedge (f_A(0, x_2, .., x_n)))$
This has recurrence $s(n) \le 2s(n-1)+5$.
W.T.S.: There is a language L whose circuit complexity is at least $2^n/4n$.
- Number of possible functions is $2^{2^n}$
- Number of circuits of size s is $\le 2^{4slogs}$
Claim 2: for each gate, 2 bits for its type, 2logs bits for which gates the inputs are from.