StudierendeLehrende
  1. Universität
  2. Technische Universität Berlin
  3. Machine Learning

ML1_1exam_WS2324

Bearbeite ML1_1exam_WS2324 und vergleiche deine Lösungen. Aus dem Kurs Machine Learning an der Technische Universität Berlin (TU Berlin).

Abschnitt 1

MC
1
1 P

a
92
X2
-1
1
1
1
output

Wähle eine Antwort aus

Abschnitt 2

Freitextaufgabe
2
1 P

a1 = 1, a2 = 0, a3 = 1, a4 = 1, output = 1

Deine Antwort:

Abschnitt MAIN-fd6dd2d1-d101-47f1-8fce-fbb136a9f8bd

Gemischt
Multiple Choice
20 P

There is only one correct answer.


a
20 P

The expectation minimization algorithm [...]

Wähle eine Antwort aus

b
20 P

Which of the following is True in the context of bias-variance decomposition

Wähle eine Antwort aus

c
20 P

Why does PCA maximize eigenvalues?

Wähle eine Antwort aus

d
20 P

Which of the following is True In the soft-margin SVM, the parameter C controls?

Wähle eine Antwort aus

Abschnitt MAIN-f3d1cdef-4277-4b7d-90cf-54ee79ae7716

Gemischt
Parameter Estimation
20 P

The average time to get a letter at the post office follows the following distribution: p(x∣θ)=θ(1−θ)(x−1)p(x|\theta) = \theta(1-\theta)^{(x-1)}p(x∣θ)=θ(1−θ)(x−1). The variable X is a positive integer (Z+Z+Z+), and θ\thetaθ is a real number.


a
5 P

Define the likelihood function p(D∣θ)p(D|\theta)p(D∣θ)

Deine Antwort:

b
5 P

Calculate the likelihood of D={1,1,2,1}D = \{1,1,2,1\}D={1,1,2,1}

Deine Antwort:

c
5 P

Now consider a Bayesian approach, with the following probability distribution : p(θ)=1p(\theta) = 1p(θ)=1, for θ∈[0,1]\theta \in [0, 1]θ∈[0,1] p(θ)=0p(\theta) = 0p(θ)=0, elsewhere. Prove that the posterior can be defined as 30∗θ4(1−θ)30 * \theta^4 (1 - \theta)30∗θ4(1−θ)

Deine Antwort:

d
5 P

Evaluate the probability of P(x>1)P(x > 1)P(x>1) with p(x∣θ)p(θ∣D)\sqrt{p(x|\theta)p(\theta|D)}p(x∣θ)p(θ∣D)​

Deine Antwort:

Abschnitt MAIN-89dda443-0d64-4945-b827-f5511349d36b

Gemischt
Kernel
20 P

A function k:X×X→Rk : X \times X \rightarrow Rk:X×X→R defined on a set X is called a Positive Semi-Definite (PSD) Kernel if, for any finite set of points {x1,x2,...,xn}≤X\{x_1, x_2, ...,x_n\} \leq X{x1​,x2​,...,xn​}≤X and any corresponding set of coefficients {c1,c2,...,cn}∈R\{c_1, c_2,..., c_n\} \in R{c1​,c2​,...,cn​}∈R, the following condition holds : ∑i=1n∑j=1ncicjk(xi,xj)≥0\sum_{i=1}^n \sum_{j=1}^n c_i c_j k(x_i, x_j) \geq 0∑i=1n​∑j=1n​ci​cj​k(xi​,xj​)≥0 for all n∈Nn \in Nn∈N and for all choices of {x1,x2,...,xn}\{x_1, x_2,...,x_n\}{x1​,x2​,...,xn​} and {c1,c2,...cn}\{c_1, c_2, ... c_n\}{c1​,c2​,...cn​}.


a
10 P

Given the following kernel: kf=f(x)k(x,x′)f(x′)k_f = f(x)k(x, x') f (x')kf​=f(x)k(x,x′)f(x′). Prove it is a psd kernel.

Deine Antwort:

b
10 P

Show that the Gaussian kernel is also a psd kernel, with kf=exp(γ⋅12∣∣x−x′∣∣2)k_f = exp(\gamma \cdot \frac{1}{2} ||x - x'||^2)kf​=exp(γ⋅21​∣∣x−x′∣∣2). Also define function f(x)f(x)f(x) for this case. Hint: you can use the following kernel definition : k(x,x′)=exp(γxx′)k(x, x') = exp(\gamma x x')k(x,x′)=exp(γxx′), and use your answers from a).

Deine Antwort:

Abschnitt MAIN-17d5ab7f-24d4-40f3-bafa-64e07f3f823b

Gemischt
Exercise 5: Ridge Regression. Using a Quadratic solver
1 P

Given a labeled dataset ((x1,y1), ..., (XN, YN)) we consider the regularized regression problem :
minw ||yw wTX||2
subject to 0 wi ≤ C and Vi: Σi Wi ≤D,
with C, DER, wЄ Rd and X ЄRNxd


a
1 P

Show that this problem is equivalent to a problem of this type :
maxv
vT (XTX)v — 2yXv, subject to the same constraints.

Deine Antwort:

b
1 P

Implement a code in Python to calculate w. You can use the cvxopt.qp solver, that already
implements the optimization problem in the following format:
max₁
vTQv - Tv s.t. Av ≤ b

Deine Antwort:

Abschnitt MAIN-6caa350f-dc42-42c9-a1d3-59c69a0576b1

Gemischt
Neural Networks
20 P

Consider the following Neural Network with activation function : step(x) = \begin{cases} 1 & if a_i > 0 \ 0 & \text{if } a_i \leq 0 \end{cases}


a
10 P

Give all weights and biases.

Deine Antwort:

b
10 P

Describe values for all activated neurons for x=(1,1)x = (1, 1)x=(1,1).

Deine Antwort:
iconlogo
Einloggen