Weitere Fächer

Informationen für Studierende

Für Studierende der Psychologie

Institut Neuropsychologie und Klinsche Psychologie

Wissenschaftliche Direktorin: Prof. Dr. Herta Flor
Tel.: 0621 1703-6302, E-Mail

Sekretariat: Angelika Bauder
Tel.: 0621 1703-6302, E-Mail
Institut Neuropsychologie und Klinische Psychologie


Abteilung Klinische Psychologie

Leitung: Prof. Dr. Peter Kirsch
Tel.: 0621 1703-6501, -6511, E-Mail

Sekretariat: Ellen Schmucker
Tel.: 0621 1703-6502, E-Mail
Abteilung Klinische Psychologie


Lehrende / Instructors:


Vorlesungsunterlagen:

Datum
/ Date

ReferentIn
/ Instructor

Thema / Download
/ Topic / Download

WS 2017/18

Für Studierende der Pharmakologie

Institut für Psychopharmakologie

Wissenschaftlicher Direktor: Prof. Dr. Rainer Spanagel
Tel.: 0621 1703-6251, E-Mail

Sekretariat: Christine Roggenkamp
Tel.: 0621 1703-6252, E-Mail
Institut für Psychopharmakologie


Lehrende:


Veranstaltungshinweise:

Aktuelle Informationen zum Seminar für Psychopharmakologie finden Sie auf der Seite Veranstaltungen. / For up-to-date information about the Psychopharmacology Seminar, please visit the Events page.

Für Studierende der Statistik

Abteilung Biostatistik

Kommissarische Leitung: Prof. Dr. Stefan Wellek
Tel.: 0621 1703-6001, E-Mail

Sekretariat: Mireille Lukas
Tel.: 0621 1703-6002, E-Mail
Abteilung Biostatistik


Lehrende:

Für Studierende der Biowissenschaften

Abteilung Molekularbiologie

Leitung: Prof. Dr. Dusan Bartsch
Tel.: 0621 1703-6202, E-Mail

Abteilung Molekularbiologie


Lehrende:


Biochemisches Labor

Leitung: apl. Prof. Dr. Patrick Schloss
Tel.: 0621 1703-2901, E-Mail
Biochemisches Labor


Lehrende:

Für Studierende der Rechtswissenschaften

Forensische Psychiatrie

Leitung: apl. Prof. Dr. Harald Dreßing
Tel.: 0621 1703-2941, E-Mail

Sekretariat: Martina Herbig
Tel.: 0621 1703-2381, E-Mail
Forensische Psychiatrie


Lehrende:

Für Studierende der Physik, Mathematik und Informatik

Physik, Mathematik und Informatik

Professor für Theoretische Neurowissenschaften /
Abteilungsleiter: Prof. Dr. Daniel Durstewitz
Tel.: 0621 1703-2361, E-Mail

Sekretariat: Christine Roggenkamp, M.A.
Tel.: 0621 1703-6252, E-Mail

Theoretische Neurowissenschaften


Lehrende:

Exercises:


Veranstaltungen im Sommersemester 2018:

Aktuelle Informationen zum Seminar finden Sie hier: /
Up-to-date information about the lecture and its contents you will find here:

Computational Statistics and Data Analysis (MVComp2)

Wednesdays from 18/4 - 25/7 2018 (= 15 sessions), INF 227/SR 1.403
Lecture (2 hrs): Wed 11.00ct-13.00
Exercises (2 hrs): Wed 14.00ct-16.00

This lecture introduces students to basic methods and techniques in computational statistics and data analysis, as widely applicable to empirical problems in the natural sciences. It also provides an overview over relevant concepts and theorems in probability theory and mathematical statistics. The lecture is accompanied by computational exercises in Python or Matlab. It will enable students to analyze small and large data sets and interpret the results from a solid, theoretically grounded statistical perspective, to devise statistical models of experimental situations, to infer the parameters of these models from empirical observations, and to test hypotheses about them.

Prerequisites

- Linear (Matrix) Algebra

- Basic calculus (derivatives & integrals)

- Basic programming skills in Python or Matlab

 

Literature

  • Wackerly, D.D., Mendenhall III, W., Scheaffer, R.L. (2008) Mathematical Statistics with Applications. Brooks/Cole.
  • Durstewitz, D. (2017) Advanced Data Analysis in Neuroscience: Integrating Statistical and Computational Models. Springer.
  • Hastie, T., Tibshirani, R., and Friedman, J. (2009). The Elements of Statistical Learning: Data Mining, Inference, and Prediction (New York: Springer).
  • Bishop, C.M. (2007). Pattern Recognition and Machine Learning. Springer. 

(Approximate) Table of Contents

 

The lecture will cover all of the general topics 1-15 listed below, but we may not be able to cover all of the details from each topic.

All concepts will be introduced along particular, motivating examples and data sets from the natural sciences.

 

  1. Basic concepts in probability theory
    - concepts of probability, sample spaces, set operations on sample space, events; basic axioms & theorems of prob. theory, Bayes’ theorem; counting permutations, partitions & factorials
  2. Random variables; expectations, variances, co-variances, and their properties
  3. Discrete & continuous probability distributions
    - uniform distribution, Bernoulli, binomial distribution, hypergeometric distribution, Poisson distribution and approximation of binomial; Gaussian, Gamma, Beta, [t, F, χ2] distribution; exponential family distributions; conjugate priors
  4. Moment-generating functions and multivariate distributions
    Tchebysheff’s theorem; moment-generating functions; multi-categorical, multinomial distribution, multivariate normal distribution, [(inverse) Wishart]; conditional and marginal probabilities for the multivariate normal, covariance matrices
  5. Statistical models & inference I
    - empirical samples and population, characteristics of parameters estimates (bias, variance, sufficiency, …)
  6. Statistical models & inference II
    - principles of parameter estimation (Maximum Likelihood, Bayesian inference), numerical methods (Newton-Raphson), [sampling techniques]
  7. Hypothesis tests I
    - test construction, rejection region, confidence intervals; exact tests, asymptotic tests (t- /F- /χ2-test), central limit theorem
  8. Hypothesis tests II
    - Log-likelihood-ratio principle; bootstrap- & permutation-based methods; multiple testing problem
  9. Linear regression
    - General Linear Model; outliers, leverage and robust regression
  10. Ridge & LASSO regression for high-dimensional data, regularization techniques
  11. Nonlinear regression (sect. 2.5, 2.6)
    - Locally Linear Regression, basis expansions & splines, a note on neural networks & big data
  12. Classification I
    - Linear & Quadratic Discriminant Analysis, k-nearest neighbors
  13. Classification II
    - [logistic regression (as an example of a General-ized Linear Model)]; Support Vector Machines and max. margin criterion
  14. Model selection & bias-variance tradeoff
    - basic problem, AIC, BIC; cross-validation; curse of dimensionality
  15. Dimensionality reduction
    - Principal Component Analysis, Factor Analysis, [Locally Linear Embedding] 
Studierende im Großen Hörsaal (HS1)