Course Details
Course Code (English)
*
Semester
*
Title (English)
*
Lecture Hours (Weekly)
ECTS Credits
*
Course Type (English)
Prerequisites (English)
The course assumes basic knowledge of mathematics (Computational and Discrete Mathematics, Porbability) as well as Programming.
Course URL (e.g., on e-class)
Learning Outcomes (English)
The aim of this course is twofold: (a) to provide an in-depth understanding of core mathematical and algorithmic concepts that underpin modern Artificial Intelligence methods, and (b) to familiarize students with the practical implementation of these methods in Python through a variety of real-world examples. Upon successful completion of the course, students will be able to: Solve linear systems and apply them to practical problems. Describe and apply affine transformations and projections in problems such as image processing, data visualization, and computer graphics. Identify and implement basic first- and second-order optimization algorithms, with applications in machine learning. Understand fundamental concepts of information theory and their relevance to Artificial Intelligence. Interpret and apply convolution operations in signal and image processing tasks. Describe and apply basic machine learning algorithms.
General Competencies (English)
- Independent thinking - Promote free, creative and inductive thinking
Course Content (English)
Unit 1: Review of Linear Algebra and Computational Methods Vectors, matrices, and fundamental operations (addition, multiplication, transposition). Determinant and matrix inversion via Gaussian elimination. Linear systems with square and non-square matrices. Pseudoinverse and the least squares method. Unit 2: Linear Transformations and Eigenvalues Linear transformations and geometric interpretation. Projections and applications in image and data processing. Eigenvalues, eigenvectors, and matrix diagonalization. Unit 3: Multivariable Calculus and Optimization Multivariate functions, partial derivatives, differentials, and gradient. Jacobian and Hessian matrices: definitions, interpretation, and applications. First- and second-order iterative optimization algorithms (e.g., gradient descent, Newton-Raphson). Automatic differentiation and backpropagation. Unit 4: Probability and Information Theory Review of basic probability and estimation theory concepts. Entropy, mutual information, and Kullback-Leibler divergence. Role of information theory in Artificial Intelligence and Machine Learning. Unit 5: Convolution and Signal/Image Processing Definition and numerical implementation of convolution. Applications of convolution in signal and image processing. Connection to Convolutional Neural Networks. Unit 6: Introduction to Machine Learning Bayes classifier and Linear Discriminant Analysis (LDA). Linear regression, Ridge Regression, and LASSO. Practical implementation of models in Python and integration of mathematical tools covered in the course.
Use of ICT (English)
Project implementation in python
Is it elective?
Άγνωστο
Ναι
Όχι
Load within semester (Hours)
Lecture Hours
Lab Hours
Independent Study
*
Project Work
*
Lab Report
*