The goal of the project is to provide a theoretical analysis of the properties of two different models, which are usually used as supervised learning algorithms: perceptron and support vector machines. From an abstract point of view these models can be interpreted as constraint satisfaction problems, which show a SAT-UNSAT phase transition at thermodynamical limit. In this framework the so-called storage problem for perceptron is found to be equivalent to a hard sphere-packing problem. The first part of the work focuses on the storage properties of a slighly modified version of the perceptron, in which one parameter is considered as a random variable. The second part is a starting point to approach the same kind of problem in support vector machines and to point out some possible source of error present in already known results. The whole work is carried out through statistical physics methods and tools used to study disordered systems - such as spin glasses - whose main ingredient is the famous replica method.
- Other