This is a little writeup of a project I did in collaboration with a classmate while studying a algorithmic complexity class. We implemented a faster, but still exact, \(k\) nearest neighbors classifier based on k-d trees. I learned a lot and hope this can be interesting to some of you.
...
The goal of this project is to reproduce the methods and experiments of the following paper:
C. Cortes, X. Gonzalvo, V. Kuznetsov, M. Mohri, S. Yang AdaNet: Adaptive Structural Learning of Artificial Neural Networks. We will try to reproduce their method that consists in building neural networks whose structure is learned and optimized at the same time as it鈥檚 weights.This method will be applied to a binary classification task on images from the CIFAR-10 dataset.
...
This is Part 3 of my decision trees series. This time around we are going to code a decision tree in Python. So I鈥檓 going to try to make this code as understandable as possible, but if you are not familiar with Object Oriented Programming (OOP) or recursion you might have a tougher time.
...
This is Part 2. of my decision tree series. Here we will see how we can build a decision tree algorithmically using Leo Breiman鈥檚 (One of the big, big names in decision trees) CART algorithm.
...
The first subject I want to tackle on this page is decision trees. What are they? How do they work? How can I make one?
I am planning to make a small series, ranging from explaining the concept, to implementing a decision tree inference algorithm and hopefully all the way up to implementing Random Forests.
All right let鈥檚 get started.
...