Advent of Code 2021 - Problem 7

As I was doing the 7th problem of the 2021 Advent of Code, I realized the mean and median were popping up in the optimal solutions. There is a fairly simple explanation to this…

December 7, 2021 Â· 12 min

Visualizing the Collatz Conjecture

What is the Collatz conjecture ? Let us choose any positive integer \(k>0\), let us then apply the following rules to \(k\) repeatedly: ...

September 3, 2021 Â· 11 min

emoj.yt (emoji URL-shortener)

This is a little write up of a very small project I did, inspired by Coding Garden with CJ on youtube & twitch (specifically this video), and Net Ninja express tutorials: A URL-shortener that uses a sequence of emojis to encode each URL. The code is available on github, and you can try it out at emoj.yt. ...

August 19, 2020 Â· 8 min

How can we make linear regression better? Regularization.

If you haven’t read my post on linear regression I invite you to do so here, but basically it is a method for modelling the relationship between variables \(X_i\) and a target feature \(y\) in a linear model. This modelling is done through learning weights \(\theta_i\) for each \(X_i\) supposing that our model looks something like this: ...

February 10, 2020 Â· 8 min

Implementing linear regression, math and Python!

Today I want to explain linear regression. It is one of the simplest statistical learning models and can be implemented in only a couple lines of Python code, in an efficient manner. Being so simple however does not mean it is not useful, in fact it can be very practical to explore relationships between features in a dataset and make predictions on a target value. Therefore I think it’s important to understand how the method works and how the different parameters have an effect on the outcome. ...

January 9, 2020 Â· 16 min

KD-trees: Classification of n points in d-sized Euclidean space

This is a little writeup of a project I did in collaboration with a classmate while studying a algorithmic complexity class. We implemented a faster, but still exact, \(k\) nearest neighbors classifier based on k-d trees. I learned a lot and hope this can be interesting to some of you. ...

April 11, 2019 Â· 23 min Â· Luc Blassel & Romain Gautron

Adanet - adaptive network structures

The goal of this project is to reproduce the methods and experiments of the following paper: C. Cortes, X. Gonzalvo, V. Kuznetsov, M. Mohri, S. Yang AdaNet: Adaptive Structural Learning of Artificial Neural Networks. We will try to reproduce their method that consists in building neural networks whose structure is learned and optimized at the same time as it’s weights.This method will be applied to a binary classification task on images from the CIFAR-10 dataset. ...

March 18, 2019 Â· 9 min Â· Luc Blassel, Romain Gautron

Let's implement the CART Algorithm

This is Part 3 of my decision trees series. This time around we are going to code a decision tree in Python. So I’m going to try to make this code as understandable as possible, but if you are not familiar with Object Oriented Programming (OOP) or recursion you might have a tougher time. ...

March 2, 2019 Â· 12 min

The CART Algorithm

This is Part 2. of my decision tree series. Here we will see how we can build a decision tree algorithmically using Leo Breiman’s (One of the big, big names in decision trees) CART algorithm. ...

February 27, 2019 Â· 10 min

What are decision trees?

The first subject I want to tackle on this page is decision trees. What are they? How do they work? How can I make one? I am planning to make a small series, ranging from explaining the concept, to implementing a decision tree inference algorithm and hopefully all the way up to implementing Random Forests. All right let’s get started. ...

February 26, 2019 Â· 8 min