Contents

Model flexibility and number of parameters

This post is some thoughts I had after reading ‘Real numbers, data science and chaos: How to fit any dataset with a single parameter’ by Laurent Boué.

arXiv:1904.12320

The paper above shows that any dataset can be approximated by the following single-parameter function:

$\sin^2 (2^{x \tau} \mathrm{arcsin} \sqrt{\alpha})$

Where $x$ is an integer, $\tau$ is a constant which controls the level of accuracy, and $\alpha$ is a real-valued parameter which is fit to the dataset in question.

This might seem impossible at first, but this is an incredibly flexible function, and you can put an awful lot of information into $\alpha$ if you include enough significant figures. Have a look at the paper to see some examples of line drawings of animals, images and audio recordings, all of which are represented just by a different choice of $\alpha$.

Too many parameters? It depends on the model

With four parameters I can fit an elephant, and with five I can make him wiggle his trunk

This quote is attributed to John von Neumann, and appears often enough that it had (wrongly) formed a good part of the basis of my understanding of model flexibility (perhaps coupled with the idea of model comparison through AIC and likelihood ratio tests). My misunderstanding was roughly that ’lots of parameters = bad’ because you have overfitted to your data, are overestimating the model accuracy, and decreasing accuracy outside of the fitted data.

Indeed, for some models such as linear systems of equations, if number of parameters > number of data points, the data can be perfectly fit (prediction error = 0). AIC model selection works nicely here: models can be nested as they have the same structure, and you can get a measure of whether you are likely to be overfitting.

But in the above example, one parameter with enough precision can draw a line through the data: because the model is incredibly flexible.

On the other hand, some models with many parameters are very inflexible: this SIR model has almost 50 parameters, but you certainly can’t use it to draw an elephant.

In summary: the number of parameters alone does not determine whether you are likely to be overfitting a model, it also depends on the flexibility of the model itself. Be careful when comparing between models with different structures.