Search

# Fourierinformationsir

### Maths, Music, and More

NOTE: I did not make this explanation. I based it on the explanation from the book “Journey through Genius” by William Dunham, and changed a few parts – I liked how much clearer this book’s explanation was than the other, more formal proofs I have seen.

## What is the Basel problem?

$\frac{1}{1}+\frac{1}{2^2}+\frac{1}{3^2}+\frac{1}{4^2}+...$

My old high school has a program where they invite primary school kids to the school to do science and maths activities – and this year, they asked me to help run one of these, the Mini Mathematicians class. This problem actually came out of a class we did there, on card shuffling.
Continue reading “Card shuffling – it’s actually really mathematical”

After a few more days messing around with the neural network (this time i’ve trained it on 4 James Joyce books, around 4 MB of text) I decided to try making a whole book. I found out that Finnegan’s wake is roughly 1.3 MB of text, so I made 5 books of 1.3 MB length, altering a few variables on the neural network each time to see what changed. I picked the one I liked the most (temperature=0.9, for reference), and then spent a few hours formatting it. Then I went to Lulu and had it printed, because why not (The texts are all available in on Github).

Neural Networks are one of my favourite parts of computing – they allow you to do an incredible range of things, from turning handwriting into text, to creating mildly incoherent literature. I’ve played around with creating one before, to count syllables for a haiku generator. That didn’t work so well, so I’ve decided to play around with one already written by someone else, and see what I can do with it.

The first thing I thought of was to make a haiku generating neural network, but it’s hard to find databases with text files of thousands of haiku. Instead, I’ve decided to train it on James Joyce novels. Andrej’s Github (linked above) gives a really good overview of what you need to do to get his neural network up and running, so i’ll leave that bit out.
Continue reading “Neural Networks and James Joyce”

I want to have a look at a very famous (and bizarre) result, the infinite sum

$\sum_{n=1}^\infty n = 1+2+3+4+5+... = -\frac{1}{12}$

At first, it doesn’t seem to make any sense. If you add all these numbers up, the result keeps getting larger, so how does it approach a finite number – and a negative one for that matter?

In this post, I want to have a look at a really neat part of calculus, Taylor series. It’s essentially a process you can use to approximate some graph, e.g. ex, using a polynomial – the idea is, the higher order the polynomial, the more accurate the approximation gets.

## Let’s dive in! (This gets a bit tricky)

How does it work? First, some terminology – the first order approximation is a linear approximation (so ax+b), second order is quadratic (ax2 + bx + c), and so on. So let’s start with a first order approximation of, say, ex.