Tips for strengthening your understanding of a difficult concept

(originally posted March 17th, 2019)

Backpropagation is a hard concept to learn, and so are many ideas in programming and math. Maybe you read an article about it and still feel like you just don’t get it. That’s normal. A good understanding of a difficult concept isn’t something that suddenly appears. It’s like the roots of a tree, starting out small but growing over time, with parts wrapping around and strengthening each other. Here are some ways to build on your shaky foundations.

Reread the article

It’s hard to understand everything in a written…

How tuning a neural network is like tuning a synthesizer

(originally posted April 6th, 2019)

When I’m making decisions about hyperparameter settings for a neural network (batch size, learning rate, number of layers, size of each layer), I’m reminded of what it was like when I started making electronic music in Apple’s Logic Pro many years ago. When I opened up a new synthesizer for the first time to make some cool sounds, I was met with this:

The Sculpture synthesizer from Apple’s Logic (source)

That is quite a lot of settings to work with, and I had no idea what any of them did. Some of…

Creating a full web app for my fantasy name generator

(originally posted February 10th, 2019)

This week I was able to build a web app with my TensorFlow name generator and put it online using GitHub’s Pages feature. Since I ended last week with a functioning model in Python and a very simple web app that only worked locally, I needed to figure out a number of things to get the app to where it is now.

First, I needed to figure out how to actually get the web app on the web and not just on my own computer…

Porting my PyTorch name generator to TensorFlowJS

(originally posted February 3rd, 2019)

My goal this week was to translate my PyTorch implementation of a name generator into TensorFlow, one of the other most popular deep learning frameworks for Python. TensorFlow has a high level interface called Keras that lets you build models in an intuitive way, layer by layer, so I used that instead of the lower level building blocks of TensorFlow, which are a bit more complicated. …

Improving my PyTorch fantasy name generator

(originally posted January 20th, 2019)

This week I returned to my fantasy name generation app to make some improvements. These included adjusting the app so it could generate names of varying lengths, adding tests to make sure all the functions were doing the right thing, and adding comments and documentation to my python files to make the code easier to return to next time.

To make the app generate names of varying lengths, instead of just six-letter names, I first changed how the training process worked. Instead of training just on the six letter…

Using PyTorch and a list of baby names as data

(originally posted January 6th, 2019)

My first NLP project is a generator of first names for fiction writers, especially for the fantasy genre. Coming up with fantasy names can be hard because a good name should seem unusual but familiar at the same time. A neural network is a great tool for the job because it can learn the patterns of letters in names, like which letter should come after a certain series of other letters. Injecting a bit of randomness into this process can produce names that range from…

Joe Bender

I write about machine learning, natural language processing, and learning in general.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store