Skip to content

Instantly share code, notes, and snippets.

@ab2005
Created December 2, 2017 19:51
Show Gist options
  • Select an option

  • Save ab2005/da7bcd3ed5355ca65ce60bb35da302c6 to your computer and use it in GitHub Desktop.

Select an option

Save ab2005/da7bcd3ed5355ca65ce60bb35da302c6 to your computer and use it in GitHub Desktop.

The “classical stack” of Software 1.0 is what we’re all familiar with — it is written in languages such as Python, C++, etc. It consists of explicit instructions to the computer written by a programmer. By writing each line of code, the programmer is identifying a specific point in program space with some desirable behavior. In contrast, Software 2.0 is written in neural network weights. No human is involved in writing this code because there are a lot of weights (typical networks might have millions), and coding directly in weights is kind of hard (I tried). Instead, we specify some constraints on the behavior of a desirable program (e.g., a dataset of input output pairs of examples) and use the computational resources at our disposal to search the program space for a program that satisfies the constraints. In the case of neural networks, we restrict the search to a continuous subset of the program space where the search process can be made (somewhat surprisingly) efficient with backpropagation and stochastic gradient descent.

@ab2005
Copy link
Author

ab2005 commented Dec 2, 2017

Tomorrow

A large portion of programmers of tomorrow do not maintain complex software repositories, write intricate programs, or analyze their running times. They collect, clean, manipulate, label, analyze and visualize data that feeds neural networks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment