I’ve recently discovered Safari Books Online thanks to a post on HackerNews. One of the first things that caught my eye was one of the Oriole Online Tutorials - Hello, TensorFlow. It was a very quick overview of TensorFlow in a really cool format. There was a video with an accompanying article (which can autoscroll along with the video contents) and code snippets embedded on the page which you can actually work with and run in the browser.
Today, I saw a new recommendation on my Safari Books Online account page: a 3-chapter preview of the as of yet unreleased book Learning TensorFlow by Tom Hope, Yehezkel S. Resheff, and Itay Lieder. I decided to spend some time reading today and following along with the provided examples and exercises.
What follows is just some general notes I took as I went, and some retrospective on a couple of the exercises. Not particularly organized, saved here in case I want to go back through them later.
First, I started a learning repository which will contain some random TensorFlow learning stuff: https://github.com/drakonka/HelloTensorFlow
Setting up the environment
- Set up TensorFlow virtualenv.
- In Visual Studio Code settings.json, create new setting for python.TensorFlowPython pointing to VirtualEnv python location (note: “~/” path does not work, has to be “/home/user/…"). Now can debug through VS Code.
- In launch.json, set up a new configuration (“TensorFlow”) and point it to use the above python path.
- Nodes are operations, edges are Tensor objects.
- The general concept of a TensorFlow graph is - we define the graph first, then send data through it.
- A Tensor object can have a manually specified name
- More than one Tensor object in the same graph cannot have the same name, so if you specify the same name for more than one TF will append an underscore and a number to a duplicate
- Tensor objects in different graphs can have the same name
- Read something about TF not playing nicely with Python 3, but I haven’t run into those cases with the examples (other than having to fix
- TensorFlow is closely coupled with numpy
- Need to read up on a lot of the mentioned math concepts, the last optimization example was pushing it.
- Specify different subdirectories in the root logging dir for TensorBoard per graph to make a nicer separation in TensorBoard UI.
- Quite a few pre-trained models are already available..but it seems like the training is going to be the fun part.
- Construction phase -> training phase on training data -> testing phase on test data (do not reuse training data)
Example 3-1 provided code for a graph and then a visual (non TensorBoard) representation of that graph. Before peeking at the visual I made my own to compare to the “answer”:
Example 3-2 gave a fun challenge! The book provides two visual representations of graphs and the task is to create those graphs in TensorFlow. However, the exercise did not provide solutions so I needed some other way to test how correct my result was. I ended up learning how to set up and start TensorBoard to have it visualize the graphs I made and compare them to the example graphs (which were not represented in the same TensorBoard format, but still gave me the ability to compare and make sure the node and edge connections looked correct):
I’m not sure when the rest of the book is being released, but while I wait I may as well use the time to brush up on the relevant math concepts.