Tensorflow: Saving optimizer state (adagrad/momentum/etc.)

Created on 14 Nov 2016  ·  1Comment  ·  Source: tensorflow/tensorflow

Hey everybody,

Last week I asked this question on stackoverflow: https://stackoverflow.com/questions/40547198/saving-the-state-of-the-adagrad-algorithm-in-tensorflow .
My problem is that I want to save the state of the optimizer (in my case the adagrad accumulators) so I can stop my learning and continue whenever I want.

Unless I'm mistaken the state of the optimizer can't be saved (you cant pass an optimizer to a tf.train.Saver, right?). A quick (hacky?) solution for me might be is calling Optimizer.get_slot_names() and save the op of each slot.
The next problem would be putting this op back in the slots, as I don't think there is a set_slot(name,op) at the moment.

So my questions are:

  • Am I right that this is currently impossible?
  • Do we want to have a set_slot(name,op) function in the Optimizer class? (I am willing to help out with this)
  • Do we want to be able to pass an optimizer to a Saver object?

>All comments

Thank you for asking the question on stackoverflow, which is a better place for it. The optimizer state will be saved by default, and is only not saved because you are specifically telling the saver what to save.

Was this page helpful?
0 / 5 - 0 ratings