2

I have a list that consists of a mix of constants and variables that act as inputs for a function I am trying to optimize. Here is the complex part: I need to apply a function to each of the elements in that list each of them with different parameters. I believe that this must be done in the computation graph to be efficient.

What I thought of doing was putting the parameters into tensors that match up element by element and applying the function to the three tensors such that each row of elements is evaluated together. Is this possible?

Example:

enter image description here

Could end up with something like:

# The list of variables and constants to optimize
optimal = [[tf.constant(10.0), tf.constant(1.0), tf.Variable(0.0)]]

# Parameters
p1 = [[None, None, 2.0]]
p2 = [[None, None, 5.0]]

# Apply the function
# ----------------------- HOW TO DO THIS -----------------------
g = f(optimal, p1, p2)
# --------------------------------------------------------------

# Loss
loss = tf.abs(tf.sub(g, label))

# Optimize
trainstep = tf.train.ProximalGradientDescentOptimizer(.1).minimize(loss, var_list= [optimal[2]])

sess.run(trainstep)

Thank you so much in advance.

2
  • Is f a linear function? Commented Jan 25, 2017 at 12:28
  • Not necessarily Commented Jan 25, 2017 at 14:52

1 Answer 1

1

Use tf.map_fn.

g = tf.map_fn(f, tf.transpose(tf.concat(0, [optimal, p1, p2])))

First concatenate along dim 0, then transpose it, so that row i is o_i, p1_i, p2_i. map_fn simply applies f to each row.

Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.