I have a list that consists of a mix of constants and variables that act as inputs for a function I am trying to optimize. Here is the complex part: I need to apply a function to each of the elements in that list each of them with different parameters. I believe that this must be done in the computation graph to be efficient.
What I thought of doing was putting the parameters into tensors that match up element by element and applying the function to the three tensors such that each row of elements is evaluated together. Is this possible?
Example:
Could end up with something like:
# The list of variables and constants to optimize
optimal = [[tf.constant(10.0), tf.constant(1.0), tf.Variable(0.0)]]
# Parameters
p1 = [[None, None, 2.0]]
p2 = [[None, None, 5.0]]
# Apply the function
# ----------------------- HOW TO DO THIS -----------------------
g = f(optimal, p1, p2)
# --------------------------------------------------------------
# Loss
loss = tf.abs(tf.sub(g, label))
# Optimize
trainstep = tf.train.ProximalGradientDescentOptimizer(.1).minimize(loss, var_list= [optimal[2]])
sess.run(trainstep)
Thank you so much in advance.

fa linear function?