Skip to content
This repository has been archived by the owner on Nov 25, 2020. It is now read-only.

how to send two placeholder varialbles with remote execute #11

Open
wangce888 opened this issue Sep 9, 2018 · 3 comments
Open

how to send two placeholder varialbles with remote execute #11

wangce888 opened this issue Sep 9, 2018 · 3 comments

Comments

@wangce888
Copy link

x = tf.placeholder(tf.int32, shape=[1], name='x')
y = tf.placeholder(tf.int32, shape=[1], name='y')
b = tf.Variable(1, name='b')
xy = tf.multiply(x, y, name='multiply')
output = tf.add(xy, b, name='op_to_store')

with tf.Session() as sess:
sess.run(tf.global_variables_initializer())

constant_graph = \
    graph_util.convert_variables_to_constants(sess, sess.graph_def, ['op_to_store'])

print(sess.run(output, feed_dict={x: [2], y: [3]}))

with tf.gfile.FastGFile('./model.pb', 'wb') as f:
    f.write(constant_graph.SerializeToString())

############################################

pred = remote.execute

@sleepsonthefloor
Copy link
Contributor

sleepsonthefloor commented Sep 9, 2018

You can do it with remote.execute_multi:

from graphpipe import remote
import numpy as np
x = np.array([0, 1]).astype(np.int32)
y = np.array([0, 1]).astype(np.int32)
result = remote.execute_multi("http://127.0.0.1:9000", [x, y], None, None) # inputNames and outputNames are not specified, so defaults are inferred from the model structure

print(result)

Or, if you want to specify the names of the inputs and outputs associated with your call:

from graphpipe import remote
import numpy as np
x = np.array([0, 1]).astype(np.int32)
y = np.array([0, 1]).astype(np.int32)
result = remote.execute_multi(
        "http://127.0.0.1:9000",
        [x, y],                # inputs
        ['x', 'y'],             # inputNames
        ['op_to_store']) # outputNames
print(result)

That said, I don't see why remote.execute should not be able to handle this through type introspection. Perhaps we should allow remote.execute to detect if an array is passed in, and if so apply multiple inputs.

btw, I ran your model like this:

docker run -it --rm \
        -v "$PWD:/models/"  \
        -p 9000:9000 \
        sleepsonthefloor/graphpipe-tf:cpu \
        --model=/models/model.pb \
        --listen=0.0.0.0:9000

Generated the model like this:

import tensorflow as tf
x = tf.placeholder(tf.int32, shape=[1], name='x')
y = tf.placeholder(tf.int32, shape=[1], name='y')
b = tf.Variable(1, name='b')
xy = tf.multiply(x, y, name='multiply')
output = tf.add(xy, b, name='op_to_store')

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())

    constant_graph = \
        tf.graph_util.convert_variables_to_constants(sess, sess.graph_def, ['op_to_store'])

    print(sess.run(output, feed_dict={x: [2], y: [3]}))

    with tf.gfile.FastGFile('./model.pb', 'wb') as f:
        f.write(constant_graph.SerializeToString())

@sleepsonthefloor
Copy link
Contributor

oracle/graphpipe-py#5 < perhaps this is worth considering

@wangce888
Copy link
Author

@sleepsonthefloor

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants