Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to change the batch size dynamically? #2684

Closed
chwangaa opened this issue Jul 3, 2015 · 7 comments
Closed

How to change the batch size dynamically? #2684

chwangaa opened this issue Jul 3, 2015 · 7 comments

Comments

@chwangaa
Copy link

chwangaa commented Jul 3, 2015

In pycaffe particular

Is there any ways we can change the batch_size dynamically without modifying the prototxt file ?

I noticed the method reshape, but apparently it does not really change the shape, as after I run forward(), the shape will have be altered back to its original shape

Many thanks

@chwangaa
Copy link
Author

chwangaa commented Jul 3, 2015

Basically I am trying to see the relationship between the batch size and the running time

@seanbell
Copy link

seanbell commented Jul 5, 2015

You need to reshape the input blob and then call reshape on the net, before calling forward.

@chwangaa
Copy link
Author

chwangaa commented Jul 5, 2015

I did the following
net.blobs['data'].reshape(BLOCK_SIZE, DIMENSION, HEIGHT, WIDTH) net.blobs['label'].reshape(BLOCK_SIZE, ) net.reshape()

The shape of every layer changes correctly. However, as soon as I run forward, the shape will automatically change back :-(

@lukeyeager
Copy link
Contributor

This example may be helpful to you. See the code in forward_pass():

for chunk in [caffe_images[x:x+batch_size] for x in xrange(0, len(caffe_images), batch_size)]:
    new_shape = (len(chunk),) + tuple(dims)
    if net.blobs['data'].data.shape != new_shape:
        net.blobs['data'].reshape(*new_shape)
    for index, image in enumerate(chunk):
        image_data = transformer.preprocess('data', image)
        net.blobs['data'].data[index] = image_data
    output = net.forward()[net.outputs[-1]]

@chwangaa
Copy link
Author

chwangaa commented Jul 7, 2015

Somehow I cannot manage to have it work. I am not trying to classify here. So I used the method suggested above to change the batchSize to 1000. However, the running time of forward does not even slightly alter, (i.e. when I modify the value in prototxt file the running time will be longer). Therefore I think the above method does not work

@shelhamer
Copy link
Member

Please discuss usage on caffe-users; I've answered in the thread Regarding Change BatchSize in python.

@shellyfung
Copy link

@chwangaa Did you get the answer?

Basically I am trying to see the relationship between the batch size and the running time

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants