Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

It seems like the label is being fed into the network while training #23

Open
supergrover opened this issue Feb 14, 2015 · 3 comments
Open

Comments

@supergrover
Copy link

I was wondering why you added 1 to the AffineTransform layer geometry in
https://github.com/botonchou/libdnn/blob/master/src/nnet.cpp#L233

After some nosing around I found you feed the label into the network as being one of its' inputs! (I printed the data at https://github.com/botonchou/libdnn/blob/master/src/nn-train.cpp#L164). I don't think this is desired behaviour. Can you please explain this?

@poweic
Copy link
Owner

poweic commented Feb 14, 2015

In rand(out + 1, in + 1), the first +1 is for biases in this layer, and second +1 is reserved for +1 in the next layer. ( [x, 1] )

No, label won't be fed into the network. I'm pretty sure about that. Maybe you can tell me the command line arguments so that I can reproduce your situation.

@supergrover
Copy link
Author

I printed the data being fed into the network, it included my label. It is relatively easy to see when testing the XOR example I showed in issue #18.

@poweic
Copy link
Owner

poweic commented Feb 16, 2015

I just printed. It didn't include your label. The last row are always 1.

About the problem you encountered, I think it's because I count the number of \n to see how many lines in the file. Since the data you provided (xor.train.dat and xor.test.dat) have no trailing \n at line 4, it might cause undefined behaviors.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants