Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NwJS in combination with py-torch.js #41

Open
3 tasks done
KPrins opened this issue Dec 19, 2024 · 0 comments
Open
3 tasks done

NwJS in combination with py-torch.js #41

KPrins opened this issue Dec 19, 2024 · 0 comments

Comments

@KPrins
Copy link

KPrins commented Dec 19, 2024

Describe the Bug

I took an exemple of a browser-version, witch works well. But, if i use it as the main in a NwJS-environment, it produces an error: Error: GPU.GPU is not a constructor.

Also: NW.js require is redefined by py-torch (?). require('fs') returns {}, not an fs-object.

Minimal Reproducible Code

  <head>
      <title>My Project</title>
	  
      <script src="https://cdnjs.cloudflare.com/ajax/libs/js-pytorch/0.7.2/js-pytorch-browser.js" 
              integrity="sha512-l22t7GnqXvHBMCBvPUBdFO2TEYxnb1ziCGcDQcpTB2un16IPA4FE5SIZ8bUR+RwoDZGikQkWisO+fhnakXt9rg=="
              crossorigin="anonymous" 
              referrerpolicy="no-referrer">
      </script>
 
  </head>
  <body>
      <script>
		  //const nn = torch.nn;
		  //const optim = torch.optim;
  
		  const device = 'gpu';
  
		  // Define training hyperparameters:
		  const vocab_size = 52;
		  const hidden_size = 32;
		  const n_timesteps = 16;
		  const n_heads = 4;
		  const dropout_p = 0;
		  const batch_size = 8;
  
		  // Create Transformer decoder Module:
		  class Transformer extends nn.Module {
		    constructor(vocab_size, hidden_size, n_timesteps, n_heads, dropout_p, device) {
			  super();
			  // Instantiate Transformer's Layers:
			  this.embed = new nn.Embedding(vocab_size, hidden_size);
			  this.pos_embed = new nn.PositionalEmbedding(n_timesteps, hidden_size);
			  this.b1 = new nn.Block(hidden_size, hidden_size, n_heads, n_timesteps, dropout_p, device);
			  this.b2 = new nn.Block(hidden_size, hidden_size, n_heads, n_timesteps, dropout_p, device);
			  this.ln = new nn.LayerNorm(hidden_size);
			  this.linear = new nn.Linear(hidden_size, vocab_size, device);
		    }
  
		    forward(x) {
			  let z;
			  z = torch.add(this.embed.forward(x), this.pos_embed.forward(x));
			  z = this.b1.forward(z);
			  z = this.b2.forward(z);
			  z = this.ln.forward(z);
			  z = this.linear.forward(z);
			  return z;
		    }
		  }
  
		  // Instantiate your custom nn.Module:
		  const model = new Transformer(vocab_size, hidden_size, n_timesteps, n_heads, dropout_p, device);
  
		  // Define loss function and optimizer:
		  const loss_func = new nn.CrossEntropyLoss();
		  const optimizer = new optim.Adam(model.parameters(), (lr = 5e-3), (reg = 0));
  
		  // Instantiate sample input and output:
		  let x = torch.randint(0, vocab_size, [batch_size, n_timesteps, 1]);
		  let y = torch.randint(0, vocab_size, [batch_size, n_timesteps]);
		  let loss;
  
		  // Training Loop:
		  for (let i = 0; i < 40; i++) {
		    // Forward pass through the Transformer:
		    let z = model.forward(x);
  
		    // Get loss:
		    loss = loss_func.forward(z, y);
  
		    // Backpropagate the loss using torch.tensor's backward() method:
		    loss.backward();
  
		    // Update the weights:
		    optimizer.step();
  
		    // Reset the gradients to zero after each training step:
		    optimizer.zero_grad();
  
		    // Print loss at every iteration:
		    console.log(`Iter ${i} - Loss ${loss.data[0].toFixed(4)}`)
		  }
      </script>
  </body>

Error-message

Uncaught TypeError: GPU.GPU is not a constructor
at Tensor.matmul (js-pytorch-browser.js:332)
at Linear.forward (js-pytorch-browser.js:1655)
at MultiHeadSelfAttention.forward (js-pytorch-browser.js:1696)
at Block.forward (js-pytorch-browser.js:1766)
at Transformer.forward (Test2.html:42)
at Test2.html:65

package.json for NwJS

{
"name": "Start NwJS",
"version__": "0.0.1",
"node-remote": ["https://.{domain}.nl/","file://*"],
"main":"E:\Temp\JS-PyTorch\Test2.html",
"single-instance":false,
"chromium-args" : "--disk-cache-size=2147483647 --media-cache-size=2147483647 --incognito",
"window":{
"position":"center",
"resizable":true,
"height":800,
"width":1000
}
}

.bat-file to start NwJS

@echo on
rem Start Applicatie
rem @echo off

set rootDirSite=E:\Temp\JS-PyTorch
set rootDirSessionFiles=E:\Temp\JS-PyTorch\NwJsSessionFiles
set rootDirNW=E:\NwJs\nwjs-sdk-v0.59.0-win-x64

mkdir "%rootDirSessionFiles%%UserName%"

"%rootDirNW%\nw.exe" --query=%1 --enable-spell-checking --allow-file-access-from-files --allow-file-access --enable-node-worker --user-data-dir="%rootDirSessionFiles%%UserName%" %rootDirSite%

@echo on
pause

Screenshot or Screen Recording (optional)

No response

Issue Checkbox

  • I added a descriptive title to this issue.
  • I searched the other issues on js-torch, and didn't find any reporting this same bug.
  • The bug is not resolved by updating to the latest version of js-pytorch on npm.

Would you like to work on this issue?

None

@KPrins KPrins changed the title NW.js in combination with py-torch.js NwJS in combination with py-torch.js Dec 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant