Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error: Failed to fetch | TypeError: Cannot read properties of undefined (reading 'status') #26

Closed
Mariotim3 opened this issue Apr 12, 2023 · 11 comments
Labels
bug Something isn't working

Comments

@Mariotim3
Copy link

No description provided.

@realcoloride
Copy link
Owner

Hello!

Could you specify more details about what happened? Code, error path, etc...

@Mariotim3
Copy link
Author

I believe it's due to attempting to send multiple messages while one is already being generated. I have resolved the issue by waiting until one is completed before sending the next one. Will let you know if I find more issues. Thanks for making this

@Mariotim3
Copy link
Author

I decided to use multiple instances of the node_characterai and run them separately. So that if the bot is already generating a message for one response it can use the other instances to generate one for another. (It seems to have worked and now has the ability to generate multiple responses at the same time.)

I have only had the issue once since then, and i'm still trying to find more info about it.

The issue also seems to occur under the chat.js file, on line 65.

if (request.status() === 200) { TypeError: Cannot read properties of undefined (reading 'status')

@realcoloride
Copy link
Owner

If request is undefined it means something went wrong with the puppeteer evaluation, as it returns an object and the status is a procedurally generated function when /streaming/ is called.

@cshadd
Copy link

cshadd commented Apr 13, 2023

I can confirm the same thing happening on my end. I have not tried the similar fix of multiple instances of node_characterai.

I can confirm this happens if multiple fetches are happening at the same time.

One other solution is probably to queue each process and wait for each to finish.

edit: Now that I think about it, it might also affect other functions such as fetching and searching characters.

@Mariotim3
Copy link
Author

I did try queueing each one and waiting for it to finish and it did work. Although if I had multiple users conversing with it at the same time the chat would end up getting really far behind (because it takes anywhere from around 4 - 10 seconds to generate a response) and was not able to respond to new ones quick enough.

@realcoloride realcoloride added the bug Something isn't working label Apr 13, 2023
@cshadd
Copy link

cshadd commented Apr 14, 2023

Yeah queueing will probably have it take a long time depending on how many users are using it. This is just an alternative solution instead of having to create multiple instances of the client for each request.

The problem with making multiple instances of puppeteer is that each one opens a chromium instance and depending on how you are hosting this it might not be feasible, so there are trade-offs for each solution.

Trying to look into it and see if I can find the solution. I really think they should just remove the Cloudflare blocking.

@cshadd
Copy link

cshadd commented Apr 14, 2023

One other solution I can think of at this moment is to try multiple puppeteer instances per request call without having to force the develop to make a new client. This still will make a chromium instance and multiple puppeteer instances for all the requests but this is probably better for the end developer.

@creepycats
Copy link
Contributor

node_characterai I believe has the same issue that I had before with multiple messages sending.

I don't know exactly how requester.js works just yet, but just a suggestion for coloride.

Add some sort of "request_id" header to requests. I use this to make sure whatever I request is what I intercept.

This basically fixes issues with Eval-Fetching and makes sure we only listen to requests we want.

image
image
image

Of course, I don't know what might break with this change. Could be set up like this for a very specific reason. This would just let you use one singular type of requesting that should hopefully fix whatever issue this is.

@realcoloride
Copy link
Owner

Hello!
Is that about multi managing the requests?

@Parking-Master
Copy link
Contributor

Parking-Master commented Jul 17, 2023

You can close this issue now. It was definitely related to a caching problem and has been resolved now as of v1.0.8.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

5 participants