Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Faceswap feature causes the app to crash on Google Colab #1175

Closed
ruipedrodias94 opened this issue Dec 4, 2023 · 4 comments
Closed

Faceswap feature causes the app to crash on Google Colab #1175

ruipedrodias94 opened this issue Dec 4, 2023 · 4 comments

Comments

@ruipedrodias94
Copy link

Describe the problem
When trying to use Faceswap feature in Google Colab, the app crashes.

Full Console Log
Image generated with private log at: /content/Fooocus/outputs/2023-12-04/log.html Generating and saving time: 30.31 seconds Requested to load SDXLClipModel Requested to load GPT2LMHeadModel Loading 2 new models [Fooocus Model Management] Moving model(s) has taken 0.85 seconds Total time: 70.66 seconds [Parameters] Adaptive CFG = 7 [Parameters] Sharpness = 2 [Parameters] ADM Scale = 1.5 : 0.8 : 0.3 [Parameters] CFG = 4.0 [Parameters] Seed = 51281143977912873 [Fooocus] Downloading control models ... [Fooocus] Loading control models ... ^C

@cad1231
Copy link

cad1231 commented Dec 4, 2023

If you are using the free version of Google Colab, there is not enough RAM for faceswap. You need the paid version, and then go to 'change runtime type' and make sure that 'high RAM' is turned on. You don't need to change anything else. 50 compute units that you get with a monthly subscription should last you 20-30 hours of processing time.

@RukshanJS
Copy link

If you are using the free version of Google Colab, there is not enough RAM for faceswap. You need the paid version, and then go to 'change runtime type' and make sure that 'high RAM' is turned on. You don't need to change anything else. 50 compute units that you get with a monthly subscription should last you 20-30 hours of processing time.

That's a nice answer. Could someone kindly add this to the Readme under the colab section?

@tbuyle
Copy link

tbuyle commented Dec 20, 2023

Try changing the last line of the colab with :

!python entry_with_update.py --share --preset --always-high-vram --all-in-fp16

See also #1377

@Damarcreative
Copy link

Always monitor RAM and GPU usage,
By default, using normal VRAM will make your RAM full and will stop.
since GPU usage is quite low, try adding --highvram parameter this will decrease RAM usage and increase GPU slightly.
Screenshot from 2023-12-25 11-08-45

@mashb1t mashb1t closed this as completed Dec 30, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants