You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
PYTORCH_ENABLE_MPS_FALLBACK=1 python scripts/dream.py --full_precision -A plms --web
You should see a warning >> cuda not available, using device mps - this is good! On fully loaded M1 Max, this is 1.5 it/s on keuler or 3.91it/s on PLMS.
I got various errors attempting to run the main repo as is (commit 3ee82d8), with pip install instead of conda install. this laptop doesn't have conda on it at all.
workaround is to setup a python env with 3.10 and update requirements.txt
however, dream.py has trouble running, even with the full_precision flag. that shows some errors like:
RuntimeError: expected scalar type BFloat16 but found Float
searching issues, I see that this is possibly due to hardcoded assumptions of CPU vs GPU (see #44 (comment)), internally somewhere the code expects a half-precision for GPU vs full-precision for CPU. since other scripts work but dream.py doesn't work, is there somewhere in dream.py with similar hard coded assumptions that we need to update? -- this is fixed in #319
everything runs great now on M1, though I have to set env var PYTORCH_ENABLE_MPS_FALLBACK=1. so, running
PYTORCH_ENABLE_MPS_FALLBACK=1 python scripts/dream.py --full_precision -A plms --web
Update: everything is working great
pip install -r requirements.txt; pip install -e .PYTORCH_ENABLE_MPS_FALLBACK=1 python scripts/dream.py --full_precision -A plms --webYou should see a warning
>> cuda not available, using device mps- this is good! On fully loaded M1 Max, this is 1.5 it/s on keuler or 3.91it/s on PLMS.I got various errors attempting to run the main repo as is (commit 3ee82d8), with pip install instead of conda install. this laptop doesn't have conda on it at all.
workaround is to setup a python env with 3.10 and update requirements.txt
also need to run
pip install -e .to findldm.this enables me to run scripts and get output:
python ./scripts/orig_scripts/txt2img.py --prompt "ocean" --ddim_steps 5 --n_samples 1 --n_iter 1✨ ✨ ✨ ✨ (output isn't great, but it works!)
however,
dream.pyhas trouble running, even with thefull_precisionflag. that shows some errors like:searching issues, I see that this is possibly due to hardcoded assumptions of CPU vs GPU (see #44 (comment)), internally somewhere the code expects a half-precision for GPU vs full-precision for CPU. since other scripts work but-- this is fixed in #319dream.pydoesn't work, is there somewhere indream.pywith similar hard coded assumptions that we need to update?everything runs great now on M1, though I have to set env var
PYTORCH_ENABLE_MPS_FALLBACK=1. so, runningworks great!