Installing BitsAndBtyes for Windows - So That You Can Do PEFT

Installing BitsAndBtyes for Windows - So That You Can Do PEFT

When working with the Huggingface ecosystem, you may find tutorials that have you install a library called ‘bitsandbytes’. Particularly if you want to do PEFT or LORA to fine tune your own Large Language Models (LLMs). This should be as easy as:

pip install bitsandbytes

But if you are in Windows (rather than Linux) you’ll quickly find that you get errors. I often get this error:

CUDA_SETUP: WARNING! not found in any environmental path. Searching in backup paths...

Or possibly this error:

CUDA Setup failed despite GPU being available. Please run the following command to get more information:

python -m bitsandbytes

Inspect the output of the command and see if you can locate CUDA libraries. You might need to add them to your LD_LIBRARY_PATH. If you suspect a bug, please take the information from python -m bitsandbytes and open an issue at:

Not Windows Compatible?

It turns out that bitsandbytes isn’t yet compatible with Windows (as of this article's publication), and only works in Linux. So, are you just out of luck in Windows?

One possibility you might come across if you search the Internet is that you might be able to install a custom Windows version created by Tim Dettmers by using this suggested command:

python -m pip install bitsandbytes --prefer-binary --extra-index-url=

However, if you are like me, this didn’t work for you. It turns out that it’s a bit sensitive and if you have anything it doesn’t want it may not work. After trial and error, we managed to get it to work by following this order.

Script To Install BitsAndBytes for Windows

First, start with a fresh Conda (this may also work with a fresh virtual environment for Python, though we haven’t tried that yet). We know it works with a fresh Conda with basically nothing installed.

Now presumably you’ll need Pytorch if you are going to do anything useful with bitsandbytes, so start by installing Pytorch using this URL:

Here is the conda they suggested there for us:

conda install pytorch torchvision torchaudio pytorch-cuda=12.1 -c pytorch -c nvidia

Next, you’ll want to install Huggingface transformers library integrated with pytorch. Here is the relevant URL on how to do that:

Here is the pip we used:

pip install transformers vs pip install 'transformers[torch]' vs conda install conda-forge::transformers

Now install Huggingface Hub:

python -m pip install huggingface_hub

Next install PEFT. Here is the URL that explains how:

I used this pip:

pip install peft

Next, you’ll need Cuda for Windows. Here is the URL that explains how:

I did this one via Conda:

conda install cuda -c nvidia

Verify that you now have cuda working by doing this:

import torch
Torch.cuda.is_available() (Returns True)

Also, you can verify you do NOT have bitsandbytes yet doing this:

import transformers
transformers.is_bitsandbytes_available() (Returns False)

Finally, we’re ready for bitsandbytes. Here is a custom version for Windows by Tim Dettmers as explained here:

Here is the command that worked for me:

python.exe -m pip install

Now check transformers.is_bitsandbytes_available() and it should now be True.

And that’s it! You now have bitsandbytes installed for Windows!

However, one word of warning: This is actually an older version than the current version- so, I can’t vouch for how well it will work (comparatively speaking).


comments powered by Disqus

Follow Us

Latest Posts

subscribe to our newsletter