Error: Repository storage limit reached (Max: 1 GB)

There isn’t a way to increase the 1 GB storage limit for a Space repo.

You can free up storage - more tips here: How can I free up storage space in my account/organization?

1 Like

Thanks @meganariley . I’m now migrating the model to hf hub. I noticed some of my models are singleton .safetensor files which are loaded using DiffusionPipeline.from_single_file(“model.safetensors”). Would you recommend to use the standard structure (“model_index.json”, “diffusion_pytorch_model.safetensors”, etc.), or could I efficiently use

path = hf_hub_download("username/model-repo", "model.safetensors")
pipe = DiffusionPipeline.from_single_file(path)

Thanks.

1 Like

At the stage of loading into the pipeline, single_file is in the same state as from_pretrained, so either is fine. However, since single_file performs conversion on the fly, from_pretrained is slightly faster to load.

If you want to save in the converted state, just do as follows.

path_diffusers = "./model_diffusers"
path = hf_hub_download("username/model-repo", "model.safetensors")
pipe = DiffusionPipeline.from_single_file(path)
pipe.save_pretrained(path_diffusers)
#new_pipe = DiffusionPipeline.from_pretrained(path_diffusers) # if load
1 Like

Nice tip. Thanks!

1 Like

I’ve created a sample code/space for the transition.

1 Like

I completely deleted my repo, started a new one this time with a repo total space of 1.1MB and even after changing the remote to the new repo it’s still telling me that I reached the 1GB limit.

1 Like

Which repo?

I started with this one: https://huggingface.co/spaces/jonorl/fugazzeta which now has been deleted
I have now Fugazzetav2 - a Hugging Face Space by jonorl . The only way I found to upload files is by using the web UI and commit through there as opposed to the terminal. My local repo weights 1.1MB, so not sure why I keep getting the same “batch response: Repository storage limit reached (Max: 1 GB)” error. There are no big files when inspecting large files on the repo settings either.

1 Like

Maybe just having LFS pointers left in the local Git history is enough to cause problems…?

Yep, Gemini suggested the same thing, but it didn’t work out. After trying pruning a million different ways and untracking the large pkl files fro lfs and creating a 3rd repo I now managed to make it work. Thanks anyways.

1 Like