Converting a Huggingface model for Telegram Use

THIS IS AN ALPHA FEATURE

Note: This feature is in Beta, it may not work as expected

Recommended reading before starting: Concepts and PT files

Our Telegram Bots like PirateDiffusion require the full model downloaded, not just the CKPT file. Thus, you’ll need to clone the model and merge in the changes and give it it’s own unique name, treating it as its own model.

#1. Go to the HuggingFace website

#2. Log into your account, or sign up for a new one.

#3. Click on the “Repos” tab, which is located in the navigation bar at the top of the page.

#4. You should see an alphabetical list of all available repos. Hover your mouse over any repo and click on the clone button on its right side to copy its link address to your clipboard. Alternatively you can also click on any of these repos to enter their main page and view/download their contents.

#5. Once you have copied or downloaded a repo link, open your terminal (or equivalent command line interface), navigate to where you want to store it, and run this command: “git clone URLOFREPO”. Replace URLOFREPO with the repo address you copied in step 4. The command will download all of the files associated with that repository onto your local machine at that location!

Transformers Library

1. Install the HuggingFace Transformers library:

$ pip install transformers

2. Load the checkpoint file into Python (If it’s not a TensorFlow checkpoint, you’ll first need to convert it):

import tensorflow as tf
model = tf.savedmodel.load(“path/to/checkpointfile”)
print(model)

3. Create the configuration file for your model in JSON format:

with open(‘configname.json’, ‘w’) as fp:
fp.write(model.getconfig())

This creates the structure of your model so you can load in the weights from the checkpoint files to match that structure.

4. Instantiate and load a tf.keras Transformer model from the configuration file and loaded checkpoint file:

transformersmodel = TFBertModel.frompretrained(“config-name”, frompt=True, config=’config-name’)
transformersmodel(tf.zeros([1,1024])) # check that it works with some input data

5. Save the model as a diffuser (in this case, we are using a BertDiffuser):

bertdiffuser = BertDiffuser(transformersmodel)
bertdiffuser.save(‘/path/to/saved/models’, ‘diffusers-name’)

Once the model is created, please notify an @admin to evaluate it in Telegram