This commit is contained in:
Adam 2023-02-12 21:40:39 -05:00
parent eac217cc09
commit 553ded1bc8
4 changed files with 31 additions and 12 deletions

1
.gitignore vendored Normal file
View file

@ -0,0 +1 @@
*.tar.gz

View file

@ -1,9 +1,15 @@
# Chatbots API # Chatbots API
FastAPI and PyTorch [FastAPI](https://fastapi.tiangolo.com/) and [PyTorch](https://pytorch.org/)
To build yourself you'll need to first train a model with ../train To build one yourself you'll need to first [train a model](../train),
place the entire directory (checkpoints aren't needed) containing pytorch_model.bin in [bots](./src/bots),
then edit or duplicate [cartman.py](./src/bots/cartman.py).
My image compressed is 1.4GB Cartman Docker images for are availible for
[x86_64](https://doordesk.net/files/chatbots_api_x86_64.tar.gz) (1.6GB) and
[aarch64](https://doordesk.net/files/chatbots_api_x86_64.tar.gz) (1.4GB)
Scripts in test to talk to it See [run](./run) and [test](./test) to interact with it
Live demo [here](https://doordesk.net/cartman)

2
train/.gitignore vendored
View file

@ -1,3 +1,5 @@
__pycache__/ __pycache__/
.ipynb_checkpoints/ .ipynb_checkpoints/
cartman/ cartman/
cached/
runs/

View file

@ -1,18 +1,28 @@
import pandas as pd import pandas as pd
df = pd.read_csv('./data/All-seasons.csv')
cleanlines = pd.Series( INPUT_FILE_PATH = './data/All-seasons.csv'
[cell OUPUT_FILE_PATH = './data/train_data.csv'
df = pd.read_csv(INPUT_FILE_PATH)
clean_lines = pd.Series(
[filter_lines
.replace('\n', '') .replace('\n', '')
.replace('(', '') .replace('(', '')
.replace(')', '') .replace(')', '')
.replace(' ', ' ') .replace(' ', ' ')
.strip() .strip()
for cell in df.Line for filter_lines in df.Line
] ]
) )
train = pd.DataFrame(df.Character) train_data = pd.DataFrame(df.Character)
train['line'] = cleanlines del df
train.columns = ['name', 'line']
train.to_csv('./data/train.csv', index=False) train_data['line'] = clean_lines
train_data.columns = ['name', 'line']
train_data.to_csv(OUPUT_FILE_PATH, index=False)