Integrate existing model inference codes into procslib.
Assuming full codebase is given, gpt-o1 usually oneshots these problems and only needs minor tweaks for outputs formatting
Prompt Template:
I have a disorganized repository with some Python files, a pyproject.toml, and random data/checkpoint directories. I want to:
- Make it a standard Python package named
my_project, using thesrc/my_projectlayout. So the final structure is something like:
my_project/
ββ pyproject.toml
ββ requirements.txt
ββ src/
β ββ my_project/
β ββ **init**.py
β ββ main_inference_class.py
β ββ ...
ββ ...
- Remove any local path hacks (
sys.path.append(...), etc.) and switch them to either:
from .some_module import ...or
from my_project.some_module import ...so that the code runs after pip install ..
-
Convert hard-coded local checkpoint paths into code that fetches from Hugging Face via
huggingface_hub.hf_hub_download. For example:from huggingface_hub import hf_hub_download local_ckpt_path = hf_hub_download("some_user/my_model_repo", "my_checkpoint.pt") -
Create a single
Inferenceclass that loads the model (downloading from Hugging Face if no local path is provided), plus a demonstration snippet that uses it. Example:
from my_project.main_inference_class import MyInference
Instantiate
infer = MyInference( model_repo=βsome_user/my_model_repoβ, model_filename=βmy_checkpoint.ptβ, )
Inference
audio = infer(βHello worldβ)
3. **Update `pyproject.toml`** so that it includes any needed dependencies in `[project.dependencies]` and also includes a `[project.scripts]` entry if we want a CLI. Example:
```toml
[project]
name = "my_project"
version = "0.1.0"
description = "..."
requires-python = ">=3.10"
dependencies = [
"torch",
"huggingface_hub",
"soundfile",
...
]
[project.scripts]
my_project-cli = "my_project.__main__:main"
- Demonstrate that the user can install with
pip install .and then run the inference code or the CLI command.
Given these steps, please provide a concise diff or snippet for each file so we can see exactly what needs to be changed. The final result should be a neatly installable Python package.
code dump:
<YOUR-CONTENT-HERE>