Improved readme
This commit is contained in:
parent
b842c7bd94
commit
0dbbde5835
51
README.md
51
README.md
|
@ -4,15 +4,52 @@ Meant to be used with a finetuned GPT-2 model
|
||||||
|
|
||||||
## Usage
|
## Usage
|
||||||
|
|
||||||
just run it, it'll make a `bot_config.json` file in the running path
|
To run the bot, you need a valid `bot_config.json` file at the path where you're running the bot.
|
||||||
|
If you do not have one, izzilis will generate a default one for you to fill out.
|
||||||
|
|
||||||
fill it out
|
This bot currently *requires* a Telegram bot token to work, as there is no option to disable curation.
|
||||||
|
To create a bot, please use the [Botfather](https://t.me/botfather). Once created, and running, set a channel for the bot to post curation options in via the `/setmain` bot command. Usually this requires sending `/setmain@bot_username` to the chat with the bot (can be group chats).
|
||||||
|
|
||||||
run again
|
This default uses the `Misskey` publisher. If you want to publish to Mastodon, Pleroma, or any other mastodon-compatible API, please replace `Misskey` in the `publisher` object with `Mastodon`.
|
||||||
|
|
||||||
## Docker Usage
|
## Config values
|
||||||
|
|
||||||
The dockerfile makes a few assumptions which you must fulfill:
|
| Name | Value |
|
||||||
|
|--------------------|------------------------------------------------------------------------------------------------|
|
||||||
|
| `python_path` | The path to the system's python3 interpreter |
|
||||||
|
| `model_name` | The name of the GPT-2 model to use (see gpt-2 docs) |
|
||||||
|
| `temperature` | The `temperature` value to call gpt-2 with (see gpt-2 docs) |
|
||||||
|
| `top_k` | The `top_k` value to call gpt-2 with (see gpt-2 docs) |
|
||||||
|
| `gpt_code_path` | The path to where the gpt-2 source & models are located |
|
||||||
|
| `interval_seconds` | See [interval_seconds](#interval_seconds) |
|
||||||
|
| `bot_token` | Telegram Bot API token |
|
||||||
|
| `chat_ref` | The chat reference ID for the telegram bot, leave at 0, will be filled once `/setmain` is sent |
|
||||||
|
| `post_buffer` | How many curated samples the bot will hold at maximum at a time |
|
||||||
|
| `publisher` | See [publisher](#publisher) |
|
||||||
|
|
||||||
* the gpt python files & model are in the gpt/ directory (this directory doesn't exist, please make it (it must contain both `generate_unconditional_samples.py` and the `models` directory containing your finetuned model))
|
|
||||||
* you have ran the bot already. Once to generate a `bot_config.json` and a second time, once the aforementioned file is generated and filled out by the user, `fediverse.toml` from authenticating with the instance.
|
### interval_seconds
|
||||||
|
|
||||||
|
| Name | Value |
|
||||||
|
|-------|-------------------------------------------------|
|
||||||
|
| `min` | Minimum amount of seconds to wait between posts |
|
||||||
|
| `max` | Maximum amount of seconds to wait between posts |
|
||||||
|
|
||||||
|
### publisher
|
||||||
|
|
||||||
|
The publisher can currently hold one of two JSON objects, named either `Mastodon` or `Misskey`, which determines which posting API it will use. Whether the object is `Misskey` or `Mastodon`, it has the following members:
|
||||||
|
|
||||||
|
| Name | Value |
|
||||||
|
|------------|------------------------------------------------------------------------------------------------------------|
|
||||||
|
| `base_url` | The base URL of the instance |
|
||||||
|
| `token` | The auth token for the account, leave empty for `Mastodon` as you will be prompted to log in and authorize |
|
||||||
|
|
||||||
|
An example `Misskey` publisher entry looks like this:
|
||||||
|
```json
|
||||||
|
"publisher": {
|
||||||
|
"Misskey": {
|
||||||
|
"base_url": "",
|
||||||
|
"token": ""
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
Loading…
Reference in New Issue