-
Notifications
You must be signed in to change notification settings - Fork 1
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
0 parents
commit 8b25ff6
Showing
27 changed files
with
1,490 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,7 @@ | ||
src/config/load_env.py | ||
volumes/* | ||
*.pyc | ||
*.csv | ||
*.json | ||
.env | ||
.DS_Store |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,55 @@ | ||
# ViMedicine: An Vietnamese LLMs agent for drug and illness suggestion | ||
|
||
## Introduction | ||
In this project, we present a medicine agent for Vietnamese medical suggestion. | ||
The medicine agent can: | ||
1. Predict illness based on patient's symptoms | ||
2. Suggest medicine based on disease | ||
3. Check specific medicine's suitability based on disease and patient's symptoms | ||
4. Predict medicines' side effects or interact or violate patient's medical record | ||
|
||
We use OpenAI GPT-3.5 model to read documents and generate medical entities, but the agent flow is not restricted to OpenAI GPT model or to Vietnamese medical domain. | ||
|
||
> ⚠️ Notes: The full document is comming soon. | ||
## Install requirement | ||
Input your chat model key in `config/.env` file. | ||
Install via `pip`: | ||
```bash | ||
pip install -r <requirements.txt> | ||
``` | ||
|
||
## Initualize Milvus server and deploy API | ||
**Initualize milvus server and create collection** | ||
```bash | ||
docker-compose up -d | ||
python setup_milvus.py | ||
``` | ||
Notes: for the data, check out [Here](https://github.com/nmd2k/vi-medicine/releases/tag/data_v1) | ||
|
||
**Deploy API server** | ||
```bash | ||
uvicorn main:app --reload --port=<port> --workers=<n_worker> | ||
``` | ||
|
||
## Acknowledge | ||
This is a research project, so we do not guarantee the accuracy of the model. | ||
Since the performance heavily depends on database (in vector store) quality, the current model precision might not meet the expectation for production use. | ||
|
||
# Citation | ||
``` | ||
@misc{vimedicine, | ||
author = {Tran Tuan Anh, Phan Quang Hung, Dung Manh Nguyen}, | ||
title = {ViMedicine: An Vietnamese LLMs agent for drug and illness suggestion}, | ||
year = {2023}, | ||
publisher = {GitHub}, | ||
journal = {GitHub repository}, | ||
howpublished = {\url{https://github.com/nmd2k/vi-medicine/}}, | ||
} | ||
``` | ||
**Our team:** | ||
- Tran Tuan Anh* | ||
- Phan Quang Hung* | ||
- Dung Manh Nguyen* | ||
|
||
*: Equals contribution |
Empty file.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,65 @@ | ||
import sys | ||
sys.path.append(["../"]) | ||
|
||
from pathlib import Path | ||
from typing import Any, List, Dict | ||
|
||
from langchain.schema import HumanMessage, SystemMessage | ||
|
||
from utils.prompts import * | ||
from utils.chatmodel import ChatModel | ||
from app.exception.custom_exception import CustomException | ||
|
||
class Functions: | ||
"Simple generate function and embedding functions" | ||
def __init__(self, | ||
top_p: float = 1, | ||
max_tokens: int = 512, | ||
temperature: float = 0, | ||
n_retry: int = 2, | ||
request_timeout: int = 30, **kwargs) -> None: | ||
|
||
self.chatmodel = ChatModel( | ||
temperature=temperature, | ||
max_tokens=max_tokens, | ||
top_p=top_p, | ||
n_retry=n_retry, | ||
request_timeout=request_timeout, | ||
**kwargs | ||
) | ||
|
||
async def generate(self, message, prompt=None): | ||
""" | ||
Chat model generate function | ||
Args: | ||
- message (str): human query/message | ||
- prompt (str): optional system message | ||
Return: | ||
- str: Generated output | ||
""" | ||
try: | ||
messages = [] | ||
if prompt: | ||
messages.append(SystemMessage(content=prompt)) | ||
messages.append(HumanMessage(content=message)) | ||
generate = self.chatmodel.generate(messages) | ||
|
||
return generate | ||
except Exception as exc: | ||
raise CustomException(exc) | ||
|
||
async def embed(self, message): | ||
""" | ||
Embedding string input | ||
Args: | ||
- message (str): message to embed | ||
Return: | ||
- List: List of embedding output | ||
""" | ||
try: | ||
assert type(message) == str | ||
embed = self.chatmodel.embed(message=message) | ||
print(len(embed)) | ||
return embed | ||
except Exception as exc: | ||
raise CustomException(exc) |
Oops, something went wrong.