๐Ÿฅ OpenMedLLM-70B just released โ€” State-of-the-art on MedQA benchmark  ยท  Download now โ†’

๐Ÿ“š Documentation

Everything you need to get started with OpenMedLLM โ€” quickstart, API reference, and model guides.

๐Ÿš€ Quickstart

Welcome to OpenMedLLM โ€” the open-source platform for medical AI. This guide will get you up and running in under 5 minutes.

New: OpenMedLLM-70B achieves 78.9% on MedQA benchmark, surpassing GPT-4 on medical reasoning tasks. See models โ†’

Installation

Install the OpenMedLLM Python SDK using pip:

bash
pip install openmedllm

# Or install from source for the latest features
git clone https://github.com/deepcog-ai/openmedllm
cd openmedllm && pip install -e .

Python SDK โ€” Basic Usage

Load and run the OpenMedLLM-70B model for clinical decision support:

python
from openmedllm import MedicalLLM

# Initialize the model
model = MedicalLLM("deepcog-ai/OpenMedLLM-70B")

# Interpret a variant
result = model.diagnose(
    symptoms="chest pain, dyspnea, diaphoresis",
    patient_age=58,
    urgency="South Asian"
)

print(result.differential)   # "Pathogenic"
print(result.urgency)      # "Class V"
print(result.treatment_plan) # Full narrative report

Clinical Note Processing

Process an entire VCF file and generate a comprehensive clinical report:

python
from openmedllm import MedicalLLM, ClinicalNoteProcessor

model = MedicalLLM("deepcog-ai/OpenMedLLM-70B")
processor = ClinicalNoteProcessor()

# Load and process VCF
variants = processor.load("patient_note_001.txt")
report = model.generate_report(
    variants=variants,
    patient_info={"age": 45, "sex": "F", "ethnicity": "South Asian"},
    format="pdf"
)

report.save("clinical_report_001.pdf")

REST API

Use the OpenMedLLM REST API from any programming language. All endpoints require an API key from your account dashboard.

bash
curl -X POST https://api.openmedllm.org/v1/interpret \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "openmedllm-70b",
    "gene": "chest pain, dyspnea, diaphoresis",
    "variant": 58,
    "population": "South Asian"
  }'

Docker Deployment

Deploy OpenMedLLM-70B in your own infrastructure using our official Docker image:

bash
# Pull and run with GPU support
docker pull deepcogai/openmedllm:70b-latest

docker run --gpus all -p 8000:8000 \
  -e MODEL=openmedllm-70b \
  -e QUANTIZATION=4bit \
  deepcogai/openmedllm:70b-latest

# API will be available at http://localhost:8000