Go to file
2025-05-12 11:22:46 +08:00
__pycache__ updated readme 2025-05-12 11:22:46 +08:00
endpoints updated readme 2025-05-12 11:22:46 +08:00
flagged updated image and video upload 2025-01-24 14:11:46 +08:00
utils updated readme 2025-05-12 11:22:46 +08:00
__init__.py updated celery redis 2025-02-07 19:18:35 +08:00
.gitignore updated celery cpu gpu optimization 2025-03-22 20:54:10 +08:00
celery_debug.py updated readme 2025-05-12 11:22:46 +08:00
celery_worker.py updated celery cpu gpu optimization 2025-03-22 20:54:10 +08:00
connect_rabbitmq.py updated celery cpu gpu optimization 2025-03-22 20:54:10 +08:00
Dockerfile updated ui 2025-01-23 21:50:55 +08:00
locustfile.py updated celery redis 2025-02-07 19:18:35 +08:00
main.py updated celery cpu gpu optimization 2025-03-22 20:54:10 +08:00
monitor.py updated celery cpu gpu optimization 2025-03-22 20:54:10 +08:00
multi-user.py updated celery redis 2025-02-07 19:18:35 +08:00
nohup.out updated celery cpu gpu optimization 2025-03-22 20:54:10 +08:00
pipeline_setup.py updated celery cpu gpu optimization 2025-03-22 20:54:10 +08:00
prompt.txt updated readme 2025-05-12 11:22:46 +08:00
README.md updated readme 2025-05-12 11:22:46 +08:00
report.docx experiment 2025-03-22 21:06:28 +08:00
requirements.txt updated celery cpu gpu optimization 2025-03-22 20:54:10 +08:00
resource_usage.log updated celery cpu gpu optimization 2025-03-22 20:54:10 +08:00
tasks.py updated readme 2025-05-12 11:22:46 +08:00
ui.py updated celery redis 2025-02-07 19:18:35 +08:00

API Usage Guide

This project supports multiple methods to start and interact with the API.

Method 1: Standard API Start

Start the API using the main script:

python main.py

Method 2: Gradio UI

To use the Gradio UI:

  1. In the endpoints/ directory, change the comment to enable 'gradio'.
  2. Run the UI script:
python ui.py

Method 3: Celery Optimization

Use Celery for optimized background task processing. You can adjust the parameters for concurrency as needed.

Step 1: Start Redis Server

redis-server

Step 2: Start Celery Workers

Open two separate terminals and run the following:

Terminal 1 (Preprocessing Queue):

celery -A tasks worker --pool=threads --loglevel=info --concurrency=2 --queues=preprocess_queue

Terminal 2 (Inference Queue):

celery -A tasks worker --pool=threads --loglevel=info --concurrency=3 --queues=inference_queue

Step 3: Run Debug Script

python celery_debug.py