runtime error
Exit code: 1. Reason: 8: UserWarning: CUDA is not available or torch_xla is imported. Disabling autocast. @torch.autocast(device_type="cuda", dtype=torch.float32) /usr/local/lib/python3.13/site-packages/diffusers/models/transformers/transformer_kandinsky.py:272: UserWarning: CUDA is not available or torch_xla is imported. Disabling autocast. @torch.autocast(device_type="cuda", dtype=torch.float32) 2026-02-03 18:46:43 - __main__ - INFO - Using device: cpu 2026-02-03 18:46:43 - httpx - INFO - HTTP Request: HEAD https://huggingface.co/Helsinki-NLP/opus-mt-ko-en/resolve/main/config.json "HTTP/1.1 307 Temporary Redirect" 2026-02-03 18:46:43 - httpx - INFO - HTTP Request: HEAD https://huggingface.co/api/resolve-cache/models/Helsinki-NLP/opus-mt-ko-en/e42d1f41b66194e6d10512f8a27bebc1f4f5097e/config.json "HTTP/1.1 200 OK" 2026-02-03 18:46:44 - httpx - INFO - HTTP Request: GET https://huggingface.co/api/resolve-cache/models/Helsinki-NLP/opus-mt-ko-en/e42d1f41b66194e6d10512f8a27bebc1f4f5097e/config.json "HTTP/1.1 200 OK" config.json: 0%| | 0.00/1.39k [00:00<?, ?B/s][A config.json: 100%|ββββββββββ| 1.39k/1.39k [00:00<00:00, 7.75MB/s] Traceback (most recent call last): File "/app/app.py", line 57, in <module> translator = translation_pipeline("translation", model="Helsinki-NLP/opus-mt-ko-en") File "/usr/local/lib/python3.13/site-packages/transformers/pipelines/__init__.py", line 777, in pipeline normalized_task, targeted_task, task_options = check_task(task) ~~~~~~~~~~^^^^^^ File "/usr/local/lib/python3.13/site-packages/transformers/pipelines/__init__.py", line 381, in check_task return PIPELINE_REGISTRY.check_task(task) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^ File "/usr/local/lib/python3.13/site-packages/transformers/pipelines/base.py", line 1354, in check_task raise KeyError(f"Invalid translation task {task}, use 'translation_XX_to_YY' format") KeyError: "Invalid translation task translation, use 'translation_XX_to_YY' format"
Container logs:
Fetching error logs...